KaBOB: ontology-based semantic integration of biomedical databases.
Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E
2015-04-23
The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for formal reasoning over a wealth of integrated biomedical data.
Integration of Schemas on the Pre-Design Level Using the KCPM-Approach
NASA Astrophysics Data System (ADS)
Vöhringer, Jürgen; Mayr, Heinrich C.
Integration is a central research and operational issue in information system design and development. It can be conducted on the system, schema, and view or data level. On the system level, integration deals with the progressive linking and testing of system components to merge their functional and technical characteristics and behavior into a comprehensive, interoperable system. Schema integration comprises the comparison and merging of two or more schemas, usually conceptual database schemas. The integration of data deals with merging the contents of multiple sources of related data. View integration is similar to schema integration, however focuses on views and queries on these instead of schemas. All these types of integration have in common, that two or more sources are merged and previously compared, in order to identify matches and mismatches as well as conflicts and inconsistencies. The sources may stem from heterogeneous companies, organizational units or projects. Integration enables the reuse and combined use of source components.
SPARQL-enabled identifier conversion with Identifiers.org
Wimalaratne, Sarala M.; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille
2015-01-01
Motivation: On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. Results: We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. Availability and implementation: The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. Contact: sarala@ebi.ac.uk PMID:25638809
SPARQL-enabled identifier conversion with Identifiers.org.
Wimalaratne, Sarala M; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille
2015-06-01
On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. © The Author 2015. Published by Oxford University Press.
Transmission Integration | Grid Modernization | NREL
high penetrations of renewable energy sources operate. NREL researchers are identifying these effects and Wind Integration Studies Studying the effects of high penetrations of renewables on Hawaiian
Techniques to Access Databases and Integrate Data for Hydrologic Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.
2009-06-17
This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed.more » The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and retrieve the required data, and their ability to integrate the data into environmental models using the FRAMES environment.« less
EMISSION FACTORS FOR IRON AND STEEL SOURCES: CRITERIA AND TOXIC POLLUTANTS
The report provides a comprehensive set of emission factors for sources of both criteria and toxic air pollutants in integrated iron and steel plants and specialty electric arc shops (minimills). Emission factors are identified for process sources, and process and open source fug...
Scenario driven data modelling: a method for integrating diverse sources of data and data streams
Brettin, Thomas S.; Cottingham, Robert W.; Griffith, Shelton D.; Quest, Daniel J.
2015-09-08
A system and method of integrating diverse sources of data and data streams is presented. The method can include selecting a scenario based on a topic, creating a multi-relational directed graph based on the scenario, identifying and converting resources in accordance with the scenario and updating the multi-directed graph based on the resources, identifying data feeds in accordance with the scenario and updating the multi-directed graph based on the data feeds, identifying analytical routines in accordance with the scenario and updating the multi-directed graph using the analytical routines and identifying data outputs in accordance with the scenario and defining queries to produce the data outputs from the multi-directed graph.
Microgravity Experiments Safety and Integration Requirements Document Tree
NASA Technical Reports Server (NTRS)
Hogan, Jean M.
1995-01-01
This report is a document tree of the safety and integration documents required to develop a space experiment. Pertinent document information for each of the top level (tier one) safety and integration documents, and their applicable and reference (tier two) documents has been identified. This information includes: document title, revision level, configuration management, electronic availability, listed applicable and reference documents, source for obtaining the document, and document owner. One of the main conclusions of this report is that no single document tree exists for all safety and integration documents, regardless of the Shuttle carrier. This document also identifies the need for a single point of contact for customers wishing to access documents. The data in this report serves as a valuable information source for the NASA Lewis Research Center Project Documentation Center, as well as for all developers of space experiments.
NASA Astrophysics Data System (ADS)
Niu, H., Jr.
2015-12-01
Volatile organic compounds (VOCs) in the atmosphere have adverse impacts via three main pathways: photochemical ozone formation, secondary organic aerosol production, and direct toxicity to humans. Few studies have integrated these effects to prioritize control measures for VOCs sources. In this study, we developed a multi-effect evaluation methodology based on updated emission inventories and source profiles, which was combined with ozone formation potential (OFP), secondary organic aerosol potential (SOAP), and VOC toxicity data to identify important emission sources and key species. We derived species-specific emission inventories for 152 sources. The OFPs, SOAPs, and toxicity of each source were determined, and the contribution and share of each source to each of these adverse effects was calculated. Weightings were given to the three adverse effects by expert scoring, and the integrated impact was determined. Using 2012 as the base year, solvent usage and industrial process were found to be the most important anthropogenic sources, accounting for 24.2 and 23.1% of the integrated environmental effect, respectively. This was followed by biomass burning, transportation, and fossil fuel combustion, all of which had a similar contribution ranging from 16.7 to 18.6%. The top five industrial sources, including plastic products, rubber products, chemical fiber products, the chemical industry, and oil refining, accounted for nearly 70.0% of industrial emissions. In China, emissions reductions are required for styrene, toluene, ethylene, benzene, and m/p-xylene. The 10 most abundant chemical species contributed 76.5% of the integrated impact. Beijing, Chongqing, Shanghai, Jiangsu, and Guangdong were the five leading provinces when considering the integrated effects. Besides, the chemical mass balance model (CMB) was used to verify the VOCs inventories of 47 cities in China, so as to optimize our evaluation results. We suggest that multi-effect evaluation is necessary to identify the need for abatement at the source type and substance levels.
Gabr, Hesham; Chen, Xi; Zevallos-Carrasco, Oscar M; Viehland, Christian; Dandrige, Alexandria; Sarin, Neeru; Mahmoud, Tamer H; Vajzovic, Lejla; Izatt, Joseph A; Toth, Cynthia A
2018-01-10
To evaluate the use of live volumetric (4D) intraoperative swept-source microscope-integrated optical coherence tomography in vitrectomy for proliferative diabetic retinopathy complications. In this prospective study, we analyzed a subgroup of patients with proliferative diabetic retinopathy complications who required vitrectomy and who were imaged by the research swept-source microscope-integrated optical coherence tomography system. In near real time, images were displayed in stereo heads-up display facilitating intraoperative surgeon feedback. Postoperative review included scoring image quality, identifying different diabetic retinopathy-associated pathologies and reviewing the intraoperatively documented surgeon feedback. Twenty eyes were included. Indications for vitrectomy were tractional retinal detachment (16 eyes), combined tractional-rhegmatogenous retinal detachment (2 eyes), and vitreous hemorrhage (2 eyes). Useful, good-quality 2D (B-scans) and 4D images were obtained in 16/20 eyes (80%). In these eyes, multiple diabetic retinopathy complications could be imaged. Swept-source microscope-integrated optical coherence tomography provided surgical guidance, e.g., in identifying dissection planes under fibrovascular membranes, and in determining residual membranes and traction that would benefit from additional peeling. In 4/20 eyes (20%), acceptable images were captured, but they were not useful due to high tractional retinal detachment elevation which was challenging for imaging. Swept-source microscope-integrated optical coherence tomography can provide important guidance during surgery for proliferative diabetic retinopathy complications through intraoperative identification of different complications and facilitation of intraoperative decision making.
Multisource Data Integration in Remote Sensing
NASA Technical Reports Server (NTRS)
Tilton, James C. (Editor)
1991-01-01
Papers presented at the workshop on Multisource Data Integration in Remote Sensing are compiled. The full text of these papers is included. New instruments and new sensors are discussed that can provide us with a large variety of new views of the real world. This huge amount of data has to be combined and integrated in a (computer-) model of this world. Multiple sources may give complimentary views of the world - consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions may be caused by noise, errors during processing, or misinterpretations, and can be identified as such. As a consequence, integration results are very reliable and represent a valid source of information for any geographical information system.
1990-05-01
shrinking water supply, and shrinking fuel reserves. As these challenges multiply, the source of solutions becomes difficult to identify. A cooperative ... movement of the people through political channels seems to be the emerging source. 5 CHAPTER II IDENTIFYING NEW THREAT AREAS What does all this have to do
Integrating Multiple Data Sources for Combinatorial Marker Discovery: A Study in Tumorigenesis.
Bandyopadhyay, Sanghamitra; Mallik, Saurav
2018-01-01
Identification of combinatorial markers from multiple data sources is a challenging task in bioinformatics. Here, we propose a novel computational framework for identifying significant combinatorial markers ( s) using both gene expression and methylation data. The gene expression and methylation data are integrated into a single continuous data as well as a (post-discretized) boolean data based on their intrinsic (i.e., inverse) relationship. A novel combined score of methylation and expression data (viz., ) is introduced which is computed on the integrated continuous data for identifying initial non-redundant set of genes. Thereafter, (maximal) frequent closed homogeneous genesets are identified using a well-known biclustering algorithm applied on the integrated boolean data of the determined non-redundant set of genes. A novel sample-based weighted support ( ) is then proposed that is consecutively calculated on the integrated boolean data of the determined non-redundant set of genes in order to identify the non-redundant significant genesets. The top few resulting genesets are identified as potential s. Since our proposed method generates a smaller number of significant non-redundant genesets than those by other popular methods, the method is much faster than the others. Application of the proposed technique on an expression and a methylation data for Uterine tumor or Prostate Carcinoma produces a set of significant combination of markers. We expect that such a combination of markers will produce lower false positives than individual markers.
Integration of a clinical trial database with a PACS
NASA Astrophysics Data System (ADS)
van Herk, M.
2014-03-01
Many clinical trials use Electronic Case Report Forms (ECRF), e.g., from OpenClinica. Trial data is augmented if DICOM scans, dose cubes, etc. from the Picture Archiving and Communication System (PACS) are included for data mining. Unfortunately, there is as yet no structured way to collect DICOM objects in trial databases. In this paper, we obtain a tight integration of ECRF and PACS using open source software. Methods: DICOM identifiers for selected images/series/studies are stored in associated ECRF events (e.g., baseline) as follows: 1) JavaScript added to OpenClinica communicates using HTML with a gateway server inside the hospitals firewall; 2) On this gateway, an open source DICOM server runs scripts to query and select the data, returning anonymized identifiers; 3) The scripts then collects, anonymizes, zips and transmits selected data to a central trial server; 4) Here data is stored in a DICOM archive which allows authorized ECRF users to view and download the anonymous images associated with each event. Results: All integration scripts are open source. The PACS administrator configures the anonymization script and decides to use the gateway in passive (receiving) mode or in an active mode going out to the PACS to gather data. Our ECRF centric approach supports automatic data mining by iterating over the cases in the ECRF database, providing the identifiers to load images and the clinical data to correlate with image analysis results. Conclusions: Using open source software and web technology, a tight integration has been achieved between PACS and ECRF.
Visual Feature Integration Indicated by pHase-Locked Frontal-Parietal EEG Signals
Phillips, Steven; Takeda, Yuji; Singh, Archana
2012-01-01
The capacity to integrate multiple sources of information is a prerequisite for complex cognitive ability, such as finding a target uniquely identifiable by the conjunction of two or more features. Recent studies identified greater frontal-parietal synchrony during conjunctive than non-conjunctive (feature) search. Whether this difference also reflects greater information integration, rather than just differences in cognitive strategy (e.g., top-down versus bottom-up control of attention), or task difficulty is uncertain. Here, we examine the first possibility by parametrically varying the number of integrated sources from one to three and measuring phase-locking values (PLV) of frontal-parietal EEG electrode signals, as indicators of synchrony. Linear regressions, under hierarchical false-discovery rate control, indicated significant positive slopes for number of sources on PLV in the 30–38 Hz, 175–250 ms post-stimulus frequency-time band for pairs in the sagittal plane (i.e., F3-P3, Fz-Pz, F4-P4), after equating conditions for behavioural performance (to exclude effects due to task difficulty). No such effects were observed for pairs in the transverse plane (i.e., F3-F4, C3-C4, P3-P4). These results provide support for the idea that anterior-posterior phase-locking in the lower gamma-band mediates integration of visual information. They also provide a potential window into cognitive development, seen as developing the capacity to integrate more sources of information. PMID:22427847
Visual feature integration indicated by pHase-locked frontal-parietal EEG signals.
Phillips, Steven; Takeda, Yuji; Singh, Archana
2012-01-01
The capacity to integrate multiple sources of information is a prerequisite for complex cognitive ability, such as finding a target uniquely identifiable by the conjunction of two or more features. Recent studies identified greater frontal-parietal synchrony during conjunctive than non-conjunctive (feature) search. Whether this difference also reflects greater information integration, rather than just differences in cognitive strategy (e.g., top-down versus bottom-up control of attention), or task difficulty is uncertain. Here, we examine the first possibility by parametrically varying the number of integrated sources from one to three and measuring phase-locking values (PLV) of frontal-parietal EEG electrode signals, as indicators of synchrony. Linear regressions, under hierarchical false-discovery rate control, indicated significant positive slopes for number of sources on PLV in the 30-38 Hz, 175-250 ms post-stimulus frequency-time band for pairs in the sagittal plane (i.e., F3-P3, Fz-Pz, F4-P4), after equating conditions for behavioural performance (to exclude effects due to task difficulty). No such effects were observed for pairs in the transverse plane (i.e., F3-F4, C3-C4, P3-P4). These results provide support for the idea that anterior-posterior phase-locking in the lower gamma-band mediates integration of visual information. They also provide a potential window into cognitive development, seen as developing the capacity to integrate more sources of information.
Integration and Utilization of Nuclear Systems on the Moon and Mars
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houts, Michael G.; Schmidt, George R.; Bragg-Sitton, Shannon
2006-01-20
Over the past five decades numerous studies have identified nuclear energy as an enhancing or enabling technology for planetary surface exploration missions. This includes both radioisotope and fission sources for providing both heat and electricity. Nuclear energy sources were used to provide electricity on Apollo missions 12, 14, 15, 16, and 17, and on the Mars Viking landers. Very small nuclear energy sources were used to provide heat on the Mars Pathfinder, Spirit, and Opportunity rovers. Research has been performed at NASA MSFC to help assess potential issues associated with surface nuclear energy sources, and to generate data that couldmore » be useful to a future program. Research areas include System Integration, use of Regolith as Radiation Shielding, Waste Heat Rejection, Surface Environmental Effects on the Integrated System, Thermal Simulators, Surface System Integration / Interface / Interaction Testing, End-to-End Breadboard Development, Advanced Materials Development, Surface Energy Source Coolants, and Planetary Surface System Thermal Management and Control. This paper provides a status update on several of these research areas.« less
An Integrated Forensics Approach To Fingerprint PCB Sources In Sediments Using RSC And ACF
Determing the original source of contamination to a heterogeneous matrix matrix such as sediment is a requirement for both clean-up and compliance programs. Identifying the source of sediment contaminants in industrial settings is a pre-requisite to implementing any proposed se...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bragg-Sitton, Shannon; Boardman, Richard; Ruth, Mark
The U.S. Department of Energy (DOE) recognizes the need to transform the energy infrastructure of the U.S. and elsewhere to systems that can drastically reduce environmental impacts in an efficient and economically viable manner while utilizing both hydrocarbon resources and clean energy generation sources. Thus, DOE is supporting research and development that could lead to more efficient utilization of clean energy generation sources, including renewable and nuclear options. A concept being advanced by the DOE Offices of Nuclear Energy (NE) and Energy Efficiency and Renewable Energy (EERE) is tighter coupling of nuclear and renewable energy sources in a manner thatmore » produces new energy currency for the combined electricity grid, industrial manufacturing, and the transportation energy sectors. This integration concept has been referred to as a “hybrid system” that is capable of providing the right type of energy, at the right time, in the right place. At the direction of DOE-NE and DOE-EERE leadership, project leads at Idaho National Laboratory (INL), National Renewable Energy Laboratory (NREL) and Massachusetts Institute of Technology (MIT) have identified and engaged stakeholders in discussing integrated energy systems that would optimize renewable and nuclear energy integration on a region-by-region basis. Subsequent work will entail conduct of technical, economic, environmental and socio-political evaluations of the leading integrated system options based on a set of criteria established with stakeholder input. The Foundational Workshop for Integrated Nuclear – Renewable Energy Systems was organized around the following objectives: 1. Identify and refine priority region-specific opportunities for integrated nuclear-renewable energy systems in the U.S.; 2. Select Figures of Merit (FOM) to rank and prioritize candidate systems; 3. Discuss enabling technology development needs; 4. Identify analysis requirements, capabilities and gaps to estimate FOM for integrated system options; 5. Identify experimental needs to develop and demonstrate nuclear-renewable energy systems.« less
EnRICH: Extraction and Ranking using Integration and Criteria Heuristics.
Zhang, Xia; Greenlee, M Heather West; Serb, Jeanne M
2013-01-15
High throughput screening technologies enable biologists to generate candidate genes at a rate that, due to time and cost constraints, cannot be studied by experimental approaches in the laboratory. Thus, it has become increasingly important to prioritize candidate genes for experiments. To accomplish this, researchers need to apply selection requirements based on their knowledge, which necessitates qualitative integration of heterogeneous data sources and filtration using multiple criteria. A similar approach can also be applied to putative candidate gene relationships. While automation can assist in this routine and imperative procedure, flexibility of data sources and criteria must not be sacrificed. A tool that can optimize the trade-off between automation and flexibility to simultaneously filter and qualitatively integrate data is needed to prioritize candidate genes and generate composite networks from heterogeneous data sources. We developed the java application, EnRICH (Extraction and Ranking using Integration and Criteria Heuristics), in order to alleviate this need. Here we present a case study in which we used EnRICH to integrate and filter multiple candidate gene lists in order to identify potential retinal disease genes. As a result of this procedure, a candidate pool of several hundred genes was narrowed down to five candidate genes, of which four are confirmed retinal disease genes and one is associated with a retinal disease state. We developed a platform-independent tool that is able to qualitatively integrate multiple heterogeneous datasets and use different selection criteria to filter each of them, provided the datasets are tables that have distinct identifiers (required) and attributes (optional). With the flexibility to specify data sources and filtering criteria, EnRICH automatically prioritizes candidate genes or gene relationships for biologists based on their specific requirements. Here, we also demonstrate that this tool can be effectively and easily used to apply highly specific user-defined criteria and can efficiently identify high quality candidate genes from relatively sparse datasets.
Time-integrated (typically 24-hr) filter-based methods (historical methods) form the underpinning of our understanding of the fate, impact of source emissions at receptor locations (source impacts), and potential health and welfare effects of particulate matter (PM) in air. Over...
Source recognition by stimulus content in the MTL.
Park, Heekyeong; Abellanoza, Cheryl; Schaeffer, James; Gandy, Kellen
2014-03-17
Source memory is considered to be the cornerstone of episodic memory that enables us to discriminate similar but different events. In the present fMRI study, we investigated whether neural correlates of source retrieval differed by stimulus content in the medial temporal lobe (MTL) when the item and context had been integrated as a perceptually unitized entity. Participants were presented with a list of items either in verbal or pictorial form overlaid on a colored square and instructed to integrate both the item and context into a single image. At test, participants judged the study status of test items and the color in which studied items were presented. Source recognition invariant of stimulus content elicited retrieval activity in both the left anterior hippocampus extending to the perirhinal cortex and the right posterior hippocampus. Word-selective source recognition was related to activity in the left perirhinal cortex, whereas picture-selective source recognition was identified in the left posterior hippocampus. Neural activity sensitive to novelty detection common to both words and pictures was found in the left anterior and right posterior hippocampus. Novelty detection selective to words was associated with the left perirhinal cortex, while activity sensitive to new pictures was identified in the bilateral hippocampus and adjacent MTL cortices, including the parahippocampal, entorhinal, and perirhinal cortices. These findings provide further support for the integral role of the hippocampus both in source recognition and in detection of new stimuli across stimulus content. Additionally, novelty effects in the MTL reveal the integral role of the MTL cortex as the interface for processing new information. Collectively, the present findings demonstrate the importance of the MTL for both previously experienced and novel events. Copyright © 2014 Elsevier B.V. All rights reserved.
Ray, Sumanta; Maulik, Ujjwal
2016-12-20
Detecting perturbation in modular structure during HIV-1 disease progression is an important step to understand stage specific infection pattern of HIV-1 virus in human cell. In this article, we proposed a novel methodology on integration of multiple biological information to identify such disruption in human gene module during different stages of HIV-1 infection. We integrate three different biological information: gene expression information, protein-protein interaction information and gene ontology information in single gene meta-module, through non negative matrix factorization (NMF). As the identified metamodules inherit those information so, detecting perturbation of these, reflects the changes in expression pattern, in PPI structure and in functional similarity of genes during the infection progression. To integrate modules of different data sources into strong meta-modules, NMF based clustering is utilized here. Perturbation in meta-modular structure is identified by investigating the topological and intramodular properties and putting rank to those meta-modules using a rank aggregation algorithm. We have also analyzed the preservation structure of significant GO terms in which the human proteins of the meta-modules participate. Moreover, we have performed an analysis to show the change of coregulation pattern of identified transcription factors (TFs) over the HIV progression stages.
Chronic disease surveillance systems within the US Associated Pacific Island jurisdictions.
Hosey, Gwen; Ichiho, Henry; Satterfield, Dawn; Dankwa-Mullan, Irene; Kuartei, Stevenson; Rhee, Kyu; Belyeu-Camacho, Tayna; deBrum, Ione; Demei, Yorah; Lippwe, Kipier; Luces, Patrick Solidum; Roby, Faiese
2011-07-01
In recent years, illness and death due to chronic disease in the US Associated Pacific Islands (USAPI) jurisdictions have dramatically increased. Effective chronic disease surveillance can help monitor disease trends, evaluate public policy, prioritize resource allocation, and guide program planning, evaluation, and research. Although chronic disease surveillance is being conducted in the USAPI, no recently published capacity assessments for chronic disease surveillance are available. The objective of this study was to assess the quality of existing USAPI chronic disease data sources and identify jurisdictional capacity for chronic disease surveillance. The assessment included a chronic disease data source inventory, literature review, and review of surveillance documentation available from the web or through individual jurisdictions. We used the World Health Organization's Health Metric Network Framework to assess data source quality and to identify jurisdictional capacity. Results showed that USAPI data sources are generally aligned with widely accepted chronic disease surveillance indicators and use standardized data collection methodology to measure chronic disease behavioral risks, preventive practices, illness, and death. However, all jurisdictions need to strengthen chronic disease surveillance through continued assessment and expanded support for valid and reliable data collection, analysis and reporting, dissemination, and integration among population-based and institution-based data sources. For sustained improvement, we recommend investment and technical assistance in support of a chronic disease surveillance system that integrates population-based and institution-based data sources. An integrated strategy that bridges and links USAPI data sources can support evidence-based policy and population health interventions.
Anguita, Alberto; García-Remesal, Miguel; Graf, Norbert; Maojo, Victor
2016-04-01
Modern biomedical research relies on the semantic integration of heterogeneous data sources to find data correlations. Researchers access multiple datasets of disparate origin, and identify elements-e.g. genes, compounds, pathways-that lead to interesting correlations. Normally, they must refer to additional public databases in order to enrich the information about the identified entities-e.g. scientific literature, published clinical trial results, etc. While semantic integration techniques have traditionally focused on providing homogeneous access to private datasets-thus helping automate the first part of the research, and there exist different solutions for browsing public data, there is still a need for tools that facilitate merging public repositories with private datasets. This paper presents a framework that automatically locates public data of interest to the researcher and semantically integrates it with existing private datasets. The framework has been designed as an extension of traditional data integration systems, and has been validated with an existing data integration platform from a European research project by integrating a private biological dataset with data from the National Center for Biotechnology Information (NCBI). Copyright © 2016 Elsevier Inc. All rights reserved.
Stead, William W.; Miller, Randolph A.; Musen, Mark A.; Hersh, William R.
2000-01-01
The vision of integrating information—from a variety of sources, into the way people work, to improve decisions and process—is one of the cornerstones of biomedical informatics. Thoughts on how this vision might be realized have evolved as improvements in information and communication technologies, together with discoveries in biomedical informatics, and have changed the art of the possible. This review identified three distinct generations of “integration” projects. First-generation projects create a database and use it for multiple purposes. Second-generation projects integrate by bringing information from various sources together through enterprise information architecture. Third-generation projects inter-relate disparate but accessible information sources to provide the appearance of integration. The review suggests that the ideas developed in the earlier generations have not been supplanted by ideas from subsequent generations. Instead, the ideas represent a continuum of progress along the three dimensions of workflow, structure, and extraction. PMID:10730596
Bayesian Integrated Microbial Forensics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, Kristin H.; Kreuzer-Martin, Helen W.; Wunschel, David S.
2008-06-01
In the aftermath of the 2001 anthrax letters, researchers have been exploring ways to predict the production environment of unknown source microorganisms. Different mass spectral techniques are being developed to characterize components of a microbe’s culture medium including water, carbon and nitrogen sources, metal ions added, and the presence of agar. Individually, each technique has the potential to identify one or two ingredients in a culture medium recipe. However, by integrating data from multiple mass spectral techniques, a more complete characterization is possible. We present a Bayesian statistical approach to integrated microbial forensics and illustrate its application on spores grownmore » in different culture media.« less
Power source selection for neutral particle beam systems
NASA Astrophysics Data System (ADS)
Silverman, Sidney W.; Chi, John W. H.; Hill, Gregory
Space based neutral particle beams (NPB) are being considered for use as an SDI weapon as well as a mid-course discriminator. These systems require a radio frequency (RF) power source. Five types of amplifiers were considered for the RF power source: the klystron, the klystrode, the tetrode, the cross field amplifier, and the solid state amplifier. A number of different types of power source systems (nuclear and non-nuclear) were considered for integration with these amplifiers. The most attractive amplifier power system concepts were identified through comparative evaluations that took into account the total masses of integrated amplifier power source systems as well as a number of other factors that consisted of development cost, technology risk, vulnerability, survivability, reliability, and impacts on spacecraft stabilization. These concepts are described and conclusions drawn.
Situational Leadership, Perception, and the Impact of Power.
ERIC Educational Resources Information Center
Hersey, Paul; And Others
1979-01-01
Integrates the concept of power with situational leadership by relating the perception of a leader's power bases with leadership styles. Sources of power are identified; situational leadership is reviewed; and the Power Perception Profile is discussed. Maturity levels and their relationships to power sources and leadership styles are discussed.…
Facility Registry Service (FRS)
This is a centrally managed database that identifies facilities either subject to environmental regulations or of environmental interest, providing an integrated source of air, water, and waste environmental data.
NASA Astrophysics Data System (ADS)
Shakak, N. B. I.
2018-04-01
Geographical information system (GIS) and remote sensing technique is a tool which is used for acquiring data from space, storing, analyzing and displaying spatial data, also can use for investigating source of environmental pollution which is affect health. Sudan landsat mosaic image which acquired in 2013 was used in this study to develop land use and land cover maps for tow selected study area, Khartoum urban area, and Bara locality in North kordofan state western Sudan. The main objective to assess the source of Nitrate pollution in shallow aquifer. ERDAS software was used to create land cover-land use maps for the study areas. For Khartoum town we used land sat mosaic image which acquire in 2013, and used supervised classification which more closely controlled than unsupervised. In this process, we select pixel that represent patterns you recognized or can identify with help from knowledge of the data, the classes desired, and the algorithm to be used is required. In this paper we integrated the (GIS&RS), and stable isotopes methods for fingerprinting Nitrate sources in shallow boreholes. The global positioning system (GPS), used in the field to identify the shallow boreholes location in a three dimensional coordinate (Latitude, longitude, and altitude), Water samples were collected from 19 shallow boreholes in the study areas according to the standard sampling method send to laboratory to measure stable nitrogen (δ15Nnitrate), and Nitrate-oxygen (δ18Onitrate) isotopes. Analysis were conducted by using isotope ratio mass spectrometry (IRMS). We can conclude that, special distribution and integration of GIs & RS help to identify the source of nitrate pollution.
O’Connor, Richard J.; Cummings, K. Michael; Rees, Vaughan W.; Connolly, Gregory N.; Norton, Kaila J.; Sweanor, David; Parascandola, Mark; Hatsukami, Dorothy K.; Shields, Peter G.
2015-01-01
Tobacco products are widely sold and marketed, yet integrated data systems for identifying, tracking, and characterizing products are lacking. Tobacco manufacturers recently have developed potential reduction exposure products (PREPs) with implied or explicit health claims. Currently, a systematic approach for identifying, defining, and evaluating PREPs sold at the local, state or national levels in the US has not been developed. Identifying, characterizing, and monitoring new tobacco products could be greatly enhanced with a responsive surveillance system. This paper critically reviews available surveillance data sources for identifying and tracking tobacco products, including PREPs, evaluating strengths and weaknesses of potential data sources in light of their reliability and validity. Absent regulations mandating disclosure of product-specific information, it is likely that public health officials will need to rely on a variety of imperfect data sources to help identify, characterize, and monitor tobacco products, including PREPs. PMID:19959680
Matrix factorization-based data fusion for the prediction of lncRNA-disease associations.
Fu, Guangyuan; Wang, Jun; Domeniconi, Carlotta; Yu, Guoxian
2018-05-01
Long non-coding RNAs (lncRNAs) play crucial roles in complex disease diagnosis, prognosis, prevention and treatment, but only a small portion of lncRNA-disease associations have been experimentally verified. Various computational models have been proposed to identify lncRNA-disease associations by integrating heterogeneous data sources. However, existing models generally ignore the intrinsic structure of data sources or treat them as equally relevant, while they may not be. To accurately identify lncRNA-disease associations, we propose a Matrix Factorization based LncRNA-Disease Association prediction model (MFLDA in short). MFLDA decomposes data matrices of heterogeneous data sources into low-rank matrices via matrix tri-factorization to explore and exploit their intrinsic and shared structure. MFLDA can select and integrate the data sources by assigning different weights to them. An iterative solution is further introduced to simultaneously optimize the weights and low-rank matrices. Next, MFLDA uses the optimized low-rank matrices to reconstruct the lncRNA-disease association matrix and thus to identify potential associations. In 5-fold cross validation experiments to identify verified lncRNA-disease associations, MFLDA achieves an area under the receiver operating characteristic curve (AUC) of 0.7408, at least 3% higher than those given by state-of-the-art data fusion based computational models. An empirical study on identifying masked lncRNA-disease associations again shows that MFLDA can identify potential associations more accurately than competing models. A case study on identifying lncRNAs associated with breast, lung and stomach cancers show that 38 out of 45 (84%) associations predicted by MFLDA are supported by recent biomedical literature and further proves the capability of MFLDA in identifying novel lncRNA-disease associations. MFLDA is a general data fusion framework, and as such it can be adopted to predict associations between other biological entities. The source code for MFLDA is available at: http://mlda.swu.edu.cn/codes.php? name = MFLDA. gxyu@swu.edu.cn. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Francisco, Glen; Brown, Todd
2012-06-01
Integrated security systems are essential to pre-empting criminal assaults. Nearly 500,000 sites have been identified (source: US DHS) as critical infrastructure sites that would suffer severe damage if a security breach should occur. One major breach in any of 123 U.S. facilities, identified as "most critical", threatens more than 1,000,000 people. The vulnerabilities of critical infrastructure are expected to continue and even heighten over the coming years.
Yu, Weiyu; Wardrop, Nicola A; Bain, Robert; Wright, Jim A
2017-07-01
Sustainable Development Goal (SDG) 6 has expanded the Millennium Development Goals' focus from improved drinking-water to safely managed water services. This expanded focus to include issues such as water quality requires richer monitoring data and potentially integration of datasets from different sources. Relevant data sets include water point mapping (WPM), the survey of boreholes, wells and other water points, census and household survey data. This study examined inconsistencies between population census and WPM datasets for Cambodia, Liberia and Tanzania, and identified potential barriers to integrating the two datasets to meet monitoring needs. Literatures on numbers of people served per water point were used to convert WPM data to population served by water source type per area and compared with census reports. For Cambodia and Tanzania, discrepancies with census data suggested incomplete WPM coverage. In Liberia, where the data sets were consistent, WPM-derived data on functionality, quantity and quality of drinking water were further combined with census area statistics to generate an enhanced drinking-water access measure for protected wells and springs. The process revealed barriers to integrating census and WPM data, including exclusion of water points not used for drinking by households, matching of census and WPM source types; temporal mismatches between data sources; data quality issues such as missing or implausible data values, and underlying assumptions about population served by different water point technologies. However, integration of these two data sets could be used to identify and rectify gaps in WPM coverage. If WPM databases become more complete and the above barriers are addressed, it could also be used to develop more realistic measures of household drinking-water access for monitoring. Copyright © 2017 Elsevier GmbH. All rights reserved.
Oak Ridge Spallation Neutron Source (ORSNS) target station design integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamy, T.; Booth, R.; Cleaves, J.
1996-06-01
The conceptual design for a 1- to 3-MW short pulse spallation source with a liquid mercury target has been started recently. The design tools and methods being developed to define requirements, integrate the work, and provide early cost guidance will be presented with a summary of the current target station design status. The initial design point was selected with performance and cost estimate projections by a systems code. This code was developed recently using cost estimates from the Brookhaven Pulsed Spallation Neutron Source study and experience from the Advanced Neutron Source Project`s conceptual design. It will be updated and improvedmore » as the design develops. Performance was characterized by a simplified figure of merit based on a ratio of neutron production to costs. A work breakdown structure was developed, with simplified systems diagrams used to define interfaces and system responsibilities. A risk assessment method was used to identify potential problems, to identify required research and development (R&D), and to aid contingency development. Preliminary 3-D models of the target station are being used to develop remote maintenance concepts and to estimate costs.« less
Brazhnik, Olga; Jones, John F.
2007-01-01
Producing reliable information is the ultimate goal of data processing. The ocean of data created with the advances of science and technologies calls for integration of data coming from heterogeneous sources that are diverse in their purposes, business rules, underlying models and enabling technologies. Reference models, Semantic Web, standards, ontology, and other technologies enable fast and efficient merging of heterogeneous data, while the reliability of produced information is largely defined by how well the data represent the reality. In this paper we initiate a framework for assessing the informational value of data that includes data dimensions; aligning data quality with business practices; identifying authoritative sources and integration keys; merging models; uniting updates of varying frequency and overlapping or gapped data sets. PMID:17071142
Kernel-PCA data integration with enhanced interpretability
2014-01-01
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge. PMID:25032747
Flat-Spectrum Radio Sources as Likely Counterparts of Unidentified INTEGRAL Sources (Research Note)
NASA Technical Reports Server (NTRS)
Molina, M.; Landi, R.; Bassani, L.; Malizia, A.; Stephen, J. B.; Bazzano, A.; Bird, A. J.; Gehrels, N.
2012-01-01
Many sources in the fourth INTEGRAL/IBIS catalogue are still unidentified since they lack an optical counterpart. An important tool that can help in identifying and classifying these sources is the cross-correlation with radio catalogues, which are very sensitive and positionally accurate. Moreover, the radio properties of a source, such as the spectrum or morphology, could provide further insight into its nature. In particular, flat-spectrum radio sources at high Galactic latitudes are likely to be AGN, possibly associated to a blazar or to the compact core of a radio galaxy. Here we present a small sample of 6 sources extracted from the fourth INTEGRAL/IBIS catalogue that are still unidentified or unclassified, but which are very likely associated with a bright, flat-spectrum radio object. To confirm the association and to study the source X-ray spectral parameters, we performed X-ray follow-up observations with Swift/XRT of all objects. We report in this note the overall results obtained from this search and discuss the nature of each individual INTEGRAL source. We find that 5 of the 6 radio associations are also detected in X-rays; furthermore, in 3 cases they are the only counterpart found. More specifically, IGR J06073-0024 is a flat-spectrum radio quasar at z = 1.08, IGR J14488-4008 is a newly discovered radio galaxy, while IGR J18129-0649 is an AGN of a still unknown type. The nature of two sources (IGR J07225-3810 and IGR J19386-4653) is less well defined, since in both cases we find another X-ray source in the INTEGRAL error circle; nevertheless, the flat-spectrum radio source, likely to be a radio loud AGN, remains a viable and, in fact, a more convincing association in both cases. Only for the last object (IGR J11544-7618) could we not find any convincing counterpart since the radio association is not an X-ray emitter, while the only X-ray source seen in the field is a G star and therefore unlikely to produce the persistent emission seen by INTEGRAL.
CIRSS vertical data integration, San Bernardino study
NASA Technical Reports Server (NTRS)
Hodson, W.; Christenson, J.; Michel, R. (Principal Investigator)
1982-01-01
The creation and use of a vertically integrated data base, including LANDSAT data, for local planning purposes in a portion of San Bernardino County, California are described. The project illustrates that a vertically integrated approach can benefit local users, can be used to identify and rectify discrepancies in various data sources, and that the LANDSAT component can be effectively used to identify change, perform initial capability/suitability modeling, update existing data, and refine existing data in a geographic information system. Local analyses were developed which produced data of value to planners in the San Bernardino County Planning Department and the San Bernardino National Forest staff.
Zhou, Peiyu; Chen, Changshu; Ye, Jianjun; Shen, Wenjie; Xiong, Xiaofei; Hu, Ping; Fang, Hongda; Huang, Chuguang; Sun, Yongge
2015-04-15
Oil fingerprints have been a powerful tool widely used for determining the source of spilled oil. In most cases, this tool works well. However, it is usually difficult to identify the source if the oil spill accident occurs during offshore petroleum exploration due to the highly similar physiochemical characteristics of suspected oils from the same drilling platform. In this report, a case study from the waters of the South China Sea is presented, and multidimensional scaling analysis (MDS) is introduced to demonstrate how oil fingerprints can be combined with mathematical methods to identify the source of spilled oil from highly similar suspected sources. The results suggest that the MDS calculation based on oil fingerprints and subsequently integrated with specific biomarkers in spilled oils is the most effective method with a great potential for determining the source in terms of highly similar suspected oils. Copyright © 2015 Elsevier Ltd. All rights reserved.
Can we estimate total magnetization directions from aeromagnetic data using Helbig's integrals?
Phillips, J.D.
2005-01-01
An algorithm that implements Helbig's (1963) integrals for estimating the vector components (mx, my, mz) of tile magnetic dipole moment from the first order moments of the vector magnetic field components (??X, ??Y, ??Z) is tested on real and synthetic data. After a grid of total field aeromagnetic data is converted to vector component grids using Fourier filtering, Helbig's infinite integrals are evaluated as finite integrals in small moving windows using a quadrature algorithm based on the 2-D trapezoidal rule. Prior to integration, best-fit planar surfaces must be removed from the component data within the data windows in order to make the results independent of the coordinate system origin. Two different approaches are described for interpreting the results of the integration. In the "direct" method, results from pairs of different window sizes are compared to identify grid nodes where the angular difference between solutions is small. These solutions provide valid estimates of total magnetization directions for compact sources such as spheres or dipoles, but not for horizontally elongated or 2-D sources. In the "indirect" method, which is more forgiving of source geometry, results of the quadrature analysis are scanned for solutions that are parallel to a specified total magnetization direction.
SNPit: a federated data integration system for the purpose of functional SNP annotation.
Shen, Terry H; Carlson, Christopher S; Tarczy-Hornoch, Peter
2009-08-01
Genome wide association studies can potentially identify the genetic causes behind the majority of human diseases. With the advent of more advanced genotyping techniques, there is now an explosion of data gathered on single nucleotide polymorphisms (SNPs). The need exists for an integrated system that can provide up-to-date functional annotation information on SNPs. We have developed the SNP Integration Tool (SNPit) system to address this need. Built upon a federated data integration system, SNPit provides current information on a comprehensive list of SNP data sources. Additional logical inference analysis was included through an inference engine plug in. The SNPit web servlet is available online for use. SNPit allows users to go to one source for up-to-date information on the functional annotation of SNPs. A tool that can help to integrate and analyze the potential functional significance of SNPs is important for understanding the results from genome wide association studies.
Flipping the Audience Script: An Activity That Integrates Research and Audience Analysis
ERIC Educational Resources Information Center
Lam, Chris; Hannah, Mark A.
2016-01-01
This article describes a flipped classroom activity that requires students to integrate research and audience analysis. The activity uses Twitter as a data source. In the activity, students identify a sample, collect customer tweets, and analyze the language of the tweets in an effort to construct knowledge about an audience's values, needs, and…
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Ground Penetrating Radar as a Contextual Sensor for Multi-Sensor Radiological Characterisation
Ukaegbu, Ikechukwu K.; Gamage, Kelum A. A.
2017-01-01
Radioactive sources exist in environments or contexts that influence how they are detected and localised. For instance, the context of a moving source is different from a stationary source because of the effects of motion. The need to incorporate this contextual information in the radiation detection and localisation process has necessitated the integration of radiological and contextual sensors. The benefits of the successful integration of both types of sensors is well known and widely reported in fields such as medical imaging. However, the integration of both types of sensors has also led to innovative solutions to challenges in characterising radioactive sources in non-medical applications. This paper presents a review of such recent applications. It also identifies that these applications mostly use visual sensors as contextual sensors for characterising radiation sources. However, visual sensors cannot retrieve contextual information about radioactive wastes located in opaque environments encountered at nuclear sites, e.g., underground contamination. Consequently, this paper also examines ground-penetrating radar (GPR) as a contextual sensor for characterising this category of wastes and proposes several ways of integrating data from GPR and radiological sensors. Finally, it demonstrates combined GPR and radiation imaging for three-dimensional localisation of contamination in underground pipes using radiation transport and GPR simulations. PMID:28387706
FIA: An Open Forensic Integration Architecture for Composing Digital Evidence
NASA Astrophysics Data System (ADS)
Raghavan, Sriram; Clark, Andrew; Mohay, George
The analysis and value of digital evidence in an investigation has been the domain of discourse in the digital forensic community for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a distant dream. With the advent of modern technologies, pro-active measures are integral to keeping abreast of all forms of cyber crimes and attacks. This paper motivates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architecture (FIA) which provides a framework for abstracting the evidence source and storage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identifies evidence information from multiple sources that enables an investigator to build theories to reconstruct the past. FIA is hierarchically composed of multiple layers and adopts a technology independent approach. FIA is also open and extensible making it simple to adapt to technological changes. We present a case study using a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.
Köster, Lennart; Krupka, Kai; Höcker, Britta; Rahmel, Axel; Samuel, Undine; Zanen, Wouter; Opelz, Gerhard; Süsal, Caner; Döhler, Bernd; Plotnicki, Lukasz; Kohl, Christian D; Knaup, Petra; Tönshoff, Burkhard
2015-01-01
Patient registries are a useful tool to measure outcomes and compare the effectiveness of therapies in a specific patient population. High data quality and completeness are therefore advantageous for registry analysis. Data integration from multiple sources may increase completeness of the data. The pediatric renal transplantation registry CERTAIN identified Eurotransplant (ET) and the Collaborative Transplant Study (CTS) as possible partners for data exchange. Import and export interfaces with CTS and ET were implemented. All parties reached their projected goals and benefit from the exchange.
NASA Astrophysics Data System (ADS)
Masetti, N.; Parisi, P.; Jiménez-Bailón, E.; Palazzi, E.; Chavushyan, V.; Bassani, L.; Bazzano, A.; Bird, A. J.; Dean, A. J.; Galaz, G.; Landi, R.; Malizia, A.; Minniti, D.; Morelli, L.; Schiavone, F.; Stephen, J. B.; Ubertini, P.
2012-02-01
Since its launch in October 2002, the INTEGRAL satellite has revolutionized our knowledge of the hard X-ray sky thanks to its unprecedented imaging capabilities and source detection positional accuracy above 20 keV. Nevertheless, many of the newly-detected sources in the INTEGRAL sky surveys are of unknown nature. The combined use of available information at longer wavelengths (mainly soft X-rays and radio) and of optical spectroscopy on the putative counterparts of these new hard X-ray objects allows us to pinpoint their exact nature. Continuing our long-standing program that has been running since 2004, and using 6 different telescopes of various sizes together with data from an online spectroscopic survey, here we report the classification through optical spectroscopy of 22 more unidentified or poorly studied high-energy sources detected with the IBIS instrument onboard INTEGRAL. We found that 16 of them are active galactic nuclei (AGNs), while the remaining 6 objects are within our Galaxy. Among the identified extragalactic sources, the large majority (14) is made up of type 1 AGNs (i.e. with broad emission lines); of these, 6 lie at redshift larger than 0.5 and one (IGR J12319-0749) has z = 3.12, which makes it the second farthest object detected in the INTEGRAL surveys up to now. The remaining AGNs are of type 2 (that is, with narrow emission lines only), and one of the two cases is confirmed as a pair of interacting Seyfert 2 galaxies. The Galactic objects are identified as two cataclysmic variables, one high-mass X-ray binary, one symbiotic binary and two chromospherically active stars, possibly of RS CVn type. The main physical parameters of these hard X-ray sources were also determined using the multiwavelength information available in the literature. We thus still find that AGNs are the most abundant population among hard X-ray objects identified through optical spectroscopy. Moreover, we note that the higher sensitivity of the more recent INTEGRAL surveys is now enabling the detection of high-redshift AGNs, thus allowing the exploration of the most distant hard X-ray emitting sources and possibly of the most extreme blazars. Based on observations collected at the following observatories: Cerro Tololo Interamerican Observatory (Chile); Observatorio del Roque de los Muchachos of the Instituto de Astrofísica de Canarias (Canary Islands, Spain); Astronomical Observatory of Bologna in Loiano (Italy); Astronomical Observatory of Asiago (Italy); Observatorio Astronómico Nacional (San Pedro Mártir, Mexico); Anglo-Australian Observatory (Siding Spring, Australia).
Tian, Yu; Kang, Xiaodong; Li, Yunyi; Li, Wei; Zhang, Aiqun; Yu, Jiangchen; Li, Yiping
2013-01-01
This article presents a strategy for identifying the source location of a chemical plume in near-shore oceanic environments where the plume is developed under the influence of turbulence, tides and waves. This strategy includes two modules: source declaration (or identification) and source verification embedded in a subsumption architecture. Algorithms for source identification are derived from the moth-inspired plume tracing strategies based on a chemical sensor. The in-water test missions, conducted in November 2002 at San Clemente Island (California, USA) in June 2003 in Duck (North Carolina, USA) and in October 2010 at Dalian Bay (China), successfully identified the source locations after autonomous underwater vehicles tracked the rhodamine dye plumes with a significant meander over 100 meters. The objective of the verification module is to verify the declared plume source using a visual sensor. Because images taken in near shore oceanic environments are very vague and colors in the images are not well-defined, we adopt a fuzzy color extractor to segment the color components and recognize the chemical plume and its source by measuring color similarity. The source verification module is tested by images taken during the CPT missions. PMID:23507823
Parikh, Priti P; Minning, Todd A; Nguyen, Vinh; Lalithsena, Sarasi; Asiaee, Amir H; Sahoo, Satya S; Doshi, Prashant; Tarleton, Rick; Sheth, Amit P
2012-01-01
Research on the biology of parasites requires a sophisticated and integrated computational platform to query and analyze large volumes of data, representing both unpublished (internal) and public (external) data sources. Effective analysis of an integrated data resource using knowledge discovery tools would significantly aid biologists in conducting their research, for example, through identifying various intervention targets in parasites and in deciding the future direction of ongoing as well as planned projects. A key challenge in achieving this objective is the heterogeneity between the internal lab data, usually stored as flat files, Excel spreadsheets or custom-built databases, and the external databases. Reconciling the different forms of heterogeneity and effectively integrating data from disparate sources is a nontrivial task for biologists and requires a dedicated informatics infrastructure. Thus, we developed an integrated environment using Semantic Web technologies that may provide biologists the tools for managing and analyzing their data, without the need for acquiring in-depth computer science knowledge. We developed a semantic problem-solving environment (SPSE) that uses ontologies to integrate internal lab data with external resources in a Parasite Knowledge Base (PKB), which has the ability to query across these resources in a unified manner. The SPSE includes Web Ontology Language (OWL)-based ontologies, experimental data with its provenance information represented using the Resource Description Format (RDF), and a visual querying tool, Cuebee, that features integrated use of Web services. We demonstrate the use and benefit of SPSE using example queries for identifying gene knockout targets of Trypanosoma cruzi for vaccine development. Answers to these queries involve looking up multiple sources of data, linking them together and presenting the results. The SPSE facilitates parasitologists in leveraging the growing, but disparate, parasite data resources by offering an integrative platform that utilizes Semantic Web techniques, while keeping their workload increase minimal.
ERIC Educational Resources Information Center
Calinger, Ronald, Ed.
This book brings together papers by scholars from around the globe on the historiography and history of mathematics and their integration with mathematical pedagogy. Of the three articles in Part 1, "Historiography and Sources", one identifies research trends in the history of mathematics, the second discusses the centrality of problems, and the…
Yao, Hong; Li, Weixin; Qian, Xin
2015-01-01
Environmental safety in multi-district boundary regions has been one of the focuses in China and is mentioned many times in the Environmental Protection Act of 2014. Five types were categorized concerning the risk sources for surface water pollution in the multi-provincial boundary region of the Taihu basin: production enterprises, waste disposal sites, chemical storage sites, agricultural non-point sources and waterway transportations. Considering the hazard of risk sources, the purification property of environmental medium and the vulnerability of risk receptors, 52 specific attributes on the risk levels of each type of risk source were screened out. Continuous piecewise linear function model, expert consultation method and fuzzy integral model were used to calculate the integrated risk indexes (RI) to characterize the risk levels of pollution sources. In the studied area, 2716 pollution sources were characterized by RI values. There were 56 high-risk sources screened out as major risk sources, accounting for about 2% of the total. The numbers of sources with high-moderate, moderate, moderate-low and low pollution risk were 376, 1059, 101 and 1124, respectively, accounting for 14%, 38%, 5% and 41% of the total. The procedure proposed could be included in the integrated risk management systems of the multi-district boundary region of the Taihu basin. It could help decision makers to identify major risk sources in the risk prevention and reduction of surface water pollution. PMID:26308032
Yao, Hong; Li, Weixin; Qian, Xin
2015-08-21
Environmental safety in multi-district boundary regions has been one of the focuses in China and is mentioned many times in the Environmental Protection Act of 2014. Five types were categorized concerning the risk sources for surface water pollution in the multi-provincial boundary region of the Taihu basin: production enterprises, waste disposal sites, chemical storage sites, agricultural non-point sources and waterway transportations. Considering the hazard of risk sources, the purification property of environmental medium and the vulnerability of risk receptors, 52 specific attributes on the risk levels of each type of risk source were screened out. Continuous piecewise linear function model, expert consultation method and fuzzy integral model were used to calculate the integrated risk indexes (RI) to characterize the risk levels of pollution sources. In the studied area, 2716 pollution sources were characterized by RI values. There were 56 high-risk sources screened out as major risk sources, accounting for about 2% of the total. The numbers of sources with high-moderate, moderate, moderate-low and low pollution risk were 376, 1059, 101 and 1124, respectively, accounting for 14%, 38%, 5% and 41% of the total. The procedure proposed could be included in the integrated risk management systems of the multi-district boundary region of the Taihu basin. It could help decision makers to identify major risk sources in the risk prevention and reduction of surface water pollution.
Exploring the Characteristics and Diverse Sources of Students' Mental Models of Acids and Bases
ERIC Educational Resources Information Center
Lin, Jing-Wen; Chiu, Mei-Hung
2007-01-01
This study was part of a 6-year integrated project designed to build a databank of students' science conceptions in Taiwan. The main purpose of this study was to identify the characteristics of students' mental models regarding acids/bases, understand their changes in mental models, and explore sources that might influence students in constructing…
Mungall, Christopher J.; McMurry, Julie A.; Köhler, Sebastian; Balhoff, James P.; Borromeo, Charles; Brush, Matthew; Carbon, Seth; Conlin, Tom; Dunn, Nathan; Engelstad, Mark; Foster, Erin; Gourdine, J.P.; Jacobsen, Julius O.B.; Keith, Dan; Laraway, Bryan; Lewis, Suzanna E.; NguyenXuan, Jeremy; Shefchek, Kent; Vasilevsky, Nicole; Yuan, Zhou; Washington, Nicole; Hochheiser, Harry; Groza, Tudor; Smedley, Damian; Robinson, Peter N.; Haendel, Melissa A.
2017-01-01
The correlation of phenotypic outcomes with genetic variation and environmental factors is a core pursuit in biology and biomedicine. Numerous challenges impede our progress: patient phenotypes may not match known diseases, candidate variants may be in genes that have not been characterized, model organisms may not recapitulate human or veterinary diseases, filling evolutionary gaps is difficult, and many resources must be queried to find potentially significant genotype–phenotype associations. Non-human organisms have proven instrumental in revealing biological mechanisms. Advanced informatics tools can identify phenotypically relevant disease models in research and diagnostic contexts. Large-scale integration of model organism and clinical research data can provide a breadth of knowledge not available from individual sources and can provide contextualization of data back to these sources. The Monarch Initiative (monarchinitiative.org) is a collaborative, open science effort that aims to semantically integrate genotype–phenotype data from many species and sources in order to support precision medicine, disease modeling, and mechanistic exploration. Our integrated knowledge graph, analytic tools, and web services enable diverse users to explore relationships between phenotypes and genotypes across species. PMID:27899636
Mungall, Christopher J.; McMurry, Julie A.; Köhler, Sebastian; ...
2016-11-29
The correlation of phenotypic outcomes with genetic variation and environmental factors is a core pursuit in biology and biomedicine. Numerous challenges impede our progress: patient phenotypes may not match known diseases, candidate variants may be in genes that have not been characterized, model organisms may not recapitulate human or veterinary diseases, filling evolutionary gaps is difficult, and many resources must be queried to find potentially significant genotype-phenotype associations. Nonhuman organisms have proven instrumental in revealing biological mechanisms. Advanced informatics tools can identify phenotypically relevant disease models in research and diagnostic contexts. Large-scale integration of model organism and clinical research datamore » can provide a breadth of knowledge not available from individual sources and can provide contextualization of data back to these sources. The Monarch Initiative (monarchinitiative.org) is a collaborative, open science effort that aims to semantically integrate genotype-phenotype data from many species and sources in order to support precision medicine, disease modeling, and mechanistic exploration. Our integrated knowledge graph, analytic tools, and web services enable diverse users to explore relationships between phenotypes and genotypes across species.« less
Mining integrated semantic networks for drug repositioning opportunities
Mullen, Joseph; Tipney, Hannah
2016-01-01
Current research and development approaches to drug discovery have become less fruitful and more costly. One alternative paradigm is that of drug repositioning. Many marketed examples of repositioned drugs have been identified through serendipitous or rational observations, highlighting the need for more systematic methodologies to tackle the problem. Systems level approaches have the potential to enable the development of novel methods to understand the action of therapeutic compounds, but requires an integrative approach to biological data. Integrated networks can facilitate systems level analyses by combining multiple sources of evidence to provide a rich description of drugs, their targets and their interactions. Classically, such networks can be mined manually where a skilled person is able to identify portions of the graph (semantic subgraphs) that are indicative of relationships between drugs and highlight possible repositioning opportunities. However, this approach is not scalable. Automated approaches are required to systematically mine integrated networks for these subgraphs and bring them to the attention of the user. We introduce a formal framework for the definition of integrated networks and their associated semantic subgraphs for drug interaction analysis and describe DReSMin, an algorithm for mining semantically-rich networks for occurrences of a given semantic subgraph. This algorithm allows instances of complex semantic subgraphs that contain data about putative drug repositioning opportunities to be identified in a computationally tractable fashion, scaling close to linearly with network data. We demonstrate the utility of our approach by mining an integrated drug interaction network built from 11 sources. This work identified and ranked 9,643,061 putative drug-target interactions, showing a strong correlation between highly scored associations and those supported by literature. We discuss the 20 top ranked associations in more detail, of which 14 are novel and 6 are supported by the literature. We also show that our approach better prioritizes known drug-target interactions, than other state-of-the art approaches for predicting such interactions. PMID:26844016
Integrating Stomach Content and Stable Isotope Analyses to Quantify the Diets of Pygoscelid Penguins
Polito, Michael J.; Trivelpiece, Wayne Z.; Karnovsky, Nina J.; Ng, Elizabeth; Patterson, William P.; Emslie, Steven D.
2011-01-01
Stomach content analysis (SCA) and more recently stable isotope analysis (SIA) integrated with isotopic mixing models have become common methods for dietary studies and provide insight into the foraging ecology of seabirds. However, both methods have drawbacks and biases that may result in difficulties in quantifying inter-annual and species-specific differences in diets. We used these two methods to simultaneously quantify the chick-rearing diet of Chinstrap (Pygoscelis antarctica) and Gentoo (P. papua) penguins and highlight methods of integrating SCA data to increase accuracy of diet composition estimates using SIA. SCA biomass estimates were highly variable and underestimated the importance of soft-bodied prey such as fish. Two-source, isotopic mixing model predictions were less variable and identified inter-annual and species-specific differences in the relative amounts of fish and krill in penguin diets not readily apparent using SCA. In contrast, multi-source isotopic mixing models had difficulty estimating the dietary contribution of fish species occupying similar trophic levels without refinement using SCA-derived otolith data. Overall, our ability to track inter-annual and species-specific differences in penguin diets using SIA was enhanced by integrating SCA data to isotopic mixing modes in three ways: 1) selecting appropriate prey sources, 2) weighting combinations of isotopically similar prey in two-source mixing models and 3) refining predicted contributions of isotopically similar prey in multi-source models. PMID:22053199
SNPit: a federated data integration system for the purpose of functional SNP annotation
Shen, Terry H; Carlson, Christopher S; Tarczy-Hornoch, Peter
2009-01-01
Genome wide association studies can potentially identify the genetic causes behind the majority of human diseases. With the advent of more advanced genotyping techniques, there is now an explosion of data gathered on single nucleotide polymorphisms (SNPs). The need exists for an integrated system that can provide up-to-date functional annotation information on SNPs. We have developed the SNP Integration Tool (SNPit) system to address this need. Built upon a federated data integration system, SNPit provides current information on a comprehensive list of SNP data sources. Additional logical inference analysis was included through an inference engine plug in. The SNPit web servlet is available online for use. SNPit allows users to go to one source for up-to-date information on the functional annotation of SNPs. A tool that can help to integrate and analyze the potential functional significance of SNPs is important for understanding the results from genome wide association studies. PMID:19327864
Design Science Methodology Applied to a Chemical Surveillance Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.
Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Comet: an open-source MS/MS sequence database search tool.
Eng, Jimmy K; Jahan, Tahmina A; Hoopmann, Michael R
2013-01-01
Proteomics research routinely involves identifying peptides and proteins via MS/MS sequence database search. Thus the database search engine is an integral tool in many proteomics research groups. Here, we introduce the Comet search engine to the existing landscape of commercial and open-source database search tools. Comet is open source, freely available, and based on one of the original sequence database search tools that has been widely used for many years. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Exploring the Hard and Soft X-ray Emission of Magnetic Cataclysmic Variables
NASA Astrophysics Data System (ADS)
de Martino, D.; Anzolin, G.; Bonnet-Bidaud, J.-M.; Falanga, M.; Matt, G.; Mouchet, M.; Mukai, K.; Masetti, N.
2009-05-01
A non-negligible fraction of galactic hard (>20 keV) X-ray sources were identified as CVs of the magnetic Intermediate Polar type in INTEGRAL, SWIFT and RXTE surveys, that suggests a still hidden but potentially important population of faint hard X-ray sources. Simbol-X has the unique potential to simultaneously characterize their variable and complex soft and hard X-ray emission thus allowing to understand their putative role in galactic populations of X-ray sources.
Building a Predictive Capability for Decision-Making that Supports MultiPEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, Joshua Daniel
Multi-phenomenological explosion monitoring (multiPEM) is a developing science that uses multiple geophysical signatures of explosions to better identify and characterize their sources. MultiPEM researchers seek to integrate explosion signatures together to provide stronger detection, parameter estimation, or screening capabilities between different sources or processes. This talk will address forming a predictive capability for screening waveform explosion signatures to support multiPEM.
Integrative Literature Review: Ascertaining Discharge Readiness for Pediatrics After Anesthesia.
Whitley, Deborah R
2016-02-01
Unplanned hospital readmissions after the administration of general anesthesia for ambulatory procedures may contribute to loss of reimbursement and assessment of financial penalties. Pediatric patients represent a unique anesthetic risk. The purpose of this integrative literature review was to ascertain specific criteria used to evaluate discharge readiness for pediatric patients after anesthesia. This study is an integrative review of literature. An integrative literature search was conducted and included literature sources dated January 2008 to November 2013. Key words included pediatric, anesthesia, discharge, criteria, standards, assessment, recovery, postoperative, postanesthesia, scale, score, outpatient, and ambulatory. Eleven literature sources that contributed significantly to the research question were identified. Levels of evidence included three systematic reviews, one randomized controlled trial, three cohort studies, two case series, and two expert opinions. This integrative literature review revealed evidence-based discharge criteria endorsing home readiness for postanesthesia pediatric patients should incorporate consideration for physiological baselines, professional judgment with regard to infant consciousness, and professional practice standards/guidelines. Additionally, identifying and ensuring discharge to a competent adult was considered imperative. Nurses should be aware that frequently used anesthesia scoring systems originated in the 1970s, and this review was unable to locate current literature examining the reliability and validity of their use in conjunction with modern anesthesia-related health care practices. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Cimpian, Andrei; Meltzer, Trent J.; Markman, Ellen M.
2011-01-01
Generic sentences (e.g., "Birds lay eggs") convey generalizations about entire categories and may thus be an important source of knowledge for children. However, these sentences cannot be identified by a simple rule, requiring instead the integration of multiple cues. The present studies focused on 3- to 5-year-olds' (N = 91) use of…
Wagholikar, Amol S; Fung, Maggie; Nelson, Colleen C
2012-01-01
Effective management of chronic diseases is a global health priority. A healthcare information system offers opportunities to address challenges of chronic disease management. However, the requirements of health information systems are often not well understood. The accuracy of requirements has a direct impact on the successful design and implementation of a health information system. Our research describes methods used to understand the requirements of health information systems for advanced prostate cancer management. The research conducted a survey to identify heterogeneous sources of clinical records. Our research showed that the General Practitioner was the common source of patient's clinical records (41%) followed by the Urologist (14%) and other clinicians (14%). Our research describes a method to identify diverse data sources and proposes a novel patient journey browser prototype that integrates disparate data sources.
An integrated approach to assess heavy metal source apportionment in peri-urban agricultural soils.
Huang, Ying; Li, Tingqiang; Wu, Chengxian; He, Zhenli; Japenga, Jan; Deng, Meihua; Yang, Xiaoe
2015-12-15
Three techniques (Isotope Ratio Analysis, GIS mapping, and Multivariate Statistical Analysis) were integrated to assess heavy metal pollution and source apportionment in peri-urban agricultural soils. The soils in the study area were moderately polluted with cadmium (Cd) and mercury (Hg), lightly polluted with lead (Pb), and chromium (Cr). GIS Mapping suggested Cd pollution originates from point sources, whereas Hg, Pb, Cr could be traced back to both point and non-point sources. Principal component analysis (PCA) indicated aluminum (Al), manganese (Mn), nickel (Ni) were mainly inherited from natural sources, while Hg, Pb, and Cd were associated with two different kinds of anthropogenic sources. Cluster analysis (CA) further identified fertilizers, waste water, industrial solid wastes, road dust, and atmospheric deposition as potential sources. Based on isotope ratio analysis (IRA) organic fertilizers and road dusts accounted for 74-100% and 0-24% of the total Hg input, while road dusts and solid wastes contributed for 0-80% and 19-100% of the Pb input. This study provides a reliable approach for heavy metal source apportionment in this particular peri-urban area, with a clear potential for future application in other regions. Copyright © 2015 Elsevier B.V. All rights reserved.
Laukka, Elina; Rantakokko, Piia; Suhonen, Marjo
2017-04-01
The aim of the review was to describe consumer-led health-related online sources and their impact on consumers. The review was carried out as an integrative literature review. Quantisation and qualitative content analysis were used as the analysis method. The most common method used by the included studies was qualitative content analysis. This review identified the consumer-led health-related online sources used between 2009 and 2016 as health-related online communities, health-related social networking sites and health-related rating websites. These sources had an impact on peer support; empowerment; health literacy; physical, mental and emotional wellbeing; illness management; and relationships between healthcare organisations and consumers. The knowledge of the existence of the health-related online sources provides healthcare organisations with an opportunity to listen to their consumers' 'voice'. The sources make healthcare consumers more competent actors in relation to healthcare, and the knowledge of them is a valuable resource for healthcare organisations. Additionally, these health-related online sources might create an opportunity to reduce the need for drifting among the healthcare services. Healthcare policymakers and organisations could benefit from having a strategy of increasing their health-related online sources.
Magner, J A; Brooks, K N
2008-03-01
Section 303(d) of the Clean Water Act requires States and Tribes to list waters not meeting water quality standards. A total maximum daily load must be prepared for waters identified as impaired with respect to water quality standards. Historically, the management of pollution in Minnesota has been focused on point-source regulation. Regulatory effort in Minnesota has improved water quality over the last three decades. Non-point source pollution has become the largest driver of conventional 303(d) listings in the 21st century. Conventional pollutants, i.e., organic, sediment and nutrient imbalances can be identified with poor land use management practices. However, the cause and effect relationship can be elusive because of natural watershed-system influences that vary with scale. Elucidation is complex because the current water quality standards in Minnesota were designed to work best with water quality permits to control point sources of pollution. This paper presents a sentinel watershed-systems approach (SWSA) to the monitoring and assessment of Minnesota waterbodies. SWSA integrates physical, chemical, and biological data over space and time using advanced technologies at selected small watersheds across Minnesota to potentially improve understanding of natural and anthropogenic watershed processes and the management of point and non-point sources of pollution. Long-term, state-of-the-art monitoring and assessment is needed to advance and improve water quality standards. Advanced water quality or ecologically-based standards that integrate physical, chemical, and biological numeric criteria offer the potential to better understand, manage, protect, and restore Minnesota's waterbodies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, D.S.; Kienzle, M.A.; Ferris, D.C.
1996-12-31
The objective of this study is to identify potential long-range sources of mercury within the southeast region of the United States. Preliminary results of a climatological study using the Short-range Layered Atmospheric Model (SLAM) transport model from a select source in the southeast U.S. are presented. The potential for long-range transport from Oak Ridge, Tennessee to Florida is discussed. The transport and transformation of mercury during periods of favorable transport to south Florida is modeled using the Organic Chemistry Integrated Dispersion (ORCHID) model, which contains the transport model used in the climatology study. SLAM/ORCHID results indicate the potential for mercurymore » reaching southeast Florida from the source and the atmospheric oxidation of mercury during transport.« less
System for Secure Integration of Aviation Data
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Wang, Yao; Keller, Rich; Chidester, Tom; Statler, Irving; Lynch, Bob; Patel, Hemil; Windrem, May; Lawrence, Bob
2007-01-01
The Aviation Data Integration System (ADIS) of Ames Research Center has been established to promote analysis of aviation data by airlines and other interested users for purposes of enhancing the quality (especially safety) of flight operations. The ADIS is a system of computer hardware and software for collecting, integrating, and disseminating aviation data pertaining to flights and specified flight events that involve one or more airline(s). The ADIS is secure in the sense that care is taken to ensure the integrity of sources of collected data and to verify the authorizations of requesters to receive data. Most importantly, the ADIS removes a disincentive to collection and exchange of useful data by providing for automatic removal of information that could be used to identify specific flights and crewmembers. Such information, denoted sensitive information, includes flight data (here signifying data collected by sensors aboard an aircraft during flight), weather data for a specified route on a specified date, date and time, and any other information traceable to a specific flight. The removal of information that could be used to perform such tracing is called "deidentification." Airlines are often reluctant to keep flight data in identifiable form because of concerns about loss of anonymity. Hence, one of the things needed to promote retention and analysis of aviation data is an automated means of de-identification of archived flight data to enable integration of flight data with non-flight aviation data while preserving anonymity. Preferably, such an automated means would enable end users of the data to continue to use pre-existing data-analysis software to identify anomalies in flight data without identifying a specific anomalous flight. It would then also be possible to perform statistical analyses of integrated data. These needs are satisfied by the ADIS, which enables an end user to request aviation data associated with de-identified flight data. The ADIS includes client software integrated with other software running on flight-operations quality-assurance (FOQA) computers for purposes of analyzing data to study specified types of events or exceedences (departures of flight parameters from normal ranges). In addition to ADIS client software, ADIS includes server hardware and software that provide services to the ADIS clients via the Internet (see figure). The ADIS server receives and integrates flight and non-flight data pertaining to flights from multiple sources. The server accepts data updates from authorized sources only and responds to requests from authorized users only. In order to satisfy security requirements established by the airlines, (1) an ADIS client must not be accessible from the Internet by an unauthorized user and (2) non-flight data as airport terminal information system (ATIS) and weather data must be displayed without any identifying flight information. ADIS hardware and software architecture as well as encryption and data display scheme are designed to meet these requirements. When a user requests one or more selected aviation data characteristics associated with an event (e.g., a collision, near miss, equipment malfunction, or exceedence), the ADIS client augments the request with date and time information from encrypted files and submits the augmented request to the server. Once the user s authorization has been verified, the server returns the requested information in de-identified form.
Ontology-based knowledge representation for resolution of semantic heterogeneity in GIS
NASA Astrophysics Data System (ADS)
Liu, Ying; Xiao, Han; Wang, Limin; Han, Jialing
2017-07-01
Lack of semantic interoperability in geographical information systems has been identified as the main obstacle for data sharing and database integration. The new method should be found to overcome the problems of semantic heterogeneity. Ontologies are considered to be one approach to support geographic information sharing. This paper presents an ontology-driven integration approach to help in detecting and possibly resolving semantic conflicts. Its originality is that each data source participating in the integration process contains an ontology that defines the meaning of its own data. This approach ensures the automation of the integration through regulation of semantic integration algorithm. Finally, land classification in field GIS is described as the example.
100-F Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovink, R.
2012-09-18
This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).
100-K Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovink, R.
2012-09-18
This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).
Yerramilli, Anjaneyulu; Dodla, Venkata Bhaskar Rao; Challa, Venkata Srinivas; Myles, Latoya; Pendergrass, William R; Vogel, Christoph A; Dasari, Hari Prasad; Tuluri, Francis; Baham, Julius M; Hughes, Robert L; Patrick, Chuck; Young, John H; Swanier, Shelton J; Hardy, Mark G
2012-12-01
Fine particulate matter (PM(2.5)) is majorly formed by precursor gases, such as sulfur dioxide (SO(2)) and nitrogen oxides (NO(x)), which are emitted largely from intense industrial operations and transportation activities. PM(2.5) has been shown to affect respiratory health in humans. Evaluation of source regions and assessment of emission source contributions in the Gulf Coast region of the USA will be useful for the development of PM(2.5) regulatory and mitigation strategies. In the present study, the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model driven by the Weather Research & Forecasting (WRF) model is used to identify the emission source locations and transportation trends. Meteorological observations as well as PM(2.5) sulfate and nitric acid concentrations were collected at two sites during the Mississippi Coastal Atmospheric Dispersion Study, a summer 2009 field experiment along the Mississippi Gulf Coast. Meteorological fields during the campaign were simulated using WRF with three nested domains of 36, 12, and 4 km horizontal resolutions and 43 vertical levels and validated with North American Mesoscale Analysis. The HYSPLIT model was integrated with meteorological fields derived from the WRF model to identify the source locations using backward trajectory analysis. The backward trajectories for a 24-h period were plotted at 1-h intervals starting from two observation locations to identify probable sources. The back trajectories distinctly indicated the sources to be in the direction between south and west, thus to have origin from local Mississippi, neighboring Louisiana state, and Gulf of Mexico. Out of the eight power plants located within the radius of 300 km of the two monitoring sites examined as sources, only Watson, Cajun, and Morrow power plants fall in the path of the derived back trajectories. Forward dispersions patterns computed using HYSPLIT were plotted from each of these source locations using the hourly mean emission concentrations as computed from past annual emission strength data to assess extent of their contribution. An assessment of the relative contributions from the eight sources reveal that only Cajun and Morrow power plants contribute to the observations at the Wiggins Airport to a certain extent while none of the eight power plants contribute to the observations at Harrison Central High School. As these observations represent a moderate event with daily average values of 5-8 μg m(-3) for sulfate and 1-3 μg m(-3) for HNO(3) with differences between the two spatially varied sites, the local sources may also be significant contributors for the observed values of PM(2.5).
Morley, Katherine I; Wallace, Joshua; Denaxas, Spiros C; Hunter, Ross J; Patel, Riyaz S; Perel, Pablo; Shah, Anoop D; Timmis, Adam D; Schilling, Richard J; Hemingway, Harry
2014-01-01
National electronic health records (EHR) are increasingly used for research but identifying disease cases is challenging due to differences in information captured between sources (e.g. primary and secondary care). Our objective was to provide a transparent, reproducible model for integrating these data using atrial fibrillation (AF), a chronic condition diagnosed and managed in multiple ways in different healthcare settings, as a case study. Potentially relevant codes for AF screening, diagnosis, and management were identified in four coding systems: Read (primary care diagnoses and procedures), British National Formulary (BNF; primary care prescriptions), ICD-10 (secondary care diagnoses) and OPCS-4 (secondary care procedures). From these we developed a phenotype algorithm via expert review and analysis of linked EHR data from 1998 to 2010 for a cohort of 2.14 million UK patients aged ≥ 30 years. The cohort was also used to evaluate the phenotype by examining associations between incident AF and known risk factors. The phenotype algorithm incorporated 286 codes: 201 Read, 63 BNF, 18 ICD-10, and four OPCS-4. Incident AF diagnoses were recorded for 72,793 patients, but only 39.6% (N = 28,795) were recorded in primary care and secondary care. An additional 7,468 potential cases were inferred from data on treatment and pre-existing conditions. The proportion of cases identified from each source differed by diagnosis age; inferred diagnoses contributed a greater proportion of younger cases (≤ 60 years), while older patients (≥ 80 years) were mainly diagnosed in SC. Associations of risk factors (hypertension, myocardial infarction, heart failure) with incident AF defined using different EHR sources were comparable in magnitude to those from traditional consented cohorts. A single EHR source is not sufficient to identify all patients, nor will it provide a representative sample. Combining multiple data sources and integrating information on treatment and comorbid conditions can substantially improve case identification.
Lala, Sanjay G.; Little, Kristen M.; Tshabangu, Nkeko; Moore, David P.; Msandiwa, Reginah; van der Watt, Martin; Chaisson, Richard E.; Martinson, Neil A.
2015-01-01
Background Contact tracing, to identify source cases with untreated tuberculosis (TB), is rarely performed in high disease burden settings when the index case is a young child with TB. As TB is strongly associated with HIV infection in these settings, we used source case investigation to determine the prevalence of undiagnosed TB and HIV in the caregivers and household contacts of hospitalised young children diagnosed with TB in South Africa. Methods Caregivers and household contacts of 576 young children (age ≤7 years) with TB diagnosed between May 2010 and August 2012 were screened for TB and HIV. The primary outcome was the detection of laboratory-confirmed, newly-diagnosed TB disease and/or HIV-infection in close contacts. Results Of 576 caregivers, 301 (52·3%) self-reported HIV-positivity. Newly-diagnosed HIV infection was detected in 63 (22·9%) of the remaining 275 caregivers who self-reported an unknown or negative HIV status. Screening identified 133 (23·1%) caregivers eligible for immediate anti-retroviral therapy (ART). Newly-diagnosed TB disease was detected in 23 (4·0%) caregivers. In non-caregiver household contacts (n = 1341), the prevalence of newly-diagnosed HIV infection and TB disease was 10·0% and 3·2% respectively. On average, screening contacts of every nine children with TB resulted in the identification of one case of newly-diagnosed TB disease, three cases of newly diagnosed HIV-infection, and three HIV-infected persons eligible for ART. Conclusion In high burden countries, source case investigation yields high rates of previously undiagnosed HIV and TB infection in the close contacts of hospitalised young children diagnosed with TB. Furthermore, integrated screening identifies many individuals who are eligible for immediate ART. Similar studies, with costing analyses, should be undertaken in other high burden settings–integrated source case investigation for TB and HIV should be routinely undertaken if our findings are confirmed. PMID:26378909
ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES
LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.
2008-01-01
Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508
78 FR 35038 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-11
..., reliable, and transparent method for identifying high-quality programs that can receive continuing five... the system is working. The study will employ a mixed-methods design that integrates and layers administrative and secondary data sources, observational measures, and interviews to develop a rich knowledge...
NASA Technical Reports Server (NTRS)
Pratt, D. T.; Radhakrishnan, K.
1986-01-01
The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.
Greenwood, Daniel; Davids, Keith; Renshaw, Ian
2014-01-01
Coordination of dynamic interceptive movements is predicated on cyclical relations between an individual's actions and information sources from the performance environment. To identify dynamic informational constraints, which are interwoven with individual and task constraints, coaches' experiential knowledge provides a complementary source to support empirical understanding of performance in sport. In this study, 15 expert coaches from 3 sports (track and field, gymnastics and cricket) participated in a semi-structured interview process to identify potential informational constraints which they perceived to regulate action during run-up performance. Expert coaches' experiential knowledge revealed multiple information sources which may constrain performance adaptations in such locomotor pointing tasks. In addition to the locomotor pointing target, coaches' knowledge highlighted two other key informational constraints: vertical reference points located near the locomotor pointing target and a check mark located prior to the locomotor pointing target. This study highlights opportunities for broadening the understanding of perception and action coupling processes, and the identified information sources warrant further empirical investigation as potential constraints on athletic performance. Integration of experiential knowledge of expert coaches with theoretically driven empirical knowledge represents a promising avenue to drive future applied science research and pedagogical practice.
Integrated system for automated financial document processing
NASA Astrophysics Data System (ADS)
Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai
1997-02-01
A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.
Cladé, Thierry; Snyder, Joshua C.
2010-01-01
Clinical trials which use imaging typically require data management and workflow integration across several parties. We identify opportunities for all parties involved to realize benefits with a modular interoperability model based on service-oriented architecture and grid computing principles. We discuss middleware products for implementation of this model, and propose caGrid as an ideal candidate due to its healthcare focus; free, open source license; and mature developer tools and support. PMID:20449775
ERIC Educational Resources Information Center
Lane, Kathleen Lynne; Oakes, Wendy Peia; Ennis, Robin Parks; Hirsch, Shanna Eisner
2014-01-01
In comprehensive, integrated, three-tiered models, it is essential to have a systematic method for identifying students who need supports at Tier 2 or Tier 3. This article provides explicit information on how to use multiple sources of data to determine which students might benefit from these supports. First, the authors provide an overview of how…
Cruz, Tess Boley
2009-01-01
This Vector paper (IV of V on monitoring the tobacco use epidemic) presents the data sources and methods that can be used to monitor tobacco marketing and makes recommendations for creating a national surveillance system. In 2002, the Vector Work Group of the National Tobacco Monitoring, Research and Evaluation Workshop identified priority indicators of tobacco marketing: tobacco brand pricing strategies, retail environment advertising and promotional allowances, gray market or smuggling activities, lobbying, direct mail marketing, tobacco brand placements in films, Internet promotions, and sponsorship at bars and events. This paper reviews and identifies data sources and gaps for these priority indicators and for 12 other indicators of interest. There are 38 commercial data sites and Internet sources, as well as individual research efforts that address the priority indicators. These sources are not integrated, often costly, and limited in standardization. Tobacco marketing could be more effectively monitored with the development of a national research network. Surveillance of the tobacco industry's methods to push tobacco and pull consumers can help the public health community identify new markets and campaigns, justify and tailor effective tobacco control strategies, and evaluate existing counter-marketing efforts.
Co, Manuel C; Boden-Albala, Bernadette; Quarles, Leigh; Wilcox, Adam; Bakken, Suzanne
2012-01-01
In designing informatics infrastructure to support comparative effectiveness research (CER), it is necessary to implement approaches for integrating heterogeneous data sources such as clinical data typically stored in clinical data warehouses and those that are normally stored in separate research databases. One strategy to support this integration is the use of a concept-oriented data dictionary with a set of semantic terminology models. The aim of this paper is to illustrate the use of the semantic structure of Clinical LOINC (Logical Observation Identifiers, Names, and Codes) in integrating community-based survey items into the Medical Entities Dictionary (MED) to support the integration of survey data with clinical data for CER studies.
100-N Area Decision Unit Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovink, R.
2012-09-18
This report documents the process used to identify source area target analytes in support of the 100-N Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).
NASA Technical Reports Server (NTRS)
King, R. B.; Fordyce, J. S.; Antoine, A. C.; Leibecki, H. F.; Neustadter, H. E.; Sidik, S. M.; Burr, J. C.; Craig, G. T.; Cornett, C. L.
1974-01-01
Preliminary review of a study of trace elements and compound concentrations in the ambient suspended particulate matter in Cleveland, Ohio, measured from August 1971 through June 1973, as a function of source, monitoring location, and meteorological conditions. The study is aimed at the development of techniques for identifying specific pollution sources which could be integrated into a practical system readily usable by an enforcement agency.
Triangulation in aetiological epidemiology
Lawlor, Debbie A; Tilling, Kate; Davey Smith, George
2016-01-01
Abstract Triangulation is the practice of obtaining more reliable answers to research questions through integrating results from several different approaches, where each approach has different key sources of potential bias that are unrelated to each other. With respect to causal questions in aetiological epidemiology, if the results of different approaches all point to the same conclusion, this strengthens confidence in the finding. This is particularly the case when the key sources of bias of some of the approaches would predict that findings would point in opposite directions if they were due to such biases. Where there are inconsistencies, understanding the key sources of bias of each approach can help to identify what further research is required to address the causal question. The aim of this paper is to illustrate how triangulation might be used to improve causal inference in aetiological epidemiology. We propose a minimum set of criteria for use in triangulation in aetiological epidemiology, summarize the key sources of bias of several approaches and describe how these might be integrated within a triangulation framework. We emphasize the importance of being explicit about the expected direction of bias within each approach, whenever this is possible, and seeking to identify approaches that would be expected to bias the true causal effect in different directions. We also note the importance, when comparing results, of taking account of differences in the duration and timing of exposures. We provide three examples to illustrate these points. PMID:28108528
SPIDER: A Framework for Understanding Driver Distraction.
Strayer, David L; Fisher, Donald L
2016-02-01
The objective was to identify key cognitive processes that are impaired when drivers divert attention from driving. Driver distraction is increasingly recognized as a significant source of injuries and fatalities on the roadway. A "SPIDER" model is developed that identifies key cognitive processes that are impaired when drivers divert attention from driving. SPIDER is an acronym standing for scanning, predicting, identifying, decision making, and executing a response. When drivers engage in secondary activities unrelated to the task of driving, SPIDER-related processes are impaired, situation awareness is degraded, and the ability to safely operate a motor vehicle may be compromised. The pattern of interference helps to illuminate the sources of driver distraction and may help guide the integration of new technology into the automobile. © 2015, Human Factors and Ergonomics Society.
Feasibility study on the use of groupware support for NASA source evaluation boards
NASA Technical Reports Server (NTRS)
Bishop, Peter C.; Yoes, Cissy
1991-01-01
Groupware is a class of computer based systems that support groups engaged in a common task (or goal) and that provide an interface to a shared environment. A potential application for groupware is the source evaluation board (SEB) process used in the procurement of government contracts. This study was undertaken to (1) identify parts of the SEB process which are candidates for groupware supports; and (2) identify tools which could be used to support the candidate process. Two processes of the SEB were identified as good candidates for groupware support: (1) document generation - a coordination and communication process required to present and document the findings of an SEB; and (2) group decision making - a highly analytical and integrative decision process requiring a clear and supportable outcome.
Reconstruction of the experimentally supported human protein interactome: what can we learn?
Klapa, Maria I; Tsafou, Kalliopi; Theodoridis, Evangelos; Tsakalidis, Athanasios; Moschonas, Nicholas K
2013-10-02
Understanding the topology and dynamics of the human protein-protein interaction (PPI) network will significantly contribute to biomedical research, therefore its systematic reconstruction is required. Several meta-databases integrate source PPI datasets, but the protein node sets of their networks vary depending on the PPI data combined. Due to this inherent heterogeneity, the way in which the human PPI network expands via multiple dataset integration has not been comprehensively analyzed. We aim at assembling the human interactome in a global structured way and exploring it to gain insights of biological relevance. First, we defined the UniProtKB manually reviewed human "complete" proteome as the reference protein-node set and then we mined five major source PPI datasets for direct PPIs exclusively between the reference proteins. We updated the protein and publication identifiers and normalized all PPIs to the UniProt identifier level. The reconstructed interactome covers approximately 60% of the human proteome and has a scale-free structure. No apparent differentiating gene functional classification characteristics were identified for the unrepresented proteins. The source dataset integration augments the network mainly in PPIs. Polyubiquitin emerged as the highest-degree node, but the inclusion of most of its identified PPIs may be reconsidered. The high number (>300) of connections of the subsequent fifteen proteins correlates well with their essential biological role. According to the power-law network structure, the unrepresented proteins should mainly have up to four connections with equally poorly-connected interactors. Reconstructing the human interactome based on the a priori definition of the protein nodes enabled us to identify the currently included part of the human "complete" proteome, and discuss the role of the proteins within the network topology with respect to their function. As the network expansion has to comply with the scale-free theory, we suggest that the core of the human interactome has essentially emerged. Thus, it could be employed in systems biology and biomedical research, despite the considerable number of currently unrepresented proteins. The latter are probably involved in specialized physiological conditions, justifying the scarcity of related PPI information, and their identification can assist in designing relevant functional experiments and targeted text mining algorithms.
Integrated omics analysis of specialized metabolism in medicinal plants.
Rai, Amit; Saito, Kazuki; Yamazaki, Mami
2017-05-01
Medicinal plants are a rich source of highly diverse specialized metabolites with important pharmacological properties. Until recently, plant biologists were limited in their ability to explore the biosynthetic pathways of these metabolites, mainly due to the scarcity of plant genomics resources. However, recent advances in high-throughput large-scale analytical methods have enabled plant biologists to discover biosynthetic pathways for important plant-based medicinal metabolites. The reduced cost of generating omics datasets and the development of computational tools for their analysis and integration have led to the elucidation of biosynthetic pathways of several bioactive metabolites of plant origin. These discoveries have inspired synthetic biology approaches to develop microbial systems to produce bioactive metabolites originating from plants, an alternative sustainable source of medicinally important chemicals. Since the demand for medicinal compounds are increasing with the world's population, understanding the complete biosynthesis of specialized metabolites becomes important to identify or develop reliable sources in the future. Here, we review the contributions of major omics approaches and their integration to our understanding of the biosynthetic pathways of bioactive metabolites. We briefly discuss different approaches for integrating omics datasets to extract biologically relevant knowledge and the application of omics datasets in the construction and reconstruction of metabolic models. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.
Vermeiren, Peter; Muñoz, Cynthia C; Ikejima, Kou
2016-12-15
Micro- and macroplastic accumulation threatens estuaries worldwide because of the often dense human populations, diverse plastic inputs and high potential for plastic degradation and storage in these ecosystems. Nonetheless, our understanding of plastic sources and sinks remains limited. We designed conceptual models of the local and estuary-wide transport of plastics. We identify processes affecting the position of plastics in the water column; processes related to the mixing of fresh and salt water; and processes resulting from the influences of wind, topography, and organism-plastic interactions. The models identify gaps in the spatial context of plastic-organisms interactions, the chemical behavior of plastics in estuaries, effects of wind on plastic suspension-deposition cycles, and the relative importance of processes affecting the position in the water column. When interpreted in the context of current understanding, sinks with high management potential can be identified. However, source-sink patterns vary among estuary types and with local scale processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cake: a bioinformatics pipeline for the integrated analysis of somatic variants in cancer genomes
Rashid, Mamunur; Robles-Espinoza, Carla Daniela; Rust, Alistair G.; Adams, David J.
2013-01-01
Summary: We have developed Cake, a bioinformatics software pipeline that integrates four publicly available somatic variant-calling algorithms to identify single nucleotide variants with higher sensitivity and accuracy than any one algorithm alone. Cake can be run on a high-performance computer cluster or used as a stand-alone application. Availabilty: Cake is open-source and is available from http://cakesomatic.sourceforge.net/ Contact: da1@sanger.ac.uk Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:23803469
Silviculture research: The intersection of science and art across generations
Theresa B. Jain
2013-01-01
A research silviculturist's work is firmly grounded in the scientific method to acquire knowledge on forest dynamics. They also integrate information from numerous sources to produce new knowledge not readily identified by single studies. Results and interpretation subsequently provide the scientific foundation for developing management decisions and strategies....
78 FR 49760 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-15
... transparent method for identifying high-quality programs that can receive continuing five-year grants without... will employ a mixed-methods design that integrates and layers administrative and secondary data sources, observational measures, and interviews to develop a rich knowledge base about what the DRS accomplishes and how...
Road-Killed Animals as Resources for Ecological Studies.
ERIC Educational Resources Information Center
Adams, Clark E.
1983-01-01
Summarizes 19 literature sources identifying road-killed vertebrates and frequency of kill by numbers. Examples of how these animals can be incorporated into curricula (integrating biology, society, people, and values) are given, followed by an illustrated example of how a road-killed raccoon's skull demonstrated a human/wildlife interaction prior…
Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning
2007-10-18
Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at http://www.ebi.ac.uk/Tools/picr.
Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning
2007-01-01
Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at . PMID:17945017
Robustness, evolvability, and the logic of genetic regulation.
Payne, Joshua L; Moore, Jason H; Wagner, Andreas
2014-01-01
In gene regulatory circuits, the expression of individual genes is commonly modulated by a set of regulating gene products, which bind to a gene's cis-regulatory region. This region encodes an input-output function, referred to as signal-integration logic, that maps a specific combination of regulatory signals (inputs) to a particular expression state (output) of a gene. The space of all possible signal-integration functions is vast and the mapping from input to output is many-to-one: For the same set of inputs, many functions (genotypes) yield the same expression output (phenotype). Here, we exhaustively enumerate the set of signal-integration functions that yield identical gene expression patterns within a computational model of gene regulatory circuits. Our goal is to characterize the relationship between robustness and evolvability in the signal-integration space of regulatory circuits, and to understand how these properties vary between the genotypic and phenotypic scales. Among other results, we find that the distributions of genotypic robustness are skewed, so that the majority of signal-integration functions are robust to perturbation. We show that the connected set of genotypes that make up a given phenotype are constrained to specific regions of the space of all possible signal-integration functions, but that as the distance between genotypes increases, so does their capacity for unique innovations. In addition, we find that robust phenotypes are (i) evolvable, (ii) easily identified by random mutation, and (iii) mutationally biased toward other robust phenotypes. We explore the implications of these latter observations for mutation-based evolution by conducting random walks between randomly chosen source and target phenotypes. We demonstrate that the time required to identify the target phenotype is independent of the properties of the source phenotype.
Robustness, Evolvability, and the Logic of Genetic Regulation
Moore, Jason H.; Wagner, Andreas
2014-01-01
In gene regulatory circuits, the expression of individual genes is commonly modulated by a set of regulating gene products, which bind to a gene’s cis-regulatory region. This region encodes an input-output function, referred to as signal-integration logic, that maps a specific combination of regulatory signals (inputs) to a particular expression state (output) of a gene. The space of all possible signal-integration functions is vast and the mapping from input to output is many-to-one: for the same set of inputs, many functions (genotypes) yield the same expression output (phenotype). Here, we exhaustively enumerate the set of signal-integration functions that yield idential gene expression patterns within a computational model of gene regulatory circuits. Our goal is to characterize the relationship between robustness and evolvability in the signal-integration space of regulatory circuits, and to understand how these properties vary between the genotypic and phenotypic scales. Among other results, we find that the distributions of genotypic robustness are skewed, such that the majority of signal-integration functions are robust to perturbation. We show that the connected set of genotypes that make up a given phenotype are constrained to specific regions of the space of all possible signal-integration functions, but that as the distance between genotypes increases, so does their capacity for unique innovations. In addition, we find that robust phenotypes are (i) evolvable, (ii) easily identified by random mutation, and (iii) mutationally biased toward other robust phenotypes. We explore the implications of these latter observations for mutation-based evolution by conducting random walks between randomly chosen source and target phenotypes. We demonstrate that the time required to identify the target phenotype is independent of the properties of the source phenotype. PMID:23373974
NASA Astrophysics Data System (ADS)
Vermeulen, A. T.; Kutsch, W. L.; Lavric, J. V.; Juurola, E.
2016-12-01
Fluxnet is facing a transition from single PI and project engagement to a cooperation of infrastructures such as ICOS, Ameriflux, NEON, Chinaflux or TERN. Each of these infrastructures has developed its own data life cycle, data license and data policy which will have implications on future cooperation within Fluxnet and other global data integration efforts such as e.g. SOCAt in the ocean community. This presentation will introduce into the recent developments of the ICOS data policy and show perspectives for future cooperation in global networks. The challenge in developing the ICOS data policy has been to find the best compromise between optimized access for users and sufficient visibility and acknowledgement of data providers. ICOS data will be provided under the Creative Commons 4.0 BY license. ICOS data will be provided through the ICOS Carbon Portal. Data usage will be absolutely unrestricted. Data have only to be attributed as ICOS data. With the attribution ICOS will provide a persistent identifier (pid, sometimes also nominated as digial object identifier, doi) that will direct to a landing page where data provider and if necessary also funding organisations are identified. In cooperation with other environmental research infrastructures in the framework of the European cluster project ENVRIplus and the Research Data Alliance (RDA) the ICOS Carbon Portal is currently developing a data citation system. This includes developing recommendations for data citation of integrated data sets from different sources.
Townsend-Small, Amy; Marrero, Josette E; Lyon, David R; Simpson, Isobel J; Meinardi, Simone; Blake, Donald R
2015-07-07
A growing dependence on natural gas for energy may exacerbate emissions of the greenhouse gas methane (CH4). Identifying fingerprints of these emissions is critical to our understanding of potential impacts. Here, we compare stable isotopic and alkane ratio tracers of natural gas, agricultural, and urban CH4 sources in the Barnett Shale hydraulic fracturing region near Fort Worth, Texas. Thermogenic and biogenic sources were compositionally distinct, and emissions from oil wells were enriched in alkanes and isotopically depleted relative to natural gas wells. Emissions from natural gas production varied in δ(13)C and alkane ratio composition, with δD-CH4 representing the most consistent tracer of natural gas sources. We integrated our data into a bottom-up inventory of CH4 for the region, resulting in an inventory of ethane (C2H6) sources for comparison to top-down estimates of CH4 and C2H6 emissions. Methane emissions in the Barnett are a complex mixture of urban, agricultural, and fossil fuel sources, which makes source apportionment challenging. For example, spatial heterogeneity in gas composition and high C2H6/CH4 ratios in emissions from conventional oil production add uncertainty to top-down models of source apportionment. Future top-down studies may benefit from the addition of δD-CH4 to distinguish thermogenic and biogenic sources.
Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2008-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.
Co, Manuel C.; Boden-Albala, Bernadette; Quarles, Leigh; Wilcox, Adam; Bakken, Suzanne
2012-01-01
In designing informatics infrastructure to support comparative effectiveness research (CER), it is necessary to implement approaches for integrating heterogeneous data sources such as clinical data typically stored in clinical data warehouses and those that are normally stored in separate research databases. One strategy to support this integration is the use of a concept-oriented data dictionary with a set of semantic terminology models. The aim of this paper is to illustrate the use of the semantic structure of Clinical LOINC (Logical Observation Identifiers, Names, and Codes) in integrating community-based survey items into the Medical Entities Dictionary (MED) to support the integration of survey data with clinical data for CER studies. PMID:24199059
A reproducible approach to high-throughput biological data acquisition and integration
Rahnavard, Gholamali; Waldron, Levi; McIver, Lauren; Shafquat, Afrah; Franzosa, Eric A.; Miropolsky, Larissa; Sweeney, Christopher
2015-01-01
Modern biological research requires rapid, complex, and reproducible integration of multiple experimental results generated both internally and externally (e.g., from public repositories). Although large systematic meta-analyses are among the most effective approaches both for clinical biomarker discovery and for computational inference of biomolecular mechanisms, identifying, acquiring, and integrating relevant experimental results from multiple sources for a given study can be time-consuming and error-prone. To enable efficient and reproducible integration of diverse experimental results, we developed a novel approach for standardized acquisition and analysis of high-throughput and heterogeneous biological data. This allowed, first, novel biomolecular network reconstruction in human prostate cancer, which correctly recovered and extended the NFκB signaling pathway. Next, we investigated host-microbiome interactions. In less than an hour of analysis time, the system retrieved data and integrated six germ-free murine intestinal gene expression datasets to identify the genes most influenced by the gut microbiota, which comprised a set of immune-response and carbohydrate metabolism processes. Finally, we constructed integrated functional interaction networks to compare connectivity of peptide secretion pathways in the model organisms Escherichia coli, Bacillus subtilis, and Pseudomonas aeruginosa. PMID:26157642
The contribution of different information sources for adverse effects data.
Golder, Su; Loke, Yoon K
2012-04-01
The aim of this study is to determine the relative value and contribution of searching different sources to identify adverse effects data. The process of updating a systematic review and meta-analysis of thiazolidinedione-related fractures in patients with type 2 diabetes mellitus was used as a case study. For each source searched, a record was made for each relevant reference included in the review noting whether it was retrieved with the search strategy used and whether it was available but not retrieved. The sensitivity, precision, and number needed to read from searching each source and from different combinations of sources were also calculated. There were 58 relevant references which presented sufficient numerical data to be included in a meta-analysis of fractures and bone mineral density. The highest number of relevant references were retrieved from Science Citation Index (SCI) (35), followed by BIOSIS Previews (27) and EMBASE (24). The precision of the searches varied from 0.88% (Scirus) to 41.67% (CENTRAL). With the search strategies used, the minimum combination of sources required to retrieve all the relevant references was; the GlaxoSmithKline (GSK) website, Science Citation Index (SCI), EMBASE, BIOSIS Previews, British Library Direct, Medscape DrugInfo, handsearching and reference checking, AHFS First, and Thomson Reuters Integrity or Conference Papers Index (CPI). In order to identify all the relevant references for this case study a number of different sources needed to be searched. The minimum combination of sources required to identify all the relevant references did not include MEDLINE.
Contaminant source and release history identification in groundwater: A multi-step approach
NASA Astrophysics Data System (ADS)
Gzyl, G.; Zanini, A.; Frączek, R.; Kura, K.
2014-02-01
The paper presents a new multi-step approach aiming at source identification and release history estimation. The new approach consists of three steps: performing integral pumping tests, identifying sources, and recovering the release history by means of a geostatistical approach. The present paper shows the results obtained from the application of the approach within a complex case study in Poland in which several areal sources were identified. The investigated site is situated in the vicinity of a former chemical plant in southern Poland in the city of Jaworzno in the valley of the Wąwolnica River; the plant has been in operation since the First World War producing various chemicals. From an environmental point of view the most relevant activity was the production of pesticides, especially lindane. The application of the multi-step approach enabled a significant increase in the knowledge of contamination at the site. Some suspected contamination sources have been proven to have minor effect on the overall contamination. Other suspected sources have been proven to have key significance. Some areas not taken into consideration previously have now been identified as key sources. The method also enabled estimation of the magnitude of the sources and, a list of the priority reclamation actions will be drawn as a result. The multi-step approach has proven to be effective and may be applied to other complicated contamination cases. Moreover, the paper shows the capability of the geostatistical approach to manage a complex real case study.
Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S
2015-03-15
The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.
In Silico Gene Prioritization by Integrating Multiple Data Sources
Zhou, Yingyao; Shields, Robert; Chanda, Sumit K.; Elston, Robert C.; Li, Jing
2011-01-01
Identifying disease genes is crucial to the understanding of disease pathogenesis, and to the improvement of disease diagnosis and treatment. In recent years, many researchers have proposed approaches to prioritize candidate genes by considering the relationship of candidate genes and existing known disease genes, reflected in other data sources. In this paper, we propose an expandable framework for gene prioritization that can integrate multiple heterogeneous data sources by taking advantage of a unified graphic representation. Gene-gene relationships and gene-disease relationships are then defined based on the overall topology of each network using a diffusion kernel measure. These relationship measures are in turn normalized to derive an overall measure across all networks, which is utilized to rank all candidate genes. Based on the informativeness of available data sources with respect to each specific disease, we also propose an adaptive threshold score to select a small subset of candidate genes for further validation studies. We performed large scale cross-validation analysis on 110 disease families using three data sources. Results have shown that our approach consistently outperforms other two state of the art programs. A case study using Parkinson disease (PD) has identified four candidate genes (UBB, SEPT5, GPR37 and TH) that ranked higher than our adaptive threshold, all of which are involved in the PD pathway. In particular, a very recent study has observed a deletion of TH in a patient with PD, which supports the importance of the TH gene in PD pathogenesis. A web tool has been implemented to assist scientists in their genetic studies. PMID:21731658
The faint X-ray sources in and out of omega Centauri: X-ray observations and optical identifications
NASA Technical Reports Server (NTRS)
Cool, Adrienne M.; Grindlay, Jonathan E.; Bailyn, Charles D.; Callanan, Paul J.; Hertz, Paul
1995-01-01
We present the results of an observation of the globular cluster omega Cen (NGC 5139) with the Einstein high-resolution imager (HRI). Of the five low-luminosity X-ray sources toward omega Cen which were first identified with the Einstein imaging proportional counter (IPC) (Hertz and Grindlay 1983a, b), two are detected in the Einstein HRI observation: IPC sources A and D. These detections provide source positions accurate to 3 sec-4 sec; the positions are confirmed in a ROSAT HRI observation reported here. Using CCD photometry and spectroscopy, we have identified both sources as foreground dwarf M stars with emission lines (dMe). The chance projection of two Mde stars within approximately 13 min of the center of omega Cen is not extraordinary, given the space density of these stellar coronal X-ray sources. We discuss the possible nature of the three as yet unidentified IPC sources toward omega Cen, and consider the constraints that the Einstein observations place on the total population of X-ray sources in this cluster. The integrated luminosity from faint X-ray sources in omega Cen appears to be low relative to both the old open cluster M67 and the post-core-collapse globular, NGC 6397.
On the Diversity of Linguistic Data and the Integration of the Language Sciences.
D'Alessandro, Roberta; van Oostendorp, Marc
2017-01-01
An integrated science of language is usually advocated as a step forward for linguistic research. In this paper, we maintain that integration of this sort is premature, and cannot take place before we identify a common object of study. We advocate instead a science of language that is inherently multi-faceted, and takes into account the different viewpoints as well as the different definitions of the object of study. We also advocate the use of different data sources, which, if non-contradictory, can provide more solid evidence for linguistic analysis. Last, we argue that generative grammar is an important tile in the puzzle.
An integrative process model of leadership: examining loci, mechanisms, and event cycles.
Eberly, Marion B; Johnson, Michael D; Hernandez, Morela; Avolio, Bruce J
2013-09-01
Utilizing the locus (source) and mechanism (transmission) of leadership framework (Hernandez, Eberly, Avolio, & Johnson, 2011), we propose and examine the application of an integrative process model of leadership to help determine the psychological interactive processes that constitute leadership. In particular, we identify the various dynamics involved in generating leadership processes by modeling how the loci and mechanisms interact through a series of leadership event cycles. We discuss the major implications of this model for advancing an integrative understanding of what constitutes leadership and its current and future impact on the field of psychological theory, research, and practice. © 2013 APA, all rights reserved.
Dealing with the Data Deluge: Handling the Multitude Of Chemical Biology Data Sources
Guha, Rajarshi; Nguyen, Dac-Trung; Southall, Noel; Jadhav, Ajit
2012-01-01
Over the last 20 years, there has been an explosion in the amount and type of biological and chemical data that has been made publicly available in a variety of online databases. While this means that vast amounts of information can be found online, there is no guarantee that it can be found easily (or at all). A scientist searching for a specific piece of information is faced with a daunting task - many databases have overlapping content, use their own identifiers and, in some cases, have arcane and unintuitive user interfaces. In this overview, a variety of well known data sources for chemical and biological information are highlighted, focusing on those most useful for chemical biology research. The issue of using multiple data sources together and the associated problems such as identifier disambiguation are highlighted. A brief discussion is then provided on Tripod, a recently developed platform that supports the integration of arbitrary data sources, providing users a simple interface to search across a federated collection of resources. PMID:26609498
Wei, Lin; Tang, Ruqi; Lian, Baofeng; Zhao, Yingjun; He, Xianghuo; Xie, Lu
2014-01-01
Background Recently, a number of studies have performed genome or exome sequencing of hepatocellular carcinoma (HCC) and identified hundreds or even thousands of mutations in protein-coding genes. However, these studies have only focused on a limited number of candidate genes, and many important mutation resources remain to be explored. Principal Findings In this study, we integrated mutation data obtained from various sources and performed pathway and network analysis. We identified 113 pathways that were significantly mutated in HCC samples and found that the mutated genes included in these pathways contained high percentages of known cancer genes, and damaging genes and also demonstrated high conservation scores, indicating their important roles in liver tumorigenesis. Five classes of pathways that were mutated most frequently included (a) proliferation and apoptosis related pathways, (b) tumor microenvironment related pathways, (c) neural signaling related pathways, (d) metabolic related pathways, and (e) circadian related pathways. Network analysis further revealed that the mutated genes with the highest betweenness coefficients, such as the well-known cancer genes TP53, CTNNB1 and recently identified novel mutated genes GNAL and the ADCY family, may play key roles in these significantly mutated pathways. Finally, we highlight several key genes (e.g., RPS6KA3 and PCLO) and pathways (e.g., axon guidance) in which the mutations were associated with clinical features. Conclusions Our workflow illustrates the increased statistical power of integrating multiple studies of the same subject, which can provide biological insights that would otherwise be masked under individual sample sets. This type of bioinformatics approach is consistent with the necessity of making the best use of the ever increasing data provided in valuable databases, such as TCGA, to enhance the speed of deciphering human cancers. PMID:24988079
Zhang, Yuannv; Qiu, Zhaoping; Wei, Lin; Tang, Ruqi; Lian, Baofeng; Zhao, Yingjun; He, Xianghuo; Xie, Lu
2014-01-01
Recently, a number of studies have performed genome or exome sequencing of hepatocellular carcinoma (HCC) and identified hundreds or even thousands of mutations in protein-coding genes. However, these studies have only focused on a limited number of candidate genes, and many important mutation resources remain to be explored. In this study, we integrated mutation data obtained from various sources and performed pathway and network analysis. We identified 113 pathways that were significantly mutated in HCC samples and found that the mutated genes included in these pathways contained high percentages of known cancer genes, and damaging genes and also demonstrated high conservation scores, indicating their important roles in liver tumorigenesis. Five classes of pathways that were mutated most frequently included (a) proliferation and apoptosis related pathways, (b) tumor microenvironment related pathways, (c) neural signaling related pathways, (d) metabolic related pathways, and (e) circadian related pathways. Network analysis further revealed that the mutated genes with the highest betweenness coefficients, such as the well-known cancer genes TP53, CTNNB1 and recently identified novel mutated genes GNAL and the ADCY family, may play key roles in these significantly mutated pathways. Finally, we highlight several key genes (e.g., RPS6KA3 and PCLO) and pathways (e.g., axon guidance) in which the mutations were associated with clinical features. Our workflow illustrates the increased statistical power of integrating multiple studies of the same subject, which can provide biological insights that would otherwise be masked under individual sample sets. This type of bioinformatics approach is consistent with the necessity of making the best use of the ever increasing data provided in valuable databases, such as TCGA, to enhance the speed of deciphering human cancers.
NASA Astrophysics Data System (ADS)
Bambacus, M.; Alameh, N.; Cole, M.
2006-12-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
Guetterman, Timothy C; Fetters, Michael D; Creswell, John W
2015-11-01
Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.
Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.
2015-01-01
PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895
Sparse Method for Direction of Arrival Estimation Using Denoised Fourth-Order Cumulants Vector.
Fan, Yangyu; Wang, Jianshu; Du, Rui; Lv, Guoyun
2018-06-04
Fourth-order cumulants (FOCs) vector-based direction of arrival (DOA) estimation methods of non-Gaussian sources may suffer from poor performance for limited snapshots or difficulty in setting parameters. In this paper, a novel FOCs vector-based sparse DOA estimation method is proposed. Firstly, by utilizing the concept of a fourth-order difference co-array (FODCA), an advanced FOCs vector denoising or dimension reduction procedure is presented for arbitrary array geometries. Then, a novel single measurement vector (SMV) model is established by the denoised FOCs vector, and efficiently solved by an off-grid sparse Bayesian inference (OGSBI) method. The estimation errors of FOCs are integrated in the SMV model, and are approximately estimated in a simple way. A necessary condition regarding the number of identifiable sources of our method is presented that, in order to uniquely identify all sources, the number of sources K must fulfill K ≤ ( M 4 - 2 M 3 + 7 M 2 - 6 M ) / 8 . The proposed method suits any geometry, does not need prior knowledge of the number of sources, is insensitive to associated parameters, and has maximum identifiability O ( M 4 ) , where M is the number of sensors in the array. Numerical simulations illustrate the superior performance of the proposed method.
McCarthy, Kathleen A.; Alvarez, David A.
2014-01-01
The Eugene Water & Electric Board (EWEB) supplies drinking water to approximately 200,000 people in Eugene, Oregon. The sole source of this water is the McKenzie River, which has consistently excellent water quality relative to established drinking-water standards. To ensure that this quality is maintained as land use in the source basin changes and water demands increase, EWEB has developed a proactive management strategy that includes a combination of conventional point-in-time discrete water sampling and time‑integrated passive sampling with a combination of chemical analyses and bioassays to explore water quality and identify where vulnerabilities may lie. In this report, we present the results from six passive‑sampling deployments at six sites in the basin, including the intake and outflow from the EWEB drinking‑water treatment plant (DWTP). This is the first known use of passive samplers to investigate both the source and finished water of a municipal DWTP. Results indicate that low concentrations of several polycyclic aromatic hydrocarbons and organohalogen compounds are consistently present in source waters, and that many of these compounds are also present in finished drinking water. The nature and patterns of compounds detected suggest that land-surface runoff and atmospheric deposition act as ongoing sources of polycyclic aromatic hydrocarbons, some currently used pesticides, and several legacy organochlorine pesticides. Comparison of results from point-in-time and time-integrated sampling indicate that these two methods are complementary and, when used together, provide a clearer understanding of contaminant sources than either method alone.
Open Targets: a platform for therapeutic target identification and validation
Koscielny, Gautier; An, Peter; Carvalho-Silva, Denise; Cham, Jennifer A.; Fumis, Luca; Gasparyan, Rippa; Hasan, Samiul; Karamanis, Nikiforos; Maguire, Michael; Papa, Eliseo; Pierleoni, Andrea; Pignatelli, Miguel; Platt, Theo; Rowland, Francis; Wankar, Priyanka; Bento, A. Patrícia; Burdett, Tony; Fabregat, Antonio; Forbes, Simon; Gaulton, Anna; Gonzalez, Cristina Yenyxe; Hermjakob, Henning; Hersey, Anne; Jupe, Steven; Kafkas, Şenay; Keays, Maria; Leroy, Catherine; Lopez, Francisco-Javier; Magarinos, Maria Paula; Malone, James; McEntyre, Johanna; Munoz-Pomer Fuentes, Alfonso; O'Donovan, Claire; Papatheodorou, Irene; Parkinson, Helen; Palka, Barbara; Paschall, Justin; Petryszak, Robert; Pratanwanich, Naruemon; Sarntivijal, Sirarat; Saunders, Gary; Sidiropoulos, Konstantinos; Smith, Thomas; Sondka, Zbyslaw; Stegle, Oliver; Tang, Y. Amy; Turner, Edward; Vaughan, Brendan; Vrousgou, Olga; Watkins, Xavier; Martin, Maria-Jesus; Sanseau, Philippe; Vamathevan, Jessica; Birney, Ewan; Barrett, Jeffrey; Dunham, Ian
2017-01-01
We have designed and developed a data integration and visualization platform that provides evidence about the association of known and potential drug targets with diseases. The platform is designed to support identification and prioritization of biological targets for follow-up. Each drug target is linked to a disease using integrated genome-wide data from a broad range of data sources. The platform provides either a target-centric workflow to identify diseases that may be associated with a specific target, or a disease-centric workflow to identify targets that may be associated with a specific disease. Users can easily transition between these target- and disease-centric workflows. The Open Targets Validation Platform is accessible at https://www.targetvalidation.org. PMID:27899665
DOE Office of Scientific and Technical Information (OSTI.GOV)
T. O. Tuemer; L. Doan; C. W. Su
2000-06-04
This paper describes the design and operation of a Compact Integrated Narcotics Detection Instrument (CINDI), which utilizes neutrons emitted from {sup 252}Cf. Neutrons emitted from the front face of CINDI penetrate dense compartment barrier materials with little change in energy but are backscattered by hydrogen-rich materials such as drugs. CINDI has led to a new technology that shows promise for identifying the concealed contraband. Carriers such as vehicles, marine vessels, airplanes, containers, cargo, and luggage will be scanned using both neutron and gamma-ray sources. The signal from both the neutron and gamma-ray backscattering and/or transmission can be used simultaneously tomore » detect and possibly identify the contrabands it has been trained for.« less
32 CFR Appendix A to Part 806b - Definitions
Code of Federal Regulations, 2010 CFR
2010-07-01
... exemption for protecting the identity of confidential sources. Cookie: Data created by a Web server that is... (persistent cookie). It provides a way for the Web site to identify users and keep track of their preferences... or is sent to a Web site different from the one you are currently viewing. Defense Data Integrity...
32 CFR Appendix A to Part 806b - Definitions
Code of Federal Regulations, 2011 CFR
2011-07-01
... exemption for protecting the identity of confidential sources. Cookie: Data created by a Web server that is... (persistent cookie). It provides a way for the Web site to identify users and keep track of their preferences... or is sent to a Web site different from the one you are currently viewing. Defense Data Integrity...
Activities for Students: Biology as a Source for Algebra Equations--The Heart
ERIC Educational Resources Information Center
Horak, Virginia M.
2005-01-01
The high school course that integrated first year algebra with an introductory environmental biology/anatomy and physiology course, in order to solve algebra problems is discussed. Lessons and activities for the course were taken by identifying the areas where mathematics and biology content intervenes may help students understand biology concepts…
ERIC Educational Resources Information Center
Scheiner, Thorsten
2015-01-01
The guiding philosophy of this theoretical work lays in the argument that mathematics teachers' professional knowledge is the integration of various knowledge facets derived from different sources including teaching experience and research. This paper goes beyond past trends identifying what the teachers' knowledge is about (content) by providing…
ERIC Educational Resources Information Center
Gerjets, Peter; Kammerer, Yvonne; Werner, Benita
2011-01-01
Web searching for complex information requires to appropriately evaluating diverse sources of information. Information science studies identified different criteria applied by searchers to evaluate Web information. However, the explicit evaluation instructions used in these studies might have resulted in a distortion of spontaneous evaluation…
Triangulation in aetiological epidemiology.
Lawlor, Debbie A; Tilling, Kate; Davey Smith, George
2016-12-01
Triangulation is the practice of obtaining more reliable answers to research questions through integrating results from several different approaches, where each approach has different key sources of potential bias that are unrelated to each other. With respect to causal questions in aetiological epidemiology, if the results of different approaches all point to the same conclusion, this strengthens confidence in the finding. This is particularly the case when the key sources of bias of some of the approaches would predict that findings would point in opposite directions if they were due to such biases. Where there are inconsistencies, understanding the key sources of bias of each approach can help to identify what further research is required to address the causal question. The aim of this paper is to illustrate how triangulation might be used to improve causal inference in aetiological epidemiology. We propose a minimum set of criteria for use in triangulation in aetiological epidemiology, summarize the key sources of bias of several approaches and describe how these might be integrated within a triangulation framework. We emphasize the importance of being explicit about the expected direction of bias within each approach, whenever this is possible, and seeking to identify approaches that would be expected to bias the true causal effect in different directions. We also note the importance, when comparing results, of taking account of differences in the duration and timing of exposures. We provide three examples to illustrate these points. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.
Nesbitt, A; Ravel, A; Murray, R; McCormick, R; Savelli, C; Finley, R; Parmley, J; Agunos, A; Majowicz, S E; Gilmour, M
2012-10-01
Salmonella enteritidis has emerged as the most prevalent cause of human salmonellosis in Canada. Recent trends of S. enteritidis subtypes and their potential sources were described by integrating Salmonella data from several Canadian surveillance and monitoring programmes. A threefold increase in S. enteritidis cases from 2003 to 2009 was identified to be primarily associated with phage types 13, 8 and 13a. Other common phage types (4, 1, 6a) showed winter seasonality and were more likely to be associated with cases linked to international travel. Conversely, phage types 13, 8 and 13a had summer seasonal peaks and were associated with cases of domestically acquired infections. During agri-food surveillance, S. enteritidis was detected in various commodities, most frequently in chicken (with PT13, PT8 and PT13a predominating). Antimicrobial resistance was low in human and non-human isolates. Continued integrated surveillance and collaborative prevention and control efforts are required to mitigate future illness.
NESBITT, A.; RAVEL, A.; MURRAY, R.; McCORMICK, R.; SAVELLI, C.; FINLEY, R.; PARMLEY, J.; AGUNOS, A.; MAJOWICZ, S. E.; GILMOUR, M.
2012-01-01
SUMMARY Salmonella Enteritidis has emerged as the most prevalent cause of human salmonellosis in Canada. Recent trends of S. Enteritidis subtypes and their potential sources were described by integrating Salmonella data from several Canadian surveillance and monitoring programmes. A threefold increase in S. Enteritidis cases from 2003 to 2009 was identified to be primarily associated with phage types 13, 8 and 13a. Other common phage types (4, 1, 6a) showed winter seasonality and were more likely to be associated with cases linked to international travel. Conversely, phage types 13, 8 and 13a had summer seasonal peaks and were associated with cases of domestically acquired infections. During agri-food surveillance, S. Enteritidis was detected in various commodities, most frequently in chicken (with PT13, PT8 and PT13a predominating). Antimicrobial resistance was low in human and non-human isolates. Continued integrated surveillance and collaborative prevention and control efforts are required to mitigate future illness. PMID:22166269
A Possible Magnetar Nature for IGR J16358-4726
NASA Technical Reports Server (NTRS)
Patel, S.; Zurita, J.; DelSanto, M.; Finger, M.; Koueliotou, C.; Eichler, D.; Gogus, E.; Ubertini, P.; Walter, R.; Woods, P.
2006-01-01
We present detailed spectral and timing analysis of the hard x-ray transient IGR J16358-4726 using multi-satellite archival observations. A study of the source flux time history over 6 years, suggests that this transient outbursts can be occurring in intervals of at most 1 year. Joint spectral fits using simultaneous Chandra/ACIS and INTEGRAL/ISGRI data reveal a spectrum well described by an absorbed cut-off power law model plus an Fe line. We detected the pulsations initially reported using Chandra/ACIS also in the INTEGRAL/ISGRI light curve and in subsequent XMM-Newton observations. Using the INTEGRAL data we identified a pulse spin up of 94 s (P = 1.6 x 10(exp -4), which strongly points to a neutron star nature for IGR J16358-4726. Assuming that the spin up is due to disc accretion, we estimate that the source magnetic field ranges between 10(sup 13) approximately 10(sup 15) depending on its distance, possibly supporting a magnetar nature for IGR J16358-4726.
iGC-an integrated analysis package of gene expression and copy number alteration.
Lai, Yi-Pin; Wang, Liang-Bo; Wang, Wei-An; Lai, Liang-Chuan; Tsai, Mong-Hsun; Lu, Tzu-Pin; Chuang, Eric Y
2017-01-14
With the advancement in high-throughput technologies, researchers can simultaneously investigate gene expression and copy number alteration (CNA) data from individual patients at a lower cost. Traditional analysis methods analyze each type of data individually and integrate their results using Venn diagrams. Challenges arise, however, when the results are irreproducible and inconsistent across multiple platforms. To address these issues, one possible approach is to concurrently analyze both gene expression profiling and CNAs in the same individual. We have developed an open-source R/Bioconductor package (iGC). Multiple input formats are supported and users can define their own criteria for identifying differentially expressed genes driven by CNAs. The analysis of two real microarray datasets demonstrated that the CNA-driven genes identified by the iGC package showed significantly higher Pearson correlation coefficients with their gene expression levels and copy numbers than those genes located in a genomic region with CNA. Compared with the Venn diagram approach, the iGC package showed better performance. The iGC package is effective and useful for identifying CNA-driven genes. By simultaneously considering both comparative genomic and transcriptomic data, it can provide better understanding of biological and medical questions. The iGC package's source code and manual are freely available at https://www.bioconductor.org/packages/release/bioc/html/iGC.html .
The Nature of the X-Ray Binary IGR J19294+1816 from INTEGRAL, RXTE, and Swift Observations
NASA Technical Reports Server (NTRS)
Rodriquez, J.; Tomsick, J. A.; Bodaghee, A.; ZuritaHeras, J.-A.; Chaty, S.; Paizis, A.; Corbel, S.
2009-01-01
We report the results of a high-energy multi-instrumental campaign with INTEGRAL, RXTE, and Swift of the recently discovered INTEGRAL source IGR J19294+ 1816. The Swift/XRT data allow us to refine the position of the source to R.A. (J2000) = 19h 29m 55.9s, Decl. (J2000) = +18 deg 18 feet 38 inches . 4 (+/- 3 inches .5), which in turn permits us to identify a candidate infrared counterpart. The Swift and RXTE spectra are well fitted with absorbed power laws with hard (Gamma approx 1) photon indices. During the longest Swift observation, we obtained evidence of absorption in true excess to the Galactic value, which may indicate some intrinsic absorption in this source. We detected a strong (P = 40%) pulsations at 12.43781 (+/- 0.00003) s that we interpret as the spin period of a pulsar. All these results, coupled with the possible 117 day orbital period, point to IGR J19294+ 1816 being an high-mass X-ray binary (HMXB) with a Be companion star. However, while the long-term INTEGRAL/IBIS/ISGRI 18-40 keV light curve shows that the source spends most of its time in an undetectable state, we detect occurrences of short (2000-3000 s) and intense flares that are more typical of supergiant fast X-ray transients. We therefore cannot make firm conclusions on the type of system, and we discuss the possible implication of IGR J19294+1816 being an Supergiant Fast X-ray Transient (SFXT).
Building an Ontology for Identity Resolution in Healthcare and Public Health.
Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P; Clyde, Stephen; Thornton, Sidney; Staes, Catherine
2015-01-01
Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology's ability to model identity-changing events over time. We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage.
Remnants of an Ancient Deltaretrovirus in the Genomes of Horseshoe Bats (Rhinolophidae).
Hron, Tomáš; Farkašová, Helena; Gifford, Robert J; Benda, Petr; Hulva, Pavel; Görföl, Tamás; Pačes, Jan; Elleder, Daniel
2018-04-10
Endogenous retrovirus (ERV) sequences provide a rich source of information about the long-term interactions between retroviruses and their hosts. However, most ERVs are derived from a subset of retrovirus groups, while ERVs derived from certain other groups remain extremely rare. In particular, only a single ERV sequence has been identified that shows evidence of being related to an ancient Deltaretrovirus , despite the large number of vertebrate genome sequences now available. In this report, we identify a second example of an ERV sequence putatively derived from a past deltaretroviral infection, in the genomes of several species of horseshoe bats (Rhinolophidae). This sequence represents a fragment of viral genome derived from a single integration. The time of the integration was estimated to be 11-19 million years ago. This finding, together with the previously identified endogenous Deltaretrovirus in long-fingered bats (Miniopteridae), suggest a close association of bats with ancient deltaretroviruses.
The importance of civilian nursing organizations: integrative literature review.
Santos, James Farley Estevam Dos; Santos, Regina Maria Dos; Costa, Laís de Miranda Crispim; Almeida, Lenira Maria Wanderley Santos de; Macêdo, Amanda Cavalcante de; Santos, Tânia Cristina Franco
2016-06-01
to identify and analyze evidence from studies about the importance of civilian nursing organizations. an integrative literature review, for which searches were conducted in the databases LILACS, PubMed/MEDLINE, SciELO, BDENF, and Scopus. sixteen articles published between the years 2004-2013 were selected, 68.75% of which were sourced from Brazilian journals and 31.25% from American journals. civilian nursing organizations are important and necessary, because they have collaborated decisively in nursing struggles in favor of the working class and society in general, and these contributions influence different axes of professional performance.
The Arithmetic of Emotion: Integration of Incidental and Integral Affect in Judgments and Decisions
Västfjäll, Daniel; Slovic, Paul; Burns, William J.; Erlandsson, Arvid; Koppel, Lina; Asutay, Erkin; Tinghög, Gustav
2016-01-01
Research has demonstrated that two types of affect have an influence on judgment and decision making: incidental affect (affect unrelated to a judgment or decision such as a mood) and integral affect (affect that is part of the perceiver’s internal representation of the option or target under consideration). So far, these two lines of research have seldom crossed so that knowledge concerning their combined effects is largely missing. To fill this gap, the present review highlights differences and similarities between integral and incidental affect. Further, common and unique mechanisms that enable these two types of affect to influence judgment and choices are identified. Finally, some basic principles for affect integration when the two sources co-occur are outlined. These mechanisms are discussed in relation to existing work that has focused on incidental or integral affect but not both. PMID:27014136
Contraindications for superficial heat and therapeutic ultrasound: do sources agree?
Batavia, Mitchell
2004-06-01
To determine the amount of agreement among general rehabilitation sources for both superficial heating and therapeutic ultrasound contraindications. English-language textbook and peer-reviewed journal sources, from January 1992 to July 2002. Searches of computerized databases (HealthSTAR, CINAHL, MEDLINE, Embase) as well as Library of Congress Online Catalogs, Books in Print, and AcqWeb's Directory of Publishers and Venders. Sources were excluded if they (1) were published before 1992, (2) failed to address general rehabilitation audiences, or (3) were identified as a researcher's related publication with similar information on the topic. Type and number of contraindications, type of audience, year of publication, number of references, rationales, and alternative treatment strategies. Eighteen superficial heat and 20 ultrasound sources identified anywhere from 5 to 22 and 9 to 36 contraindications/precautions, respectively. Agreement among sources was generally high but ranged from 11% to 95%, with lower agreement noted for pregnancy, metal implants, edema, skin integrity, and cognitive/communicative concerns. Seventy-two percent of superficial heat sources and 25% of ultrasound sources failed to reference at least 1 contraindication claim. Agreement among contraindication sources was generally good for both superficial heat and therapeutic ultrasound. Sources varied with regard to the number of contraindications, references, and rationales cited. Greater reliance on objective data and standardized classification systems may serve to develop more uniform guidelines for superficial heat and therapeutic ultrasound.
NOAA's Approach to Community Building and Governance for Data Integration and Standards Within IOOS
NASA Astrophysics Data System (ADS)
Willis, Z.; Shuford, R.
2007-12-01
This presentation will review NOAA's current approach to the Integrated Ocean Observing System (IOOS) at a national and regional level within the context of our United States Federal and Non-Federal partners. Further, it will discuss the context of integrating data and the necessary standards definition that must be done not only within the United States but in a larger global context. IOOS is the U.S. contribution to the Global Ocean Observing System (GOOS), which itself is the ocean contribution to the Global Earth Observation System of Systems (GEOSS). IOOS is a nationally important network of distributed systems that forms an infrastructure providing many different users with the diverse information they require to characterize, understand, predict, and monitor changes in dynamic coastal and open ocean environments. NOAA recently established an IOOS Program Office to provide a focal point for its ocean observation programs and assist with coordination of regional and national IOOS activities. One of the Program's initial priorities is the development of a data integration framework (DIF) proof-of-concept for IOOS data. The initial effort will focus on NOAA sources of data and be implemented incrementally over the course of three years. The first phase will focus on the integration of five core IOOS variables being collected, and disseminated, for independent purposes and goals by multiple NOAA observing sources. The goal is to ensure that data from different sources is interoperable to enable rapid and routine use by multiple NOAA decision-support tool developers and other end users. During the second phase we expect to ingest these integrated variables into four specific NOAA data products used for decision-support. Finally, we will systematically test and evaluate enhancements to these products, and verify, validate, and benchmark new performance specifications. The outcome will be an extensible product for operational use that allows for broader community applicability to include additional variables, applications, and non-NOAA sources of data. NOAA is working with Ocean.US to implement an interagency process for the submission, proposal, and recommendation of IOOS data standards. In order to achieve the broader goals of data interoperability of GEOSS, communication of this process and the identified standards needs to be coordinated at the international level. NOAA is participating in the development of a series of IODE workshops with the objective to achieve broad agreement and commitment to ocean data management and exchange standards. The first of these meetings will use the five core variables identified by the NOAA DIF as a focus.
NASA Astrophysics Data System (ADS)
Schmitt, R. J. P.; Bizzi, S.; Kondolf, G. M.; Rubin, Z.; Castelletti, A.
2016-12-01
Field and laboratory evidence indicates that the spatial distribution of transport in both alluvial and bedrock rivers is an adaptation to sediment supply. Sediment supply, in turn, depends on spatial distribution and properties (e.g., grain sizes and supply rates) of individual sediment sources. Analyzing the distribution of transport capacity in a river network could hence clarify the spatial distribution and properties of sediment sources. Yet, challenges include a) identifying magnitude and spatial distribution of transport capacity for each of multiple grain sizes being simultaneously transported, and b) estimating source grain sizes and supply rates, both at network scales. Herein, we approach the problem of identifying the spatial distribution of sediment sources and the resulting network sediment fluxes in a major, poorly monitored tributary (80,000 km2) of the Mekong. Therefore, we apply the CASCADE modeling framework (Schmitt et al. (2016)). CASCADE calculates transport capacities and sediment fluxes for multiple grainsizes on the network scale based on remotely-sensed morphology and modelled hydrology. CASCADE is run in an inverse Monte Carlo approach for 7500 random initializations of source grain sizes. In all runs, supply of each source is inferred from the minimum downstream transport capacity for the source grain size. Results for each realization are compared to sparse available sedimentary records. Only 1 % of initializations reproduced the sedimentary record. Results for these realizations revealed a spatial pattern in source supply rates, grain sizes, and network sediment fluxes that correlated well with map-derived patterns in lithology and river-morphology. Hence, we propose that observable river hydro-morphology contains information on upstream source properties that can be back-calculated using an inverse modeling approach. Such an approach could be coupled to more detailed models of hillslope processes in future to derive integrated models of hillslope production and fluvial transport processes, which is particularly useful to identify sediment provenance in poorly monitored river basins.
Review on open source operating systems for internet of things
NASA Astrophysics Data System (ADS)
Wang, Zhengmin; Li, Wei; Dong, Huiliang
2017-08-01
Internet of Things (IoT) is an environment in which everywhere and every device became smart in a smart world. Internet of Things is growing vastly; it is an integrated system of uniquely identifiable communicating devices which exchange information in a connected network to provide extensive services. IoT devices have very limited memory, computational power, and power supply. Traditional operating systems (OS) have no way to meet the needs of IoT systems. In this paper, we thus analyze the challenges of IoT OS and survey applicable open source OSs.
Integrating social and value dimensions into sustainability assessment of lignocellulosic biofuels
Raman, Sujatha; Mohr, Alison; Helliwell, Richard; Ribeiro, Barbara; Shortall, Orla; Smith, Robert; Millar, Kate
2015-01-01
The paper clarifies the social and value dimensions for integrated sustainability assessments of lignocellulosic biofuels. We develop a responsible innovation approach, looking at technology impacts and implementation challenges, assumptions and value conflicts influencing how impacts are identified and assessed, and different visions for future development. We identify three distinct value-based visions. From a techno-economic perspective, lignocellulosic biofuels can contribute to energy security with improved GHG implications and fewer sustainability problems than fossil fuels and first-generation biofuels, especially when biomass is domestically sourced. From socio-economic and cultural-economic perspectives, there are concerns about the capacity to support UK-sourced feedstocks in a global agri-economy, difficulties monitoring large-scale supply chains and their potential for distributing impacts unfairly, and tensions between domestic sourcing and established legacies of farming. To respond to these concerns, we identify the potential for moving away from a one-size-fits-all biofuel/biorefinery model to regionally-tailored bioenergy configurations that might lower large-scale uses of land for meat, reduce monocultures and fossil-energy needs of farming and diversify business models. These configurations could explore ways of reconciling some conflicts between food, fuel and feed (by mixing feed crops with lignocellulosic material for fuel, combining livestock grazing with energy crops, or using crops such as miscanthus to manage land that is no longer arable); different bioenergy applications (with on-farm use of feedstocks for heat and power and for commercial biofuel production); and climate change objectives and pressures on farming. Findings are based on stakeholder interviews, literature synthesis and discussions with an expert advisory group. PMID:26664147
Integrating social and value dimensions into sustainability assessment of lignocellulosic biofuels.
Raman, Sujatha; Mohr, Alison; Helliwell, Richard; Ribeiro, Barbara; Shortall, Orla; Smith, Robert; Millar, Kate
2015-11-01
The paper clarifies the social and value dimensions for integrated sustainability assessments of lignocellulosic biofuels. We develop a responsible innovation approach, looking at technology impacts and implementation challenges, assumptions and value conflicts influencing how impacts are identified and assessed, and different visions for future development. We identify three distinct value-based visions. From a techno-economic perspective, lignocellulosic biofuels can contribute to energy security with improved GHG implications and fewer sustainability problems than fossil fuels and first-generation biofuels, especially when biomass is domestically sourced. From socio-economic and cultural-economic perspectives, there are concerns about the capacity to support UK-sourced feedstocks in a global agri-economy, difficulties monitoring large-scale supply chains and their potential for distributing impacts unfairly, and tensions between domestic sourcing and established legacies of farming. To respond to these concerns, we identify the potential for moving away from a one-size-fits-all biofuel/biorefinery model to regionally-tailored bioenergy configurations that might lower large-scale uses of land for meat, reduce monocultures and fossil-energy needs of farming and diversify business models. These configurations could explore ways of reconciling some conflicts between food, fuel and feed (by mixing feed crops with lignocellulosic material for fuel, combining livestock grazing with energy crops, or using crops such as miscanthus to manage land that is no longer arable); different bioenergy applications (with on-farm use of feedstocks for heat and power and for commercial biofuel production); and climate change objectives and pressures on farming. Findings are based on stakeholder interviews, literature synthesis and discussions with an expert advisory group.
Spallation Neutron Source Second Target Station Integrated Systems Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ankner, John Francis; An, Ke; Blokland, Willem
The Spallation Neutron Source (SNS) was designed from the beginning to accommodate both an accelerator upgrade to increase the proton power and a second target station (STS). Four workshops were organized in 2013 and 2014 to identify key science areas and challenges where neutrons will play a vital role [1-4]. Participants concluded that the addition of STS to the existing ORNL neutron sources was needed to complement the strengths of High Flux Isotope Reactor (HFIR) and the SNS first target station (FTS). To address the capability gaps identified in the workshops, a study was undertaken to identify instrument concepts thatmore » could provide the required new science capabilities. The study outlined 22 instrument concepts and presented an initial science case for STS [5]. These instrument concepts formed the basis of a planning suite of instruments whose requirements determined an initial site layout and moderator selection. An STS Technical Design Report (TDR) documented the STS concept based on those choices [6]. Since issue of the TDR, the STS concept has significantly matured as described in this document.« less
Cheminformatics and the Semantic Web: adding value with linked data and enhanced provenance
Frey, Jeremy G; Bird, Colin L
2013-01-01
Cheminformatics is evolving from being a field of study associated primarily with drug discovery into a discipline that embraces the distribution, management, access, and sharing of chemical data. The relationship with the related subject of bioinformatics is becoming stronger and better defined, owing to the influence of Semantic Web technologies, which enable researchers to integrate heterogeneous sources of chemical, biochemical, biological, and medical information. These developments depend on a range of factors: the principles of chemical identifiers and their role in relationships between chemical and biological entities; the importance of preserving provenance and properly curated metadata; and an understanding of the contribution that the Semantic Web can make at all stages of the research lifecycle. The movements toward open access, open source, and open collaboration all contribute to progress toward the goals of integration. PMID:24432050
Vision Based Autonomous Robotic Control for Advanced Inspection and Repair
NASA Technical Reports Server (NTRS)
Wehner, Walter S.
2014-01-01
The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.
García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J
2010-01-01
Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.
Reconstruction of the experimentally supported human protein interactome: what can we learn?
2013-01-01
Background Understanding the topology and dynamics of the human protein-protein interaction (PPI) network will significantly contribute to biomedical research, therefore its systematic reconstruction is required. Several meta-databases integrate source PPI datasets, but the protein node sets of their networks vary depending on the PPI data combined. Due to this inherent heterogeneity, the way in which the human PPI network expands via multiple dataset integration has not been comprehensively analyzed. We aim at assembling the human interactome in a global structured way and exploring it to gain insights of biological relevance. Results First, we defined the UniProtKB manually reviewed human “complete” proteome as the reference protein-node set and then we mined five major source PPI datasets for direct PPIs exclusively between the reference proteins. We updated the protein and publication identifiers and normalized all PPIs to the UniProt identifier level. The reconstructed interactome covers approximately 60% of the human proteome and has a scale-free structure. No apparent differentiating gene functional classification characteristics were identified for the unrepresented proteins. The source dataset integration augments the network mainly in PPIs. Polyubiquitin emerged as the highest-degree node, but the inclusion of most of its identified PPIs may be reconsidered. The high number (>300) of connections of the subsequent fifteen proteins correlates well with their essential biological role. According to the power-law network structure, the unrepresented proteins should mainly have up to four connections with equally poorly-connected interactors. Conclusions Reconstructing the human interactome based on the a priori definition of the protein nodes enabled us to identify the currently included part of the human “complete” proteome, and discuss the role of the proteins within the network topology with respect to their function. As the network expansion has to comply with the scale-free theory, we suggest that the core of the human interactome has essentially emerged. Thus, it could be employed in systems biology and biomedical research, despite the considerable number of currently unrepresented proteins. The latter are probably involved in specialized physiological conditions, justifying the scarcity of related PPI information, and their identification can assist in designing relevant functional experiments and targeted text mining algorithms. PMID:24088582
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-29
... scientific knowledge useful in indicating the kind and extent of all identifiable effects on public health or... effects of the pollutant on public health and welfare. EPA is also to revise the NAAQS, if appropriate... physics, sources and emissions, analytical methodology, transport and transformation in the environment...
Toward an Integrated System of Income Acquisition and Management: Four Community College Responses.
ERIC Educational Resources Information Center
Birmingham, Kathryn M.
This study argues that community college funding and resource development must become a long-term core function of the institution due to changes in the source of revenue for community colleges. The research problem was: (1) to identify and describe how organizational structure and management activities have changed in four community colleges in…
Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-81
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Lin, Jill D.
1997-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-81. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-81 and the resulting effect on the Space Shuttle Program.
Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-83
NASA Technical Reports Server (NTRS)
Lin, Jill D.; Katnik, Gregory N.
1997-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-83. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-83 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-71
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1995-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-71. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-71 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-102
NASA Technical Reports Server (NTRS)
Rivera, Jorge E.; Kelly, J. David (Technical Monitor)
2001-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-102. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch were analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or inflight anomalies. This report documents the debris/ice /thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-102 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-94
NASA Technical Reports Server (NTRS)
Bowen, Barry C.; Lin, Jill D.
1997-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-94. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-94 and the resulting effect on the Space Shuttle Program.
Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-79
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Lin, Jill D.
1996-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-79. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-79 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-73
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Lin, Jill D.
1995-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-73. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle Mission STS-73 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-50
NASA Technical Reports Server (NTRS)
Higginbotham, Scott A.; Davis, J. Bradley; Katnik, Gregory N.
1992-01-01
Thermal Protection System (TPS) assessment and integrated photographic analysis was conducted for Shuttle Mission STS-50. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-50, and the resulting effect on the Space Shuttle Program are documented.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis for Shuttle Mission STS-49
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1992-01-01
A debris/ice/Thermal Protection System (TPS) assessment and integrated photographic analysis was conducted for Shuttle Mission STS-49. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. Debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-49, and the resulting effect on the Space Shuttle Program are discussed.
Debris/Ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-77
NASA Technical Reports Server (NTRS)
Katnik, GregoryN.; Lin, Jill D. (Compiler)
1996-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-77. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-77 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-70
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1995-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-70. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-70 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-51
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1993-01-01
A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for shuttle mission STS-51. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle mission STS-51 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-55
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1993-01-01
A Debris/Ice/TPS assessment and integrated photographic analysis was conducted for Shuttle mission STS-55. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/Frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle mission STS-55, and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-69
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1995-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-69. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system condition and integrated photographic analysis of Shuttle Mission STS-69 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-52
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1992-01-01
A debris/ice/Thermal Protection System (TPS) assessment and integrated photographic analysis was conducted for Shuttle Mission STS-47. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-52, and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-106
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Kelley, J. David (Technical Monitor)
2000-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-106. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-106 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS assessment and integrated photographic analysis of shuttle mission STS-76
NASA Technical Reports Server (NTRS)
Lin, Jill D.
1996-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-76. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-76 and the resulting effect on the Space Shuttle Program.
Influence of branding on preference-based decision making.
Philiastides, Marios G; Ratcliff, Roger
2013-07-01
Branding has become one of the most important determinants of consumer choices. Intriguingly, the psychological mechanisms of how branding influences decision making remain elusive. In the research reported here, we used a preference-based decision-making task and computational modeling to identify which internal components of processing are affected by branding. We found that a process of noisy temporal integration of subjective value information can model preference-based choices reliably and that branding biases are explained by changes in the rate of the integration process itself. This result suggests that branding information and subjective preference are integrated into a single source of evidence in the decision-making process, thereby altering choice behavior.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-53
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1993-01-01
A Debris/Ice/TPS assessment and integrated photographic analysis was conducted for Shuttle Mission STS-53. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/Frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-53, and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-54
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1993-01-01
A Debris/Ice/TPS assessment and integrated photographic analysis was conducted for Shuttle Mission STS-54. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-54, and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-61
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1994-01-01
A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for shuttle mission STS-61. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/TPS conditions and integrated photographic analysis of shuttle mission STS-61, and the resulting effect on the space shuttle program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-72
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Lin, Jill D.
1996-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-72. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-72 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis for Shuttle mission STS-58
NASA Technical Reports Server (NTRS)
Davis, J. Bradley; Rivera, Jorge E.; Katnik, Gregory N.; Bowen, Barry C.; Speece, Robert F.; Rosado, Pedro J.
1994-01-01
A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for Shuttle mission STS-58. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The ice/debris/TPS conditions and integrated photographic analysis of Shuttle mission STS-58, and the resulting effect on the Space Shuttle Program are documented.
Debris/ice/TPS assessment and integrated photographic analysis for Shuttle mission STS-47
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1992-01-01
A debris/ice/TPS assessment and integrated photographic analysis was conducted for Shuttle Mission STS-47. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-47, and the resulting effect on the Space Shuttle Program.
Zou, Bin; Jiang, Xiaolu; Duan, Xiaoli; Zhao, Xiuge; Zhang, Jing; Tang, Jingwen; Sun, Guoqing
2017-03-23
Traditional sampling for soil pollution evaluation is cost intensive and has limited representativeness. Therefore, developing methods that can accurately and rapidly identify at-risk areas and the contributing pollutants is imperative for soil remediation. In this study, we propose an innovative integrated H-G scheme combining human health risk assessment and geographical detector methods that was based on geographical information system technology and validated its feasibility in a renewable resource industrial park in mainland China. With a discrete site investigation of cadmium (Cd), arsenic (As), copper (Cu), mercury (Hg) and zinc (Zn) concentrations, the continuous surfaces of carcinogenic risk and non-carcinogenic risk caused by these heavy metals were estimated and mapped. Source apportionment analysis using geographical detector methods further revealed that these risks were primarily attributed to As, according to the power of the determinant and its associated synergic actions with other heavy metals. Concentrations of critical As and Cd, and the associated exposed CRs are closed to the safe thresholds after remediating the risk areas identified by the integrated H-G scheme. Therefore, the integrated H-G scheme provides an effective approach to support decision-making for regional contaminated soil remediation at fine spatial resolution with limited sampling data over a large geographical extent.
Derse, E.; Knee, K.L.; Wankel, Scott D.; Kendall, C.; Berg, C.J.; Paytan, A.
2007-01-01
Sewage effluent, storm runoff, discharge from polluted rivers, and inputs of groundwater have all been suggested as potential sources of land derived nutrients into Hanalei Bay, Kauai. We determined the nitrogen isotopic signatures (??15N) of different nitrate sources to Hanalei Bay along with the isotopic signature recorded by 11 species of macroalgal collected in the Bay. The macroalgae integrate the isotopic signatures of the nitrate sources over time, thus these data along with the nitrate to dissolved inorganic phosphate molar ratios (N:P) of the macroalgae were used to determine the major nitrate source to the bay ecosystem and which of the macro-nutrients is limiting algae growth, respectively. Relatively low ??15N values (average -0.5???) were observed in all algae collected throughout the Bay; implicating fertilizer, rather than domestic sewage, as an important external source of nitrogen to the coastal water around Hanalei. The N:P ratio in the algae compared to the ratio in the Bay waters imply that the Hanalei Bay coastal ecosystem is nitrogen limited and thus, increased nitrogen input may potentially impactthis coastal ecosystem and specifically the coral reefs in the Bay. Identifying the major source of nutrient loading to the Bay is important for risk assessment and potential remediation plans. ?? 2007 American Chemical Society.
Integrating Microarray Data and GRNs.
Koumakis, L; Potamias, G; Tsiknakis, M; Zervakis, M; Moustakis, V
2016-01-01
With the completion of the Human Genome Project and the emergence of high-throughput technologies, a vast amount of molecular and biological data are being produced. Two of the most important and significant data sources come from microarray gene-expression experiments and respective databanks (e,g., Gene Expression Omnibus-GEO (http://www.ncbi.nlm.nih.gov/geo)), and from molecular pathways and Gene Regulatory Networks (GRNs) stored and curated in public (e.g., Kyoto Encyclopedia of Genes and Genomes-KEGG (http://www.genome.jp/kegg/pathway.html), Reactome (http://www.reactome.org/ReactomeGWT/entrypoint.html)) as well as in commercial repositories (e.g., Ingenuity IPA (http://www.ingenuity.com/products/ipa)). The association of these two sources aims to give new insight in disease understanding and reveal new molecular targets in the treatment of specific phenotypes.Three major research lines and respective efforts that try to utilize and combine data from both of these sources could be identified, namely: (1) de novo reconstruction of GRNs, (2) identification of Gene-signatures, and (3) identification of differentially expressed GRN functional paths (i.e., sub-GRN paths that distinguish between different phenotypes). In this chapter, we give an overview of the existing methods that support the different types of gene-expression and GRN integration with a focus on methodologies that aim to identify phenotype-discriminant GRNs or subnetworks, and we also present our methodology.
Digging Back In Time: Integrating Historical Data Into an Operational Ocean Observing System
NASA Astrophysics Data System (ADS)
McCammon, M.
2016-02-01
Modern technologies allow reporting and display of data near real-time from in situ instrumentation live on the internet. This has given users fast access to critical information for scientific applications, marine safety, planning, and numerous other activities. Equally as valuable is having access to historical data sets. However, it is challenging to identify sources and access of historical data of interest as it exists in many different locations, depending on the funding source and provider. Also, time-varying formats can make it difficult to data-mine and display historical data. There is also the issue of data quality, and having a systematic means of assessing credibility of historical data sets. The Alaska Ocean Observing System (AOOS) data management system demonstrates the successful ingestion of historical data, both old and new (as recent as yesterday) and has integrated numerous historical data streams into user friendly data portals, available for data upload and display on the AOOS Website. An example is the inclusion of non-real-time (e.g. day old) AIS (Automatic Identification System) ship tracking data, important for scientists working in marine mammal migration regions. Other examples include historical sea ice data, and various data streams from previous research projects (e.g. moored time series, HF Radar surface currents, weather, shipboard CTD). Most program or project websites only offer access to data specific to their agency or project alone, but do not have the capacity to provide access to the plethora of other data that might be available for the region and be useful for integration, comparison and synthesis. AOOS offers end users access to a one stop-shop for data in the area they want to research, helping them identify other sources of information and access. Demonstrations of data portals using historical data illustrate these benefits.
Predicting Adverse Drug Effects from Literature- and Database-Mined Assertions.
La, Mary K; Sedykh, Alexander; Fourches, Denis; Muratov, Eugene; Tropsha, Alexander
2018-06-06
Given that adverse drug effects (ADEs) have led to post-market patient harm and subsequent drug withdrawal, failure of candidate agents in the drug development process, and other negative outcomes, it is essential to attempt to forecast ADEs and other relevant drug-target-effect relationships as early as possible. Current pharmacologic data sources, providing multiple complementary perspectives on the drug-target-effect paradigm, can be integrated to facilitate the inference of relationships between these entities. This study aims to identify both existing and unknown relationships between chemicals (C), protein targets (T), and ADEs (E) based on evidence in the literature. Cheminformatics and data mining approaches were employed to integrate and analyze publicly available clinical pharmacology data and literature assertions interrelating drugs, targets, and ADEs. Based on these assertions, a C-T-E relationship knowledge base was developed. Known pairwise relationships between chemicals, targets, and ADEs were collected from several pharmacological and biomedical data sources. These relationships were curated and integrated according to Swanson's paradigm to form C-T-E triangles. Missing C-E edges were then inferred as C-E relationships. Unreported associations between drugs, targets, and ADEs were inferred, and inferences were prioritized as testable hypotheses. Several C-E inferences, including testosterone → myocardial infarction, were identified using inferences based on the literature sources published prior to confirmatory case reports. Timestamping approaches confirmed the predictive ability of this inference strategy on a larger scale. The presented workflow, based on free-access databases and an association-based inference scheme, provided novel C-E relationships that have been validated post hoc in case reports. With refinement of prioritization schemes for the generated C-E inferences, this workflow may provide an effective computational method for the early detection of potential drug candidate ADEs that can be followed by targeted experimental investigations.
Extension of optical lithography by mask-litho integration with computational lithography
NASA Astrophysics Data System (ADS)
Takigawa, T.; Gronlund, K.; Wiley, J.
2010-05-01
Wafer lithography process windows can be enlarged by using source mask co-optimization (SMO). Recently, SMO including freeform wafer scanner illumination sources has been developed. Freeform sources are generated by a programmable illumination system using a micro-mirror array or by custom Diffractive Optical Elements (DOE). The combination of freeform sources and complex masks generated by SMO show increased wafer lithography process window and reduced MEEF. Full-chip mask optimization using source optimized by SMO can generate complex masks with small variable feature size sub-resolution assist features (SRAF). These complex masks create challenges for accurate mask pattern writing and low false-defect inspection. The accuracy of the small variable-sized mask SRAF patterns is degraded by short range mask process proximity effects. To address the accuracy needed for these complex masks, we developed a highly accurate mask process correction (MPC) capability. It is also difficult to achieve low false-defect inspections of complex masks with conventional mask defect inspection systems. A printability check system, Mask Lithography Manufacturability Check (M-LMC), is developed and integrated with 199-nm high NA inspection system, NPI. M-LMC successfully identifies printable defects from all of the masses of raw defect images collected during the inspection of a complex mask. Long range mask CD uniformity errors are compensated by scanner dose control. A mask CD uniformity error map obtained by mask metrology system is used as input data to the scanner. Using this method, wafer CD uniformity is improved. As reviewed above, mask-litho integration technology with computational lithography is becoming increasingly important.
GNormPlus: An Integrative Approach for Tagging Genes, Gene Families, and Protein Domains
Lu, Zhiyong
2015-01-01
The automatic recognition of gene names and their associated database identifiers from biomedical text has been widely studied in recent years, as these tasks play an important role in many downstream text-mining applications. Despite significant previous research, only a small number of tools are publicly available and these tools are typically restricted to detecting only mention level gene names or only document level gene identifiers. In this work, we report GNormPlus: an end-to-end and open source system that handles both gene mention and identifier detection. We created a new corpus of 694 PubMed articles to support our development of GNormPlus, containing manual annotations for not only gene names and their identifiers, but also closely related concepts useful for gene name disambiguation, such as gene families and protein domains. GNormPlus integrates several advanced text-mining techniques, including SimConcept for resolving composite gene names. As a result, GNormPlus compares favorably to other state-of-the-art methods when evaluated on two widely used public benchmarking datasets, achieving 86.7% F1-score on the BioCreative II Gene Normalization task dataset and 50.1% F1-score on the BioCreative III Gene Normalization task dataset. The GNormPlus source code and its annotated corpus are freely available, and the results of applying GNormPlus to the entire PubMed are freely accessible through our web-based tool PubTator. PMID:26380306
Yang, Zeyu; Hollebone, Bruce P; Wang, Zhendi; Yang, Chun; Brown, Carl; Landriault, Mike
2013-06-01
A case study is presented for the forensic identification of several spilled biodiesels and its blends with petroleum oil using integrated forensic oil fingerprinting techniques. The integrated fingerprinting techniques combined SPE with GC/MS for obtaining individual petroleum hydrocarbons (aliphatic hydrocarbons, polyaromatic hydrocarbons and their alkylated derivatives and biomarkers), and biodiesel hydrocarbons (fatty acid methyl esters, free fatty acids, glycerol, monoacylglycerides, and free sterols). HPLC equipped with evaporative scattering laser detector was also used for identifying the compounds that conventional GC/MS could not finish. The three environmental samples (E1, E2, and E3) and one suspected source sample (S2) were dominant with vegetable oil with high acid values and low concentration of fatty acid methyl ester. The suspected source sample S2 was responsible for the three spilled samples although E1 was slightly contaminated by petroleum oil with light hydrocarbons. The suspected source sample S1 exhibited with the high content of glycerol, low content of glycerides, and high polarity, indicating its difference from the other samples. These samples may be the separated byproducts in producing biodiesel. Canola oil source is the most possible feedstock for the three environmental samples and the suspected source sample S2. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Power system monitoring and source control of the Space Station Freedom DC power system testbed
NASA Technical Reports Server (NTRS)
Kimnach, Greg L.; Baez, Anastacio N.
1992-01-01
Unlike a terrestrial electric utility which can purchase power from a neighboring utility, the Space Station Freedom (SSF) has strictly limited energy resources; as a result, source control, system monitoring, system protection, and load management are essential to the safe and efficient operation of the SSF Electric Power System (EPS). These functions are being evaluated in the DC Power Management and Distribution (PMAD) Testbed which NASA LeRC has developed at the Power System Facility (PSF) located in Cleveland, Ohio. The testbed is an ideal platform to develop, integrate, and verify power system monitoring and control algorithms. State Estimation (SE) is a monitoring tool used extensively in terrestrial electric utilities to ensure safe power system operation. It uses redundant system information to calculate the actual state of the EPS, to isolate faulty sensors, to determine source operating points, to verify faults detected by subsidiary controllers, and to identify high impedance faults. Source control and monitoring safeguard the power generation and storage subsystems and ensure that the power system operates within safe limits while satisfying user demands with minimal interruptions. System monitoring functions, in coordination with hardware implemented schemes, provide for a complete fault protection system. The objective of this paper is to overview the development and integration of the state estimator and the source control algorithms.
Power system monitoring and source control of the Space Station Freedom dc-power system testbed
NASA Technical Reports Server (NTRS)
Kimnach, Greg L.; Baez, Anastacio N.
1992-01-01
Unlike a terrestrial electric utility which can purchase power from a neighboring utility, the Space Station Freedom (SSF) has strictly limited energy resources; as a result, source control, system monitoring, system protection, and load management are essential to the safe and efficient operation of the SSF Electric Power System (EPS). These functions are being evaluated in the dc Power Management and Distribution (PMAD) Testbed which NASA LeRC has developed at the Power System Facility (PSF) located in Cleveland, Ohio. The testbed is an ideal platform to develop, integrate, and verify power system monitoring and control algorithms. State Estimation (SE) is a monitoring tool used extensively in terrestrial electric utilities to ensure safe power system operation. It uses redundant system information to calculate the actual state of the EPS, to isolate faulty sensors, to determine source operating points, to verify faults detected by subsidiary controllers, and to identify high impedance faults. Source control and monitoring safeguard the power generation and storage subsystems and ensure that the power system operates within safe limits while satisfying user demands with minimal interruptions. System monitoring functions, in coordination with hardware implemented schemes, provide for a complete fault protection system. The objective of this paper is to overview the development and integration of the state estimator and the source control algorithms.
On the periodic Toda lattice hierarchy with an integral source
NASA Astrophysics Data System (ADS)
Babajanov, Bazar; Fečkan, Michal; Urazboev, Gayrat
2017-11-01
This work is devoted to the application of inverse spectral problem for integration of the periodic Toda lattice hierarchy with an integral type source. The effective method is presented of constructing the periodic Toda lattice hierarchy with an integral source.
Yang, Jae-Seong; Kwon, Oh Sung; Kim, Sanguk; Jang, Sung Key
2013-01-01
Successful viral infection requires intimate communication between virus and host cell, a process that absolutely requires various host proteins. However, current efforts to discover novel host proteins as therapeutic targets for viral infection are difficult. Here, we developed an integrative-genomics approach to predict human genes involved in the early steps of hepatitis C virus (HCV) infection. By integrating HCV and human protein associations, co-expression data, and tight junction-tetraspanin web specific networks, we identified host proteins required for the early steps in HCV infection. Moreover, we validated the roles of newly identified proteins in HCV infection by knocking down their expression using small interfering RNAs. Specifically, a novel host factor CD63 was shown to directly interact with HCV E2 protein. We further demonstrated that an antibody against CD63 blocked HCV infection, indicating that CD63 may serve as a new therapeutic target for HCV-related diseases. The candidate gene list provides a source for identification of new therapeutic targets. PMID:23593195
Valentijn, Pim P; Biermann, Claus; Bruijnzeels, Marc A
2016-08-02
Integrated care services are considered a vital strategy for improving the Triple Aim values for people with chronic kidney disease. However, a solid scholarly explanation of how to develop, implement and evaluate such value-based integrated renal care services is limited. The aim of this study was to develop a framework to identify the strategies and outcomes for the implementation of value-based integrated renal care. First, the theoretical foundations of the Rainbow Model of Integrated Care and the Triple Aim were united into one overarching framework through an iterative process of key-informant consultations. Second, a rapid review approach was conducted to identify the published research on integrated renal care, and the Cochrane Library, Medline, Scopus, and Business Source Premier databases were searched for pertinent articles published between 2000 and 2015. Based on the framework, a coding schema was developed to synthesis the included articles. The overarching framework distinguishes the integrated care domains: 1) type of integration, 2) enablers of integration and the interrelated outcome domains, 3) experience of care, 4) population health and 5) costs. The literature synthesis indicated that integrated renal care implementation strategies have particularly focused on micro clinical processes and physical outcomes, while little emphasis has been placed on meso organisational as well as macro system integration processes. In addition, evidence regarding patients' perceived outcomes and economic outcomes has been weak. These results underscore that the future challenge for researchers is to explore which integrated care implementation strategies achieve better health and improved experience of care at a lower cost within a specific context. For this purpose, this study's framework and evidence synthesis have set a developmental agenda for both integrated renal care practice and research. Accordingly, we plan further work to develop an implementation model for value-based integrated renal services.
The Network Structure Underlying the Earth Observation Assessment
NASA Astrophysics Data System (ADS)
Vitkin, S.; Doane, W. E. J.; Mary, J. C.
2017-12-01
The Earth Observations Assessment (EOA 2016) is a multiyear project designed to assess the effectiveness of civil earth observation data sources (instruments, sensors, models, etc.) on societal benefit areas (SBAs) for the United States. Subject matter experts (SMEs) provided input and scored how data sources inform products, product groups, key objectives, SBA sub-areas, and SBAs in an attempt to quantify the relationships between data sources and SBAs. The resulting data were processed by Integrated Applications Incorporated (IAI) using MITRE's PALMA software to create normalized relative impact scores for each of these relationships. However, PALMA processing obscures the natural network representation of the data. Any network analysis that might identify patterns of interaction among data sources, products, and SBAs is therefore impossible. Collaborating with IAI, we cleaned and recreated a network from the original dataset. Using R and Python we explore the underlying structure of the network and apply frequent itemset mining algorithms to identify groups of data sources and products that interact. We reveal interesting patterns and relationships in the EOA dataset that were not immediately observable from the EOA 2016 report and provide a basis for further exploration of the EOA network dataset.
Standardised survey method for identifying catchment risks to water quality.
Baker, D L; Ferguson, C M; Chier, P; Warnecke, M; Watkinson, A
2016-06-01
This paper describes the development and application of a systematic methodology to identify and quantify risks in drinking water and recreational catchments. The methodology assesses microbial and chemical contaminants from both diffuse and point sources within a catchment using Escherichia coli, protozoan pathogens and chemicals (including fuel and pesticides) as index contaminants. Hazard source information is gathered by a defined sanitary survey process involving use of a software tool which groups hazards into six types: sewage infrastructure, on-site sewage systems, industrial, stormwater, agriculture and recreational sites. The survey estimates the likelihood of the site affecting catchment water quality, and the potential consequences, enabling the calculation of risk for individual sites. These risks are integrated to calculate a cumulative risk for each sub-catchment and the whole catchment. The cumulative risks process accounts for the proportion of potential input sources surveyed and for transfer of contaminants from upstream to downstream sub-catchments. The output risk matrices show the relative risk sources for each of the index contaminants, highlighting those with the greatest impact on water quality at a sub-catchment and catchment level. Verification of the sanitary survey assessments and prioritisation is achieved by comparison with water quality data and microbial source tracking.
Origin and sources of dissolved organic matter in snow on the East Antarctic ice sheet.
Antony, Runa; Grannas, Amanda M; Willoughby, Amanda S; Sleighter, Rachel L; Thamban, Meloth; Hatcher, Patrick G
2014-06-03
Polar ice sheets hold a significant pool of the world's carbon reserve and are an integral component of the global carbon cycle. Yet, organic carbon composition and cycling in these systems is least understood. Here, we use ultrahigh resolution mass spectrometry to elucidate, at an unprecedented level, molecular details of dissolved organic matter (DOM) in Antarctic snow. Tens of thousands of distinct molecular species are identified, providing clues to the nature and sources of organic carbon in Antarctica. We show that many of the identified supraglacial organic matter formulas are consistent with material from microbial sources, and terrestrial inputs of vascular plant-derived materials are likely more important sources of organic carbon to Antarctica than previously thought. Black carbon-like material apparently originating from biomass burning in South America is also present, while a smaller fraction originated from soil humics and appears to be photochemically or microbially modified. In addition to remote continental sources, we document signals of oceanic emissions of primary aerosols and secondary organic aerosol precursors. The new insights on the diversity of organic species in Antarctic snowpack reinforce the importance of studying organic carbon associated with the Earth's polar regions in the face of changing climate.
Development, Integration and Utilization of Surface Nuclear Energy Sources for Exploration Missions
NASA Technical Reports Server (NTRS)
Houts, Michael G.; Schmidt, George R.; Bragg-Sitton, Shannon; Hickman, Robert; Hissam, Andy; Houston, Vance; Martin, Jim; Mireles, Omar; Reid, Bob; Schneider, Todd
2005-01-01
Throughout the past five decades numerous studies have identified nuclear energy as an enhancing or enabling technology for human surface exploration missions. Nuclear energy sources were used to provide electricity on Apollo missions 12, 14, 15, 16, and 17, and on the Mars Viking landers. Nuclear energy sources were used to provide heat on the Pathfinder; Spirit, and Discovery rovers. Scenarios have been proposed that utilize -1 kWe radioisotope systems for early missions, followed by fission systems in the 10 - 30 kWe range when energy requirements increase. A fission energy source unit size of approximately 150 kWt has been proposed based on previous lunar and Mars base architecture studies. Such a unit could support both early and advanced bases through a building block approach.
Building an Ontology for Identity Resolution in Healthcare and Public Health
Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P.; Clyde, Stephen; Thornton, Sidney; Staes, Catherine
2015-01-01
Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Objectives: Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology’s ability to model identity-changing events over time. Methods: We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. Results: We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. Conclusion: The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage. PMID:26392849
Approaches to integrating germline and tumor genomic data in cancer research
Feigelson, Heather Spencer; Goddard, Katrina A.B.; Hollombe, Celine; Tingle, Sharna R.; Gillanders, Elizabeth M.; Mechanic, Leah E.; Nelson, Stefanie A.
2014-01-01
Cancer is characterized by a diversity of genetic and epigenetic alterations occurring in both the germline and somatic (tumor) genomes. Hundreds of germline variants associated with cancer risk have been identified, and large amounts of data identifying mutations in the tumor genome that participate in tumorigenesis have been generated. Increasingly, these two genomes are being explored jointly to better understand how cancer risk alleles contribute to carcinogenesis and whether they influence development of specific tumor types or mutation profiles. To understand how data from germline risk studies and tumor genome profiling is being integrated, we reviewed 160 articles describing research that incorporated data from both genomes, published between January 2009 and December 2012, and summarized the current state of the field. We identified three principle types of research questions being addressed using these data: (i) use of tumor data to determine the putative function of germline risk variants; (ii) identification and analysis of relationships between host genetic background and particular tumor mutations or types; and (iii) use of tumor molecular profiling data to reduce genetic heterogeneity or refine phenotypes for germline association studies. We also found descriptive studies that compared germline and tumor genomic variation in a gene or gene family, and papers describing research methods, data sources, or analytical tools. We identified a large set of tools and data resources that can be used to analyze and integrate data from both genomes. Finally, we discuss opportunities and challenges for cancer research that integrates germline and tumor genomics data. PMID:25115441
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olama, Mohammed M; McNair, Wade; Sukumar, Sreenivas R
2014-01-01
In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its ownmore » set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system s readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMA-DMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system s data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.« less
NASA Astrophysics Data System (ADS)
Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.
2014-05-01
In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.
Research considerations when studying disasters.
Cox, Catherine Wilson
2008-03-01
Nurses play an integral role during disasters because they are called upon more than any other health care professional during disaster response efforts; consequently, nurse researchers are interested in studying the issues that impact nurses in the aftermath of a disaster. This article offers research considerations for nurse scientists when developing proposals related to disaster research and identifies resources and possible funding sources for their projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
As part of the U.S. Navy's overall energy strategy, the National Renewable Energy Laboratory (NREL) partnered with the Naval Facilities Engineering Command (NAVFAC) to demonstrate market-ready energy efficiency measures, renewable energy generation, and energy systems integration. One such technology - retrofitting rooftop air-conditioning units with an advanced rooftop control system - was identified as a promising source for reducing energy use and costs, and can contribute to increasing energy security.
The effect of object processing in content-dependent source memory
2013-01-01
Background Previous studies have suggested that the study condition of an item influences how the item is encoded. However, it is still unclear whether subsequent source memory effects are dependent upon stimulus content when the item and context are unitized. The present fMRI study investigated the effect of encoding activity sensitive to stimulus content in source memory via unitization. In the scanner, participants were instructed to integrate a study item, an object in either a word or a picture form, with perceptual context into a single image. Results Subsequent source memory effects independent of stimulus content were identified in the left lateral frontal and parietal regions, bilateral fusiform areas, and the left perirhinal cortex extending to the anterior hippocampus. Content-dependent subsequent source memory effects were found only with words in the left medial frontal lobe, the ventral visual stream, and bilateral parahippocampal regions. Further, neural activity for source memory with words extensively overlapped with the region where pictures were preferentially processed than words, including the left mid-occipital cortex and the right parahippocampal cortex. Conclusions These results indicate that words that were accurately remembered with correct contextual information were processed more like pictures mediated by integrated imagery operation, compared to words that were recognized with incorrect context. In contrast, such processing did not discriminate subsequent source memory with pictures. Taken together, these findings suggest that unitization supports source memory for both words and pictures and that the requirement of the study task interacts with the nature of stimulus content in unitized source encoding. PMID:23848969
TISSUES 2.0: an integrative web resource on mammalian tissue expression
Palasca, Oana; Santos, Alberto; Stolte, Christian; Gorodkin, Jan; Jensen, Lars Juhl
2018-01-01
Abstract Physiological and molecular similarities between organisms make it possible to translate findings from simpler experimental systems—model organisms—into more complex ones, such as human. This translation facilitates the understanding of biological processes under normal or disease conditions. Researchers aiming to identify the similarities and differences between organisms at the molecular level need resources collecting multi-organism tissue expression data. We have developed a database of gene–tissue associations in human, mouse, rat and pig by integrating multiple sources of evidence: transcriptomics covering all four species and proteomics (human only), manually curated and mined from the scientific literature. Through a scoring scheme, these associations are made comparable across all sources of evidence and across organisms. Furthermore, the scoring produces a confidence score assigned to each of the associations. The TISSUES database (version 2.0) is publicly accessible through a user-friendly web interface and as part of the STRING app for Cytoscape. In addition, we analyzed the agreement between datasets, across and within organisms, and identified that the agreement is mainly affected by the quality of the datasets rather than by the technologies used or organisms compared. Database URL: http://tissues.jensenlab.org/ PMID:29617745
Multi-Omics Factor Analysis-a framework for unsupervised integration of multi-omics data sets.
Argelaguet, Ricard; Velten, Britta; Arnol, Damien; Dietrich, Sascha; Zenz, Thorsten; Marioni, John C; Buettner, Florian; Huber, Wolfgang; Stegle, Oliver
2018-06-20
Multi-omics studies promise the improved characterization of biological processes across molecular layers. However, methods for the unsupervised integration of the resulting heterogeneous data sets are lacking. We present Multi-Omics Factor Analysis (MOFA), a computational method for discovering the principal sources of variation in multi-omics data sets. MOFA infers a set of (hidden) factors that capture biological and technical sources of variability. It disentangles axes of heterogeneity that are shared across multiple modalities and those specific to individual data modalities. The learnt factors enable a variety of downstream analyses, including identification of sample subgroups, data imputation and the detection of outlier samples. We applied MOFA to a cohort of 200 patient samples of chronic lymphocytic leukaemia, profiled for somatic mutations, RNA expression, DNA methylation and ex vivo drug responses. MOFA identified major dimensions of disease heterogeneity, including immunoglobulin heavy-chain variable region status, trisomy of chromosome 12 and previously underappreciated drivers, such as response to oxidative stress. In a second application, we used MOFA to analyse single-cell multi-omics data, identifying coordinated transcriptional and epigenetic changes along cell differentiation. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.
India RE Grid Integration Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cochran, Jaquelin M
The use of renewable energy (RE) sources, primarily wind and solar generation, is poised to grow significantly within the Indian power system. The Government of India has established a target of 175 gigawatts (GW) of installed RE capacity by 2022, including 60 GW of wind and 100 GW of solar, up from 29 GW wind and 9 GW solar at the beginning of 2017. Thanks to advanced weather and power system modeling made for this project, the study team is able to explore operational impacts of meeting India's RE targets and identify actions that may be favorable for integration.
A Nondestructive Method to Identify POP Contamination Sources in Omnivorous Seabirds.
Michielsen, Rosanne J; Shamoun-Baranes, Judy; Parsons, John R; Kraak, Michiel H S
2018-03-13
Persistent organic pollutants (POPs) are present in almost all environments due to their high bioaccumulation potential. Especially species that adapted to human activities, like gulls, might be exposed to harmful concentrations of these chemicals. The nature and degree of the exposure to POPs greatly vary between individual gulls, due to their diverse foraging behavior and specialization in certain foraging tactics. Therefore, in order clarify the effect of POP-contaminated areas on gull populations, it is important to identify the sources of POP contamination in individual gulls. Conventional sampling methods applied when studying POP contamination are destructive and ethically undesired. The aim of this literature review was to evaluate the potential of using feathers as a nondestructive method to determine sources of POP contamination in individual gulls. The reviewed data showed that high concentrations of PCBs and PBDEs in feathers together with a large proportion of less bioaccumulative congeners may indicate that the contamination originates from landfills. Low PCB and PBDE concentrations in feathers and a large proportion of more bioaccumulative congeners could indicate that the contamination originates from marine prey. We propose a nondestructive approach to identify the source of contamination in individual gulls based on individual contamination levels and PCB and PBDE congener profiles in feathers. Despite some uncertainties that might be reduced by future research, we conclude that especially when integrated with other methods like GPS tracking and the analysis of stable isotopic signatures, identifying the source of POP contamination based on congener profiles in feathers could become a powerful nondestructive method.
A Possible Magnetar Nature for IGR J16358-4726
NASA Technical Reports Server (NTRS)
Patel, S. K.; Zurita, J.; DelSanto, M.; Finger, M.; Kouveliotou, C.; Eichler, D.; Gogus, E.; Ubertini, P.; Walter, R.; Woods, P.;
2007-01-01
We present detailed spectral and timing analysis of the hard X-ray transient IGR J16358-4726 using multisatellite archival observations. A study of the source flux time history over 6 yr suggests that lower luminosity transient outbursts can be occurring in intervals of at most 1 yr. Joint spectral fits of the higher luminosity outburst using simultaneous Chandra ACIS and INTEGRAL ISGRI data reveal a spectrum well described by an absorbed power-law model with a high-energy cutoff plus an Fe line. We detected the 1.6 hr pulsations initially reported using Chandra ACIS also in the INTEGRAL ISGRI light curve and in subsequent XMM-Newton observations. Using the INTEGRAL data, we identified a spin-up of 94 s (P(sup(.)) = 1.6 x 10(exp -4), which strongly points to a neutron star nature for IGR J16358-4726. Assuming that the spin-up is due to disk accretion, we estimate that the source magnetic field ranges between 10(exp 13) and 10(exp 15) G, depending on its distance, possibly supporting a magnetar nature for IGR J16358-4726.
The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science
Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo
2008-01-01
The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570
NASA Astrophysics Data System (ADS)
Terminanto, A.; Swantoro, H. A.; Hidayanto, A. N.
2017-12-01
Enterprise Resource Planning (ERP) is an integrated information system to manage business processes of companies of various business scales. Because of the high cost of ERP investment, ERP implementation is usually done in large-scale enterprises, Due to the complexity of implementation problems, the success rate of ERP implementation is still low. Open Source System ERP becomes an alternative choice of ERP application to SME companies in terms of cost and customization. This study aims to identify characteristics and configure the implementation of OSS ERP Payroll module in KKPS (Employee Cooperative PT SRI) using OSS ERP Odoo and using ASAP method. This study is classified into case study research and action research. Implementation of OSS ERP Payroll module is done because the HR section of KKPS has not been integrated with other parts. The results of this study are the characteristics and configuration of OSS ERP payroll module in KKPS.
Annotating novel genes by integrating synthetic lethals and genomic information
Schöner, Daniel; Kalisch, Markus; Leisner, Christian; Meier, Lukas; Sohrmann, Marc; Faty, Mahamadou; Barral, Yves; Peter, Matthias; Gruissem, Wilhelm; Bühlmann, Peter
2008-01-01
Background Large scale screening for synthetic lethality serves as a common tool in yeast genetics to systematically search for genes that play a role in specific biological processes. Often the amounts of data resulting from a single large scale screen far exceed the capacities of experimental characterization of every identified target. Thus, there is need for computational tools that select promising candidate genes in order to reduce the number of follow-up experiments to a manageable size. Results We analyze synthetic lethality data for arp1 and jnm1, two spindle migration genes, in order to identify novel members in this process. To this end, we use an unsupervised statistical method that integrates additional information from biological data sources, such as gene expression, phenotypic profiling, RNA degradation and sequence similarity. Different from existing methods that require large amounts of synthetic lethal data, our method merely relies on synthetic lethality information from two single screens. Using a Multivariate Gaussian Mixture Model, we determine the best subset of features that assign the target genes to two groups. The approach identifies a small group of genes as candidates involved in spindle migration. Experimental testing confirms the majority of our candidates and we present she1 (YBL031W) as a novel gene involved in spindle migration. We applied the statistical methodology also to TOR2 signaling as another example. Conclusion We demonstrate the general use of Multivariate Gaussian Mixture Modeling for selecting candidate genes for experimental characterization from synthetic lethality data sets. For the given example, integration of different data sources contributes to the identification of genetic interaction partners of arp1 and jnm1 that play a role in the same biological process. PMID:18194531
19 CFR 10.532 - Integrated Sourcing Initiative.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 1 2010-04-01 2010-04-01 false Integrated Sourcing Initiative. 10.532 Section 10... Trade Agreement Rules of Origin § 10.532 Integrated Sourcing Initiative. (a) For purposes of General... Sourcing Initiative if: (1) The good, in its condition as imported, is both classified in a tariff...
Semantic Web meets Integrative Biology: a survey.
Chen, Huajun; Yu, Tong; Chen, Jake Y
2013-01-01
Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-103
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
2000-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-103. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-103 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-91
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
1998-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-91. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-91 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-93
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
1999-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-93. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis findings of Space Shuttle mission STS-93 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-95
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
1999-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-95. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-95 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-90
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
1998-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-90. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system-conditions and integrated photographic analysis of Space Shuttle mission STS-90 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-80
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Lin, Jill D.
1997-01-01
A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for Shuttle mission STS-80. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission Space Transportation System (STS-80) and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-89
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
1998-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-89. Debris inspections of the flight element and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection systems conditions and integrated photographic analysis of Space Shuttle mission STS-89 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-112
NASA Technical Reports Server (NTRS)
Oliu, Armando
2002-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-112. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-112 and the resulting effect of the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-74
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Lin, Jill D.
1996-01-01
A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for shuttle mission STS-74. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of shuttle mission STS-74 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-87
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
1998-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-87. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the-use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-87 and the resulting effect on the Space Shuttle Program.
Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-96
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
1999-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-96. Debris inspections of the flight elements and launch pad were performed before and after launch. icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-96 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-101
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
2000-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle Mission STS-101. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-101 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-88
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
1999-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-88. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-88 and the resulting effect on the Space Shuttle Program.
NASA Technical Reports Server (NTRS)
Davis, J. Bradley; Bowen, Barry C.; Rivera, Jorge E.; Speece, Robert F.; Katnik, Gregory N.
1994-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-64. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-64, and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-68
NASA Technical Reports Server (NTRS)
Rivera, Jorge E.; Bowen, Barry C.; Davis, J. Bradley; Speece, Robert F.
1994-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-68. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report-documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-68, and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-111
NASA Technical Reports Server (NTRS)
Oliu, Armando
2005-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-111. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-111 and the resulting effect of the Space Shuttle Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cochran, Jaquelin
This fact sheet overviews the Greening the Grid India grid integration study. The use of renewable energy (RE) sources, primarily wind and solar generation, is poised to grow significantly within the Indian power system. The Government of India has established a target of 175 gigawatts (GW) of installed RE capacity by 2022, including 60 GW of wind and 100 GW of solar, up from 29 GW wind and 9 GW solar at the beginning of 2017. Thanks to advanced weather and power system modeling made for this project, the study team is able to explore operational impacts of meeting India'smore » RE targets and identify actions that may be favorable for integration.« less
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-99
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
2000-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-99. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-99 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-98
NASA Technical Reports Server (NTRS)
Speece, Robert F.
2004-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle Mission STS-98. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-98 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of shuttle mission STS-63
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1995-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for shuttle mission STS-63. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the external tank were assessed by the use of computer programs, monographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of shuttle mission STS-63, and the resulting effect on the space shuttle program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-66
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1995-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-66. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer program nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-66, and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-97
NASA Technical Reports Server (NTRS)
Rivera, Jorge E.; Kelly, J. David (Technical Monitor)
2001-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-97. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch were analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris /ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-97 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-86
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Lin, Jill D.
1997-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-86. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-86 and the resulting affect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-100
NASA Technical Reports Server (NTRS)
Oliu, Armando
2004-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-100. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-100 and the resulting effect of the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-92
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.
2000-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-92. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-92 and the resulting effect, if any, on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-65
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1994-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for shuttle mission STS-65. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of shuttle mission STS-65, and the resulting effect on the Space Shuttle Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warren, N. Jill
2002-09-17
These proceedings contain papers prepared for the 24th Seismic Research Review: Nuclear Explosion Monitoring: Innovation and Integration, held 17-19 September, 2002 in Ponte Vedra Beach, Florida. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the precedingmore » year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.« less
How can land-use modelling tools inform bioenergy policies?
Davis, Sarah C.; House, Joanna I.; Diaz-Chavez, Rocio A.; Molnar, Andras; Valin, Hugo; DeLucia, Evan H.
2011-01-01
Targets for bioenergy have been set worldwide to mitigate climate change. Although feedstock sources are often ambiguous, pledges in European nations, the United States and Brazil amount to more than 100 Mtoe of biorenewable fuel production by 2020. As a consequence, the biofuel sector is developing rapidly, and it is increasingly important to distinguish bioenergy options that can address energy security and greenhouse gas mitigation from those that cannot. This paper evaluates how bioenergy production affects land-use change (LUC), and to what extent land-use modelling can inform sound decision-making. We identified local and global internalities and externalities of biofuel development scenarios, reviewed relevant data sources and modelling approaches, identified sources of controversy about indirect LUC (iLUC) and then suggested a framework for comprehensive assessments of bioenergy. Ultimately, plant biomass must be managed to produce energy in a way that is consistent with the management of food, feed, fibre, timber and environmental services. Bioenergy production provides opportunities for improved energy security, climate mitigation and rural development, but the environmental and social consequences depend on feedstock choices and geographical location. The most desirable solutions for bioenergy production will include policies that incentivize regionally integrated management of diverse resources with low inputs, high yields, co-products, multiple benefits and minimal risks of iLUC. Many integrated assessment models include energy resources, trade, technological development and regional environmental conditions, but do not account for biodiversity and lack detailed data on the location of degraded and underproductive lands that would be ideal for bioenergy production. Specific practices that would maximize the benefits of bioenergy production regionally need to be identified before a global analysis of bioenergy-related LUC can be accomplished. PMID:22482028
Introducing Darwinism to Toronto's post-1887 reconstituted medical school.
Court, John P M
2011-01-01
Charles Darwin's scientific paradigm was largely welcomed in Canadian academic biology and medicine, while reaction among other faculty and laypeople ranged from interest to outrage. In 1874, Ramsay Wright, a Darwinian-era biologist from Edinburgh, was appointed to the University of Toronto's Chair of Natural History. Over his 38-year career Wright integrated the evolutionary perspective into medical and biology teaching without accentuating its controversial source. He also applied the emerging German experimental research model and laboratory technology. This study identifies five categories of scientific and personal influences upon Wright through archival research on biographical sources and his writings.
An integrated SNP mining and utilization (ISMU) pipeline for next generation sequencing data.
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A V S K; Varshney, Rajeev K
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software.
2013-01-01
Background Many large-scale studies analyzed high-throughput genomic data to identify altered pathways essential to the development and progression of specific types of cancer. However, no previous study has been extended to provide a comprehensive analysis of pathways disrupted by copy number alterations across different human cancers. Towards this goal, we propose a network-based method to integrate copy number alteration data with human protein-protein interaction networks and pathway databases to identify pathways that are commonly disrupted in many different types of cancer. Results We applied our approach to a data set of 2,172 cancer patients across 16 different types of cancers, and discovered a set of commonly disrupted pathways, which are likely essential for tumor formation in majority of the cancers. We also identified pathways that are only disrupted in specific cancer types, providing molecular markers for different human cancers. Analysis with independent microarray gene expression datasets confirms that the commonly disrupted pathways can be used to identify patient subgroups with significantly different survival outcomes. We also provide a network view of disrupted pathways to explain how copy number alterations affect pathways that regulate cell growth, cycle, and differentiation for tumorigenesis. Conclusions In this work, we demonstrated that the network-based integrative analysis can help to identify pathways disrupted by copy number alterations across 16 types of human cancers, which are not readily identifiable by conventional overrepresentation-based and other pathway-based methods. All the results and source code are available at http://compbio.cs.umn.edu/NetPathID/. PMID:23822816
ERIC Educational Resources Information Center
Sacks, Risa
This book presents interviews with 12 of the best primary researchers in the business. These research professionals reveal their strategies for integrating online and offline resources, identifying experts, and getting past gatekeepers to obtain information that exists only in someone's head. Topics include how searchers use a combination of…
ERIC Educational Resources Information Center
Jarrell, Michele E.; And Others
Procedures used in Doctor of Education (Ed.D.) dissertations at the University of Alabama (Tuscaloosa) were studied. Focus was on identifying: (1) characteristics of the research designs used; (2) sources of the instruments used to collect data; (3) reports of reliability estimates and evidence of validity of the instruments; and (4) types of…
A strong-lensing elliptical galaxy in the MaNGA survey
NASA Astrophysics Data System (ADS)
Smith, Russell J.
2017-01-01
I report discovery of a new galaxy-scale gravitational lens system, identified using public data from the Mapping Galaxies at Apache Point Observatory (MaNGA) survey, as part of a systematic search for lensed background line emitters. The lens is SDSS J170124.01+372258.0, a giant elliptical galaxy with velocity dispersion σ = 256 km s-1, at a redshift of zl = 0.122. After modelling and subtracting the target galaxy light, the integral-field data cube reveals [O II], [O III] and Hβ emission lines corresponding to a source at zs = 0.791, forming an identifiable ring around the galaxy centre. If the ring is formed by a single lensed source, then the Einstein radius is REin ≈ 2.3 arcsec, projecting to ˜5 kpc at the distance of the lens. The total projected lensing mass is MEin = (3.6 ± 0.6) × 1011 M⊙, and the total J-band mass-to-light ratio is 3.0 ± 0.7 solar units. Plausible estimates of the likely dark matter content could reconcile this with a Milky Way-like initial mass function (IMF), for which M/L ≈ 1.5 is expected, but heavier IMFs are by no means excluded with the present data. An alternative interpretation of the system, with a more complex source plane, is also discussed. The discovery of this system bodes well for future lens searches based on MaNGA and other integral-field spectroscopic surveys.
A Close Investigation into Source Use in Integrated Second Language Writing Tasks
ERIC Educational Resources Information Center
Plakans, Lia; Gebril, Atta
2012-01-01
An increasing number of writing programs and assessments are employing writing-from-sources tasks in which reading and writing are integrated. The integration of reading and writing in such contexts raises a number of questions with regard to writers' use of sources in their writing, the functions these sources serve, and how proficiency affects…
EPA Facility Registry Service (FRS): PCS_NPDES
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Permit Compliance System (PCS) or the National Pollutant Discharge Elimination System (NPDES) module of the Integrated Compliance Information System (ICIS). PCS tracks NPDES surface water permits issued under the Clean Water Act. This system is being incrementally replaced by the NPDES module of ICIS. Under NPDES, all facilities that discharge pollutants from any point source into waters of the United States are required to obtain a permit. The permit will likely contain limits on what can be discharged, impose monitoring and reporting requirements, and include other provisions to ensure that the discharge does not adversely affect water quality. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to NPDES facilities once the PCS or ICIS-NPDES data has been integrated into the FRS database. Additional information on FRS is available
Biomedical Ontologies in Action: Role in Knowledge Management, Data Integration and Decision Support
Bodenreider, O.
2008-01-01
Summary Objectives To provide typical examples of biomedical ontologies in action, emphasizing the role played by biomedical ontologies in knowledge management, data integration and decision support. Methods Biomedical ontologies selected for their practical impact are examined from a functional perspective. Examples of applications are taken from operational systems and the biomedical literature, with a bias towards recent journal articles. Results The ontologies under investigation in this survey include SNOMED CT, the Logical Observation Identifiers, Names, and Codes (LOINC), the Foundational Model of Anatomy, the Gene Ontology, RxNorm, the National Cancer Institute Thesaurus, the International Classification of Diseases, the Medical Subject Headings (MeSH) and the Unified Medical Language System (UMLS). The roles played by biomedical ontologies are classified into three major categories: knowledge management (indexing and retrieval of data and information, access to information, mapping among ontologies); data integration, exchange and semantic interoperability; and decision support and reasoning (data selection and aggregation, decision support, natural language processing applications, knowledge discovery). Conclusions Ontologies play an important role in biomedical research through a variety of applications. While ontologies are used primarily as a source of vocabulary for standardization and integration purposes, many applications also use them as a source of computable knowledge. Barriers to the use of ontologies in biomedical applications are discussed. PMID:18660879
PRECISION INTEGRATOR FOR MINUTE ELECTRIC CURRENTS
Hemmendinger, A.; Helmer, R.J.
1961-10-24
An integrator is described for measuring the value of integrated minute electrical currents. The device consists of a source capacitor connected in series with the source of such electrical currents, a second capacitor of accurately known capacitance and a source of accurately known and constant potential, means responsive to the potentials developed across the source capacitor for reversibly connecting the second capacitor in series with the source of known potential and with the source capacitor and at a rate proportional to the potential across the source capacitor to maintain the magnitude of the potential across the source capacitor at approximately zero. (AEC)
Cheng, Meng -Dawn; Kabela, Erik D.
2016-04-30
The Potential Source Contribution Function (PSCF) model has been successfully used for identifying regions of emission source at a long distance in this study, the PSCF model relies on backward trajectories calculated by the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model. In this study, we investigated the impacts of grid resolution and Planetary Boundary Layer (PBL) parameterization (e.g., turbulent transport of pollutants) on the PSCF analysis. The Mellor-Yamada-Janjic (MYJ) and Yonsei University (YUS) parameterization schemes were selected to model the turbulent transport in the PBL within the Weather Research and Forecasting (WRF version 3.6) model. Two separate domain grid sizesmore » (83 and 27 km) were chosen in the WRF downscaling in generating the wind data for driving the HYSPLIT calculation. The effects of grid size and PBL parameterization are important in incorporating the influ- ence of regional and local meteorological processes such as jet streaks, blocking patterns, Rossby waves, and terrain-induced convection on the transport of pollutants by a wind trajectory. We found high resolution PSCF did discover and locate source areas more precisely than that with lower resolution meteorological inputs. The lack of anticipated improvement could also be because a PBL scheme chosen to produce the WRF data was only a local parameterization and unable to faithfully duplicate the real atmosphere on a global scale. The MYJ scheme was able to replicate PSCF source identification by those using the Reanalysis and discover additional source areas that was not identified by the Reanalysis data. In conclusion, a potential benefit for using high-resolution wind data in the PSCF modeling is that it could discover new source location in addition to those identified by using the Reanalysis data input.« less
NASA Astrophysics Data System (ADS)
Cole, M.; Alameh, N.; Bambacus, M.
2006-05-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
Hyam, Roger; Hagedorn, Gregor; Chagnoux, Simon; Röpert, Dominik; Casino, Ana; Droege, Gabi; Glöckler, Falko; Gödderz, Karsten; Groom, Quentin; Hoffmann, Jana; Holleman, Ayco; Kempa, Matúš; Koivula, Hanna; Marhold, Karol; Nicolson, Nicky; Smith, Vincent S.; Triebel, Dagmar
2017-01-01
With biodiversity research activities being increasingly shifted to the web, the need for a system of persistent and stable identifiers for physical collection objects becomes increasingly pressing. The Consortium of European Taxonomic Facilities agreed on a common system of HTTP-URI-based stable identifiers which is now rolled out to its member organizations. The system follows Linked Open Data principles and implements redirection mechanisms to human-readable and machine-readable representations of specimens facilitating seamless integration into the growing semantic web. The implementation of stable identifiers across collection organizations is supported with open source provider software scripts, best practices documentations and recommendations for RDF metadata elements facilitating harmonized access to collection information in web portals. Database URL: http://cetaf.org/cetaf-stable-identifiers PMID:28365724
Identifying equivalent sound sources from aeroacoustic simulations using a numerical phased array
NASA Astrophysics Data System (ADS)
Pignier, Nicolas J.; O'Reilly, Ciarán J.; Boij, Susann
2017-04-01
An application of phased array methods to numerical data is presented, aimed at identifying equivalent flow sound sources from aeroacoustic simulations. Based on phased array data extracted from compressible flow simulations, sound source strengths are computed on a set of points in the source region using phased array techniques assuming monopole propagation. Two phased array techniques are used to compute the source strengths: an approach using a Moore-Penrose pseudo-inverse and a beamforming approach using dual linear programming (dual-LP) deconvolution. The first approach gives a model of correlated sources for the acoustic field generated from the flow expressed in a matrix of cross- and auto-power spectral values, whereas the second approach results in a model of uncorrelated sources expressed in a vector of auto-power spectral values. The accuracy of the equivalent source model is estimated by computing the acoustic spectrum at a far-field observer. The approach is tested first on an analytical case with known point sources. It is then applied to the example of the flow around a submerged air inlet. The far-field spectra obtained from the source models for two different flow conditions are in good agreement with the spectra obtained with a Ffowcs Williams-Hawkings integral, showing the accuracy of the source model from the observer's standpoint. Various configurations for the phased array and for the sources are used. The dual-LP beamforming approach shows better robustness to changes in the number of probes and sources than the pseudo-inverse approach. The good results obtained with this simulation case demonstrate the potential of the phased array approach as a modelling tool for aeroacoustic simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tomsick, John A.; Bodaghee, Arash; Chaty, Sylvain
2012-08-01
We report on Chandra observations of 18 hard X-ray (>20 keV) sources discovered with the INTEGRAL satellite near the Galactic plane. For 14 of the INTEGRAL sources, we have uncovered one or two potential Chandra counterparts per source. These provide soft X-ray (0.3-10 keV) spectra and subarcsecond localizations, which we use to identify counterparts at other wavelengths, providing information about the nature of each source. Despite the fact that all of the sources are within 5 Degree-Sign of the plane, four of the IGR sources are active galactic nuclei (AGNs; IGR J01545+6437, IGR J15391-5307, IGR J15415-5029, and IGR J21565+5948) andmore » four others are likely AGNs (IGR J03103+5706, IGR J09189-4418, IGR J16413-4046, and IGR J16560-4958) based on each of them having a strong IR excess and/or extended optical or near-IR emission. We compare the X-ray and near-IR fluxes of this group of sources to those of AGNs selected by their 2-10 keV emission in previous studies and find that these IGR AGNs are in the range of typical values. There is evidence in favor of four of the sources being Galactic (IGR J12489-6243, IGR J15293-5609, IGR J16173-5023, and IGR J16206-5253), but only IGR J15293-5609 is confirmed as a Galactic source as it has a unique Chandra counterpart and a parallax measurement from previous optical observations that puts its distance at 1.56 {+-} 0.12 kpc. The 0.3-10 keV luminosity for this source is (1.4{sup +1.0}{sub -0.4}) Multiplication-Sign 10{sup 32} erg s{sup -1}, and its optical/IR spectral energy distribution is well described by a blackbody with a temperature of 4200-7000 K and a radius of 12.0-16.4 R{sub Sun }. These values suggest that IGR J15293-5609 is a symbiotic binary with an early K-type giant and a white dwarf accretor. We also obtained likely Chandra identifications for IGR J13402-6428 and IGR J15368-5102, but follow-up observations are required to constrain their source types.« less
NASA Astrophysics Data System (ADS)
Kim, Eugene; Hopke, Philip K.; Edgerton, Eric S.
Daily integrated PM 2.5 (particulate matter ⩽2.5 μm in aerodynamic diameter) composition data including eight individual carbon fractions collected at the Jefferson Street monitoring site in Atlanta were analyzed with positive matrix factorization (PMF). Particulate carbon was analyzed using the thermal optical reflectance method that divides carbon into four organic carbon (OC), pyrolized organic carbon (OP), and three elemental carbon (EC) fractions. A total of 529 samples and 28 variables were measured between August 1998 and August 2000. PMF identified 11 sources in this study: sulfate-rich secondary aerosol I (50%), on-road diesel emissions (11%), nitrate-rich secondary aerosol (9%), wood smoke (7%), gasoline vehicle (6%), sulfate-rich secondary aerosol II (6%), metal processing (3%), airborne soil (3%), railroad traffic (3%), cement kiln/carbon-rich (2%), and bus maintenance facility/highway traffic (2%). Differences from previous studies using only the traditional OC and EC data (J. Air Waste Manag. Assoc. 53(2003a)731; Atmos Environ. (2003b)) include four traffic-related combustion sources (gasoline vehicle, on-road diesel, railroad, and bus maintenance facility) containing carbon fractions whose abundances were different between the various sources. This study indicates that the temperature resolved fractional carbon data can be utilized to enhance source apportionment study, especially with respect to the separation of diesel emissions from gasoline vehicle sources. Conditional probability functions using surface wind data and identified source contributions aid the identifications of local point sources.
Semantic integration to identify overlapping functional modules in protein interaction networks
Cho, Young-Rae; Hwang, Woochang; Ramanathan, Murali; Zhang, Aidong
2007-01-01
Background The systematic analysis of protein-protein interactions can enable a better understanding of cellular organization, processes and functions. Functional modules can be identified from the protein interaction networks derived from experimental data sets. However, these analyses are challenging because of the presence of unreliable interactions and the complex connectivity of the network. The integration of protein-protein interactions with the data from other sources can be leveraged for improving the effectiveness of functional module detection algorithms. Results We have developed novel metrics, called semantic similarity and semantic interactivity, which use Gene Ontology (GO) annotations to measure the reliability of protein-protein interactions. The protein interaction networks can be converted into a weighted graph representation by assigning the reliability values to each interaction as a weight. We presented a flow-based modularization algorithm to efficiently identify overlapping modules in the weighted interaction networks. The experimental results show that the semantic similarity and semantic interactivity of interacting pairs were positively correlated with functional co-occurrence. The effectiveness of the algorithm for identifying modules was evaluated using functional categories from the MIPS database. We demonstrated that our algorithm had higher accuracy compared to other competing approaches. Conclusion The integration of protein interaction networks with GO annotation data and the capability of detecting overlapping modules substantially improve the accuracy of module identification. PMID:17650343
U.S. Army PEM fuel cell programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patil, A.S.; Jacobs, R.
The United States Army has identified the need for lightweight power sources to provide the individual soldier with continuous power for extended periods without resupply. Due to the high cost of primary batteries and the high weight of rechargeable batteries, fuel cell technology is being developed to provide a power source for the individual soldier, sensors, communications equipment and other various applications in the Army. Current programs are in the tech base area and will demonstrate Proton Exchange Membrane (PEM) Fuel Cell Power Sources with low weight and high energy densities. Fuel Cell Power Sources underwent user evaluations in 1996more » that showed a power source weight reduction of 75%. The quiet operation along with the ability to refuel much like an engine was well accepted by the user and numerous applications were investigated. These programs are now aimed at further weight reduction for applications that are weight critical; system integration that will demonstrate a viable military power source; refining the user requirements; and planning for a transition to engineering development.« less
Multiple fingerprinting analyses in quality control of Cassiae Semen polysaccharides.
Cheng, Jing; He, Siyu; Wan, Qiang; Jing, Pu
2018-03-01
Quality control issue overshadows potential health benefits of Cassiae Semen due to the analytic limitations. In this study, multiple-fingerprint analysis integrated with several chemometrics was performed to assess the polysaccharide quality of Cassiae Semen harvested from different locations. FT-IR, HPLC, and GC fingerprints of polysaccharide extracts from the authentic source were established as standard profiles, applying to assess the quality of foreign sources. Analyses of FT-IR fingerprints of polysaccharide extracts using either Pearson correlation analysis or principal component analysis (PCA), or HPLC fingerprints of partially hydrolyzed polysaccharides with PCA, distinguished the foreign sources from the authentic source. However, HPLC or GC fingerprints of completely hydrolyzed polysaccharides couldn't identify all foreign sources and the methodology using GC is quite limited in determining the monosaccharide composition. This indicates that FT-IR/HPLC fingerprints of non/partially-hydrolyzed polysaccharides, respectively, accompanied by multiple chemometrics methods, might be potentially applied in detecting and differentiating sources of Cassiae Semen. Copyright © 2018 Elsevier B.V. All rights reserved.
Stepwise Connectivity of the Modal Cortex Reveals the Multimodal Organization of the Human Brain
Sepulcre, Jorge; Sabuncu, Mert R.; Yeo, Thomas B.; Liu, Hesheng; Johnson, Keith A.
2012-01-01
How human beings integrate information from external sources and internal cognition to produce a coherent experience is still not well understood. During the past decades, anatomical, neurophysiological and neuroimaging research in multimodal integration have stood out in the effort to understand the perceptual binding properties of the brain. Areas in the human lateral occipito-temporal, prefrontal and posterior parietal cortices have been associated with sensory multimodal processing. Even though this, rather patchy, organization of brain regions gives us a glimpse of the perceptual convergence, the articulation of the flow of information from modality-related to the more parallel cognitive processing systems remains elusive. Using a method called Stepwise Functional Connectivity analysis, the present study analyzes the functional connectome and transitions from primary sensory cortices to higher-order brain systems. We identify the large-scale multimodal integration network and essential connectivity axes for perceptual integration in the human brain. PMID:22855814
Moertl, Peter M; Canning, John M; Gronlund, Scott D; Dougherty, Michael R P; Johansson, Joakim; Mills, Scott H
2002-01-01
Prior research examined how controllers plan in their traditional environment and identified various information uncertainties as detriments to planning. A planning aid was designed to reduce this uncertainty by perceptually representing important constraints. This included integrating spatial information on the radar screen with discrete information (planned sequences of air traffic). Previous research reported improved planning performance and decreased workload in the planning aid condition. The purpose of this paper was to determine the source of these performance improvements. Analysis of computer interactions using log-linear modeling showed that the planning interface led to less repetitive--but more integrated--information retrieval compared with the traditional planning environment. Ecological interface design principles helped explain how the integrated information retrieval gave rise to the performance improvements. Actual or potential applications of this research include the design and evaluation of interface automation that keeps users in active control by modification of perceptual task characteristics.
Multisource geological data mining and its utilization of uranium resources exploration
NASA Astrophysics Data System (ADS)
Zhang, Jie-lin
2009-10-01
Nuclear energy as one of clear energy sources takes important role in economic development in CHINA, and according to the national long term development strategy, many more nuclear powers will be built in next few years, so it is a great challenge for uranium resources exploration. Research and practice on mineral exploration demonstrates that utilizing the modern Earth Observe System (EOS) technology and developing new multi-source geological data mining methods are effective approaches to uranium deposits prospecting. Based on data mining and knowledge discovery technology, this paper uses multi-source geological data to character electromagnetic spectral, geophysical and spatial information of uranium mineralization factors, and provides the technical support for uranium prospecting integrating with field remote sensing geological survey. Multi-source geological data used in this paper include satellite hyperspectral image (Hyperion), high spatial resolution remote sensing data, uranium geological information, airborne radiometric data, aeromagnetic and gravity data, and related data mining methods have been developed, such as data fusion of optical data and Radarsat image, information integration of remote sensing and geophysical data, and so on. Based on above approaches, the multi-geoscience information of uranium mineralization factors including complex polystage rock mass, mineralization controlling faults and hydrothermal alterations have been identified, the metallogenic potential of uranium has been evaluated, and some predicting areas have been located.
Branson, Richard A
2009-01-01
The purpose of this article is to describe a model of chiropractic integration developed over a 10-year period within a private hospital system in Minnesota. Needs were assessed by surveying attitudes and behaviors related to chiropractic and complementary and alternative medicine (CAM) of physicians associated with the hospital. Analyzing referral and utilization patterns assessed chiropractic integration into the hospital system. One hundred five surveys were returned after 2 mailings for a response rate of 74%. Seventy-four percent of respondents supported integration of CAM into the hospital system, although 45% supported the primary care physician as the gatekeeper for CAM use. From 2006 to 2008, there were 8294 unique new patients in the chiropractic program. Primary care providers (medical doctors and physician assistants) were the most common referral source, followed by self-referred patients, sports medicine physicians, and orthopedic physicians. Overall examination of the program identified that facilitators of chiropractic integration were (1) growth in interest in CAM, (2) establishing relationships with key administrators and providers, (3) use of evidence-based practice, (4) adequate physical space, and (5) creation of an integrated spine care program. Barriers were (1) lack of understanding of chiropractic professional identity by certain providers and (2) certain financial aspects of third-party payment for chiropractic. This article describes the process of integrating chiropractic into one of the largest private hospital systems in Minnesota from a business and professional perspective and the results achieved once chiropractic was integrated into the system. This study identified key factors that facilitated integration of services and demonstrates that chiropractic care can be successfully integrated within a hospital system.
An integrated workflow for analysis of ChIP-chip data.
Weigelt, Karin; Moehle, Christoph; Stempfl, Thomas; Weber, Bernhard; Langmann, Thomas
2008-08-01
Although ChIP-chip is a powerful tool for genome-wide discovery of transcription factor target genes, the steps involving raw data analysis, identification of promoters, and correlation with binding sites are still laborious processes. Therefore, we report an integrated workflow for the analysis of promoter tiling arrays with the Genomatix ChipInspector system. We compare this tool with open-source software packages to identify PU.1 regulated genes in mouse macrophages. Our results suggest that ChipInspector data analysis, comparative genomics for binding site prediction, and pathway/network modeling significantly facilitate and enhance whole-genome promoter profiling to reveal in vivo sites of transcription factor-DNA interactions.
Developing a response to family violence in primary health care: the New Zealand experience.
Gear, Claire; Koziol-McLain, Jane; Wilson, Denise; Clark, Faye
2016-08-20
Despite primary health care being recognised as an ideal setting to effectively respond to those experiencing family violence, responses are not widely integrated as part of routine health care. A lack of evidence testing models and approaches for health sector integration, alongside challenges of transferability and sustainability, means the best approach in responding to family violence is still unknown. The Primary Health Care Family Violence Responsiveness Evaluation Tool was developed as a guide to implement a formal systems-led response to family violence within New Zealand primary health care settings. Given the difficulties integrating effective, sustainable responses to family violence, we share the experience of primary health care sites that embarked on developing a response to family violence, presenting the enablers, barriers and resources required to maintain, progress and sustain family violence response development. In this qualitative descriptive study data were collected from two sources. Firstly semi-structured focus group interviews were conducted during 24-month follow-up evaluation visits of primary health care sites to capture the enablers, barriers and resources required to maintain, progress and sustain a response to family violence. Secondly the outcomes of a group activity to identify response development barriers and implementation strategies were recorded during a network meeting of primary health care professionals interested in family violence prevention and intervention; findings were triangulated across the two data sources. Four sites, representing three PHOs and four general practices participated in the focus group interviews; 35 delegates from across New Zealand attended the network meeting representing a wider perspective on family violence response development within primary health care. Enablers and barriers to developing a family violence response were identified across four themes: 'Getting started', 'Building effective relationships', 'Sourcing funding' and 'Shaping a national approach to family violence'. The strong commitment of key people dedicated to addressing family violence is essential for response sustainability and would be strengthened by prioritising family violence response as a national health target with dedicated resourcing. Further analysis of the health care system as a complex adaptive system may provide insight into effective approaches to response development and health system integration.
All-source Information Management and Integration for Improved Collective Intelligence Production
2011-06-01
Intelligence (ELINT) • Open Source Intelligence ( OSINT ) • Technical Intelligence (TECHINT) These intelligence disciplines produce... intelligence , measurement and signature intelligence , signals intelligence , and open - source data, in the production of intelligence . All- source intelligence ...All- Source Information Integration and Management) R&D Project 3 All- Source Intelligence
NASA Astrophysics Data System (ADS)
Nowicki, Cassandre; Gosselin, Louis
2012-08-01
Efficient smelters currently consume roughly 13 MWh of electricity per ton of aluminum, while roughly half of that energy is lost as thermal waste. Although waste heat is abundant, current thermal integration in primary aluminum facilities remains limited. This is due to both the low quality of waste heat available and the shortage of potential uses within reasonable distance of identified waste heat sources. In this article, we present a mapping of both heat dissipation processes and heat demands around a sample facility (Alcoa Deschambault Quebec smelter). Our primary aim is to report opportunities for heat recovery and integration in the primary aluminum industry. We consider potential heat-to-sink pairings individually and assess their thermodynamic potential for producing energy savings.
NASA Astrophysics Data System (ADS)
Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.
2017-05-01
An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.
The continued development of the Spallation Neutron Source external antenna H- ion sourcea)
NASA Astrophysics Data System (ADS)
Welton, R. F.; Carmichael, J.; Desai, N. J.; Fuga, R.; Goulding, R. H.; Han, B.; Kang, Y.; Lee, S. W.; Murray, S. N.; Pennisi, T.; Potter, K. G.; Santana, M.; Stockli, M. P.
2010-02-01
The U.S. Spallation Neutron Source (SNS) is an accelerator-based, pulsed neutron-scattering facility, currently in the process of ramping up neutron production. In order to ensure that the SNS will meet its operational commitments as well as provide for future facility upgrades with high reliability, we are developing a rf-driven, H- ion source based on a water-cooled, ceramic aluminum nitride (AlN) plasma chamber. To date, early versions of this source have delivered up to 42 mA to the SNS front end and unanalyzed beam currents up to ˜100 mA (60 Hz, 1 ms) to the ion source test stand. This source was operated on the SNS accelerator from February to April 2009 and produced ˜35 mA (beam current required by the ramp up plan) with availability of ˜97%. During this run several ion source failures identified reliability issues, which must be addressed before the source re-enters production: plasma ignition, antenna lifetime, magnet cooling, and cooling jacket integrity. This report discusses these issues, details proposed engineering solutions, and notes progress to date.
SZDB: A Database for Schizophrenia Genetic Research
Wu, Yong; Yao, Yong-Gang
2017-01-01
Abstract Schizophrenia (SZ) is a debilitating brain disorder with a complex genetic architecture. Genetic studies, especially recent genome-wide association studies (GWAS), have identified multiple variants (loci) conferring risk to SZ. However, how to efficiently extract meaningful biological information from bulk genetic findings of SZ remains a major challenge. There is a pressing need to integrate multiple layers of data from various sources, eg, genetic findings from GWAS, copy number variations (CNVs), association and linkage studies, gene expression, protein–protein interaction (PPI), co-expression, expression quantitative trait loci (eQTL), and Encyclopedia of DNA Elements (ENCODE) data, to provide a comprehensive resource to facilitate the translation of genetic findings into SZ molecular diagnosis and mechanism study. Here we developed the SZDB database (http://www.szdb.org/), a comprehensive resource for SZ research. SZ genetic data, gene expression data, network-based data, brain eQTL data, and SNP function annotation information were systematically extracted, curated and deposited in SZDB. In-depth analyses and systematic integration were performed to identify top prioritized SZ genes and enriched pathways. Multiple types of data from various layers of SZ research were systematically integrated and deposited in SZDB. In-depth data analyses and integration identified top prioritized SZ genes and enriched pathways. We further showed that genes implicated in SZ are highly co-expressed in human brain and proteins encoded by the prioritized SZ risk genes are significantly interacted. The user-friendly SZDB provides high-confidence candidate variants and genes for further functional characterization. More important, SZDB provides convenient online tools for data search and browse, data integration, and customized data analyses. PMID:27451428
Saux, Gaston; Britt, Anne; Le Bigot, Ludovic; Vibert, Nicolas; Burin, Debora; Rouet, Jean-François
2017-01-01
According to the documents model framework (Britt, Perfetti, Sandak, & Rouet, 1999), readers' detection of contradictions within texts increases their integration of source-content links (i.e., who says what). This study examines whether conflict may also strengthen the relationship between the respective sources. In two experiments, participants read brief news reports containing two critical statements attributed to different sources. In half of the reports, the statements were consistent with each other, whereas in the other half they were discrepant. Participants were tested for source memory and source integration in an immediate item-recognition task (Experiment 1) and a cued recall task (Experiments 1 and 2). In both experiments, discrepancies increased readers' memory for sources. We found that discrepant sources enhanced retrieval of the other source compared to consistent sources (using a delayed recall measure; Experiments 1 and 2). However, discrepant sources failed to prime the other source as evidenced in an online recognition measure (Experiment 1). We argue that discrepancies promoted the construction of links between sources, but that integration did not take place during reading.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Kelly K.; Zavala-Zraiza, Daniel
Here, we summarize an effort to develop a global oil and gas infrastructure (GOGI) taxonomy and geodatabase, using a combination of big data computing, custom search and data integration algorithms, and expert driven spatio-temporal analytics to identify, access, and evaluate open oil and gas data resources and uncertainty trends worldwide. This approach leveraged custom National Energy Technology Laboratory (NETL) tools and capabilities in collaboration with Environmental Defense Fund (EDF) and Carbon Limits subject matter expertise, to identify over 380 datasets and integrate more than 4.8 million features into the GOGI database. In addition to acquisition of open oil and gasmore » infrastructure data, information was collected and analyzed to assess the spatial, temporal, and source quality of these resources, and estimate their completeness relative to the top 40 hydrocarbon producing and consuming countries.« less
[Playful strategies for data collection with child cancer patients: an integrative review].
Sposito, Amanda Mota Pacciulio; de Sparapani, Valéria Cássia; Pfeifer, Luzia Iara; de Lima, Regina Aparecida Garcia; Nascimento, Lucila Castanheira
2013-09-01
Children are the best sources of information on their experiences and opinions, and qualitative studies have favored the development and application of techniques that facilitate their self-expression and approaching the researcher. Through an integrative literature review, the objective of this research was to identify playful resources used in qualitative research data collection with child cancer patients, and their forms of application. Systemized searches of electronic databases and a virtual library were undertaken, which, combined with a non-systemized sample, totaled 15 studies spanning the period from 2000 and 2010. Drawing, toys, puppets, photography and creativity and sensitivity dynamics were identified which, in association with interviews or not, were shown to directly or indirectly facilitate data collection, thereby broadening the interaction with the children, and permitting further expression of their feelings. The advantages and limitations of using these resources are presented thus contributing to planning research with children.
Harrison, Jolie; Ferguson, Megan; Gedamke, Jason; Hatch, Leila; Southall, Brandon; Van Parijs, Sofie
2016-01-01
To help manage chronic and cumulative impacts of human activities on marine mammals, the National Oceanic and Atmospheric Administration (NOAA) convened two working groups, the Underwater Sound Field Mapping Working Group (SoundMap) and the Cetacean Density and Distribution Mapping Working Group (CetMap), with overarching effort of both groups referred to as CetSound, which (1) mapped the predicted contribution of human sound sources to ocean noise and (2) provided region/time/species-specific cetacean density and distribution maps. Mapping products were presented at a symposium where future priorities were identified, including institutionalization/integration of the CetSound effort within NOAA-wide goals and programs, creation of forums and mechanisms for external input and funding, and expanded outreach/education. NOAA is subsequently developing an ocean noise strategy to articulate noise conservation goals and further identify science and management actions needed to support them.
Shieh, Fwu-Shan; Jongeneel, Patrick; Steffen, Jamin D; Lin, Selena; Jain, Surbhi; Song, Wei; Su, Ying-Hsiu
2017-01-01
Identification of viral integration sites has been important in understanding the pathogenesis and progression of diseases associated with particular viral infections. The advent of next-generation sequencing (NGS) has enabled researchers to understand the impact that viral integration has on the host, such as tumorigenesis. Current computational methods to analyze NGS data of virus-host junction sites have been limited in terms of their accessibility to a broad user base. In this study, we developed a software application (named ChimericSeq), that is the first program of its kind to offer a graphical user interface, compatibility with both Windows and Mac operating systems, and optimized for effectively identifying and annotating virus-host chimeric reads within NGS data. In addition, ChimericSeq's pipeline implements custom filtering to remove artifacts and detect reads with quantitative analytical reporting to provide functional significance to discovered integration sites. The improved accessibility of ChimericSeq through a GUI interface in both Windows and Mac has potential to expand NGS analytical support to a broader spectrum of the scientific community.
NASA Technical Reports Server (NTRS)
2005-01-01
The Baseline Report captures range and spaceport capabilities at five sites: KSC, CCAFS, VAFB, Wallops, and Kodiak. The Baseline depicts a future state that relies on existing technology, planned upgrades, and straight-line recapitalization at these sites projected through 2030. The report presents an inventory of current spaceport and range capabilities at these five sites. The baseline is the first part of analyzing a business case for a set of capabilities designed to transform U.S. ground and space launch operations toward a single, integrated national "system" of space transportation systems. The second part of the business case compares current capabilities with technologies needed to support the integrated national "system". The final part, a return on investment analysis, identifies the technologies that best lead to the integrated national system and reduce recurring costs..Numerous data sources were used to define and describe the baseline spaceport and range by identifying major systems and elements and describing capabilities, limitations, and capabilities
Shieh, Fwu-Shan; Jongeneel, Patrick; Steffen, Jamin D.; Lin, Selena; Jain, Surbhi; Song, Wei
2017-01-01
Identification of viral integration sites has been important in understanding the pathogenesis and progression of diseases associated with particular viral infections. The advent of next-generation sequencing (NGS) has enabled researchers to understand the impact that viral integration has on the host, such as tumorigenesis. Current computational methods to analyze NGS data of virus-host junction sites have been limited in terms of their accessibility to a broad user base. In this study, we developed a software application (named ChimericSeq), that is the first program of its kind to offer a graphical user interface, compatibility with both Windows and Mac operating systems, and optimized for effectively identifying and annotating virus-host chimeric reads within NGS data. In addition, ChimericSeq’s pipeline implements custom filtering to remove artifacts and detect reads with quantitative analytical reporting to provide functional significance to discovered integration sites. The improved accessibility of ChimericSeq through a GUI interface in both Windows and Mac has potential to expand NGS analytical support to a broader spectrum of the scientific community. PMID:28829778
Simulation of networks of spiking neurons: A review of tools and strategies
Brette, Romain; Rudolph, Michelle; Carnevale, Ted; Hines, Michael; Beeman, David; Bower, James M.; Diesmann, Markus; Morrison, Abigail; Goodman, Philip H.; Harris, Frederick C.; Zirpe, Milind; Natschläger, Thomas; Pecevski, Dejan; Ermentrout, Bard; Djurfeldt, Mikael; Lansner, Anders; Rochel, Olivier; Vieville, Thierry; Muller, Eilif; Davison, Andrew P.; El Boustani, Sami
2009-01-01
We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin–Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks. PMID:17629781
Managing and mitigating conflict in healthcare teams: an integrative review.
Almost, Joan; Wolff, Angela C; Stewart-Pyne, Althea; McCormick, Loretta G; Strachan, Diane; D'Souza, Christine
2016-07-01
To review empirical studies examining antecedents (sources, causes, predictors) in the management and mitigation of interpersonal conflict. Providing quality care requires positive, collaborative working relationships among healthcare team members. In today's increasingly stress-laden work environments, such relationships can be threatened by interpersonal conflict. Identifying the underlying causes of conflict and choice of conflict management style will help practitioners, leaders and managers build an organizational culture that fosters collegiality and create the best possible environment to engage in effective conflict management. Integrative literature review. CINAHL, MEDLINE, PsycINFO, Proquest ABI/Inform, Cochrane Library and Joanne Briggs Institute Library were searched for empirical studies published between 2002-May 2014. The review was informed by the approach of Whittemore and Knafl. Findings were extracted, critically examined and grouped into themes. Forty-four papers met the inclusion criteria. Several antecedents influence conflict and choice of conflict management style including individual characteristics, contextual factors and interpersonal conditions. Sources most frequently identified include lack of emotional intelligence, certain personality traits, poor work environment, role ambiguity, lack of support and poor communication. Very few published interventions were found. By synthesizing the knowledge and identifying antecedents, this review offers evidence to support recommendations on managing and mitigating conflict. As inevitable as conflict is, it is the responsibility of everyone to increase their own awareness, accountability and active participation in understanding conflict and minimizing it. Future research should investigate the testing of interventions to minimize these antecedents and, subsequently, reduce conflict. © 2016 John Wiley & Sons Ltd.
Understanding Transitions Toward Sustainable Urban Water Management: Miami, Las Vegas, Los Angeles
NASA Astrophysics Data System (ADS)
Garcia, M. E.; Manago, K. F.; Treuer, G.; Deslatte, A.; Koebele, E.; Ernst, K.
2016-12-01
Cities in the United States face numerous threats to their long-term water supplies including preserving ecosystems, competing uses, and climate change. Yet, it is unclear why only some cities have transitioned toward more sustainable water management. These transitions include strategies such as water conservation, water supply portfolio diversification, long-term planning, and integrated resource management. While the circumstances that motivate or moderate transition may vary greatly across cities' physical and institutional contexts, identifying common factors associated with transition can help resource managers capitalize on windows of opportunity for change. To begin the process of identifying such factors, we ask two questions: 1) what combinations of conditions are associated with water management transitions?, and 2) what are the outcomes of these transitions? We examine three cases of utility-level water management in Miami, Las Vegas, and Los Angeles to create data-driven narratives detailing each city's transition. These narratives systematically synthesize multiple data sources to enable cross-case comparison and provide insights into how and why cities transition. Using the foundational concepts from the exposure-based theory of urban change, we focus our analysis on three broad categories of variables that influence urban water management transition: biophysical, political, and regulatory exposures. First, we compare these factors across time and across cities using metrics that standardize diverse data sources. Next, we incorporate qualitative factors that capture a city's unique conditions by integrating these metrics with salient contextual information. Then, through cross-city comparison, we identify factors associated with transition.
Adaptable Information Models in the Global Change Information System
NASA Astrophysics Data System (ADS)
Duggan, B.; Buddenberg, A.; Aulenbach, S.; Wolfe, R.; Goldstein, J.
2014-12-01
The US Global Change Research Program has sponsored the creation of the Global Change Information System (
Improving postapproval drug safety surveillance: getting better information sooner.
Hennessy, Sean; Strom, Brian L
2015-01-01
Adverse drug events (ADEs) are an important public health concern, accounting for 5% of all hospital admissions and two-thirds of all complications occurring shortly after hospital discharge. There are often long delays between when a drug is approved and when serious ADEs are identified. Recent and ongoing advances in drug safety surveillance include the establishment of government-sponsored networks of population databases, the use of data mining approaches, and the formal integration of diverse sources of drug safety information. These advances promise to reduce delays in identifying drug-related risks and in providing reassurance about the absence of such risks.
XMM-Newton discovery of pulsations from IGR J21237+4218=V2069 Cyg
NASA Astrophysics Data System (ADS)
de Martino, D.; Bonnet-Bidaud, J. M.; Falanga, M.; Mouchet, M.; Motch, C.
2009-06-01
We report on a preliminary analysis of a XMM-Newton observation of the INTEGRAL source IGR J21237+4218 identified as the cataclysmic variable RXJ2123.7+4217=V2069 Cyg (Motch et al. 1996 A&A 307, 459; Barlow et al. 2006, MNRAS 372, 224). This observation was performed on April 30, 2009 (Start time: 2009-04-30T10:45:58.000) for a total of 28ksec (Observation ID: 0601270101). The source is detected in the EPIC cameras at an average net countrate of 1.05 cts/sec (EPIC-pn) and 0.65cts/sec (EPIC-MOS).
EPA FRS Facilities Combined File CSV Download for the Marshall Islands
The Facility Registry System (FRS) identifies facilities, sites, or places subject to environmental regulation or of environmental interest to EPA programs or delegated states. Using vigorous verification and data management procedures, FRS integrates facility data from program national systems, state master facility records, tribal partners, and other federal agencies and provides the Agency with a centrally managed, single source of comprehensive and authoritative information on facilities.
EPA FRS Facilities Single File CSV Download for the Marshall Islands
The Facility Registry System (FRS) identifies facilities, sites, or places subject to environmental regulation or of environmental interest to EPA programs or delegated states. Using vigorous verification and data management procedures, FRS integrates facility data from program national systems, state master facility records, tribal partners, and other federal agencies and provides the Agency with a centrally managed, single source of comprehensive and authoritative information on facilities.
Analytical Approach to the Characterization of Military Lubricants
1976-03-01
Continue on reverse side It necessary and Identity by block number) As an integral part of the Army’s overall power train lubrication research! effort...favorable results. Technology has progressed to where it is now possible to qualitatively analyze and quantitate the major base stock components in hybrid...lubricant component to equipment performance, and identify sources of new, used, synthetic and re-refined lubricants, power train and hydraulic
Common source cascode amplifiers for integrating IR-FPA applications
NASA Technical Reports Server (NTRS)
Woolaway, James T.; Young, Erick T.
1989-01-01
Space based astronomical infrared measurements present stringent performance requirements on the infrared detector arrays and their associated readout circuitry. To evaluate the usefulness of commercial CMOS technology for astronomical readout applications a theoretical and experimental evaluation was performed on source follower and common-source cascode integrating amplifiers. Theoretical analysis indicates that for conditions where the input amplifier integration capacitance is limited by the detectors capacitance the input referred rms noise electrons of each amplifier should be equivalent. For conditions of input gate limited capacitance the source follower should provide lower noise. Measurements of test circuits containing both source follower and common source cascode circuits showed substantially lower input referred noise for the common-source cascode input circuits. Noise measurements yielded 4.8 input referred rms noise electrons for an 8.5 minute integration. The signal and noise gain of the common-source cascode amplifier appears to offer substantial advantages in acheiving predicted noise levels.
Development and Performance of a Filter Radiometer Monitor System for Integrating Sphere Sources
NASA Technical Reports Server (NTRS)
Ding, Leibo; Kowalewski, Matthew G.; Cooper, John W.; Smith, GIlbert R.; Barnes, Robert A.; Waluschka, Eugene; Butler, James J.
2011-01-01
The NASA Goddard Space Flight Center (GSFC) Radiometric Calibration Laboratory (RCL) maintains several large integrating sphere sources covering the visible to the shortwave infrared wavelength range. Two critical, functional requirements of an integrating sphere source are short and long-term operational stability and repeatability. Monitoring the source is essential in determining the origin of systemic errors, thus increasing confidence in source performance and quantifying repeatability. If monitor data falls outside the established parameters, this could be an indication that the source requires maintenance or re-calibration against the National Institute of Science and Technology (NIST) irradiance standard. The GSFC RCL has developed a Filter Radiometer Monitoring System (FRMS) to continuously monitor the performance of its integrating sphere calibration sources in the 400 to 2400nm region. Sphere output change mechanisms include lamp aging, coating (e.g. BaSO4) deterioration, and ambient water vapor level. The Filter Radiometer Monitor System (FRMS) wavelength bands are selected to quantify changes caused by these mechanisms. The FRMS design and operation are presented, as well as data from monitoring four of the RCL s integrating sphere sources.
Photochemical grid model implementation and application of ...
For the purposes of developing optimal emissions control strategies, efficient approaches are needed to identify the major sources or groups of sources that contribute to elevated ozone (O3) concentrations. Source-based apportionment techniques implemented in photochemical grid models track sources through the physical and chemical processes important to the formation and transport of air pollutants. Photochemical model source apportionment has been used to track source impacts of specific sources, groups of sources (sectors), sources in specific geographic areas, and stratospheric and lateral boundary inflow on O3. The implementation and application of a source apportionment technique for O3 and its precursors, nitrogen oxides (NOx) and volatile organic compounds (VOCs), for the Community Multiscale Air Quality (CMAQ) model are described here. The Integrated Source Apportionment Method (ISAM) O3 approach is a hybrid of source apportionment and source sensitivity in that O3 production is attributed to precursor sources based on O3 formation regime (e.g., for a NOx-sensitive regime, O3 is apportioned to participating NOx emissions). This implementation is illustrated by tracking multiple emissions source sectors and lateral boundary inflow. NOx, VOC, and O3 attribution to tracked sectors in the application are consistent with spatial and temporal patterns of precursor emissions. The O3 ISAM implementation is further evaluated through comparisons of apportioned am
Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan
2016-07-01
Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.
Technologies for autonomous integrated lab-on-chip systems for space missions
NASA Astrophysics Data System (ADS)
Nascetti, A.; Caputo, D.; Scipinotti, R.; de Cesare, G.
2016-11-01
Lab-on-chip devices are ideal candidates for use in space missions where experiment automation, system compactness, limited weight and low sample and reagent consumption are required. Currently, however, most microfluidic systems require external desktop instrumentation to operate and interrogate the chip, thus strongly limiting their use as stand-alone systems. In order to overcome the above-mentioned limitations our research group is currently working on the design and fabrication of "true" lab-on-chip systems that integrate in a single device all the analytical steps from the sample preparation to the detection without the need for bulky external components such as pumps, syringes, radiation sources or optical detection systems. Three critical points can be identified to achieve 'true' lab-on-chip devices: sample handling, analytical detection and signal transduction. For each critical point, feasible solutions are presented and evaluated. Proposed microfluidic actuation and control is based on electrowetting on dielectrics, autonomous capillary networks and active valves. Analytical detection based on highly specific chemiluminescent reactions is used to avoid external radiation sources. Finally, the integration on the same chip of thin film sensors based on hydrogenated amorphous silicon is discussed showing practical results achieved in different sensing tasks.
Events as power source: wireless sustainable corrosion monitoring.
Sun, Guodong; Qiao, Guofu; Zhao, Lin; Chen, Zhibo
2013-12-17
This study presents and implements a corrosion-monitoring wireless sensor platform, EPS (Events as Power Source), which monitors the corrosion events in reinforced concrete (RC) structures, while being powered by the micro-energy released from the corrosion process. In EPS, the proposed corrosion-sensing device serves both as the signal source for identifying corrosion and as the power source for driving the sensor mote, because the corrosion process (event) releases electric energy; this is a novel idea proposed by this study. For accumulating the micro-corrosion energy, we integrate EPS with a COTS (Commercial Off-The-Shelf) energy-harvesting chip that recharges a supercapacitor. In particular, this study designs automatic energy management and adaptive transmitted power control polices to efficiently use the constrained accumulated energy. Finally, a set of preliminary experiments based on concrete pore solution are conducted to evaluate the feasibility and the efficacy of EPS.
The nature of the embedded population in the Rho Ophiuchi dark cloud - Mid-infrared observations
NASA Technical Reports Server (NTRS)
Lada, C. J.; Wilking, B. A.
1984-01-01
In combination with previous IR and optical data, the present 10-20 micron observations of previously identified members of the embedded population of the Rho Ophiuchi dark cloud allow determinations to be made of the broadband energy distributions for 32 of the 44 sources. The majority of the sources are found to emit the bulk of their luminosity in the 1-20 micron range, and to be surrounded by dust shells. Because they are, in light of these characteristics, probably premain-sequence in nature, relatively accurate bolometric luminosities for these objects can be obtained through integration of their energy distributions. It is found that 44 percent of the sources are less luminous than the sun, and are among the lowest luminosity premain-sequence/protostellar objects observed to date.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bragg-Sitton, Shannon; Boardman, Richard; Ruth, Mark
The U.S. Department of Energy (DOE) recognizes the need to transform the energy infrastructure of the U.S. and elsewhere to systems that can significantly reduce environmental impacts in an efficient and economically viable manner while utilizing both clean energy generation sources and hydrocarbon resources. Thus, DOE is supporting research and development that could lead to more efficient utilization of clean nuclear and renewable energy generation sources. A concept being advanced by the DOE Offices of Nuclear Energy (NE) and Energy Efficiency and Renewable Energy (EERE) is tighter coupling of nuclear and renewable energy sources in a manner that better optimizesmore » energy use for the combined electricity, industrial manufacturing, and the transportation sectors. This integration concept has been referred to as a “hybrid system” that is capable of providing energy (thermal or electrical) where it is needed, when it is needed. For the purposes of this work, the hybrid system would integrate two or more energy resources to generate two or more products, one of which must be an energy commodity, such as electricity or transportation fuel. This definition requires coupling of subsystems ‘‘behind’’ the electrical transmission bus, where energy flows are dynamically apportioned as necessary to meet demand and the system has a single connection to the grid that provides dispatchable electricity as required while capital intensive generation assets operate at full capacity. Development of integrated energy systems for an “energy park” must carefully consider the intended location and the associated regional resources, traditional industrial processes, energy delivery infrastructure, and markets to identify viable region-specific system configurations. This paper will provide an overview of the current status of regional hybrid energy system design, development and application of dynamic analysis tools to assess technical and economic performance, and roadmap development to identify and prioritize component, subsystem and system testing that will lead to prototype demonstration.« less
Wu, Yiping; Chen, Ji
2013-01-01
Understanding the physical processes of point source (PS) and nonpoint source (NPS) pollution is critical to evaluate river water quality and identify major pollutant sources in a watershed. In this study, we used the physically-based hydrological/water quality model, Soil and Water Assessment Tool, to investigate the influence of PS and NPS pollution on the water quality of the East River (Dongjiang in Chinese) in southern China. Our results indicate that NPS pollution was the dominant contribution (>94%) to nutrient loads except for mineral phosphorus (50%). A comprehensive Water Quality Index (WQI) computed using eight key water quality variables demonstrates that water quality is better upstream than downstream despite the higher level of ammonium nitrogen found in upstream waters. Also, the temporal (seasonal) and spatial distributions of nutrient loads clearly indicate the critical time period (from late dry season to early wet season) and pollution source areas within the basin (middle and downstream agricultural lands), which resource managers can use to accomplish substantial reduction of NPS pollutant loadings. Overall, this study helps our understanding of the relationship between human activities and pollutant loads and further contributes to decision support for local watershed managers to protect water quality in this region. In particular, the methods presented such as integrating WQI with watershed modeling and identifying the critical time period and pollutions source areas can be valuable for other researchers worldwide.
NASA Astrophysics Data System (ADS)
Twohig, Sarah; Pattison, Ian; Sander, Graham
2017-04-01
Fine sediment poses a significant threat to UK river systems in terms of vegetation, aquatic habitats and morphology. Deposition of fine sediment onto the river bed reduces channel capacity resulting in decreased volume to contain high flow events. Once the in channel problem has been identified managers are under pressure to sustainably mitigate flood risk. With climate change and land use adaptations increasing future pressures on river catchments it is important to consider the connectivity of fine sediment throughout the river catchment and its influence on channel capacity, particularly in systems experiencing long term aggradation. Fine sediment erosion is a continuing concern in the River Eye, Leicestershire. The predominately rural catchment has a history of flooding within the town of Melton Mowbray. Fine sediment from agricultural fields has been identified as a major contributor of sediment delivery into the channel. Current mitigation measures are not sustainable or successful in preventing the continuum of sediment throughout the catchment. Identifying the potential sources and connections of fine sediment would provide insight into targeted catchment management. 'Sensitive Catchment Integrated Modelling Analysis Platforms' (SCIMAP) is a tool often used by UK catchment managers to identify potential sources and routes of sediment within a catchment. SCIMAP is a risk based model that combines hydrological (rainfall) and geomorphic controls (slope, land cover) to identify the risk of fine sediment being transported from source into the channel. A desktop version of SCIMAP was run for the River Eye at a catchment scale using 5m terrain, rainfall and land cover data. A series of SCIMAP model runs were conducted changing individual parameters to determine the sensitivity of the model. Climate Change prediction data for the catchment was used to identify potential areas of future connectivity and erosion risk for catchment managers. The results have been subjected to field validation as part of a wider research project which provides an indication of the robustness of widespread models as effective management tools.
A fast and high performance multiple data integration algorithm for identifying human disease genes
2015-01-01
Background Integrating multiple data sources is indispensable in improving disease gene identification. It is not only due to the fact that disease genes associated with similar genetic diseases tend to lie close with each other in various biological networks, but also due to the fact that gene-disease associations are complex. Although various algorithms have been proposed to identify disease genes, their prediction performances and the computational time still should be further improved. Results In this study, we propose a fast and high performance multiple data integration algorithm for identifying human disease genes. A posterior probability of each candidate gene associated with individual diseases is calculated by using a Bayesian analysis method and a binary logistic regression model. Two prior probability estimation strategies and two feature vector construction methods are developed to test the performance of the proposed algorithm. Conclusions The proposed algorithm is not only generated predictions with high AUC scores, but also runs very fast. When only a single PPI network is employed, the AUC score is 0.769 by using F2 as feature vectors. The average running time for each leave-one-out experiment is only around 1.5 seconds. When three biological networks are integrated, the AUC score using F3 as feature vectors increases to 0.830, and the average running time for each leave-one-out experiment takes only about 12.54 seconds. It is better than many existing algorithms. PMID:26399620
Teachers' Voices on Integrating Metacognition into Science Education
NASA Astrophysics Data System (ADS)
Ben-David, Adi; Orion, Nir
2013-12-01
This study is an attempt to gain new insight, on behalf of science teachers, into the integration of metacognition (MC) into science education. Participants were 44 elementary school science teachers attending an in-service teacher-training (INST) program. Data collection was carried out by several data sources: recordings of all verbal discussions that took place during the program, teachers' written reflections, and semi-structured individual interviews. Our study provides a qualitative analysis of the 44 teachers' voices as a group, as well as a detailed case-study narrative analysis of three teachers' stories The findings show that the teachers' intuitive (pre-instructional) thinking was incomplete and unsatisfactory and their voices were skeptical and against the integration of MC. After teachers had mastered the notion of MC in the INST program, the following outcomes have been identified: (a) teachers expressed amazement at how such an important and relevant issue had been almost invisible to them; (b) teachers identified the affective character of metacognitive experiences as the most significant facet of MC, which acts as a mediator between teaching and learning; (c) the complete lack of learning materials addressing MC and the absence of supportive in-classroom guidance were identified as the major obstacles for its implementation; (d) teachers expressed a willingness to continue their professional development toward expanding their abilities to integrate MC as an inseparable component of the science curriculum. The implications of the findings for professional development courses in the field of MC are discussed.
On space of integrable quantum field theories
Smirnov, F. A.; Zamolodchikov, A. B.
2016-12-21
Here, we study deformations of 2D Integrable Quantum Field Theories (IQFT) which preserve integrability (the existence of infinitely many local integrals of motion). The IQFT are understood as “effective field theories”, with finite ultraviolet cutoff. We show that for any such IQFT there are infinitely many integrable deformations generated by scalar local fields X s, which are in one-to-one correspondence with the local integrals of motion; moreover, the scalars X s are built from the components of the associated conserved currents in a universal way. The first of these scalars, X 1, coincides with the composite field View the MathMLmore » source(TT¯) built from the components of the energy–momentum tensor. The deformations of quantum field theories generated by X 1 are “solvable” in a certain sense, even if the original theory is not integrable. In a massive IQFT the deformations X s are identified with the deformations of the corresponding factorizable S-matrix via the CDD factor. The situation is illustrated by explicit construction of the form factors of the operators X s in sine-Gordon theory. Lastly, we also make some remarks on the problem of UV completeness of such integrable deformations.« less
Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Hector J.
2016-01-01
The majority of restoration strategies in the wake of large-scale disasters have focused on short-term emergency response solutions. Few consider medium- to long-term restoration strategies to reconnect urban areas to national supply chain interdependent critical infrastructure systems (SCICI). These SCICI promote the effective flow of goods, services, and information vital to the economic vitality of an urban environment. To re-establish the connectivity that has been broken during a disaster between the different SCICI, relationships between these systems must be identified, formulated, and added to a common framework to form a system-level restoration plan. To accomplish this goal, a considerable collection of SCICI data is necessary. The aim of this paper is to review what data are required for model construction, the accessibility of these data, and their integration with each other. While a review of publically available data reveals a dearth of real-time data to assist modeling long-term recovery following an extreme event, a significant amount of static data does exist and these data can be used to model the complex interdependencies needed. For the sake of illustration, a particular SCICI (transportation) is used to highlight the challenges of determining the interdependencies and creating models capable of describing the complexity of an urban environment with the data publically available. Integration of such data as is derived from public domain sources is readily achieved in a geospatial environment, after all geospatial infrastructure data are the most abundant data source and while significant quantities of data can be acquired through public sources, a significant effort is still required to gather, develop, and integrate these data from multiple sources to build a complete model. Therefore, while continued availability of high quality, public information is essential for modeling efforts in academic as well as government communities, a more streamlined approach to a real-time acquisition and integration of these data is essential.
Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.
Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua
2015-01-01
A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.
Data mining: childhood injury control and beyond.
Tepas, Joseph J
2009-08-01
Data mining is defined as the automatic extraction of useful, often previously unknown information from large databases or data sets. It has become a major part of modern life and is extensively used in industry, banking, government, and health care delivery. The process requires a data collection system that integrates input from multiple sources containing critical elements that define outcomes of interest. Appropriately designed data mining processes identify and adjust for confounding variables. The statistical modeling used to manipulate accumulated data may involve any number of techniques. As predicted results are periodically analyzed against those observed, the model is consistently refined to optimize precision and accuracy. Whether applying integrated sources of clinical data to inferential probabilistic prediction of risk of ventilator-associated pneumonia or population surveillance for signs of bioterrorism, it is essential that modern health care providers have at least a rudimentary understanding of what the concept means, how it basically works, and what it means to current and future health care.
Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G
2011-01-01
A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.
Sensory regulation of C. elegans male mate-searching behaviour
Barrios, Arantza; Nurrish, Stephen; Emmons, Scott W.
2009-01-01
Summary How do animals integrate internal drives and external environmental cues to coordinate behaviours? We address this question studying mate-searching behaviour in C. elegans. C. elgans males explore their environment in search of mates (hermaphrodites) and will leave food if mating partners are absent. However, when mates and food coincide, male exploratory behaviour is suppressed and males are retained on the food source. We show that the drive to explore is stimulated by male specific neurons in the tail, the ray neurons. Periodic contact with the hermaphrodite detected through ray neurons changes the male’s behaviour during periods of no contact and prevents the male from leaving the food source. The hermaphrodite signal is conveyed by male-specific interneurons that are post-synaptic to the rays and that send processes to the major integrative center in the head. This study identifies key parts of the neural circuit that regulates a sexual appetitive behaviour in C. elegans. PMID:19062284
Composition and Sources of Fine and Coarse Particles Collected during 2002–2010 in Boston, MA
Masri, Shahir; Kang, Choong-Min; Koutrakis, Petros
2016-01-01
Identifying the sources, composition, and temporal variability of fine (PM2.5) and coarse (PM2.5-10) particles is a crucial component in understanding PM toxicity and establishing proper PM regulations. In this study, a Harvard Impactor was used to collect daily integrated fine and coarse particle samples every third day for nine years at a single site in Boston, MA. A total of 1,960 filters were analyzed for elements, black carbon (BC), and total PM mass. Positive Matrix Factorization (PMF) was used to identify source types and quantify their contributions to ambient PM2.5 and PM2.5-10. BC and 17 elements were identified as the main constituents in our samples. Results showed that BC, S, and Pb were associated exclusively with the fine particle mode, while 84% of V and 79% of Ni were associated with this mode. Elements mostly found in the coarse mode, over 80%, included Ca, Mn (road dust), and Cl (sea salt). PMF identified six source types for PM2.5 and three source types for PM2.5-10. Source types for PM2.5 included regional pollution, motor vehicles, sea salt, crustal/road dust, oil combustion, and wood burning. Regional pollution contributed the most, accounting for 48% of total PM2.5 mass, followed by motor vehicles (21%) and wood burning (19%). Source types for PM2.5-10 included crustal/road dust (62%), motor vehicles (22%), and sea salt (16%). A linear decrease in PM concentrations with time was observed for both fine (−5.2%/yr) and coarse (−3.6%/yr) particles. The fine-mode trend was mostly related to oil combustion and regional pollution contributions. Average PM2.5 concentrations peaked in summer (10.4 μg/m3) while PM2.5-10 concentrations were lower and demonstrated little seasonal variability. The findings of this study show that PM25 is decreasing more sharply than PM2.5-10 over time. This suggests the increasing importance of PM2.5-10 and traffic-related sources for PM exposure and future policies. PMID:25947125
Whiteway, Matthew R; Butts, Daniel A
2017-03-01
The activity of sensory cortical neurons is not only driven by external stimuli but also shaped by other sources of input to the cortex. Unlike external stimuli, these other sources of input are challenging to experimentally control, or even observe, and as a result contribute to variability of neural responses to sensory stimuli. However, such sources of input are likely not "noise" and may play an integral role in sensory cortex function. Here we introduce the rectified latent variable model (RLVM) in order to identify these sources of input using simultaneously recorded cortical neuron populations. The RLVM is novel in that it employs nonnegative (rectified) latent variables and is much less restrictive in the mathematical constraints on solutions because of the use of an autoencoder neural network to initialize model parameters. We show that the RLVM outperforms principal component analysis, factor analysis, and independent component analysis, using simulated data across a range of conditions. We then apply this model to two-photon imaging of hundreds of simultaneously recorded neurons in mouse primary somatosensory cortex during a tactile discrimination task. Across many experiments, the RLVM identifies latent variables related to both the tactile stimulation as well as nonstimulus aspects of the behavioral task, with a majority of activity explained by the latter. These results suggest that properly identifying such latent variables is necessary for a full understanding of sensory cortical function and demonstrate novel methods for leveraging large population recordings to this end. NEW & NOTEWORTHY The rapid development of neural recording technologies presents new opportunities for understanding patterns of activity across neural populations. Here we show how a latent variable model with appropriate nonlinear form can be used to identify sources of input to a neural population and infer their time courses. Furthermore, we demonstrate how these sources are related to behavioral contexts outside of direct experimental control. Copyright © 2017 the American Physiological Society.
Penalized differential pathway analysis of integrative oncogenomics studies.
van Wieringen, Wessel N; van de Wiel, Mark A
2014-04-01
Through integration of genomic data from multiple sources, we may obtain a more accurate and complete picture of the molecular mechanisms underlying tumorigenesis. We discuss the integration of DNA copy number and mRNA gene expression data from an observational integrative genomics study involving cancer patients. The two molecular levels involved are linked through the central dogma of molecular biology. DNA copy number aberrations abound in the cancer cell. Here we investigate how these aberrations affect gene expression levels within a pathway using observational integrative genomics data of cancer patients. In particular, we aim to identify differential edges between regulatory networks of two groups involving these molecular levels. Motivated by the rate equations, the regulatory mechanism between DNA copy number aberrations and gene expression levels within a pathway is modeled by a simultaneous-equations model, for the one- and two-group case. The latter facilitates the identification of differential interactions between the two groups. Model parameters are estimated by penalized least squares using the lasso (L1) penalty to obtain a sparse pathway topology. Simulations show that the inclusion of DNA copy number data benefits the discovery of gene-gene interactions. In addition, the simulations reveal that cis-effects tend to be over-estimated in a univariate (single gene) analysis. In the application to real data from integrative oncogenomic studies we show that inclusion of prior information on the regulatory network architecture benefits the reproducibility of all edges. Furthermore, analyses of the TP53 and TGFb signaling pathways between ER+ and ER- samples from an integrative genomics breast cancer study identify reproducible differential regulatory patterns that corroborate with existing literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.
2013-07-15
Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less
Development of a High Dynamic Range Pixel Array Detector for Synchrotrons and XFELs
NASA Astrophysics Data System (ADS)
Weiss, Joel Todd
Advances in synchrotron radiation light source technology have opened new lines of inquiry in material science, biology, and everything in between. However, x-ray detector capabilities must advance in concert with light source technology to fully realize experimental possibilities. X-ray free electron lasers (XFELs) place particularly large demands on the capabilities of detectors, and developments towards diffraction-limited storage ring sources also necessitate detectors capable of measuring very high flux [1-3]. The detector described herein builds on the Mixed Mode Pixel Array Detector (MM-PAD) framework, developed previously by our group to perform high dynamic range imaging, and the Adaptive Gain Integrating Pixel Detector (AGIPD) developed for the European XFEL by a collaboration between Deustsches Elektronen-Synchrotron (DESY), the Paul-Scherrer-Institute (PSI), the University of Hamburg, and the University of Bonn, led by Heinz Graafsma [4, 5]. The feasibility of combining adaptive gain with charge removal techniques to increase dynamic range in XFEL experiments is assessed by simulating XFEL scatter with a pulsed infrared laser. The strategy is incorporated into pixel prototypes which are evaluated with direct current injection to simulate very high incident x-ray flux. A fully functional 16x16 pixel hybrid integrating x-ray detector featuring several different pixel architectures based on the prototypes was developed. This dissertation describes its operation and characterization. To extend dynamic range, charge is removed from the integration node of the front-end amplifier without interrupting integration. The number of times this process occurs is recorded by a digital counter in the pixel. The parameter limiting full well is thereby shifted from the size of an integration capacitor to the depth of a digital counter. The result is similar to that achieved by counting pixel array detectors, but the integrators presented here are designed to tolerate a sustained flux >1011 x-rays/pixel/second. In addition, digitization of residual analog signals allows sensitivity for single x-rays or low flux signals. Pixel high flux linearity is evaluated by direct exposure to an unattenuated synchrotron source x-ray beam and flux measurements of more than 1010 9.52 keV x-rays/pixel/s are made. Detector sensitivity to small signals is evaluated and dominant sources of error are identified. These new pixels boast multiple orders of magnitude improvement in maximum sustained flux over the MM-PAD, which is capable of measuring a sustained flux in excess of 108 x-rays/pixel/second while maintaining sensitivity to smaller signals, down to single x-rays.
A novel integrated framework and improved methodology of computer-aided drug design.
Chen, Calvin Yu-Chian
2013-01-01
Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.
Biomarkers in Transit Reveal the Nature of Fluvial Integration
NASA Astrophysics Data System (ADS)
Ponton, C.; West, A.; Feakins, S. J.; Galy, V.
2013-12-01
The carbon and hydrogen isotopic composition of vascular plant leaf waxes are common proxies for hydrologic and vegetation change. Sedimentary archives off major river systems are prime targets for continental paleoclimate studies under the assumption that rivers integrate changes in terrestrial organic carbon (OC) composition over their drainage basin. However, the proportional contribution of sources within the basin (e.g. head waters vs. floodplain) and the transit times of OC through the fluvial system remain largely unknown. This lack of quantifiable information about the proportions and timescales of integration within large catchments poses a challenge for paleoclimate reconstructions. To examine the sources of terrestrial OC eroded and supplied to a river system and the spatial distribution of these sources, we use compound specific isotope analysis (i.e. δ13C, Δ14C, and δD) on plant-derived leaf waxes, filtered from large volumes of river water (20-200L) along a major river system. We selected the Kosñipata River that drains the western flank of the Andes in Peru, joins the Madre de Dios River across the Amazonian floodplain, and ultimately contributes to the Amazon River. Our study encompassed an elevation gradient of >4 km, in an almost entirely forested catchment. Precipitation δD values vary by >50‰ due to the isotopic effect of elevation, a feature we exploit to identify the sources of plant wax n-alkanoic acids transported by the river. We used the δD plant wax values from tributary rivers as source constrains and the main stem values as the integrated signal. In addition, compound specific radiocarbon on individual chain length n-alkanoic acids provide unprecedented detail on the integrated age of these compounds. Preliminary results have established that 1) most of the OC transport occurs in the wet season; 2) total carbon transport in the Madre de Dios is dominated by lowland sources because of the large floodplain area, but initial data suggest that OC from high elevations may be proportionally overrepresented relative to areal extent, with possibly important implications for biomarker isotope composition; 3) timescales of different biomarkers vary considerably; 4) the composition of OC varies downstream and with depth stratification within large rivers. We filtered >1000L of river water in this remote location during the wet season, and are presently replicating that study during the dry season, providing a seasonal comparison of OC transport in this major river system.
Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; M., M.
2016-06-01
Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.
Capability for Integrated Systems Risk-Reduction Analysis
NASA Technical Reports Server (NTRS)
Mindock, J.; Lumpkins, S.; Shelhamer, M.
2016-01-01
NASA's Human Research Program (HRP) is working to increase the likelihoods of human health and performance success during long-duration missions, and subsequent crew long-term health. To achieve these goals, there is a need to develop an integrated understanding of how the complex human physiological-socio-technical mission system behaves in spaceflight. This understanding will allow HRP to provide cross-disciplinary spaceflight countermeasures while minimizing resources such as mass, power, and volume. This understanding will also allow development of tools to assess the state of and enhance the resilience of individual crewmembers, teams, and the integrated mission system. We will discuss a set of risk-reduction questions that has been identified to guide the systems approach necessary to meet these needs. In addition, a framework of factors influencing human health and performance in space, called the Contributing Factor Map (CFM), is being applied as the backbone for incorporating information addressing these questions from sources throughout HRP. Using the common language of the CFM, information from sources such as the Human System Risk Board summaries, Integrated Research Plan, and HRP-funded publications has been combined and visualized in ways that allow insight into cross-disciplinary interconnections in a systematic, standardized fashion. We will show examples of these visualizations. We will also discuss applications of the resulting analysis capability that can inform science portfolio decisions, such as areas in which cross-disciplinary solicitations or countermeasure development will potentially be fruitful.
Assessment of rockfall susceptibility by integrating statistical and physically-based approaches
NASA Astrophysics Data System (ADS)
Frattini, Paolo; Crosta, Giovanni; Carrara, Alberto; Agliardi, Federico
In Val di Fassa (Dolomites, Eastern Italian Alps) rockfalls constitute the most significant gravity-induced natural disaster that threatens both the inhabitants of the valley, who are few, and the thousands of tourists who populate the area in summer and winter. To assess rockfall susceptibility, we developed an integrated statistical and physically-based approach that aimed to predict both the susceptibility to onset and the probability that rockfalls will attain specific reaches. Through field checks and multi-temporal aerial photo-interpretation, we prepared a detailed inventory of both rockfall source areas and associated scree-slope deposits. Using an innovative technique based on GIS tools and a 3D rockfall simulation code, grid cells pertaining to the rockfall source-area polygons were classified as active or inactive, based on the state of activity of the associated scree-slope deposits. The simulation code allows one to link each source grid cell with scree deposit polygons by calculating the trajectory of each simulated launch of blocks. By means of discriminant analysis, we then identified the mix of environmental variables that best identifies grid cells with low or high susceptibility to rockfalls. Among these variables, structural setting, land use, and morphology were the most important factors that led to the initiation of rockfalls. We developed 3D simulation models of the runout distance, intensity and frequency of rockfalls, whose source grid cells corresponded either to the geomorphologically-defined source polygons ( geomorphological scenario) or to study area grid cells with slope angle greater than an empirically-defined value of 37° ( empirical scenario). For each scenario, we assigned to the source grid cells an either fixed or variable onset susceptibility; the latter was derived from the discriminant model group (active/inactive) membership probabilities. Comparison of these four models indicates that the geomorphological scenario with variable onset susceptibility appears to be the most realistic model. Nevertheless, political and legal issues seem to guide local administrators, who tend to select the more conservative empirically-based scenario as a land-planning tool.
How Conjunctive Use of Surface and Ground Water could Increase Resiliency in US?
NASA Astrophysics Data System (ADS)
Josset, L.; Rising, J. A.; Russo, T. A.; Troy, T. J.; Lall, U.; Allaire, M.
2016-12-01
Optimized management practices are crucial to ensuring water availability in the future. However this presents a tremendous challenge due to the many functions of water: water is not only central for our survival as drinking water or for irrigation, but it is also valued for industrial and recreational use. Sources of water meeting these needs range from rain water harvesting to reservoirs, water reuse, groundwater abstraction and desalination. A global conjunctive management approach is thus necessary to develop sustainable practices as all sectors are strongly coupled. Policy-makers and researchers have identified pluralism in water sources as a key solution to reach water security. We propose a novel approach to sustainable water management that accounts for multiple sources of water in an integrated manner. We formulate this challenge as an optimization problem where the choice of water sources is driven both by the availability of the sources and their relative cost. The results determine the optimal operational decisions for each sources (e.g. reservoirs releases, surface water withdrawals, groundwater abstraction and/or desalination water use) at each time step for a given time horizon. The physical surface and ground water systems are simulated inside the optimization by setting state equations as constraints. Additional constraints may be added to the model to represent the influence of policy decisions. To account for uncertainty in weather conditions and its impact on availability, the optimization is performed for an ensemble of climate scenarios. While many sectors and their interactions are represented, the computational cost is limited as the problem remains linear and thus enables large-scale applications and the propagation of uncertainty. The formulation is implemented within the model "America's Water Analysis, Synthesis and Heuristic", an integrated model for the conterminous US discretized at the county-scale. This enables a systematic evaluation of stresses on water resources. We explore in particular geographic and temporal trends in function of user-types to develop a better understanding of the dynamics at play. We conclude with a comparison between the optimization results and current water use to identify potential solutions to increase resiliency.
Meshkati, Najmedin; Tabibzadeh, Maryam; Farshid, Ali; Rahimi, Mansour; Alhanaee, Ghena
2016-02-01
The aim of this study is to identify the interdependencies of human and organizational subsystems of multiple complex, safety-sensitive technological systems and their interoperability in the context of sustainability and resilience of an ecosystem. Recent technological disasters with severe environmental impact are attributed to human factors and safety culture causes. One of the most populous and environmentally sensitive regions in the world, the (Persian) Gulf, is on the confluence of an exponentially growing number of two industries--nuclear power and seawater desalination plants--that is changing its land- and seascape. Building upon Rasmussen's model, a macrosystem integrative framework, based on the broader context of human factors, is developed, which can be considered in this context as a "meta-ergonomics" paradigm, for the analysis of interactions, design of interoperability, and integration of decisions of major actors whose actions can affect safety and sustainability of the focused industries during routine and nonroutine (emergency) operations. Based on the emerging realities in the Gulf region, it is concluded that without such systematic approach toward addressing the interdependencies of water and energy sources, sustainability will be only a short-lived dream and prosperity will be a disappearing mirage for millions of people in the region. This multilayered framework for the integration of people, technology, and ecosystem--which has been applied to the (Persian) Gulf--offers a viable and vital approach to the design and operation of large-scale complex systems wherever the nexus of water, energy, and food sources are concerned, such as the Black Sea. © 2016, Human Factors and Ergonomics Society.
NASA Astrophysics Data System (ADS)
Schuetze, C.; Sauer, U.; Dietrich, P.
2015-12-01
Reliable detection and assessment of near-surface CO2 emissions from natural or anthropogenic sources require the application of various monitoring tools at different spatial scales. Especially, optical remote sensing tools for atmospheric monitoring have the potential to measure integrally CO2 emissions over larger scales (> 10.000m2). Within the framework of the MONACO project ("Monitoring approach for geological CO2 storage sites using a hierarchical observation concept"), an integrative hierarchical monitoring concept was developed and validated at different field sites with the aim to establish a modular observation strategy including investigations in the shallow subsurface, at ground surface level and the lower atmospheric boundary layer. The main aims of the atmospheric monitoring using optical remote sensing were the observation of the gas dispersion in to the near-surface atmosphere, the determination of maximum concentration values and identification of the main challenges associated with the monitoring of extended emission sources with the proposed methodological set up under typical environmental conditions. The presentation will give an overview about several case studies using the integrative approach of Open-Path Fourier Transform Infrared spectroscopy (OP FTIR) in combination with in situ measurements. As a main result, the method was validated as possible approach for continuous monitoring of the atmospheric composition, in terms of integral determination of GHG concentrations and to identify target areas which are needed to be investigated more in detail. Especially the data interpretation should closely consider the micrometeorological conditions. Technical aspects concerning robust equipment, experimental set up and fast data processing algorithms have to be taken into account for the enhanced automation of atmospheric monitoring.
SemMat: Federated Semantic Services Platform for Open materials Science and Engineering
2017-01-01
identified the following two important tasks to remedy the data heterogeneity challenge to promote data integration: (1) creating the semantic...sourced from the structural and bio -materials domains. For structural materials data, we reviewed and used MIL-HDBK-5J [11] and MIL-HDBK-17. Furthermore...documents about composite materials provided by our domain expert. Based on the suggestions given by domain experts in bio -materials, the following
Integrated Warfighter Biodefense Program (IWBP) - Phase 2
2011-03-03
one another or with environments to spread disease such as indirect transmission of cholera via water. In this paper only localized mixing is...from WHO reports and other sources. Cholera , Pneumonia, Malaria, and Hepatitis A were selected as representative diseases as their methods of...A(H1N1) and US NORTHCOM In March 2009, the US identified its first cases of „swine flu‟ in New York. Initial reports of cases in Mexico , where the
Toward a complete dataset of drug-drug interaction information from publicly available sources.
Ayvaz, Serkan; Horn, John; Hassanzadeh, Oktie; Zhu, Qian; Stan, Johann; Tatonetti, Nicholas P; Vilar, Santiago; Brochhausen, Mathias; Samwald, Matthias; Rastegar-Mojarad, Majid; Dumontier, Michel; Boyce, Richard D
2015-06-01
Although potential drug-drug interactions (PDDIs) are a significant source of preventable drug-related harm, there is currently no single complete source of PDDI information. In the current study, all publically available sources of PDDI information that could be identified using a comprehensive and broad search were combined into a single dataset. The combined dataset merged fourteen different sources including 5 clinically-oriented information sources, 4 Natural Language Processing (NLP) Corpora, and 5 Bioinformatics/Pharmacovigilance information sources. As a comprehensive PDDI source, the merged dataset might benefit the pharmacovigilance text mining community by making it possible to compare the representativeness of NLP corpora for PDDI text extraction tasks, and specifying elements that can be useful for future PDDI extraction purposes. An analysis of the overlap between and across the data sources showed that there was little overlap. Even comprehensive PDDI lists such as DrugBank, KEGG, and the NDF-RT had less than 50% overlap with each other. Moreover, all of the comprehensive lists had incomplete coverage of two data sources that focus on PDDIs of interest in most clinical settings. Based on this information, we think that systems that provide access to the comprehensive lists, such as APIs into RxNorm, should be careful to inform users that the lists may be incomplete with respect to PDDIs that drug experts suggest clinicians be aware of. In spite of the low degree of overlap, several dozen cases were identified where PDDI information provided in drug product labeling might be augmented by the merged dataset. Moreover, the combined dataset was also shown to improve the performance of an existing PDDI NLP pipeline and a recently published PDDI pharmacovigilance protocol. Future work will focus on improvement of the methods for mapping between PDDI information sources, identifying methods to improve the use of the merged dataset in PDDI NLP algorithms, integrating high-quality PDDI information from the merged dataset into Wikidata, and making the combined dataset accessible as Semantic Web Linked Data. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?
Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C
2017-09-01
Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not routinely being surveyed, including Workload, Clear Leadership/Decision-Making, Management, Flexibility of Integrated Care Model, Engagement, Usefulness of Integrated Care and Collaboration, and Positive Impact/Clinical Benefits/Practice Level Benefits. There were several domains identified from qualitative research that are not routinely included in quantitative surveys to assess health professionals' experiences of integrated care. In addition, the qualitative findings suggest that the experiences of HCPs are often impacted by deeper aspects than those measured by existing surveys. Incorporation of targeted items within these domains in the design of surveys should enhance the capture of data that are relevant to the experiences of HCPs with integrated care, which may assist in more comprehensive evaluation and subsequent improvement of integrated care programs.
Innovative financing for health: what is truly innovative?
Atun, Rifat; Knaul, Felicia Marie; Akachi, Yoko; Frenk, Julio
2012-12-08
Development assistance for health has increased every year between 2000 and 2010, particularly for HIV/AIDS, tuberculosis, and malaria, to reach US$26·66 billion in 2010. The continued global economic crisis means that increased external financing from traditional donors is unlikely in the near term. Hence, new funding has to be sought from innovative financing sources to sustain the gains made in global health, to achieve the health Millennium Development Goals, and to address the emerging burden from non-communicable diseases. We use the value chain approach to conceptualise innovative financing. With this framework, we identify three integrated innovative financing mechanisms-GAVI, Global Fund, and UNITAID-that have reached a global scale. These three financing mechanisms have innovated along each step of the innovative finance value chain-namely resource mobilisation, pooling, channelling, resource allocation, and implementation-and integrated these steps to channel large amounts of funding rapidly to low-income and middle-income countries to address HIV/AIDS, malaria, tuberculosis, and vaccine-preventable diseases. However, resources mobilised from international innovative financing sources are relatively modest compared with donor assistance from traditional sources. Instead, the real innovation has been establishment of new organisational forms as integrated financing mechanisms that link elements of the financing value chain to more effectively and efficiently mobilise, pool, allocate, and channel financial resources to low-income and middle-income countries and to create incentives to improve implementation and performance of national programmes. These mechanisms provide platforms for health funding in the future, especially as efforts to grow innovative financing have faltered. The lessons learnt from these mechanisms can be used to develop and expand innovative financing from international sources to address health needs in low-income and middle-income countries. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cochran, Jaquelin
The use of renewable energy (RE) sources, primarily wind and solar generation, is poised to grow significantly within the Indian power system. The Government of India has established a target of 175 gigawatts (GW) of installed RE capacity by 2022, including 60 GW of wind and 100 GW of solar, up from 29 GW wind and 9 GW solar at the beginning of 2017. Thanks to advanced weather and power system modeling made for this project, the study team is able to explore operational impacts of meeting India's RE targets and identify actions that may be favorable for integration.
Schill, Steven R; Raber, George T; Roberts, Jason J; Treml, Eric A; Brenner, Jorge; Halpin, Patrick N
2015-01-01
We integrated coral reef connectivity data for the Caribbean and Gulf of Mexico into a conservation decision-making framework for designing a regional scale marine protected area (MPA) network that provides insight into ecological and political contexts. We used an ocean circulation model and regional coral reef data to simulate eight spawning events from 2008-2011, applying a maximum 30-day pelagic larval duration and 20% mortality rate. Coral larval dispersal patterns were analyzed between coral reefs across jurisdictional marine zones to identify spatial relationships between larval sources and destinations within countries and territories across the region. We applied our results in Marxan, a conservation planning software tool, to identify a regional coral reef MPA network design that meets conservation goals, minimizes underlying threats, and maintains coral reef connectivity. Our results suggest that approximately 77% of coral reefs identified as having a high regional connectivity value are not included in the existing MPA network. This research is unique because we quantify and report coral larval connectivity data by marine ecoregions and Exclusive Economic Zones (EZZ) and use this information to identify gaps in the current Caribbean-wide MPA network by integrating asymmetric connectivity information in Marxan to design a regional MPA network that includes important reef network connections. The identification of important reef connectivity metrics guides the selection of priority conservation areas and supports resilience at the whole system level into the future.
Innovative Technologies for Global Space Exploration
NASA Technical Reports Server (NTRS)
Hay, Jason; Gresham, Elaine; Mullins, Carie; Graham, Rachael; Williams-Byrd; Reeves, John D.
2012-01-01
Under the direction of NASA's Exploration Systems Mission Directorate (ESMD), Directorate Integration Office (DIO), The Tauri Group with NASA's Technology Assessment and Integration Team (TAIT) completed several studies and white papers that identify novel technologies for human exploration. These studies provide technical inputs to space exploration roadmaps, identify potential organizations for exploration partnerships, and detail crosscutting technologies that may meet some of NASA's critical needs. These studies are supported by a relational database of more than 400 externally funded technologies relevant to current exploration challenges. The identified technologies can be integrated into existing and developing roadmaps to leverage external resources, thereby reducing the cost of space exploration. This approach to identifying potential spin-in technologies and partnerships could apply to other national space programs, as well as international and multi-government activities. This paper highlights innovative technologies and potential partnerships from economic sectors that historically are less connected to space exploration. It includes breakthrough concepts that could have a significant impact on space exploration and discusses the role of breakthrough concepts in technology planning. Technologies and partnerships are from NASA's Technology Horizons and Technology Frontiers game-changing and breakthrough technology reports as well as the External Government Technology Dataset, briefly described in the paper. The paper highlights example novel technologies that could be spun-in from government and commercial sources, including virtual worlds, synthetic biology, and human augmentation. It will consider how these technologies can impact space exploration and will discuss ongoing activities for planning and preparing them.
Schill, Steven R.; Raber, George T.; Roberts, Jason J.; Treml, Eric A.; Brenner, Jorge; Halpin, Patrick N.
2015-01-01
We integrated coral reef connectivity data for the Caribbean and Gulf of Mexico into a conservation decision-making framework for designing a regional scale marine protected area (MPA) network that provides insight into ecological and political contexts. We used an ocean circulation model and regional coral reef data to simulate eight spawning events from 2008–2011, applying a maximum 30-day pelagic larval duration and 20% mortality rate. Coral larval dispersal patterns were analyzed between coral reefs across jurisdictional marine zones to identify spatial relationships between larval sources and destinations within countries and territories across the region. We applied our results in Marxan, a conservation planning software tool, to identify a regional coral reef MPA network design that meets conservation goals, minimizes underlying threats, and maintains coral reef connectivity. Our results suggest that approximately 77% of coral reefs identified as having a high regional connectivity value are not included in the existing MPA network. This research is unique because we quantify and report coral larval connectivity data by marine ecoregions and Exclusive Economic Zones (EZZ) and use this information to identify gaps in the current Caribbean-wide MPA network by integrating asymmetric connectivity information in Marxan to design a regional MPA network that includes important reef network connections. The identification of important reef connectivity metrics guides the selection of priority conservation areas and supports resilience at the whole system level into the future. PMID:26641083
Cai, Minggang; Lin, Yan; Chen, Meng; Yang, Weifeng; Du, Huihong; Xu, Ye; Cheng, Shayen; Xu, Fangjian; Hong, Jiajun; Chen, Mian; Ke, Hongwei
2017-12-31
To obtain the historical changes of pyrogenic sources, integrated source apportionment methods, which include PAH compositions, diagnostic ratios (DRs), Pb isotopic ratios, and positive matrix factorization (PMF) model, were developed and applied in sediments of the northern South China Sea. These methods provided a gradually clear picture of energy structural change. Spatially, Σ 15 PAH (11.3 to 95.5ng/g) and Pb (10.2 to 74.6μg/g) generally exhibited decreasing concentration gradient offshore; while the highest levels of PAHs and Pb were observed near the southern Taiwan Strait, which may be induced by accumulation of different fluvial input. Historical records of pollutants followed closely with the economic development of China, with fast growth of Σ 15 PAH and Pb occurring since the 1980s and 1990s, respectively. The phasing-out of leaded gasoline in China was captured with a sharp decrease of Pb after the mid-1990s. PAHs and Pb correlated well with TOC and clay content for core sediments, which was not observed for surface sediments. There was an up-core increase of high molecular PAH proportions. Coal and biomass burning were then qualitatively identified as the major sources of PAHs with DRs. Furthermore, shift toward less radiogenic signatures of Pb isotopic ratios after 1900 revealed the start and growing importance of industrial sources. Finally, a greater separation and quantification of various input was achieved by a three-factor PMF model, which made it clear that biomass burning, coal combustion, and vehicle emissions accounted for 40±20%, 41±13%, and 19±12% of PAHs through the core. Biomass and coal combustion acted as major sources before 2000, while contributions from vehicle emission soared thereafter. The integrated multi-methodologies here improved the source apportionment by reducing biases with a step-down and cross-validation perspective, which could be similarly applied to other aquatic systems. Copyright © 2017 Elsevier B.V. All rights reserved.
cyclostratigraphy, sequence stratigraphy and organic matter accumulation mechanism
NASA Astrophysics Data System (ADS)
Cong, F.; Li, J.
2016-12-01
The first member of Maokou Formation of Sichuan basin is composed of well preserved carbonate ramp couplets of limestone and marlstone/shale. It acts as one of the potential shale gas source rock, and is suitable for time-series analysis. We conducted time-series analysis to identify high-frequency sequences, reconstruct high-resolution sedimentation rate, estimate detailed primary productivity for the first time in the study intervals and discuss organic matter accumulation mechanism of source rock under sequence stratigraphic framework.Using the theory of cyclostratigraphy and sequence stratigraphy, the high-frequency sequences of one outcrop profile and one drilling well are identified. Two third-order sequences and eight fourth-order sequences are distinguished on outcrop profile based on the cycle stacking patterns. For drilling well, sequence boundary and four system tracts is distinguished by "integrated prediction error filter analysis" (INPEFA) of Gamma-ray logging data, and eight fourth-order sequences is identified by 405ka long eccentricity curve in depth domain which is quantified and filtered by integrated analysis of MTM spectral analysis, evolutive harmonic analysis (EHA), evolutive average spectral misfit (eASM) and band-pass filtering. It suggests that high-frequency sequences correlate well with Milankovitch orbital signals recorded in sediments, and it is applicable to use cyclostratigraphy theory in dividing high-frequency(4-6 orders) sequence stratigraphy.High-resolution sedimentation rate is reconstructed through the study interval by tracking the highly statistically significant short eccentricity component (123ka) revealed by EHA. Based on sedimentation rate, measured TOC and density data, the burial flux, delivery flux and primary productivity of organic carbon was estimated. By integrating redox proxies, we can discuss the controls on organic matter accumulation by primary production and preservation under the high-resolution sequence stratigraphic framework. Results show that high average organic carbon contents in the study interval are mainly attributed to high primary production. The results also show a good correlation between high organic carbon accumulation and intervals of transgression.
Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim
2016-01-01
Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458
An Integrated SNP Mining and Utilization (ISMU) Pipeline for Next Generation Sequencing Data
Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M.; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A. V. S. K.; Varshney, Rajeev K.
2014-01-01
Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software. PMID:25003610
Using isotopes to investigate hydrological flow pathways and sources in a remote Arctic catchment
NASA Astrophysics Data System (ADS)
Lessels, Jason; Tetzlaff, Doerthe; Dinsmore, Kerry; Street, Lorna; Billet, Mike; Baxter, Robert; Subke, Jens-Arne; Wookey, Phillip
2014-05-01
Stable water isotopes allow for the identification of flow paths and stream water sources. This ability is beneficial in improving the understanding in catchments with dynamic spatial and temporal sources. Arctic catchments are characterised with strong seasonality where the dominant flow paths change throughout the short summer season. Therefore, the identification of stream water sources through time and space is necessary in order to accurately quantify these dynamics. Stable isotope tracers are incredibly useful tools which integrate processes of time and space and therefore, particularly useful in identifying flow pathways and runoff sources at remote sites. This work presents stable isotope data collected from a small (1km2) catchment in Northwest Canada. The aims of this study are to 1) identify sources of stream water through time and space, 2) provide information which will be incorporated into hydrological and transit time models Sampling of snowmelt, surface runoff, ice-wedge polygons, stream and soil water was undertaken throughout the 2013 summer. The results of this sampling reveal the dominant flow paths in the catchment and the strong influence of aspect in controlling these processes. After the spring freshet, late lying snow packs on north facing slopes and thawing permafrost on south facing slopes are the dominant sources of stream water. Progressively through the season the thawing permafrost and precipitation become the largest contributing sources. The depth of the thawing aspect layer and consequently the contribution to the stream is heavily dependent on aspect. The collection of precipitation, soil and stream isotope samples throughout the summer period provide valuable information for transit time estimates. The combination of spatial and temporal sampling of stable isotopes has revealed clear differences between the main stream sources in the studied catchment and reinforced the importance of slope aspect in these catchments.
Liang, Jian; Jiang, Tao; WeiI, Shi-Qiang; Lu, Song; Yan, Jin-Long; Wang, Qi-Lei; Gao, Jie
2015-03-01
This study aimed at evaluating the variability of the optical properties including UV-Vis and fluorescence characteristics of dissolved organic matter (DOM) from rainwater in summer and winter seasons. UV-Vis and fluorescence spectroscopy, together with Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model and fire events map, were conducted to characterize DOM and investigate its sources and contributions. The results showed that as compared with aquatic and soil DOM, rainwater DOM showed similar spectral characteristics, suggesting DOM in precipitation was also an important contributor to DOM pool in terrestrial and aquatic systems. The concentrations of DOC in rainwater were 0.88-12.80 mg x L(-1), and the CDOM concentrations were 3.17-21.11 mg x L(-1). Differences of DOM samples between summer and winter were significant (P < 0.05). In comparison to summer, DOM samples in winter had lower molecular weight and aromaticity, and also lower humification. Input of DOM in winter was predominantly derived from local and short-distance distances, while non-special scattering sources were identified as the main contributors in summer. Although absorption and fluorescence spectroscopy could be used to identify DOM composition and sources, there were obvious differences in spectra and sources analysis between rainwater DOM and the others from other sources. Thus, the classic differentiation method by "allochthonous (terrigenous) and autochthonous (authigenic)" is possibly too simple and arbitrary for characterization of DOM in rainwater.
Water sources and mixing in riparian wetlands revealed by tracers and geospatial analysis.
Lessels, Jason S; Tetzlaff, Doerthe; Birkel, Christian; Dick, Jonathan; Soulsby, Chris
2016-01-01
Mixing of waters within riparian zones has been identified as an important influence on runoff generation and water quality. Improved understanding of the controls on the spatial and temporal variability of water sources and how they mix in riparian zones is therefore of both fundamental and applied interest. In this study, we have combined topographic indices derived from a high-resolution Digital Elevation Model (DEM) with repeated spatially high-resolution synoptic sampling of multiple tracers to investigate such dynamics of source water mixing. We use geostatistics to estimate concentrations of three different tracers (deuterium, alkalinity, and dissolved organic carbon) across an extended riparian zone in a headwater catchment in NE Scotland, to identify spatial and temporal influences on mixing of source waters. The various biogeochemical tracers and stable isotopes helped constrain the sources of runoff and their temporal dynamics. Results show that spatial variability in all three tracers was evident in all sampling campaigns, but more pronounced in warmer dryer periods. The extent of mixing areas within the riparian area reflected strong hydroclimatic controls and showed large degrees of expansion and contraction that was not strongly related to topographic indices. The integrated approach of using multiple tracers, geospatial statistics, and topographic analysis allowed us to classify three main riparian source areas and mixing zones. This study underlines the importance of the riparian zones for mixing soil water and groundwater and introduces a novel approach how this mixing can be quantified and the effect on the downstream chemistry be assessed.
Mapping the spatio-temporal risk of lead exposure in apex species for more effective mitigation
Mateo-Tomás, Patricia; Olea, Pedro P.; Jiménez-Moreno, María; Camarero, Pablo R.; Sánchez-Barbudo, Inés S.; Rodríguez Martín-Doimeadios, Rosa C.; Mateo, Rafael
2016-01-01
Effective mitigation of the risks posed by environmental contaminants for ecosystem integrity and human health requires knowing their sources and spatio-temporal distribution. We analysed the exposure to lead (Pb) in griffon vulture Gyps fulvus—an apex species valuable as biomonitoring sentinel. We determined vultures' lead exposure and its main sources by combining isotope signatures and modelling analyses of 691 bird blood samples collected over 5 years. We made yearlong spatially explicit predictions of the species risk of lead exposure. Our results highlight elevated lead exposure of griffon vultures (i.e. 44.9% of the studied population, approximately 15% of the European, showed lead blood levels more than 200 ng ml−1) partly owing to environmental lead (e.g. geological sources). These exposures to environmental lead of geological sources increased in those vultures exposed to point sources (e.g. lead-based ammunition). These spatial models and pollutant risk maps are powerful tools that identify areas of wildlife exposure to potentially harmful sources of lead that could affect ecosystem and human health. PMID:27466455
Noise analysis for CCD-based ultraviolet and visible spectrophotometry.
Davenport, John J; Hodgkinson, Jane; Saffell, John R; Tatam, Ralph P
2015-09-20
We present the results of a detailed analysis of the noise behavior of two CCD spectrometers in common use, an AvaSpec-3648 CCD UV spectrometer and an Ocean Optics S2000 Vis spectrometer. Light sources used include a deuterium UV/Vis lamp and UV and visible LEDs. Common noise phenomena include source fluctuation noise, photoresponse nonuniformity, dark current noise, fixed pattern noise, and read noise. These were identified and characterized by varying light source, spectrometer settings, or temperature. A number of noise-limiting techniques are proposed, demonstrating a best-case spectroscopic noise equivalent absorbance of 3.5×10(-4) AU for the AvaSpec-3648 and 5.6×10(-4) AU for the Ocean Optics S2000 over a 30 s integration period. These techniques can be used on other CCD spectrometers to optimize performance.
Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.
Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N
2007-12-07
A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.
Atmospheric mercury (Hg) in the Adirondacks: Concentrations and sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyun-Deok Choi; Thomas M. Holsen; Philip K. Hopke
2008-08-15
Hourly averaged gaseous elemental Hg (GEM) concentrations and hourly integrated reactive gaseous Hg (RGM), and particulate Hg (HgP) concentrations in the ambient air were measured at Huntington Forest in the Adirondacks, New York from June 2006 to May 2007. The average concentrations of GEM, RGM, and HgP were 1.4 {+-} 0.4 ng m{sup -3}, 1.8 {+-} 2.2 pg m{sup -3}, and 3.2 {+-} 3.7 pg m{sup -3}, respectively. RGM represents <3.5% of total atmospheric Hg or total gaseous Hg (TGM: GEM + RGM) and HgP represents <3.0% of the total atmospheric Hg. The highest mean concentrations of GEM, RGM, andmore » HgP were measured during winter and summer whereas the lowest mean concentrations were measured during spring and fall. Significant diurnal patterns were apparent in warm seasons for all species whereas diurnal patterns were weak in cold seasons. RGM was better correlated with ozone concentration and temperature in both warm than the other species. Potential source contribution function (PSCF) analysis was applied to identify possible Hg sources. This method identified areas in Pennsylvania, West Virginia, Ohio, Kentucky, Texas, Indiana, and Missouri, which coincided well with sources reported in a 2002 U.S. mercury emissions inventory. 51 refs., 7 figs., 1 tab.« less
Lane, Katie; Derbyshire, Emma; Li, Weili; Brennan, Charles
2014-01-01
Presently alpha-linolenic acid (ALA) is the most widely used vegetarian LC3PUFA, but only marginal amounts are converted into eicosapentaenoic (EPA) and docosahexaenoic acid (DHA); both of which are strongly related to human health. Currently, fish oils represent the most prominent dietary sources of EPA and DHA; however, these are unsuitable for vegetarians. Alternative sources include flaxseed, echium, walnut, and algal oil but their conversion to EPA and DHA must be considered. The present systematic review sets out to collate information from intervention studies examining the bioavailability of alternative vegetarian long chain omega-3 (n-3) polyunsaturated fatty acids (LC3PUFA) sources. Ten key papers published over the last 10 years were identified with seven intervention studies reporting that ALA from nut and seed oils was not converted to DHA at all. Three studies showed that ingestion of micro-algae oil led to significant increases in blood erythrocyte and plasma DHA. Further work is now needed to identify optimal doses of alternative vegetarian LC3PUFAs and how these can be integrated within daily diets. The potential role of algal oils appears to be particularly promising and an area in which further research is warranted.
Lindblad, Anne S; Manukyan, Zorayr; Purohit-Sheth, Tejashri; Gensler, Gary; Okwesili, Paul; Meeker-O'Connell, Ann; Ball, Leslie; Marler, John R
2014-04-01
Site monitoring and source document verification account for 15%-30% of clinical trial costs. An alternative is to streamline site monitoring to focus on correcting trial-specific risks identified by central data monitoring. This risk-based approach could preserve or even improve the quality of clinical trial data and human subject protection compared to site monitoring focused primarily on source document verification. To determine whether a central review by statisticians using data submitted to the Food and Drug Administration (FDA) by clinical trial sponsors can identify problem sites and trials that failed FDA site inspections. An independent Analysis Center (AC) analyzed data from four anonymous new drug applications (NDAs) where FDA had performed site inspections overseen by FDA's Office of Scientific Investigations (OSI). FDA team members in the OSI chose the four NDAs from among all NDAs with data in Study Data Tabulation Model (SDTM) format. Two of the NDAs had data that OSI had deemed unreliable in support of the application after FDA site inspections identified serious data integrity problems. The other two NDAs had clinical data that OSI deemed reliable after site inspections. At the outset, the AC knew only that the experimental design specified two NDAs with significant problems. FDA gave the AC no information about which NDAs had problems, how many sites were inspected, or how many were found to have problems until after the AC analysis was complete. The AC evaluated randomization balance, enrollment patterns, study visit scheduling, variability of reported data, and last digit reference. The AC classified sites as 'High Concern', 'Moderate Concern', 'Mild Concern', or 'No Concern'. The AC correctly identified the two NDAs with data deemed unreliable by OSI. In addition, central data analysis correctly identified 5 of 6 (83%) sites for which FDA recommended rejection of data and 13 of 15 sites (87%) for which any regulatory deviations were identified during inspection. Of the six sites for which OSI reviewed inspections and found no deviations, the central process flagged four at the lowest level of concern, one at a moderate level, and one was not flagged. Central data monitoring during the conduct of a trial while data checking was in progress was not evaluated. Systematic central monitoring of clinical trial data can identify problems at the same trials and sites identified during FDA site inspections. Central data monitoring in conjunction with an overall monitoring process that adapts to identify risks as a trial progresses has the potential to reduce the frequency of site visits while increasing data integrity and decreasing trial costs compared to processes that are dependent primarily on source documentation.
An integrated approach using high time-resolved tools to study the origin of aerosols.
Di Gilio, A; de Gennaro, G; Dambruoso, P; Ventrella, G
2015-10-15
Long-range transport of natural and/or anthropogenic particles can contribute significantly to PM10 and PM2.5 concentrations and some European cities often fail to comply with PM daily limit values due to the additional impact of particles from remote sources. For this reason, reliable methodologies to identify long-range transport (LRT) events would be useful to better understand air pollution phenomena and support proper decision-making. This study explores the potential of an integrated and high time-resolved monitoring approach for the identification and characterization of local, regional and long-range transport events of high PM. In particular, the goal of this work was also the identification of time-limited event. For this purpose, a high time-resolved monitoring campaign was carried out at an urban background site in Bari (southern Italy) for about 20 days (1st-20th October 2011). The integration of collected data as the hourly measurements of inorganic ions in PM2.5 and their gas precursors and of the natural radioactivity, in addition to the analyses of aerosol maps and hourly back trajectories (BT), provided useful information for the identification and chemical characterization of local sources and trans-boundary intrusions. Non-sea salt (nss) sulfate levels were found to increase when air masses came from northeastern Europe and higher dispersive conditions of the atmosphere were detected. Instead, higher nitrate and lower nss-sulfate concentrations were registered in correspondence with air mass stagnation and attributed to local traffic source. In some cases, combinations of local and trans-boundary sources were observed. Finally, statistical investigations such as the principal component analysis (PCA) applied on hourly ion concentrations and the cluster analyses, the Potential Source Contribution Function (PSCF) and the Concentration Weighted Trajectory (CWT) models computed on hourly back-trajectories enabled to complete a cognitive framework and confirm the influence of aerosol transported from heavily polluted areas on the receptor site. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palchak, David; Cochran, Jaquelin; Deshmukh, Ranjit
The use of renewable energy (RE) sources, primarily wind and solar generation, is poised to grow significantly within the Indian power system. The Government of India has established an installed capacity target of 175 gigawatts (GW) RE by 2022 that includes 60 GW of wind and 100 GW of solar, up from current capacities of 29 GW wind and 9 GW solar. India’s contribution to global efforts on climate mitigation extends this ambition to 40% non-fossil-based generation capacity by 2030. Global experience demonstrates that power systems can integrate wind and solar at this scale; however, evidence-based planning is important tomore » achieve wind and solar integration at least cost. The purpose of this analysis is to evaluate the operation of India’s power grid with 175 GW of RE in order to identify potential cost and operational concerns and actions needed to efficiently integrate this level of wind and solar generation.« less
Establishment of the Northeast Coastal Watershed Geospatial Data Network (NECWGDN)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hannigan, Robyn
The goals of NECWGDN were to establish integrated geospatial databases that interfaced with existing open-source (water.html) environmental data server technologies (e.g., HydroDesktop) and included ecological and human data to enable evaluation, prediction, and adaptation in coastal environments to climate- and human-induced threats to the coastal marine resources within the Gulf of Maine. We have completed the development and testing of a "test bed" architecture that is compatible with HydroDesktop and have identified key metadata structures that will enable seamless integration and delivery of environmental, ecological, and human data as well as models to predict threats to end-users. Uniquely this databasemore » integrates point as well as model data and so offers capacities to end-users that are unique among databases. Future efforts will focus on the development of integrated environmental-human dimension models that can serve, in near real time, visualizations of threats to coastal resources and habitats.« less
BacillOndex: an integrated data resource for systems and synthetic biology.
Misirli, Goksel; Wipat, Anil; Mullen, Joseph; James, Katherine; Pocock, Matthew; Smith, Wendy; Allenby, Nick; Hallinan, Jennifer S
2013-04-10
BacillOndex is an extension of the Ondex data integration system, providing a semantically annotated, integrated knowledge base for the model Gram-positive bacterium Bacillus subtilis. This application allows a user to mine a variety of B. subtilis data sources, and analyse the resulting integrated dataset, which contains data about genes, gene products and their interactions. The data can be analysed either manually, by browsing using Ondex, or computationally via a Web services interface. We describe the process of creating a BacillOndex instance, and describe the use of the system for the analysis of single nucleotide polymorphisms in B. subtilis Marburg. The Marburg strain is the progenitor of the widely-used laboratory strain B. subtilis 168. We identified 27 SNPs with predictable phenotypic effects, including genetic traits for known phenotypes. We conclude that BacillOndex is a valuable tool for the systems-level investigation of, and hypothesis generation about, this important biotechnology workhorse. Such understanding contributes to our ability to construct synthetic genetic circuits in this organism.
BacillOndex: An Integrated Data Resource for Systems and Synthetic Biology.
Misirli, Goksel; Wipat, Anil; Mullen, Joseph; James, Katherine; Pocock, Matthew; Smith, Wendy; Allenby, Nick; Hallinan, Jennifer S
2013-06-01
BacillOndex is an extension of the Ondex data integration system, providing a semantically annotated, integrated knowledge base for the model Gram-positive bacterium Bacillus subtilis. This application allows a user to mine a variety of B. subtilis data sources, and analyse the resulting integrated dataset, which contains data about genes, gene products and their interactions. The data can be analysed either manually, by browsing using Ondex, or computationally via a Web services interface. We describe the process of creating a BacillOndex instance, and describe the use of the system for the analysis of single nucleotide polymorphisms in B. subtilis Marburg. The Marburg strain is the progenitor of the widely-used laboratory strain B. subtilis 168. We identified 27 SNPs with predictable phenotypic effects, including genetic traits for known phenotypes. We conclude that BacillOndex is a valuable tool for the systems-level investigation of, and hypothesis generation about, this important biotechnology workhorse. Such understanding contributes to our ability to construct synthetic genetic circuits in this organism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, C. S.; Gaensler, B. M.; Feain, I. J., E-mail: craiga@physics.usyd.edu.au
We present a broadband polarization analysis of 36 discrete polarized radio sources over a very broad, densely sampled frequency band. Our sample was selected on the basis of polarization behavior apparent in narrowband archival data at 1.4 GHz: half the sample shows complicated frequency-dependent polarization behavior (i.e., Faraday complexity) at these frequencies, while half shows comparatively simple behavior (i.e., they appear Faraday simple ). We re-observed the sample using the Australia Telescope Compact Array in full polarization, with 6 GHz of densely sampled frequency coverage spanning 1.3–10 GHz. We have devised a general polarization modeling technique that allows us tomore » identify multiple polarized emission components in a source, and to characterize their properties. We detect Faraday complex behavior in almost every source in our sample. Several sources exhibit particularly remarkable polarization behavior. By comparing our new and archival data, we have identified temporal variability in the broadband integrated polarization spectra of some sources. In a number of cases, the characteristics of the polarized emission components, including the range of Faraday depths over which they emit, their temporal variability, spectral index, and the linear extent of the source, allow us to argue that the spectropolarimetric data encode information about the magneto-ionic environment of active galactic nuclei themselves. Furthermore, the data place direct constraints on the geometry and magneto-ionic structure of this material. We discuss the consequences of restricted frequency bands on the detection and interpretation of polarization structures, and the implications for upcoming spectropolarimetric surveys.« less
Distributed design approach in persistent identifiers systems
NASA Astrophysics Data System (ADS)
Golodoniuc, Pavel; Car, Nicholas; Klump, Jens
2017-04-01
The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID) systems, of which there is a great variety in terms of technical and social implementations, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have catered for identifier uniqueness, integrity, persistence, and trustworthiness, regardless of the identifier's application domain, the scope of which has expanded significantly in the past two decades. Since many PID systems have been largely conceived and developed by small communities, or even a single organisation, they have faced challenges in gaining widespread adoption and, most importantly, the ability to survive change of technology. This has left a legacy of identifiers that still exist and are being used but which have lost their resolution service. We believe that one of the causes of once successful PID systems fading is their reliance on a centralised technical infrastructure or a governing authority. Golodoniuc et al. (2016) proposed an approach to the development of PID systems that combines the use of (a) the Handle system, as a distributed system for the registration and first-degree resolution of persistent identifiers, and (b) the PID Service (Golodoniuc et al., 2015), to enable fine-grained resolution to different information object representations. The proposed approach solved the problem of guaranteed first-degree resolution of identifiers, but left fine-grained resolution and information delivery under the control of a single authoritative source, posing risk to the long-term availability of information resources. Herein, we develop these approaches further and explore the potential of large-scale decentralisation at all levels: (i) persistent identifiers and information resources registration; (ii) identifier resolution; and (iii) data delivery. To achieve large-scale decentralisation, we propose using Distributed Hash Tables (DHT), Peer Exchange networks (PEX), Magnet Links, and peer-to-peer (P2P) file sharing networks - the technologies that enable applications such as BitTorrent (Wu et al., 2010). The proposed approach introduces reliable information replication and caching mechanisms, eliminating the need for a central PID data store, and increases overall system fault tolerance due to the lack of a single point of failure. The proposed PID system's design aims to ensure trustworthiness of the system and incorporates important aspects of governance, such as the notion of the authoritative source, data integrity, caching, and data replication control.
A two-channel, spectrally degenerate polarization entangled source on chip
NASA Astrophysics Data System (ADS)
Sansoni, Linda; Luo, Kai Hong; Eigner, Christof; Ricken, Raimund; Quiring, Viktor; Herrmann, Harald; Silberhorn, Christine
2017-12-01
Integrated optics provides the platform for the experimental implementation of highly complex and compact circuits for quantum information applications. In this context integrated waveguide sources represent a powerful resource for the generation of quantum states of light due to their high brightness and stability. However, the confinement of the light in a single spatial mode limits the realization of multi-channel sources. Due to this challenge one of the most adopted sources in quantum information processes, i.e. a source which generates spectrally indistinguishable polarization entangled photons in two different spatial modes, has not yet been realized in a fully integrated platform. Here we overcome this limitation by suitably engineering two periodically poled waveguides and an integrated polarization splitter in lithium niobate. This source produces polarization entangled states with fidelity of F = 0.973 ±0.003 and a test of Bell's inequality results in a violation larger than 14 standard deviations. It can work both in pulsed and continuous wave regime. This device represents a new step toward the implementation of fully integrated circuits for quantum information applications.
NASA Astrophysics Data System (ADS)
Neill, Aaron; Tetzlaff, Doerthe; Strachan, Norval; Hough, Rupert; Soulsby, Chris
2016-04-01
In order to comply with legislation such as the Water Framework Directive and to safeguard public health, there is a critical need to maintain the quality of water sources that are used to supply drinking water. Private water supplies (PWS) are still common in many rural areas in the UK, and are especially vulnerable to poor water quality, owing to the limited treatment they often receive and variable raw water quality in groundwater and surface water sources. A significant issue affecting PWS quality is contamination by faecal pathogens derived from grazing animals or agricultural practices. In Scotland, approximately 20,000 PWS serve around 200,000 people, with a number of these PWS consistently failing to meet water quality targets relating to coliform bacteria and E. coli, both of which can be indicative of faecal contamination (faecal indicator organisms - FIOs). The purpose of our study was to employ integrated empirical and modelling approaches from hydrology and microbiology to elucidate the nature of the still poorly-understood interplay between hydrological flow pathways which connect sources of pathogens to PWS sources, antecedent conditions, seasonality and pathogen transfer risk, for two catchments with contrasting land uses in Scotland: an agricultural catchment (Tarland Burn) and a montane catchment (Bruntland Burn). In the Tarland Burn, 15 years of spatially-distributed samples collected at the catchment-scale of FIO counts were analysed alongside hydrometric data to identify "hot spots" of faecal pathogen transfer risk and possible spatial and temporal controls. We also used a combination of tracer-based and numerical modelling approaches to identify the relationship between hydrological connectivity, flow pathways, and the mobilisation of faecal pathogens from different sources. In the Bruntland Burn, we coupled a pathogen storage, mobilisation and transport scheme to a previously developed tracer-informed hydrological model for the catchment to investigate temporal patterns and controls of pathogen transfer risk from different hydrological source areas identified from extensive past tracer and numerical modelling work: groundwater, hillslopes and the dynamic riparian zone.
Incorporation of Novel MRI and Biomarkers into Prostate Cancer Active Surveillance Risk Assessment
2016-09-01
that are integral in the preparation for a career in clinical research. These goals are 1) training in T1 translational research; 2) training in the...Texas. In addition to basic mentoring tasks, we have identified four sources of mentorship in particular to my career , which include opportunities at...clinical schedule for protected time to ensure I meet the goals of my grant and career development. Research: We have successfully implemented
Integration of an Apple II Plus Computer into an Existing Dual Axis Sun Tracker System.
1984-06-01
Identify by block number) S, tpec l Sun Tracker System Solar Energy Apple II Plus Computer 20. ABSTRACT (’ ntlnue on reveree ide If neceesery end...14 4. Dual Axis Sun Tracker (Side View) ----------------- 15 5. Solar Tracker System Block Diagram ---------------- 17 6. Plug Wiring Diagram for Top...sources will be competitive. Already many homes have solar collectors and other devices designed to decrease the consumption of gas, oil, and
Survey of Potential Radio Frequency Interference Sources.
1980-05-13
RESOLUTION - 121 km *ICE FIELD MAPS. RESOLUTION - 21 km * MEASUREMENT OF INTEGRATED ATMOSPHERIC WATER VAPOR AND LIQUID MATTER IN A COLUMN ALONG THE...frequency allocation matters . 3. Enclosure (2) reports a telecon with Mr. William Shaffer of NASA. The status report contains the results of decisions...have been identified; they exist in the bands 1.215-1.30, 3.1-3.3, 5.25-5.35, and 9.5-9.8 MHz. The matter of satisfying these requirements remains under
The Impact of Biogenic and Anthropogenic Atmospheric Aerosol on Climate in Egypt
NASA Astrophysics Data System (ADS)
Ibrahim, A. I.; Zakey, A.; Steiner, A. L.; Shokr, M. E.; El-Raey, M.; Ahmed, Y.; Al-Hadidi, A.; Zakey, A.
2014-12-01
Aerosols are indicators of air quality as they reduce visibility and adversely affect public health. Aerosol optical depth (AOD) is a measure of the radiation extinction due to interaction of radiation with aerosol particles in the atmosphere. Using this optical measure of atmospheric aerosols we explore the seasonal and annual patterns of aerosols from both anthropogenic and biogenic sources over Egypt. Here, we use an integrated environment-climate-aerosol model in conjunction with inversion technique to identify the aerosol particle size distribution over different locations in Egypt. The online-integrated Environment-Climate-Aerosol model (EnvClimA), which is based on the International Center for Theoretical Physics Regional Climate Model (ICTP-RegCM), is used to study the emission of different aerosols and their impact on climate parameters for a long-term base line simulation run over Egypt and North Africa. The global emission inventory is downscaled and remapping them over Egypt using local factors such as population, traffic and industrial activities to identify the sources of anthropogenic and biogenic emission from local emission over Egypt. The results indicated that the dominant natural aerosols over Egypt are dust emissions that frequently occur during the transitional seasons (Spring and Autumn). From the local observation we identify the number of dust and sand storm occurrences over Egypt. The Multiangle Imaging SpectroRadiometer (MISR) is used to identify the optical characterizations of different types of aerosols over Egypt. Modeled aerosol optical depth and MISR observed (at 555 nm) are compared from March 2000 through November 2013. The results identify that the MISR AOD captures the maximum peaks of AOD in March/April that coincide with the Khamasin dust storms. However, peaks in May are either due to photochemical reactions or anthropogenic activities. Note: This presentation is for a Partnerships for Enhanced Engagement in Research (PEER) project sponsored by USAID/NSF/NAS. Project Link (at National Academies website): http://sites.nationalacademies.org/PGA/dsc/peerscience/PGA_084046.htmwebsite: http://CleanAirEgypt.org
Measuring Data Quality Through a Source Data Verification Audit in a Clinical Research Setting.
Houston, Lauren; Probst, Yasmine; Humphries, Allison
2015-01-01
Health data has long been scrutinised in relation to data quality and integrity problems. Currently, no internationally accepted or "gold standard" method exists measuring data quality and error rates within datasets. We conducted a source data verification (SDV) audit on a prospective clinical trial dataset. An audit plan was applied to conduct 100% manual verification checks on a 10% random sample of participant files. A quality assurance rule was developed, whereby if >5% of data variables were incorrect a second 10% random sample would be extracted from the trial data set. Error was coded: correct, incorrect (valid or invalid), not recorded or not entered. Audit-1 had a total error of 33% and audit-2 36%. The physiological section was the only audit section to have <5% error. Data not recorded to case report forms had the greatest impact on error calculations. A significant association (p=0.00) was found between audit-1 and audit-2 and whether or not data was deemed correct or incorrect. Our study developed a straightforward method to perform a SDV audit. An audit rule was identified and error coding was implemented. Findings demonstrate that monitoring data quality by a SDV audit can identify data quality and integrity issues within clinical research settings allowing quality improvement to be made. The authors suggest this approach be implemented for future research.
NASA Astrophysics Data System (ADS)
Zare, Fateme; Elsawah, Sondoss; Iwanaga, Takuya; Jakeman, Anthony J.; Pierce, Suzanne A.
2017-09-01
There are substantial challenges facing humanity in the water and related sectors and purposeful integration of the disciplines, connected sectors and interest groups is now perceived as essential to address them. This article describes and uses bibliometric analysis techniques to provide quantitative insights into the general landscape of Integrated Water Resource Assessment and Modelling (IWAM) research over the last 45 years. Keywords, terms in titles, abstracts and the full texts are used to distinguish the 13,239 IWAM articles in journals and other non-grey literature. We identify the major journals publishing IWAM research, influential authors through citation counts, as well as the distribution and strength of source countries. Fruitfully, we find that the growth in numbers of such publications has continued to accelerate, and attention to both the biophysical and socioeconomic aspects has also been growing. On the other hand, our analysis strongly indicates that the former continue to dominate, partly by embracing integration with other biophysical sectors related to water - environment, groundwater, ecology, climate change and agriculture. In the social sciences the integration is occurring predominantly through economics, with the others, including law, policy and stakeholder participation, much diminished in comparison. We find there has been increasing attention to management and decision support systems, but a much weaker focus on uncertainty, a pervasive concern whose criticalities must be identified and managed for improving decision making. It would seem that interdisciplinary science still has a long way to go before crucial integration with the non-economic social sciences and uncertainty considerations are achieved more routinely.
NASA Astrophysics Data System (ADS)
Silva, K. M.; Flagey, N.; Noriega-Crespo, A.; Carey, S.; Ingallinera, A.
2017-03-01
We present Very Large Telescope/Spectrograph for INtegral Field Observations in the Near Infrared H- and K-band spectra of potential central stars within the inner 8″-by-8″ regions of 55 MIPSGAL “bubbles” (MBs), sub-arcminute circumstellar shells discovered in the mid-IR survey of the Galactic plane with Spitzer/MIPS. At magnitudes brighter than 15, we detect a total of 230 stars in the K band and 179 stars in the H band. We spectrally identify 145 stars in all but three MBs, with average magnitudes of 13.8 and 12.7 respectively, using spectral libraries and previous studies of near-IR stellar spectra. We also use tabulated intrinsic stellar magnitudes and colors to derive distances and extinction values, and to better constrain the classifications of the stars. We reliably identify the central sources for 21 of the 55 MBs, which we classify as follows: one Wolf-Rayet, three luminous blue variable candidates, four early-type (O to F), and 15 late-type (G to M) stars. The 21 central sources are, on average, one magnitude fainter than these in the most recent study of MBs, and we notice a significant drop in the fraction of massive star candidates. For the 34 remaining MBs in our sample, we are unable to identify the central sources due to confusion, low spectroscopic signal-to-noise ratio, and/or lack of detections in the images near the centers of the bubbles. We discuss how our findings compare with previous studies and support the trend, for the most part, between the shells’ morphologies in the mid-IR and central sources spectral types.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silva, K. M.; Flagey, N.; Noriega-Crespo, A.
We present Very Large Telescope/Spectrograph for INtegral Field Observations in the Near Infrared H - and K -band spectra of potential central stars within the inner 8″-by-8″ regions of 55 MIPSGAL “bubbles” (MBs), sub-arcminute circumstellar shells discovered in the mid-IR survey of the Galactic plane with Spitzer /MIPS. At magnitudes brighter than 15, we detect a total of 230 stars in the K band and 179 stars in the H band. We spectrally identify 145 stars in all but three MBs, with average magnitudes of 13.8 and 12.7 respectively, using spectral libraries and previous studies of near-IR stellar spectra. Wemore » also use tabulated intrinsic stellar magnitudes and colors to derive distances and extinction values, and to better constrain the classifications of the stars. We reliably identify the central sources for 21 of the 55 MBs, which we classify as follows: one Wolf–Rayet, three luminous blue variable candidates, four early-type (O to F), and 15 late-type (G to M) stars. The 21 central sources are, on average, one magnitude fainter than these in the most recent study of MBs, and we notice a significant drop in the fraction of massive star candidates. For the 34 remaining MBs in our sample, we are unable to identify the central sources due to confusion, low spectroscopic signal-to-noise ratio, and/or lack of detections in the images near the centers of the bubbles. We discuss how our findings compare with previous studies and support the trend, for the most part, between the shells’ morphologies in the mid-IR and central sources spectral types.« less
NASA Astrophysics Data System (ADS)
Estima, Jacinto Paulo Simoes
Traditional geographic information has been produced by mapping agencies and corporations, using high skilled people as well as expensive precision equipment and procedures, in a very costly approach. The production of land use and land cover databases are just one example of such traditional approach. On the other side, The amount of Geographic Information created and shared by citizens through the Web has been increasing exponentially during the last decade, resulting from the emergence and popularization of technologies such as the Web 2.0, cloud computing, GPS, smart phones, among others. Such comprehensive amount of free geographic data might have valuable information to extract and thus opening great possibilities to improve significantly the production of land use and land cover databases. In this thesis we explored the feasibility of using geographic data from different user generated spatial content initiatives in the process of land use and land cover database production. Data from Panoramio, Flickr and OpenStreetMap were explored in terms of their spatial and temporal distribution, and their distribution over the different land use and land cover classes. We then proposed a conceptual model to integrate data from suitable user generated spatial content initiatives based on identified dissimilarities among a comprehensive list of initiatives. Finally we developed a prototype implementing the proposed integration model, which was then validated by using the prototype to solve four identified use cases. We concluded that data from user generated spatial content initiatives has great value but should be integrated to increase their potential. The possibility of integrating data from such initiatives in an integration model was proved. Using the developed prototype, the relevance of the integration model was also demonstrated for different use cases. None None None
Plagiarism in nursing education: an integrative review.
Lynch, Joan; Everett, Bronwyn; Ramjan, Lucie M; Callins, Renee; Glew, Paul; Salamonson, Yenna
2017-10-01
To identify the prevalence and antecedents of plagiarism within nursing education and approaches to prevention and management. There has been growing media attention highlighting the prevalence of plagiarism in universities, including the academic integrity of undergraduate nursing students. A breach of academic integrity among nursing students also raises further concern with the potential transfer of this dishonest behaviour to the clinical setting. Integrative review. A systematic search of five electronic databases including CINAHL, MEDLINE, SCOPUS, ProQuest Nursing & Allied Health Source, and ERIC was undertaken. Only primary studies related to plagiarism and nursing students (undergraduate or postgraduate) studying at a tertiary education institution or nursing faculty were included. Both qualitative and quantitative study designs were included. Twenty studies were included in this review with six key themes identified: (1) prevalence; (2) knowledge, understanding and attitudes; (3) types of plagiarism; (4) antecedents to plagiarism; (5) interventions to reduce or prevent plagiarism; and (6) the relationship between academic honesty and professional integrity. Plagiarism is common among university nursing students, with a difference in perception of this behaviour between students and academics. The review also highlighted the importance of distinguishing between inadvertent and deliberate plagiarism, with differing strategies suggested to address this behaviour. Nevertheless, interventions to reduce plagiarism have not been shown to be effective. The current punitive approach to plagiarism within nursing faculties has not reduced its occurrence. There is a need to promote awareness, knowledge and provide students with the appropriate referencing skills, to reduce the significant amount of inadvertent plagiarism. The importance of promoting honesty and academic integrity in nursing education is highlighted. Cheating within the academic setting has been associated with dishonesty in the clinical setting, which highlights the importance of nurturing a culture of honesty and integrity at university. © 2016 John Wiley & Sons Ltd.
New Mathematical Functions for Vacuum System Analysis
NASA Technical Reports Server (NTRS)
Woronowicz, Michael S.
2017-01-01
A new bivariate function has been found that provides solutions of integrals having the form u (sup minus eta) e (sup u) du which arise when developing predictions for the behavior of pressure within a rigid volume under high vacuum conditions in the presence of venting as well as sources characterized by power law transient decay over the range [0,1] for eta and for u greater than or equal to 0. A few properties of the new function are explored in this work. For instance the eta equals 1/2 case reproduces the Dawson function. In addition, a slight variation of the solution technique reproduces the exponential integral for eta equals 1. The technique used to generate these functions leads to an approach for solving a more general class of nonlinear ordinary differential equations, with the potential for identifying other new functions that solve other integrals.
NASA Technical Reports Server (NTRS)
Phillips, T. J.; Semtner, A. J., Jr.
1984-01-01
Anomalies in ocean surface temperature have been identified as possible causes of variations in the climate of particular seasons or as a source of interannual climatic variability, and attempts have been made to forecast seasonal climate by using ocean temperatures as predictor variables. However, the seasonal atmospheric response to ocean temperature anomalies has not yet been systematically investigated with nonlinear models. The present investigation is concerned with ten-year integrations involving a model of intermediate complexity, the Held-Suarez climate model. The calculations have been performed to investigate the changes in seasonal climate which result from a fixed anomaly imposed on a seasonally varying, global ocean temperature field. Part I of the paper provides a report on the results of these decadal integrations. Attention is given to model properties, the experimental design, and the anomaly experiments.
Integration of the Eventlndex with other ATLAS systems
NASA Astrophysics Data System (ADS)
Barberis, D.; Cárdenas Zárate, S. E.; Gallas, E. J.; Prokoshin, F.
2015-12-01
The ATLAS EventIndex System, developed for use in LHC Run 2, is designed to index every processed event in ATLAS, replacing the TAG System used in Run 1. Its storage infrastructure, based on Hadoop open-source software framework, necessitates revamping how information in this system relates to other ATLAS systems. It will store more indexes since the fundamental mechanisms for retrieving these indexes will be better integrated into all stages of data processing, allowing more events from later stages of processing to be indexed than was possible with the previous system. Connections with other systems (conditions database, monitoring) are fundamentally critical to assess dataset completeness, identify data duplication, and check data integrity, and also enhance access to information in EventIndex by user and system interfaces. This paper gives an overview of the ATLAS systems involved, the relevant metadata, and describe the technologies we are deploying to complete these connections.
Seeking Synthesis: The Integrative Problem in Understanding Language and Its Evolution.
Dale, Rick; Kello, Christopher T; Schoenemann, P Thomas
2016-04-01
We discuss two problems for a general scientific understanding of language, sequences and synergies: how language is an intricately sequenced behavior and how language is manifested as a multidimensionally structured behavior. Though both are central in our understanding, we observe that the former tends to be studied more than the latter. We consider very general conditions that hold in human brain evolution and its computational implications, and identify multimodal and multiscale organization as two key characteristics of emerging cognitive function in our species. This suggests that human brains, and cognitive function specifically, became more adept at integrating diverse information sources and operating at multiple levels for linguistic performance. We argue that framing language evolution, learning, and use in terms of synergies suggests new research questions, and it may be a fruitful direction for new developments in theory and modeling of language as an integrated system. Copyright © 2016 Cognitive Science Society, Inc.
Analysis of large system black box verification test data
NASA Technical Reports Server (NTRS)
Clapp, Kenneth C.; Iyer, Ravishankar Krishnan
1993-01-01
Issues regarding black box, large systems verification are explored. It begins by collecting data from several testing teams. An integrated database containing test, fault, repair, and source file information is generated. Intuitive effectiveness measures are generated using conventional black box testing results analysis methods. Conventional analysts methods indicate that the testing was effective in the sense that as more tests were run, more faults were found. Average behavior and individual data points are analyzed. The data is categorized and average behavior shows a very wide variation in number of tests run and in pass rates (pass rates ranged from 71 percent to 98 percent). The 'white box' data contained in the integrated database is studied in detail. Conservative measures of effectiveness are discussed. Testing efficiency (ratio of repairs to number of tests) is measured at 3 percent, fault record effectiveness (ratio of repairs to fault records) is measured at 55 percent, and test script redundancy (ratio of number of failed tests to minimum number of tests needed to find the faults) ranges from 4.2 to 15.8. Error prone source files and subsystems are identified. A correlational mapping of test functional area to product subsystem is completed. A new adaptive testing process based on real-time generation of the integrated database is proposed.
An integrated approach to identify the origin of PM10 exceedances.
Amodio, M; Andriani, E; de Gennaro, G; Demarinis Loiotile, A; Di Gilio, A; Placentino, M C
2012-09-01
This study was aimed to the development of an integrated approach for the characterization of particulate matter (PM) pollution events in the South of Italy. PM(10) and PM(2.5) daily samples were collected from June to November 2008 at an urban background site located in Bari (Puglia Region, South of Italy). Meteorological data, particle size distributions and atmospheric dispersion conditions were also monitored in order to provide information concerning the different features of PM sources. The collected data allowed suggesting four indicators to characterize different PM(10) exceedances. PM(2.5)/PM(10) ratio, natural radioactivity, aerosol maps and back-trajectory analysis and particle distributions were considered in order to evaluate the contribution of local anthropogenic sources and to determine the different origins of intrusive air mass coming from long-range transport, such as African dust outbreaks and aerosol particles from Central and Eastern Europe. The obtained results were confirmed by applying principal component analysis to the number particle concentration dataset and by the chemical characterization of the samples (PM(10) and PM(2.5)). The integrated approach for PM study suggested in this paper can be useful to support the air quality managers for the development of cost-effective control strategies and the application of more suitable risk management approaches.
Sensor Fusion Techniques for Phased-Array Eddy Current and Phased-Array Ultrasound Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arrowood, Lloyd F.
Sensor (or Data) fusion is the process of integrating multiple data sources to produce more consistent, accurate and comprehensive information than is provided by a single data source. Sensor fusion may also be used to combine multiple signals from a single modality to improve the performance of a particular inspection technique. Industrial nondestructive testing may utilize multiple sensors to acquire inspection data depending upon the object under inspection and the anticipated types of defects that can be identified. Sensor fusion can be performed at various levels of signal abstraction with each having its strengths and weaknesses. A multimodal data fusionmore » strategy first proposed by Heideklang and Shokouhi that combines spatially scattered detection locations to improve detection performance of surface-breaking and near-surface cracks in ferromagnetic metals is shown using a surface inspection example and is then extended for volumetric inspections. Utilizing data acquired from an Olympus Omniscan MX2 from both phased array eddy current and ultrasound probes on test phantoms, single and multilevel fusion techniques are employed to integrate signals from the two modalities. Preliminary results demonstrate how confidence in defect identification and interpretation benefit from sensor fusion techniques. Lastly, techniques for integrating data into radiographic and volumetric imagery from computed tomography are described and results are presented.« less
Scenario driven data modelling: a method for integrating diverse sources of data and data streams
2011-01-01
Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854
Design structure for in-system redundant array repair in integrated circuits
Bright, Arthur A.; Crumley, Paul G.; Dombrowa, Marc; Douskey, Steven M.; Haring, Rudolf A.; Oakland, Steven F.; Quellette, Michael R.; Strissel, Scott A.
2008-11-25
A design structure for repairing an integrated circuit during operation of the integrated circuit. The integrated circuit comprising of a multitude of memory arrays and a fuse box holding control data for controlling redundancy logic of the arrays. The design structure provides the integrated circuit with a control data selector for passing the control data from the fuse box to the memory arrays; providing a source of alternate control data, external of the integrated circuit; and connecting the source of alternate control data to the control data selector. The design structure further passes the alternate control data from the source thereof, through the control data selector and to the memory arrays to control the redundancy logic of the memory arrays.
Systems and methods for an integrated electrical sub-system powered by wind energy
Liu, Yan [Ballston Lake, NY; Garces, Luis Jose [Niskayuna, NY
2008-06-24
Various embodiments relate to systems and methods related to an integrated electrically-powered sub-system and wind power system including a wind power source, an electrically-powered sub-system coupled to and at least partially powered by the wind power source, the electrically-powered sub-system being coupled to the wind power source through power converters, and a supervisory controller coupled to the wind power source and the electrically-powered sub-system to monitor and manage the integrated electrically-powered sub-system and wind power system.
Hong, Hongwei; Rahal, Mohamad; Demosthenous, Andreas; Bayford, Richard H
2009-10-01
Multi-frequency electrical impedance tomography (MF-EIT) systems require current sources that are accurate over a wide frequency range (1 MHz) and with large load impedance variations. The most commonly employed current source design in EIT systems is the modified Howland circuit (MHC). The MHC requires tight matching of resistors to achieve high output impedance and may suffer from instability over a wide frequency range in an integrated solution. In this paper, we introduce a new integrated current source design in CMOS technology and compare its performance with the MHC. The new integrated design has advantages over the MHC in terms of power consumption and area. The output current and the output impedance of both circuits were determined through simulations and measurements over the frequency range of 10 kHz to 1 MHz. For frequencies up to 1 MHz, the measured maximum variation of the output current for the integrated current source is 0.8% whereas for the MHC the corresponding value is 1.5%. Although the integrated current source has an output impedance greater than 1 MOmega up to 1 MHz in simulations, in practice, the impedance is greater than 160 kOmega up to 1 MHz due to the presence of stray capacitance.
English, Lacey; Miller, James S; Mbusa, Rapheal; Matte, Michael; Kenney, Jessica; Bwambale, Shem; Ntaro, Moses; Patel, Palka; Mulogo, Edgar; Stone, Geren S
2016-04-29
In Uganda, over half of under-five child mortality is attributed to three infectious diseases: malaria, pneumonia and diarrhoea. Integrated community case management (iCCM) trains village health workers (VHWs) to provide in-home diagnosis and treatment of these common childhood illnesses. For severely ill children, iCCM relies on a functioning referral system to ensure timely treatment at a health facility. However, referral completion rates vary widely among iCCM programmes and are difficult to monitor. The Bugoye Integrated Community Case Management Initiative (BIMI) is an iCCM programme operating in Bugoye sub-county, Uganda. This case study describes BIMI's experience with monitoring referral completion at Bugoye Health Centre III (BHC), and outlines improvements to be made within iCCM referral systems. This study triangulated multiple data sources to evaluate the strengths and gaps in the BIMI referral system. Three quantitative data sources were reviewed: (1) VHW report of referred patients, (2) referral forms found at BHC, and (3) BHC patient records. These data sources were collated and triangulated from January-December 2014. The goal was to determine if patients were completing their referrals and if referrals were adequately documented using routine data sources. From January-December 2014, there were 268 patients referred to BHC, as documented by VHWs. However, only 52 of these patients had referral forms stored at BHC. Of the 52 referral forms found, 22 of these patients were also found in BHC register books recorded by clinic staff. Thus, the study found a mismatch between VHW reports of patient referrals and the referral visits documented at BHC. This discrepancy may indicate several gaps: (1) referred patients may not be completing their referral, (2) referral forms may be getting lost at BHC, and, (3) referred patients may be going to other health facilities or drug shops, rather than BHC, for their referral. This study demonstrates the challenges of effectively monitoring iCCM referral completion, given identified limitations such as discordant data sources, incomplete record keeping and lack of unique identifiers. There is a need to innovate and improve the ways by which referral compliance is monitored using routine data, in order to improve the percentage of referrals completed. Through research and field experience, this study proposes programmatic and technological solutions to rectify these gaps within iCCM programmes facing similar challenges. With improved monitoring, VHWs will be empowered to increase referral completion, allowing critically ill children to access needed health services.
NASA Astrophysics Data System (ADS)
Mohammad, R.; Ramsey, M.; Scheidt, S. P.
2010-12-01
Prior to mineral dust deposition affecting albedo, aerosols can have direct and indirect effects on local to regional scale climate by changing both the shortwave and longwave radiative forcing. In addition, mineral dust causes health hazards, such as respiratory-related illnesses and deaths, loss of agricultural soil, and safety hazards to aviation and motorists due to reduced visibility. Previous work utilized satellite and ground-based TIR data to describe the direct longwave radiative effect of the Saharan Air Layer (SAL) over the Atlantic Ocean originating from dust storms in the Western Sahara. TIR emission spectroscopy was used to identify the spectral absorption features of that dust. The current research focuses on Kuwait and utilizes a comprehensive set of spatial, analytical and geological tools to characterize dust emissions and its radiative effects. Surface mineral composition maps for the Kuwait region were created using ASTER images and GIS datasets in order to identify the possible sources of wind-blown dust. Backward trajectory analysis using the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model suggests the dust source areas were located in Iraq, Syria, Jordan and Saudi Arabia. Samples collected from two dust storms (May and July 2010) were analyzed for their mineral composition and to validate the dust source areas identified by the modeling and remote sensing analysis. These air fall dust samples were collected in glass containers on a 13 meter high rooftop in the suburb of Rumaithiya in Kuwait. Additional samples will be collected to expand the analysis and their chemical compositions will be characterized by a combination of laboratory X-ray fluorescence (XRF), Scanning Electron Microscopy (SEM) and TIR emission spectroscopy. The overarching objective of this ongoing research is to both characterize the effects of mineral dust on climate as well as establish a predictive tool that can identify dust storm sources and potentially aid in establishing a more accurate prediction and warning system in the Middle East region.
Demonstration of an ethane spectrometer for methane source identification.
Yacovitch, Tara I; Herndon, Scott C; Roscioli, Joseph R; Floerchinger, Cody; McGovern, Ryan M; Agnese, Michael; Pétron, Gabrielle; Kofler, Jonathan; Sweeney, Colm; Karion, Anna; Conley, Stephen A; Kort, Eric A; Nähle, Lars; Fischer, Marc; Hildebrandt, Lars; Koeth, Johannes; McManus, J Barry; Nelson, David D; Zahniser, Mark S; Kolb, Charles E
2014-07-15
Methane is an important greenhouse gas and tropospheric ozone precursor. Simultaneous observation of ethane with methane can help identify specific methane source types. Aerodyne Ethane-Mini spectrometers, employing recently available mid-infrared distributed feedback tunable diode lasers (DFB-TDL), provide 1 s ethane measurements with sub-ppb precision. In this work, an Ethane-Mini spectrometer has been integrated into two mobile sampling platforms, a ground vehicle and a small airplane, and used to measure ethane/methane enhancement ratios downwind of methane sources. Methane emissions with precisely known sources are shown to have ethane/methane enhancement ratios that differ greatly depending on the source type. Large differences between biogenic and thermogenic sources are observed. Variation within thermogenic sources are detected and tabulated. Methane emitters are classified by their expected ethane content. Categories include the following: biogenic (<0.2%), dry gas (1-6%), wet gas (>6%), pipeline grade natural gas (<15%), and processed natural gas liquids (>30%). Regional scale observations in the Dallas/Fort Worth area of Texas show two distinct ethane/methane enhancement ratios bridged by a transitional region. These results demonstrate the usefulness of continuous and fast ethane measurements in experimental studies of methane emissions, particularly in the oil and natural gas sector.
Improving Data Catalogs with Free and Open Source Software
NASA Astrophysics Data System (ADS)
Schweitzer, R.; Hankin, S.; O'Brien, K.
2013-12-01
The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are using free services such as Google Charts to create an easily identifiable visual metaphor which describes the quality of data catalogs. Using this rubric, in conjunction with the ncISO metadata quality rubric, will allow data providers to identify non-compliance issues in their data catalogs, thereby improving data availability to their users and to data discovery systems
Understanding potential exposure sources of perfluorinated carboxylic acids in the workplace.
Kaiser, Mary A; Dawson, Barbara J; Barton, Catherine A; Botelho, Miguel A
2010-11-01
This paper integrates perspectives from analytical chemistry, environmental engineering, and industrial hygiene to better understand how workers may be exposed to perfluorinated carboxylic acids when handling them in the workplace in order to identify appropriate exposure controls. Due to the dramatic difference in physical properties of the protonated acid form and the anionic form, this family of chemicals provides unique industrial hygiene challenges. Workplace monitoring, experimental data, and modeling results were used to ascertain the most probable workplace exposure sources and transport mechanisms for perfluorooctanoic acid (PFOA) and its ammonium salt (APFO). PFOA is biopersistent and its measurement in the blood has been used to assess human exposure since it integrates exposure from all routes of entry. Monitoring suggests that inhalation of airborne material may be an important exposure route. Transport studies indicated that, under low pH conditions, PFOA, the undissociated (acid) species, actively partitions from water into air. In addition, solid-phase PFOA and APFO may also sublime into the air. Modeling studies determined that contributions from surface sublimation and loss from low pH aqueous solutions can be significant potential sources of workplace exposure. These findings suggest that keeping surfaces clean, preventing accumulation of material in unventilated areas, removing solids from waste trenches and sumps, and maintaining neutral pH in sumps can lower workplace exposures.
The classification of flaring states of blazars
NASA Astrophysics Data System (ADS)
Resconi, E.; Franco, D.; Gross, A.; Costamante, L.; Flaccomio, E.
2009-08-01
Aims: The time evolution of the electromagnetic emission from blazars, in particular high-frequency peaked sources (HBLs), displays irregular activity that has not yet been understood. In this work we report a methodology capable of characterizing the time behavior of these variable objects. Methods: The maximum likelihood blocks (MLBs) is a model-independent estimator that subdivides the light curve into time blocks, whose length and amplitude are compatible with states of constant emission rate of the observed source. The MLBs yield the statistical significance in the rate variations and strongly suppresses the noise fluctuations in the light curves. We applied the MLBs for the first time on the long term X-ray light curves (RXTE/ASM) of Mkn 421, Mkn 501, 1ES 1959+650, and 1ES 2155-304, more than 10 years of observational data (1996-2007). Using the MLBs interpretation of RXTE/ASM data, the integrated time flux distribution is determined for each single source considered. We identify in these distributions the characteristic level, as well as the flaring states of the blazars. Results: All the distributions show a significant component at negative flux values, most probably caused by an uncertainty in the background subtraction and by intrinsic fluctuations of RXTE/ASM. This effect concerns in particular short time observations. To quantify the probability that the intrinsic fluctuations give rise to a false identification of a flare, we study a population of very faint sources and their integrated time-flux distribution. We determine duty cycle or fraction of time a source spent in the flaring state of the source Mkn 421, Mkn 501, 1ES 1959+650 and 1ES 2155-304. Moreover, we study the random coincidences between flares and generic sporadic events such as high-energy neutrinos or flares in other wavelengths.
ERIC Educational Resources Information Center
Fry, Michelle L.
2010-01-01
Until recently, few K-12 teachers outside of social studies have integrated primary sources in classroom instruction. Integrating primary sources in educational practice does require an uncommon pedagogical understanding. Addressing this K-12 educator need is the Library of Congress. Recently, the Library implemented a national educator…
Database of potential sources for earthquakes larger than magnitude 6 in Northern California
,
1996-01-01
The Northern California Earthquake Potential (NCEP) working group, composed of many contributors and reviewers in industry, academia and government, has pooled its collective expertise and knowledge of regional tectonics to identify potential sources of large earthquakes in northern California. We have created a map and database of active faults, both surficial and buried, that forms the basis for the northern California portion of the national map of probabilistic seismic hazard. The database contains 62 potential sources, including fault segments and areally distributed zones. The working group has integrated constraints from broadly based plate tectonic and VLBI models with local geologic slip rates, geodetic strain rate, and microseismicity. Our earthquake source database derives from a scientific consensus that accounts for conflict in the diverse data. Our preliminary product, as described in this report brings to light many gaps in the data, including a need for better information on the proportion of deformation in fault systems that is aseismic.
Leveraging contemporary species introductions to test phylogenetic hypotheses of trait evolution.
Lu-Irving, Patricia; Marx, Hannah E; Dlugosch, Katrina M
2018-05-10
Plant trait evolution is a topic of interest across disciplines and scales. Phylogenetic studies are powerful for generating hypotheses about the mechanisms that have shaped plant traits and their evolution. Introduced plants are a rich source of data on contemporary trait evolution. Introductions could provide especially useful tests of a variety of evolutionary hypotheses because the environments selecting on evolving traits are still present. We review phylogenetic and contemporary studies of trait evolution and identify areas of overlap and areas for further integration. Emerging tools which can promote integration include broadly focused repositories of trait data, and comparative models of trait evolution that consider both intra and interspecific variation. Copyright © 2018 Elsevier Ltd. All rights reserved.
An Open Source Tool for Game Theoretic Health Data De-Identification.
Prasser, Fabian; Gaupp, James; Wan, Zhiyu; Xia, Weiyi; Vorobeychik, Yevgeniy; Kantarcioglu, Murat; Kuhn, Klaus; Malin, Brad
2017-01-01
Biomedical data continues to grow in quantity and quality, creating new opportunities for research and data-driven applications. To realize these activities at scale, data must be shared beyond its initial point of collection. To maintain privacy, healthcare organizations often de-identify data, but they assume worst-case adversaries, inducing high levels of data corruption. Recently, game theory has been proposed to account for the incentives of data publishers and recipients (who attempt to re-identify patients), but this perspective has been more hypothetical than practical. In this paper, we report on a new game theoretic data publication strategy and its integration into the open source software ARX. We evaluate our implementation with an analysis on the relationship between data transformation, utility, and efficiency for over 30,000 demographic records drawn from the U.S. Census Bureau. The results indicate that our implementation is scalable and can be combined with various data privacy risk and quality measures.
Global Solar Magnetology and Reference Points of the Solar Cycle
NASA Astrophysics Data System (ADS)
Obridko, V. N.; Shelting, B. D.
2003-11-01
The solar cycle can be described as a complex interaction of large-scale/global and local magnetic fields. In general, this approach agrees with the traditional dynamo scheme, although there are numerous discrepancies in the details. Integrated magnetic indices introduced earlier are studied over long time intervals, and the epochs of the main reference points of the solar cycles are refined. A hypothesis proposed earlier concerning global magnetometry and the natural scale of the cycles is verified. Variations of the heliospheric magnetic field are determined by both the integrated photospheric i(B r )ph and source surface i(B r )ss indices, however, their roles are different. Local fields contribute significantly to the photospheric index determining the total increase in the heliospheric magnetic field. The i(B r )ss index (especially the partial index ZO, which is related to the quasi-dipolar field) determines narrow extrema. These integrated indices supply us with a “passport” for reference points, making it possible to identify them precisely. A prominent dip in the integrated indices is clearly visible at the cycle maximum, resulting in the typical double-peak form (the Gnevyshev dip), with the succeeding maximum always being higher than the preceding maximum. At the source surface, this secondary maximum significantly exceeds the primary maximum. Using these index data, we can estimate the progression expected for the 23rd cycle and predict the dates of the ends of the 23rd and 24th cycles (the middle of 2007 and December 2018, respectively).
Kirst, Maritt; Im, Jennifer; Burns, Tim; Baker, G. Ross; Goldhar, Jodeme; O'Campo, Patricia; Wojtak, Anne; Wodchis, Walter P
2017-01-01
Abstract Purpose A realist review of the evaluative evidence was conducted on integrated care (IC) programs for older adults to identify key processes that lead to the success or failure of these programs in achieving outcomes such as reduced healthcare utilization, improved patient health, and improved patient and caregiver experience. Data sources International academic literature was searched in 12 indexed, electronic databases and gray literature through internet searches, to identify evaluative studies. Study selection Inclusion criteria included evaluative literature on integrated, long-stay health and social care programs, published between January 1980 and July 2015, in English. Data extraction Data were extracted on the study purpose, period, setting, design, population, sample size, outcomes, and study results, as well as explanations of mechanisms and contextual factors influencing outcomes. Results of data synthesis A total of 65 articles, representing 28 IC programs, were included in the review. Two context-mechanism-outcome configurations (CMOcs) were identified: (i) trusting multidisciplinary team relationships and (ii) provider commitment to and understanding of the model. Contextual factors such as strong leadership that sets clear goals and establishes an organizational culture in support of the program, along with joint governance structures, supported team collaboration and subsequent successful implementation. Furthermore, time to build an infrastructure to implement and flexibility in implementation, emerged as key processes instrumental to success of these programs. Conclusions This review included a wide range of international evidence, and identified key processes for successful implementation of IC programs that should be considered by program planners, leaders and evaluators. PMID:28992156
Sun, Yi; Zhang, Wei; Chen, Yunqin; Ma, Qin; Wei, Jia; Liu, Qi
2016-02-23
Clinical responses to anti-cancer therapies often only benefit a defined subset of patients. Predicting the best treatment strategy hinges on our ability to effectively translate genomic data into actionable information on drug responses. To achieve this goal, we compiled a comprehensive collection of baseline cancer genome data and drug response information derived from a large panel of cancer cell lines. This data set was applied to identify the signature genes relevant to drug sensitivity and their resistance by integrating CNVs and the gene expression of cell lines with in vitro drug responses. We presented an efficient in-silico pipeline for integrating heterogeneous cell line data sources with the simultaneous modeling of drug response values across all the drugs and cell lines. Potential signature genes correlated with drug response (sensitive or resistant) in different cancer types were identified. Using signature genes, our collaborative filtering-based drug response prediction model outperformed the 44 algorithms submitted to the DREAM competition on breast cancer cells. The functions of the identified drug response related signature genes were carefully analyzed at the pathway level and the synthetic lethality level. Furthermore, we validated these signature genes by applying them to the classification of the different subtypes of the TCGA tumor samples, and further uncovered their in vivo implications using clinical patient data. Our work may have promise in translating genomic data into customized marker genes relevant to the response of specific drugs for a specific cancer type of individual patients.
A Conceptual Model of the Cognitive Processing of Environmental Distance Information
NASA Astrophysics Data System (ADS)
Montello, Daniel R.
I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clayton, James; Shedlock, Daniel; Langeveld, Willem G.J.
In the security and inspection market, there is a push towards highly mobile, reduced-dose active interrogation scanning and imaging systems to allow operation in urban environments. To achieve these goals, the accelerator system design needs to be smaller than existing systems. A smaller radiation exclusion zone may be accomplished through better beam collimation and an integrated, x-ray-source/detector-array assembly to allow feedback and control of an intensity-modulated x-ray source. A shaped low-Z target in the x-ray source can be used to generate a more forward peaked x-ray beam. Electron-beam steering can then be applied to direct the forward-peaked x rays towardmore » areas in the cargo with high attenuation. This paper presents an exploratory study to identify components and upgrades that would be required to meet the desired specifications, as well as the best technical approach to design and build a prototype.« less
Source Apportionment of Final Particulate Matterin North China Plain based on Air Quality Modeling
NASA Astrophysics Data System (ADS)
Xing, J.; Wu, W.; Chang, X.; Wang, S.; Hao, J.
2016-12-01
Most Chinese cities in North China Plain are suffering from serious air pollution. To develop the regional air pollution control policies, we need to identify the major source contributions to such pollution and to design the control policy which is accurate, efficient and effective. This study used the air quality model with serval advanced technologies including ISAM and ERSM, to assess the source contributions from individual pollutants (incl. SO2, NOx, VOC, NH3, primary PM), sectors (incl. power plants, industry, transportation and domestic), and regions (Beijing, Hebei, Tianjing and surrounding provinces). The modeling period is two months in 2012 as January and July which represent winter and summer respectively. The non-linear relationship between air pollutant emissions and air quality will be addressed, and the integrated control of multi-pollutants and multi-regions in China will be suggested.
Ayvaz, M Tamer
2010-09-20
This study proposes a linked simulation-optimization model for solving the unknown groundwater pollution source identification problems. In the proposed model, MODFLOW and MT3DMS packages are used to simulate the flow and transport processes in the groundwater system. These models are then integrated with an optimization model which is based on the heuristic harmony search (HS) algorithm. In the proposed simulation-optimization model, the locations and release histories of the pollution sources are treated as the explicit decision variables and determined through the optimization model. Also, an implicit solution procedure is proposed to determine the optimum number of pollution sources which is an advantage of this model. The performance of the proposed model is evaluated on two hypothetical examples for simple and complex aquifer geometries, measurement error conditions, and different HS solution parameter sets. Identified results indicated that the proposed simulation-optimization model is an effective way and may be used to solve the inverse pollution source identification problems. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Multimodal Imaging Using a 11B(d,nγ)12C Source
NASA Astrophysics Data System (ADS)
Nattress, Jason; Rose, Paul; Mayer, Michal; Wonders, Marc; Wilhelm, Kyle; Erickson, Anna; Jovanovic, Igor; Multimodal Imaging; Nuclear Detection (MIND) in Active Interrogation Collaboration
2016-03-01
Detection of shielded special nuclear material (SNM) still remains one of the greatest challenges facing nuclear security, where small signal-to-background ratios result from complex, challenging configurations of practical objects. Passive detection relies on the spontaneous radioactive decay, whereas active interrogation (AI) uses external probing radiation to identify and characterize the material. AI provides higher signal intensity, providing a more viable method for SNM detection. New and innovative approaches are needed to overcome specific application constraints, such as limited scanning time. We report on a new AI approach that integrates both neutron and gamma transmission signatures to deduce specific material properties that can be utilized to aid SNM identification. The approach uses a single AI source, single detector type imaging system based on the 11B(d,nγ)12C reaction and an array of eight EJ-309 liquid scintillators, respectively. An integral transmission imaging approach has been employed initially for both neutrons and photons, exploiting the detectors' particle discrimination properties. Representative object images using neutrons and photons will be presented.
Toward a comprehensive, theoretical model of compassion fatigue: An integrative literature review.
Coetzee, Siedine K; Laschinger, Heather K S
2018-03-01
This study was an integrative literature review in relation to compassion fatigue models, appraising these models, and developing a comprehensive theoretical model of compassion fatigue. A systematic search on PubMed, EbscoHost (Academic Search Premier, E-Journals, Medline, PsycINFO, Health Source Nursing/Academic Edition, CINAHL, MasterFILE Premier and Health Source Consumer Edition), gray literature, and manual searches of included reference lists was conducted in 2016. The studies (n = 11) were analyzed, and the strengths and limitations of the compassion fatigue models identified. We further built on these models through the application of the conservation of resources theory and the social neuroscience of empathy. The compassion fatigue model shows that it is not empathy that puts nurses at risk of developing compassion fatigue, but rather a lack of resources, inadequate positive feedback, and the nurse's response to personal distress. By acting on these three aspects, the risk of developing compassion fatigue can be addressed, which could improve the retention of a compassionate and committed nurse workforce. © 2017 John Wiley & Sons Australia, Ltd.
Parallel evolution of Nitric Oxide signaling: Diversity of synthesis & memory pathways
Moroz, Leonid L.; Kohn, Andrea B.
2014-01-01
The origin of NO signaling can be traceable back to the origin of life with the large scale of parallel evolution of NO synthases (NOSs). Inducible-like NOSs may be the most basal prototype of all NOSs and that neuronal-like NOS might have evolved several times from this prototype. Other enzymatic and non-enzymatic pathways for NO synthesis have been discovered using reduction of nitrites, an alternative source of NO. Diverse synthetic mechanisms can co-exist within the same cell providing a complex NO-oxygen microenvironment tightly coupled with cellular energetics. The dissection of multiple sources of NO formation is crucial in analysis of complex biological processes such as neuronal integration and learning mechanisms when NO can act as a volume transmitter within memory-forming circuits. In particular, the molecular analysis of learning mechanisms (most notably in insects and gastropod molluscs) opens conceptually different perspectives to understand the logic of recruiting evolutionarily conserved pathways for novel functions. Giant uniquely identified cells from Aplysia and related species precent unuque opportunities for integrative analysis of NO signaling at the single cell level. PMID:21622160
Hu, Youfan; Yang, Jin; Jing, Qingshen; Niu, Simiao; Wu, Wenzhuo; Wang, Zhong Lin
2013-11-26
An unstable mechanical structure that can self-balance when perturbed is a superior choice for vibration energy harvesting and vibration detection. In this work, a suspended 3D spiral structure is integrated with a triboelectric nanogenerator (TENG) for energy harvesting and sensor applications. The newly designed vertical contact-separation mode TENG has a wide working bandwidth of 30 Hz in low-frequency range with a maximum output power density of 2.76 W/m(2) on a load of 6 MΩ. The position of an in-plane vibration source was identified by placing TENGs at multiple positions as multichannel, self-powered active sensors, and the location of the vibration source was determined with an error less than 6%. The magnitude of the vibration is also measured by the output voltage and current signal of the TENG. By integrating the TENG inside a buoy ball, wave energy harvesting at water surface has been demonstrated and used for lighting illumination light, which shows great potential applications in marine science and environmental/infrastructure monitoring.
NASA Astrophysics Data System (ADS)
Estrany, Joan; Martinez-Carreras, Nuria
2013-04-01
Tracers have been acknowledged as a useful tool to identify sediment sources, based upon a variety of techniques and chemical and physical sediment properties. Sediment fingerprinting supports the notion that changes in sedimentation rates are not just related to increased/reduced erosion and transport in the same areas, but also to the establishment of different pathways increasing sediment connectivity. The Na Borges is a Mediterranean lowland agricultural river basin (319 km2) where traditional soil and water conservation practices have been applied over millennia to provide effective protection of cultivated land. During the twentieth century, industrialisation and pressure from tourism activities have increased urbanised surfaces, which have impacts on the processes that control streamflow. Within this context, source material sampling was focused in Na Borges on obtaining representative samples from potential sediment sources (comprised topsoil; i.e., 0-2 cm) susceptible to mobilisation by water and subsequent routing to the river channel network, while those representing channel bank sources were collected from actively eroding channel margins and ditches. Samples of road dust and of solids from sewage treatment plants were also collected. During two hydrological years (2004-2006), representative suspended sediment samples for use in source fingerprinting studies were collected at four flow gauging stations and at eight secondary sampling points using time-integrating sampling samplers. Likewise, representative bed-channel sediment samples were obtained using the resuspension approach at eight sampling points in the main stem of the Na Borges River. These deposits represent the fine sediment temporarily stored in the bed-channel and were also used for tracing source contributions. A total of 102 individual time-integrated sediment samples, 40 bulk samples and 48 bed-sediment samples were collected. Upon return to the laboratory, source material samples were oven-dried at 40° C, disaggregated using a pestle and mortar, and dry sieved to
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-28
...--Experimental Aircraft Association ELT--Emergency Locator Transmitter ES--Extended Squitter EUROCAE--European...--Security Certification and Accreditation Procedures SDA--System Design Assurance SIL--Source Integrity.... Surveillance Integrity Level 6. Source Integrity Level (SIL) and System Design Assurance (SDA) 7. Secondary...
Integration of Landsat, Seasat, and other geo-data sources
NASA Technical Reports Server (NTRS)
Zobrist, A. L.; Blackwell, R. J.; Stromberg, W. D.
1979-01-01
The paper discusses integration of Landsat, Seasat, and other geographic information sources. Mosaicking of radar data and registration of radar to Landsat digital imagery are described, and six types of geophysical data, including gravity and magnetic measurements, are integrated and analyzed using image processing techniques.
Recent Advances in Fiber Lasers for Nonlinear Microscopy
Xu, C.; Wise, F. W.
2013-01-01
Nonlinear microscopy techniques developed over the past two decades have provided dramatic new capabilities for biological imaging. The initial demonstrations of nonlinear microscopies coincided with the development of solid-state femtosecond lasers, which continue to dominate applications of nonlinear microscopy. Fiber lasers offer attractive features for biological and biomedical imaging, and recent advances are leading to high-performance sources with the potential for robust, inexpensive, integrated instruments. This article discusses recent advances, and identifies challenges and opportunities for fiber lasers in nonlinear bioimaging. PMID:24416074
Identification of everyday objects on the basis of Gaborized outline versions
Sassi, Michaël; Vancleef, Kathleen; Machilsen, Bart; Panis, Sven; Wagemans, Johan
2010-01-01
Using outlines derived from a widely used set of line drawings, we created stimuli geared towards the investigation of contour integration and texture segmentation using shapes of everyday objects. Each stimulus consisted of Gabor elements positioned and oriented curvilinearly along the outline of an object, embedded within a larger Gabor array of homogeneous density. We created six versions of the resulting Gaborized outline stimuli by varying the orientations of elements inside and outside the outline. Data from two experiments, in which participants attempted to identify the objects in the stimuli, provide norms for identifiability and name agreement, and show differences in identifiability between stimulus versions. While there was substantial variability between the individual objects in our stimulus set, further analyses suggest a number of stimulus properties which are generally predictive of identification performance. The stimuli and the accompanying normative data, both available on our website (http://www.gestaltrevision.be/sources/gaboroutlines), provide a useful tool to further investigate contour integration and texture segmentation in both normal and clinical populations, especially when top-down influences on these processes, such as the role of prior knowledge of familiar objects, are of main interest. PMID:23145218
Modrák, Martin; Vohradský, Jiří
2018-04-13
Identifying regulons of sigma factors is a vital subtask of gene network inference. Integrating multiple sources of data is essential for correct identification of regulons and complete gene regulatory networks. Time series of expression data measured with microarrays or RNA-seq combined with static binding experiments (e.g., ChIP-seq) or literature mining may be used for inference of sigma factor regulatory networks. We introduce Genexpi: a tool to identify sigma factors by combining candidates obtained from ChIP experiments or literature mining with time-course gene expression data. While Genexpi can be used to infer other types of regulatory interactions, it was designed and validated on real biological data from bacterial regulons. In this paper, we put primary focus on CyGenexpi: a plugin integrating Genexpi with the Cytoscape software for ease of use. As a part of this effort, a plugin for handling time series data in Cytoscape called CyDataseries has been developed and made available. Genexpi is also available as a standalone command line tool and an R package. Genexpi is a useful part of gene network inference toolbox. It provides meaningful information about the composition of regulons and delivers biologically interpretable results.
Identification of everyday objects on the basis of Gaborized outline versions.
Sassi, Michaël; Vancleef, Kathleen; Machilsen, Bart; Panis, Sven; Wagemans, Johan
2010-01-01
Using outlines derived from a widely used set of line drawings, we created stimuli geared towards the investigation of contour integration and texture segmentation using shapes of everyday objects. Each stimulus consisted of Gabor elements positioned and oriented curvilinearly along the outline of an object, embedded within a larger Gabor array of homogeneous density. We created six versions of the resulting Gaborized outline stimuli by varying the orientations of elements inside and outside the outline. Data from two experiments, in which participants attempted to identify the objects in the stimuli, provide norms for identifiability and name agreement, and show differences in identifiability between stimulus versions. While there was substantial variability between the individual objects in our stimulus set, further analyses suggest a number of stimulus properties which are generally predictive of identification performance. The stimuli and the accompanying normative data, both available on our website (http://www.gestaltrevision.be/sources/gaboroutlines), provide a useful tool to further investigate contour integration and texture segmentation in both normal and clinical populations, especially when top-down influences on these processes, such as the role of prior knowledge of familiar objects, are of main interest.
Milc, Justyna; Sala, Antonio; Bergamaschi, Sonia; Pecchioni, Nicola
2011-01-01
The CEREALAB database aims to store genotypic and phenotypic data obtained by the CEREALAB project and to integrate them with already existing data sources in order to create a tool for plant breeders and geneticists. The database can help them in unravelling the genetics of economically important phenotypic traits; in identifying and choosing molecular markers associated to key traits; and in choosing the desired parentals for breeding programs. The database is divided into three sub-schemas corresponding to the species of interest: wheat, barley and rice; each sub-schema is then divided into two sub-ontologies, regarding genotypic and phenotypic data, respectively. Database URL: http://www.cerealab.unimore.it/jws/cerealab.jnlp PMID:21247929
DASMiner: discovering and integrating data from DAS sources
2009-01-01
Background DAS is a widely adopted protocol for providing syntactic interoperability among biological databases. The popularity of DAS is due to a simplified and elegant mechanism for data exchange that consists of sources exposing their RESTful interfaces for data access. As a growing number of DAS services are available for molecular biology resources, there is an incentive to explore this protocol in order to advance data discovery and integration among these resources. Results We developed DASMiner, a Matlab toolkit for querying DAS data sources that enables creation of integrated biological models using the information available in DAS-compliant repositories. DASMiner is composed by a browser application and an API that work together to facilitate gathering of data from different DAS sources, which can be used for creating enriched datasets from multiple sources. The browser is used to formulate queries and navigate data contained in DAS sources. Users can execute queries against these sources in an intuitive fashion, without the need of knowing the specific DAS syntax for the particular source. Using the source's metadata provided by the DAS Registry, the browser's layout adapts to expose only the set of commands and coordinate systems supported by the specific source. For this reason, the browser can interrogate any DAS source, independently of the type of data being served. The API component of DASMiner may be used for programmatic access of DAS sources by programs in Matlab. Once the desired data is found during navigation, the query is exported in the format of an API call to be used within any Matlab application. We illustrate the use of DASMiner by creating integrative models of histone modification maps and protein-protein interaction networks. These enriched datasets were built by retrieving and integrating distributed genomic and proteomic DAS sources using the API. Conclusion The support of the DAS protocol allows that hundreds of molecular biology databases to be treated as a federated, online collection of resources. DASMiner enables full exploration of these resources, and can be used to deploy applications and create integrated views of biological systems using the information deposited in DAS repositories. PMID:19919683
NASA Remote Sensing Observations for Water Resource and Infrastructure Management
NASA Astrophysics Data System (ADS)
Granger, S. L.; Armstrong, L.; Farr, T.; Geller, G.; Heath, E.; Hyon, J.; Lavoie, S.; McDonald, K.; Realmuto, V.; Stough, T.; Szana, K.
2008-12-01
Decision support tools employed by water resource and infrastructure managers often utilize data products obtained from local sources or national/regional databases of historic surveys and observations. Incorporation of data from these sources can be laborious and time consuming as new products must be identified, cleaned and archived for each new study site. Adding remote sensing observations to the list of sources holds promise for a timely, consistent, global product to aid decision support at regional and global scales by providing global observations of geophysical parameters including soil moisture, precipitation, atmospheric temperature, derived evapotranspiration, and snow extent needed for hydrologic models and decision support tools. However, issues such as spatial and temporal resolution arise when attempting to integrate remote sensing observations into existing decision support tools. We are working to overcome these and other challenges through partnerships with water resource managers, tool developers and other stakeholders. We are developing a new data processing framework, enabled by a core GIS server, to seamlessly pull together observations from disparate sources for synthesis into information products and visualizations useful to the water resources community. A case study approach is being taken to develop the system by working closely with water infrastructure and resource managers to integrate remote observations into infrastructure, hydrologic and water resource decision tools. We present the results of a case study utilizing observations from the PALS aircraft instrument as a proxy for NASA's upcoming Soil Moisture Active Passive (SMAP) mission and an existing commercial decision support tool.
Neural Decoding of Bistable Sounds Reveals an Effect of Intention on Perceptual Organization
2018-01-01
Auditory signals arrive at the ear as a mixture that the brain must decompose into distinct sources based to a large extent on acoustic properties of the sounds. An important question concerns whether listeners have voluntary control over how many sources they perceive. This has been studied using pure high (H) and low (L) tones presented in the repeating pattern HLH-HLH-, which can form a bistable percept heard either as an integrated whole (HLH-) or as segregated into high (H-H-) and low (-L-) sequences. Although instructing listeners to try to integrate or segregate sounds affects reports of what they hear, this could reflect a response bias rather than a perceptual effect. We had human listeners (15 males, 12 females) continuously report their perception of such sequences and recorded neural activity using MEG. During neutral listening, a classifier trained on patterns of neural activity distinguished between periods of integrated and segregated perception. In other conditions, participants tried to influence their perception by allocating attention either to the whole sequence or to a subset of the sounds. They reported hearing the desired percept for a greater proportion of time than when listening neutrally. Critically, neural activity supported these reports; stimulus-locked brain responses in auditory cortex were more likely to resemble the signature of segregation when participants tried to hear segregation than when attempting to perceive integration. These results indicate that listeners can influence how many sound sources they perceive, as reflected in neural responses that track both the input and its perceptual organization. SIGNIFICANCE STATEMENT Can we consciously influence our perception of the external world? We address this question using sound sequences that can be heard either as coming from a single source or as two distinct auditory streams. Listeners reported spontaneous changes in their perception between these two interpretations while we recorded neural activity to identify signatures of such integration and segregation. They also indicated that they could, to some extent, choose between these alternatives. This claim was supported by corresponding changes in responses in auditory cortex. By linking neural and behavioral correlates of perception, we demonstrate that the number of objects that we perceive can depend not only on the physical attributes of our environment, but also on how we intend to experience it. PMID:29440556
Content Integration across Multiple Documents Reduces Memory for Sources
ERIC Educational Resources Information Center
Braasch, Jason L. G.; McCabe, Rebecca M.; Daniel, Frances
2016-01-01
The current experiments systematically examined semantic content integration as a mechanism for explaining source inattention and forgetting when reading-to-remember multiple texts. For all 3 experiments, degree of semantic overlap was manipulated amongst messages provided by various information sources. In Experiment 1, readers' source…
NASA Technical Reports Server (NTRS)
Bicknell, B.; Wilson, S.; Dennis, M.; Lydon, M.
1988-01-01
Commonality and integration of propulsion and fluid systems associated with the Space Station elements are being evaluated. The Space Station elements consist of the core station, which includes habitation and laboratory modules, nodes, airlocks, and trusswork; and associated vehicles, platforms, experiments, and payloads. The program is being performed as two discrete tasks. Task 1 investigated the components of the Space Station architecture to determine the feasibility and practicality of commonality and integration among the various propulsion elements. This task was completed. Task 2 is examining integration and commonality among fluid systems which were identified by the Phase B Space Station contractors as being part of the initial operating capability (IOC) and growth Space Station architectures. Requirements and descriptions for reference fluid systems were compiled from Space Station documentation and other sources. The fluid systems being examined are: an experiment gas supply system, an oxygen/hydrogen supply system, an integrated water system, the integrated nitrogen system, and the integrated waste fluids system. Definitions and descriptions of alternate systems were developed, along with analyses and discussions of their benefits and detriments. This databook includes fluid systems descriptions, requirements, schematic diagrams, component lists, and discussions of the fluid systems. In addition, cost comparison are used in some cases to determine the optimum system for a specific task.
Useful Life Prediction for Payload Carrier Hardware
NASA Technical Reports Server (NTRS)
Ben-Arieh, David
2002-01-01
The Space Shuttle has been identified for use through 2020. Payload carrier systems will be needed to support missions through the same time frame. To support the future decision making process with reliable systems, it is necessary to analyze design integrity, identify possible sources of undesirable risk and recognize required upgrades for carrier systems. This project analyzed the information available regarding the carriers and developed the probability of becoming obsolete under different scenarios. In addition, this project resulted in a plan for an improved information system that will improve monitoring and control of the various carriers. The information collected throughout this project is presented in this report as process flow, historical records, and statistical analysis.
[Efficiency indicators to contribute to sustainability of health services in Spain].
García, E I; Mira Solves, J J; Guilabert Mora, M
2014-01-01
Identifying a minimum set of efficiency indicators calculated from current information sources. Interventions adopted from the analysis of these indicators could contribute to health services sustainability. We applied the discussion group technique. A total of 23 quality coordinators from around the country and the representatives of the regional quality societies in SECA (Spanish Society for Quality in Healthcare) participated. Ten efficiency indicators useful for integrated management areas were identified and accepted, 5 in the area of primary care and 5 for hospital management. The efficiency indicators agreed upon could contribute to the sustainability of the health system without this affecting the quality of care. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.
Hybrid Energy: Combining Nuclear and Other Energy Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jong Suk; Garcia, Humberto E.
2015-02-01
The leading cause of global climate change is generally accepted to be growing emissions of greenhouse gas (GHG) as a result of increased use of fossil fuels [1]. Among various sources of GHG, the global electricity supply sector generates the largest share of GHG emissions (37.5% of total CO2 emissions) [2]. Since the current electricity production heavily relies on fossil fuels, it is envisioned that bolstering generation technologies based on non-emitting energy sources, i.e., nuclear and/or renewables could reduce future GHG emissions. Integrated nuclear-renewable hybrid energy systems HES) are very-low-emitting options, but they are capital-intensive technologies that should operate atmore » full capacities to maximize profits. Hence, electricity generators often pay the grid to take electricity when demand is low, resulting in negative profits for many hours per year. Instead of wasting an excess generation capacity at negative profit during off-peak hours when electricity prices are low, nuclear-renewable HES could result in positive profits by storing and/or utilizing surplus thermal and/or electrical energy to produce useful storable products to meet industrial and transportation demands. Consequently, it is necessary (1) to identify key integrated system options based on specific regions and (2) to propose optimal operating strategy to economically produce products on demand. In prioritizing region-specific HES options, available resources, markets, existing infrastructures, and etc. need to be researched to identify attractive system options. For example, the scarcity of water (market) and the availability of abundant solar radiation make solar energy (resource) a suitable option to mitigate the water deficit the Central-Southern region of the U.S. Thus, a solar energy-driven desalination process would be an attractive option to be integrated into a nuclear power plant to support the production of fresh water in this region. In this work, we introduce a particular HES option proposed for a specific U.S. region and briefly describe our modeling assumptions and procedure utilized for its analysis. Preliminary simulation results are also included addressing several technical characteristics of the proposed nuclear-renewable HES.« less
Ryou, Hyoung Gon; Heo, Jongbae; Kim, Sun-Young
2018-09-01
Studies of source apportionment (SA) for particulate matter (PM) air pollution have enhanced understanding of dominant pollution sources and quantification of their contribution. Although there have been many SA studies in South Korea over the last two decades, few studies provided an integrated understanding of PM sources nationwide. The aim of this study was to summarize findings of PM SA studies of South Korea and to explore study characteristics. We selected studies that estimated sources of PM 10 and PM 2.5 performed for 2000-2017 in South Korea using Positive Matrix Factorization and Chemical Mass Balance. We reclassified the original PM sources identified in each study into seven categories: motor vehicle, secondary aerosol, soil dust, biomass/field burning, combustion/industry, natural source, and others. These seven source categories were summarized by using frequency and contribution across four regions, defined by northwest, west, southeast, and southwest regions, by PM 10 and PM 2.5 . We also computed the population-weighted mean contribution of each source category. In addition, we compared study features including sampling design, sampling and lab analysis methods, chemical components, and the inclusion of Asian dust days. In the 21 selected studies, all six PM 10 studies identified motor vehicle, soil dust, and combustion/industry, while all 15 PM 2.5 studies identified motor vehicle and soil dust. Different from the frequency, secondary aerosol produced a large contribution to both PM 10 and PM 2.5 . Motor vehicle contributed highly to both, whereas the contribution of combustion/industry was high for PM 10 . The population-weighted mean contribution was the highest for the motor vehicle and secondary aerosol sources for both PM10 and PM2.5. However, these results were based on different subsets of chemical speciation data collected at a single sampling site, commonly in metropolitan areas, with short overlap and measured by different lab analysis methods. We found that motor vehicle and secondary aerosol were the most common and influential sources for PM in South Korea. Our study, however, suggested a caution to understand SA findings from heterogeneous study features for study designs and input data. Copyright © 2018. Published by Elsevier Ltd.
Integrated spatiotemporal characterization of dust sources and outbreaks in Central and East Asia
NASA Astrophysics Data System (ADS)
Darmenova, Kremena T.
The potential of atmospheric dust aerosols to modify the Earth's environment and climate has been recognized for some time. However, predicting the diverse impact of dust has several significant challenges. One is to quantify the complex spatial and temporal variability of dust burden in the atmosphere. Another is to quantify the fraction of dust originating from human-made sources. This thesis focuses on the spatiotemporal characterization of sources and dust outbreaks in Central and East Asia by integrating ground-based data, satellite multisensor observations, and modeling. A new regional dust modeling system capable of operating over a span of scales was developed. The modeling system consists of a dust module DuMo, which incorporates several dust emission schemes of different complexity, and the PSU/NCAR mesoscale model MM5, which offers a variety of physical parameterizations and flexible nesting capability. The modeling system was used to perform for the first time a comprehensive study of the timing, duration, and intensity of individual dust events in Central and East Asia. Determining the uncertainties caused by the choice of model physics, especially the boundary layer parameterization, and the dust production scheme was the focus of our study. Implications to assessments of the anthropogenic dust fraction in these regions were also addressed. Focusing on Spring 2001, an analysis of routine surface meteorological observations and satellite multi-sensor data was carried out in conjunction with modeling to determine the extent to which integrated data set can be used to characterize the spatiotemporal distribution of dust plumes at a range of temporal scales, addressing the active dust sources in China and Mongolia, mid-range transport and trans-Pacific, long-range transport of dust outbreaks on a case-by-case basis. This work demonstrates that adequate and consistent characterization of individual dust events is central to establishing a reliable climatology, ultimately leading to improved assessments of dust impacts on the environment and climate. This will also help to identify the appropriate temporal and spatial scales for adequate intercomparison between model results and observational data as well as for developing an integrated analysis methodology for dust studies.
Gauthier, G Robin; Francisco, Sara C; Khan, Bilal; Dombrowski, Kirk
2018-05-01
Throughout North America, indigenous women experience higher rates of intimate partner violence and sexual violence than any other ethnic group, and so it is of particular importance to understand sources of support for Native American women. In this article, we use social network analysis to study the relationship between social integration and women's access to domestic violence support by examining the recommendations they would give to another woman in need. We ask two main questions: First, are less integrated women more likely to make no recommendation at all when compared with more socially integrated women? Second, are less integrated women more likely than more integrated women to nominate a formal source of support rather than an informal one? We use network data collected from interviews with 158 Canadian women residing in an indigenous community to measure their access to support. We find that, in general, less integrated women are less likely to make a recommendation than more integrated women. However, when they do make a recommendation, less integrated women are more likely to recommend a formal source of support than women who are more integrated. These results add to our understanding of how access to two types of domestic violence support is embedded in the larger set of social relations of an indigenous community.
Wells, James A; Thrush, Carol R; Martinson, Brian C; May, Terry A; Stickler, Michelle; Callahan, Eileen C; Klomparens, Karen L
2014-12-01
The Survey of Organizational Research Climate (SOuRCe) is a new instrument that assesses dimensions of research integrity climate, including ethical leadership, socialization and communication processes, and policies, procedures, structures, and processes to address risks to research integrity. We present a descriptive analysis to characterize differences on the SOuRCe scales across departments, fields of study, and status categories (faculty, postdoctoral scholars, and graduate students) for 11,455 respondents from three research-intensive universities. Among the seven SOuRCe scales, variance explained by status and fields of study ranged from 7.6% (Advisor-Advisee Relations) to 16.2% (Integrity Norms). Department accounted for greater than 50% of the variance explained for each of the SOuRCe scales, ranging from 52.6% (Regulatory Quality) to 80.3% (Integrity Inhibitors). It is feasible to implement this instrument in large university settings across a broad range of fields, department types, and individual roles within academic units. Published baseline results provide initial data for institutions using the SOuRCe who wish to compare their own research integrity climates. © The Author(s) 2014.
Ellouze, Afef Samet; Bouaziz, Rafik; Ghorbel, Hanen
2016-10-01
Integrating semantic dimension into clinical archetypes is necessary once modeling medical records. First, it enables semantic interoperability and, it offers applying semantic activities on clinical data and provides a higher design quality of Electronic Medical Record (EMR) systems. However, to obtain these advantages, designers need to use archetypes that cover semantic features of clinical concepts involved in their specific applications. In fact, most of archetypes filed within open repositories are expressed in the Archetype Definition Language (ALD) which allows defining only the syntactic structure of clinical concepts weakening semantic activities on the EMR content in the semantic web environment. This paper focuses on the modeling of an EMR prototype for infants affected by Cerebral Palsy (CP), using the dual model approach and integrating semantic web technologies. Such a modeling provides a better delivery of quality of care and ensures semantic interoperability between all involved therapies' information systems. First, data to be documented are identified and collected from the involved therapies. Subsequently, data are analyzed and arranged into archetypes expressed in accordance of ADL. During this step, open archetype repositories are explored, in order to find the suitable archetypes. Then, ADL archetypes are transformed into archetypes expressed in OWL-DL (Ontology Web Language - Description Language). Finally, we construct an ontological source related to these archetypes enabling hence their annotation to facilitate data extraction and providing possibility to exercise semantic activities on such archetypes. Semantic dimension integration into EMR modeled in accordance to the archetype approach. The feasibility of our solution is shown through the development of a prototype, baptized "CP-SMS", which ensures semantic exploitation of CP EMR. This prototype provides the following features: (i) creation of CP EMR instances and their checking by using a knowledge base which we have constructed by interviews with domain experts, (ii) translation of initially CP ADL archetypes into CP OWL-DL archetypes, (iii) creation of an ontological source which we can use to annotate obtained archetypes and (vi) enrichment and supply of the ontological source and integration of semantic relations by providing hence fueling the ontology with new concepts, ensuring consistency and eliminating ambiguity between concepts. The degree of semantic interoperability that could be reached between EMR systems depends strongly on the quality of the used archetypes. Thus, the integration of semantic dimension in archetypes modeling process is crucial. By creating an ontological source and annotating archetypes, we create a supportive platform ensuring semantic interoperability between archetypes-based EMR-systems. Copyright © 2016. Published by Elsevier Inc.
Spinozzi, Giulio; Calabria, Andrea; Brasca, Stefano; Beretta, Stefano; Merelli, Ivan; Milanesi, Luciano; Montini, Eugenio
2017-11-25
Bioinformatics tools designed to identify lentiviral or retroviral vector insertion sites in the genome of host cells are used to address the safety and long-term efficacy of hematopoietic stem cell gene therapy applications and to study the clonal dynamics of hematopoietic reconstitution. The increasing number of gene therapy clinical trials combined with the increasing amount of Next Generation Sequencing data, aimed at identifying integration sites, require both highly accurate and efficient computational software able to correctly process "big data" in a reasonable computational time. Here we present VISPA2 (Vector Integration Site Parallel Analysis, version 2), the latest optimized computational pipeline for integration site identification and analysis with the following features: (1) the sequence analysis for the integration site processing is fully compliant with paired-end reads and includes a sequence quality filter before and after the alignment on the target genome; (2) an heuristic algorithm to reduce false positive integration sites at nucleotide level to reduce the impact of Polymerase Chain Reaction or trimming/alignment artifacts; (3) a classification and annotation module for integration sites; (4) a user friendly web interface as researcher front-end to perform integration site analyses without computational skills; (5) the time speedup of all steps through parallelization (Hadoop free). We tested VISPA2 performances using simulated and real datasets of lentiviral vector integration sites, previously obtained from patients enrolled in a hematopoietic stem cell gene therapy clinical trial and compared the results with other preexisting tools for integration site analysis. On the computational side, VISPA2 showed a > 6-fold speedup and improved precision and recall metrics (1 and 0.97 respectively) compared to previously developed computational pipelines. These performances indicate that VISPA2 is a fast, reliable and user-friendly tool for integration site analysis, which allows gene therapy integration data to be handled in a cost and time effective fashion. Moreover, the web access of VISPA2 ( http://openserver.itb.cnr.it/vispa/ ) ensures accessibility and ease of usage to researches of a complex analytical tool. We released the source code of VISPA2 in a public repository ( https://bitbucket.org/andreacalabria/vispa2 ).
Integrating an Automatic Judge into an Open Source LMS
ERIC Educational Resources Information Center
Georgouli, Katerina; Guerreiro, Pedro
2011-01-01
This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…
Inverse modeling methods for indoor airborne pollutant tracking: literature review and fundamentals.
Liu, X; Zhai, Z
2007-12-01
Reduction in indoor environment quality calls for effective control and improvement measures. Accurate and prompt identification of contaminant sources ensures that they can be quickly removed and contaminated spaces isolated and cleaned. This paper discusses the use of inverse modeling to identify potential indoor pollutant sources with limited pollutant sensor data. The study reviews various inverse modeling methods for advection-dispersion problems and summarizes the methods into three major categories: forward, backward, and probability inverse modeling methods. The adjoint probability inverse modeling method is indicated as an appropriate model for indoor air pollutant tracking because it can quickly find source location, strength and release time without prior information. The paper introduces the principles of the adjoint probability method and establishes the corresponding adjoint equations for both multi-zone airflow models and computational fluid dynamics (CFD) models. The study proposes a two-stage inverse modeling approach integrating both multi-zone and CFD models, which can provide a rapid estimate of indoor pollution status and history for a whole building. Preliminary case study results indicate that the adjoint probability method is feasible for indoor pollutant inverse modeling. The proposed method can help identify contaminant source characteristics (location and release time) with limited sensor outputs. This will ensure an effective and prompt execution of building management strategies and thus achieve a healthy and safe indoor environment. The method can also help design optimal sensor networks.
ACToR Chemical Structure processing using Open Source ...
ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from over 1,950 public sources. ACToR contains chemical structure information and toxicological data for over 558,000 unique chemicals. The database primarily includes data from NCCT research programs, in vivo toxicity data from ToxRef, human exposure data from ExpoCast, high-throughput screening data from ToxCast and high quality chemical structure information from the EPA DSSTox program. The DSSTox database is a chemical structure inventory for the NCCT programs and currently has about 16,000 unique structures. Included are also data from PubChem, ChemSpider, USDA, FDA, NIH and several other public data sources. ACToR has been a resource to various international and national research groups. Most of our recent efforts on ACToR are focused on improving the structural identifiers and Physico-Chemical properties of the chemicals in the database. Organizing this huge collection of data and improving the chemical structure quality of the database has posed some major challenges. Workflows have been developed to process structures, calculate chemical properties and identify relationships between CAS numbers. The Structure processing workflow integrates web services (PubChem and NIH NCI Cactus) to d
NASA Astrophysics Data System (ADS)
Jin, L.; Borgeson, S.; Fredman, D.; Hans, L.; Spurlock, A.; Todd, A.
2015-12-01
California's renewable portfolio standard (2012) requires the state to get 33% of its electricity from renewable sources by 2020. Increased share of variable renewable sources such as solar and wind in the California electricity system may require more grid flexibility to insure reliable power services. Such grid flexibility can be potentially provided by changes in end use electricity consumptions in response to grid conditions (demand-response). In the solar case, residential consumption in the late afternoon can be used as reserve capacity to balance the drop in solar generation. This study presents our initial attempt to identify, from a behavior perspective, residential demand response potentials in relation to solar ramp events using a data-driven approach. Based on hourly residential energy consumption data, we derive representative daily load shapes focusing on discretionary consumption with an innovative clustering analysis technique. We aggregate the representative load shapes into behavior groups in terms of the timing and rhythm of energy use in the context of solar ramp events. Households of different behavior groups that are active during hours with high solar ramp rates are identified for capturing demand response potential. Insights into the nature and predictability of response to demand-response programs are provided.
Energy harvesting concepts for small electric unmanned systems
NASA Astrophysics Data System (ADS)
Qidwai, Muhammad A.; Thomas, James P.; Kellogg, James C.; Baucom, Jared N.
2004-07-01
In this study, we identify and survey energy harvesting technologies for small electrically powered unmanned systems designed for long-term (>1 day) time-on-station missions. An environmental energy harvesting scheme will provide long-term, energy additions to the on-board energy source. We have identified four technologies that cover a broad array of available energy sources: solar, kinetic (wind) flow, autophagous structure-power (both combustible and metal air-battery systems) and electromagnetic (EM) energy scavenging. We present existing conceptual designs, critical system components, performance, constraints and state-of-readiness for each technology. We have concluded that the solar and autophagous technologies are relatively matured for small-scale applications and are capable of moderate power output levels (>1 W). We have identified key components and possible multifunctionalities in each technology. The kinetic flow and EM energy scavenging technologies will require more in-depth study before they can be considered for implementation. We have also realized that all of the harvesting systems require design and integration of various electrical, mechanical and chemical components, which will require modeling and optimization using hybrid mechatronics-circuit simulation tools. This study provides a starting point for detailed investigation into the proposed technologies for unmanned system applications under current development.
Piecewise synonyms for enhanced UMLS source terminology integration.
Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J
2007-10-11
The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.
Ionospheric threats to the integrity of airborne GPS users
NASA Astrophysics Data System (ADS)
Datta-Barua, Seebany
The Global Positioning System (GPS) has both revolutionized and entwined the worlds of aviation and atmospheric science. As the largest and most unpredictable source of GPS positioning error, the ionospheric layer of the atmosphere, if left unchecked, can endanger the safety, or "integrity," of the single frequency airborne user. An augmentation system is a differential-GPS-based navigation system that provides integrity through independent ionospheric monitoring by reference stations. However, the monitor stations are not in general colocated with the user's GPS receiver. The augmentation system must protect users from possible ionosphere density variations occurring between its measurements and the user's. This study analyzes observations from ionospherically active periods to identify what types of ionospheric disturbances may cause threats to user safety if left unmitigated. This work identifies when such disturbances may occur using a geomagnetic measure of activity and then considers two disturbances as case studies. The first case study indicates the need for a non-trivial threat model for the Federal Aviation Administration's Local Area Augmentation System (LAAS) that was not known prior to the work. The second case study uses ground- and space-based data to model an ionospheric disturbance of interest to the Federal Aviation Administration's Wide Area Augmentation System (WAAS). This work is a step in the justification for, and possible future refinement of, one of the WAAS integrity algorithms. For both WAAS and LAAS, integrity threats are basically caused by events that may be occurring but are unobservable. Prior to the data available in this solar cycle, events of such magnitude were not known to be possible. This work serves as evidence that the ionospheric threat models developed for WARS and LAAS are warranted and that they are sufficiently conservative to maintain user integrity even under extreme ionospheric behavior.
Emerging Concepts of Data Integration in Pathogen Phylodynamics.
Baele, Guy; Suchard, Marc A; Rambaut, Andrew; Lemey, Philippe
2017-01-01
Phylodynamics has become an increasingly popular statistical framework to extract evolutionary and epidemiological information from pathogen genomes. By harnessing such information, epidemiologists aim to shed light on the spatio-temporal patterns of spread and to test hypotheses about the underlying interaction of evolutionary and ecological dynamics in pathogen populations. Although the field has witnessed a rich development of statistical inference tools with increasing levels of sophistication, these tools initially focused on sequences as their sole primary data source. Integrating various sources of information, however, promises to deliver more precise insights in infectious diseases and to increase opportunities for statistical hypothesis testing. Here, we review how the emerging concept of data integration is stimulating new advances in Bayesian evolutionary inference methodology which formalize a marriage of statistical thinking and evolutionary biology. These approaches include connecting sequence to trait evolution, such as for host, phenotypic and geographic sampling information, but also the incorporation of covariates of evolutionary and epidemic processes in the reconstruction procedures. We highlight how a full Bayesian approach to covariate modeling and testing can generate further insights into sequence evolution, trait evolution, and population dynamics in pathogen populations. Specific examples demonstrate how such approaches can be used to test the impact of host on rabies and HIV evolutionary rates, to identify the drivers of influenza dispersal as well as the determinants of rabies cross-species transmissions, and to quantify the evolutionary dynamics of influenza antigenicity. Finally, we briefly discuss how data integration is now also permeating through the inference of transmission dynamics, leading to novel insights into tree-generative processes and detailed reconstructions of transmission trees. [Bayesian inference; birth–death models; coalescent models; continuous trait evolution; covariates; data integration; discrete trait evolution; pathogen phylodynamics.
Wang, Yongcui; Chen, Shilong; Deng, Naiyang; Wang, Yong
2013-01-01
Computational inference of novel therapeutic values for existing drugs, i.e., drug repositioning, offers the great prospect for faster and low-risk drug development. Previous researches have indicated that chemical structures, target proteins, and side-effects could provide rich information in drug similarity assessment and further disease similarity. However, each single data source is important in its own way and data integration holds the great promise to reposition drug more accurately. Here, we propose a new method for drug repositioning, PreDR (Predict Drug Repositioning), to integrate molecular structure, molecular activity, and phenotype data. Specifically, we characterize drug by profiling in chemical structure, target protein, and side-effects space, and define a kernel function to correlate drugs with diseases. Then we train a support vector machine (SVM) to computationally predict novel drug-disease interactions. PreDR is validated on a well-established drug-disease network with 1,933 interactions among 593 drugs and 313 diseases. By cross-validation, we find that chemical structure, drug target, and side-effects information are all predictive for drug-disease relationships. More experimentally observed drug-disease interactions can be revealed by integrating these three data sources. Comparison with existing methods demonstrates that PreDR is competitive both in accuracy and coverage. Follow-up database search and pathway analysis indicate that our new predictions are worthy of further experimental validation. Particularly several novel predictions are supported by clinical trials databases and this shows the significant prospects of PreDR in future drug treatment. In conclusion, our new method, PreDR, can serve as a useful tool in drug discovery to efficiently identify novel drug-disease interactions. In addition, our heterogeneous data integration framework can be applied to other problems. PMID:24244318
Emerging Concepts of Data Integration in Pathogen Phylodynamics
Baele, Guy; Suchard, Marc A.; Rambaut, Andrew; Lemey, Philippe
2017-01-01
Phylodynamics has become an increasingly popular statistical framework to extract evolutionary and epidemiological information from pathogen genomes. By harnessing such information, epidemiologists aim to shed light on the spatio-temporal patterns of spread and to test hypotheses about the underlying interaction of evolutionary and ecological dynamics in pathogen populations. Although the field has witnessed a rich development of statistical inference tools with increasing levels of sophistication, these tools initially focused on sequences as their sole primary data source. Integrating various sources of information, however, promises to deliver more precise insights in infectious diseases and to increase opportunities for statistical hypothesis testing. Here, we review how the emerging concept of data integration is stimulating new advances in Bayesian evolutionary inference methodology which formalize a marriage of statistical thinking and evolutionary biology. These approaches include connecting sequence to trait evolution, such as for host, phenotypic and geographic sampling information, but also the incorporation of covariates of evolutionary and epidemic processes in the reconstruction procedures. We highlight how a full Bayesian approach to covariate modeling and testing can generate further insights into sequence evolution, trait evolution, and population dynamics in pathogen populations. Specific examples demonstrate how such approaches can be used to test the impact of host on rabies and HIV evolutionary rates, to identify the drivers of influenza dispersal as well as the determinants of rabies cross-species transmissions, and to quantify the evolutionary dynamics of influenza antigenicity. Finally, we briefly discuss how data integration is now also permeating through the inference of transmission dynamics, leading to novel insights into tree-generative processes and detailed reconstructions of transmission trees. [Bayesian inference; birth–death models; coalescent models; continuous trait evolution; covariates; data integration; discrete trait evolution; pathogen phylodynamics. PMID:28173504
Chavan, Shweta S; Bauer, Michael A; Peterson, Erich A; Heuck, Christoph J; Johann, Donald J
2013-01-01
Transcriptome analysis by microarrays has produced important advances in biomedicine. For instance in multiple myeloma (MM), microarray approaches led to the development of an effective disease subtyping via cluster assignment, and a 70 gene risk score. Both enabled an improved molecular understanding of MM, and have provided prognostic information for the purposes of clinical management. Many researchers are now transitioning to Next Generation Sequencing (NGS) approaches and RNA-seq in particular, due to its discovery-based nature, improved sensitivity, and dynamic range. Additionally, RNA-seq allows for the analysis of gene isoforms, splice variants, and novel gene fusions. Given the voluminous amounts of historical microarray data, there is now a need to associate and integrate microarray and RNA-seq data via advanced bioinformatic approaches. Custom software was developed following a model-view-controller (MVC) approach to integrate Affymetrix probe set-IDs, and gene annotation information from a variety of sources. The tool/approach employs an assortment of strategies to integrate, cross reference, and associate microarray and RNA-seq datasets. Output from a variety of transcriptome reconstruction and quantitation tools (e.g., Cufflinks) can be directly integrated, and/or associated with Affymetrix probe set data, as well as necessary gene identifiers and/or symbols from a diversity of sources. Strategies are employed to maximize the annotation and cross referencing process. Custom gene sets (e.g., MM 70 risk score (GEP-70)) can be specified, and the tool can be directly assimilated into an RNA-seq pipeline. A novel bioinformatic approach to aid in the facilitation of both annotation and association of historic microarray data, in conjunction with richer RNA-seq data, is now assisting with the study of MM cancer biology.
Light Microscopy Module Imaging Tested and Demonstrated
NASA Technical Reports Server (NTRS)
Gati, Frank
2004-01-01
The Fluids Integrated Rack (FIR), a facility-class payload, and the Light Microscopy Module (LMM), a subrack payload, are integrated research facilities that will fly in the U.S. Laboratory module, Destiny, aboard the International Space Station. Both facilities are being engineered, designed, and developed at the NASA Glenn Research Center by Northrop Grumman Information Technology. The FIR is a modular, multiuser scientific research facility that is one of two racks that make up the Fluids and Combustion Facility (the other being the Combustion Integrated Rack). The FIR has a large volume dedicated for experimental hardware; easily reconfigurable diagnostics, power, and data systems that allow for unique experiment configurations; and customizable software. The FIR will also provide imagers, light sources, power management and control, command and data handling for facility and experiment hardware, and data processing and storage. The first payload in the FIR will be the LMM. The LMM integrated with the FIR is a remotely controllable, automated, on-orbit microscope subrack facility, with key diagnostic capabilities for meeting science requirements--including video microscopy to observe microscopic phenonema and dynamic interactions, interferometry to make thin-film measurements with nanometer resolution, laser tweezers to manipulate micrometer-sized particles, confocal microscopy to provide enhanced three-dimensional visualization of structures, and spectrophotometry to measure the photonic properties of materials. Vibration disturbances were identified early in the LMM development phase as a high risk for contaminating the science microgravity environment. An integrated FIR-LMM test was conducted in Glenn's Acoustics Test Laboratory to assess mechanical sources of vibration and their impact to microscopic imaging. The primary purpose of the test was to characterize the LMM response at the sample location, the x-y stage within the microscope, to vibration emissions from the FIR and LMM support structures.
MOVES-Matrix and distributed computing for microscale line source dispersion analysis.
Liu, Haobing; Xu, Xiaodan; Rodgers, Michael O; Xu, Yanzhi Ann; Guensler, Randall L
2017-07-01
MOVES and AERMOD are the U.S. Environmental Protection Agency's recommended models for use in project-level transportation conformity and hot-spot analysis. However, the structure and algorithms involved in running MOVES make analyses cumbersome and time-consuming. Likewise, the modeling setup process, including extensive data requirements and required input formats, in AERMOD lead to a high potential for analysis error in dispersion modeling. This study presents a distributed computing method for line source dispersion modeling that integrates MOVES-Matrix, a high-performance emission modeling tool, with the microscale dispersion models CALINE4 and AERMOD. MOVES-Matrix was prepared by iteratively running MOVES across all possible iterations of vehicle source-type, fuel, operating conditions, and environmental parameters to create a huge multi-dimensional emission rate lookup matrix. AERMOD and CALINE4 are connected with MOVES-Matrix in a distributed computing cluster using a series of Python scripts. This streamlined system built on MOVES-Matrix generates exactly the same emission rates and concentration results as using MOVES with AERMOD and CALINE4, but the approach is more than 200 times faster than using the MOVES graphical user interface. Because AERMOD requires detailed meteorological input, which is difficult to obtain, this study also recommends using CALINE4 as a screening tool for identifying the potential area that may exceed air quality standards before using AERMOD (and identifying areas that are exceedingly unlikely to exceed air quality standards). CALINE4 worst case method yields consistently higher concentration results than AERMOD for all comparisons in this paper, as expected given the nature of the meteorological data employed. The paper demonstrates a distributed computing method for line source dispersion modeling that integrates MOVES-Matrix with the CALINE4 and AERMOD. This streamlined system generates exactly the same emission rates and concentration results as traditional way to use MOVES with AERMOD and CALINE4, which are regulatory models approved by the U.S. EPA for conformity analysis, but the approach is more than 200 times faster than implementing the MOVES model. We highlighted the potentially significant benefit of using CALINE4 as screening tool for identifying potential area that may exceeds air quality standards before using AERMOD, which requires much more meteorology input than CALINE4.
ERIC Educational Resources Information Center
Barzilai, Sarit; Ka'adan, Ibtisam
2017-01-01
Learning to integrate multiple information sources is vital for advancing learners' digital literacy. Previous studies have found that learners' epistemic metacognitive knowledge about the nature of knowledge and knowing is related to their strategic integration performance. The purpose of this study was to understand how these relations come into…
Uncertainty and risk in wildland fire management: a review.
Thompson, Matthew P; Calkin, Dave E
2011-08-01
Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.
Efficient Privacy-Aware Record Integration.
Kuzu, Mehmet; Kantarcioglu, Murat; Inan, Ali; Bertino, Elisa; Durham, Elizabeth; Malin, Bradley
2013-01-01
The integration of information dispersed among multiple repositories is a crucial step for accurate data analysis in various domains. In support of this goal, it is critical to devise procedures for identifying similar records across distinct data sources. At the same time, to adhere to privacy regulations and policies, such procedures should protect the confidentiality of the individuals to whom the information corresponds. Various private record linkage (PRL) protocols have been proposed to achieve this goal, involving secure multi-party computation (SMC) and similarity preserving data transformation techniques. SMC methods provide secure and accurate solutions to the PRL problem, but are prohibitively expensive in practice, mainly due to excessive computational requirements. Data transformation techniques offer more practical solutions, but incur the cost of information leakage and false matches. In this paper, we introduce a novel model for practical PRL, which 1) affords controlled and limited information leakage, 2) avoids false matches resulting from data transformation. Initially, we partition the data sources into blocks to eliminate comparisons for records that are unlikely to match. Then, to identify matches, we apply an efficient SMC technique between the candidate record pairs. To enable efficiency and privacy, our model leaks a controlled amount of obfuscated data prior to the secure computations. Applied obfuscation relies on differential privacy which provides strong privacy guarantees against adversaries with arbitrary background knowledge. In addition, we illustrate the practical nature of our approach through an empirical analysis with data derived from public voter records.
Sinden, Kathryn; MacDermid, Joy C
2014-03-01
Employers are tasked with developing injury management and return-to-work (RTW) programs in response to occupational health and safety policies. Physical demands analyses (PDAs) are the cornerstone of injury management and RTW development. Synthesizing and contextualizing policy knowledge for use in occupational program development, including PDAs, is challenging due to multiple stakeholder involvement. Few studies have used a knowledge translation theoretical framework to facilitate policy-based interventions in occupational contexts. The primary aim of this case study was to identify how constructs of the knowledge-to-action (KTA) framework were reflected in employer stakeholder-researcher collaborations during development of a firefighter PDA. Four stakeholder meetings were conducted with employee participants who had experience using PDAs in their occupational role. Directed content analysis informed analyses of meeting minutes, stakeholder views and personal reflections recorded throughout the case. Existing knowledge sources including local data, stakeholder experiences, policies and priorities were synthesized and tailored to develop a PDA in response to the barriers and facilitators identified by the firefighters. The flexibility of the KTA framework and synthesis of multiple knowledge sources were identified strengths. The KTA Action cycle was useful in directing the overall process but insufficient for directing the specific aspects of PDA development. Integration of specific PDA guidelines into the process provided explicit direction on best practices in tailoring the PDA and knowledge synthesis. Although the themes of the KTA framework were confirmed in our analysis, order modification of the KTA components was required. Despite a complex context with divergent perspectives successful implementation of a draft PDA was achieved. The KTA framework facilitated knowledge synthesis and PDA development but specific standards and modifications to the KTA framework were needed to enhance process structure. Flexibility for modification and integration of PDA practice guidelines were identified as assets of the KTA framework during its application.
SSA Building Blocks - Transforming Your Data and Applications into Operational Capability
NASA Astrophysics Data System (ADS)
Buell, D.; Hawthorne, Shayn, L.; Higgins, J.
The Electronic System Center's 850 Electronic Systems Group (ELSG) is currently using a Service Oriented Architecture (SOA) to rapidly create net-centric experimental prototypes. This SOA has been utilized effectively across diverse mission areas, such as global air operations and rapid sensor tasking for improved space event management. The 850 ELSG has deployed a working, accredited, SOA on the SIPRNET and provided real-time space information to five separate distributed operations centers. The 850 ELSG has learned first-hand the power of SOAs for integrating DoD and non-DoD SSA data in a rapid and agile manner, allowing capabilities to be fielded and sensors to be integrated in weeks instead of months. This opens a world of opportunity to integrate University data and experimental or proof-of-concept data with sensitive sensors and sources to support developing an array of SSA products for approved users in and outside of the space community. This paper will identify how new capabilities can be proactively developed to rapidly answer critical needs when SOA methodologies are employed and identifies the operational utility and the far-reaching benefits realized by implementing a service-oriented architecture. We offer a new paradigm for how data and application producer's contributions are presented for the rest of the community to leverage.
Integrative Genomic Analyses Yields Cell Cycle Regulatory Programs with Prognostic Value
Cheng, Chao; Lou, Shaoke; Andrews, Erik H.; Ung, Matthew H.; Varn, Frederick S.
2016-01-01
Liposarcoma is the second most common form of sarcoma, which has been categorized into four molecular subtypes, which are associated with differential prognosis of patients. However, the transcriptional regulatory programs associated with distinct histological and molecular subtypes of liposarcoma have not been investigated. This study uses integrative analyses to systematically define the transcriptional regulatory programs associated with liposarcoma. Likewise, computational methods are used to identify regulatory programs associated with different liposarcoma subtypes as well as programs that are predictive of prognosis. Further analysis of curated gene sets was used to identify prognostic gene signatures. The integration of data from a variety sources including gene expression profiles, transcription factor (TF) binding data from ChIP-seq experiments, curated gene sets, and clinical information of patients indicated discrete regulatory programs (e.g., controlled by E2F1 and E2F4) with significantly different regulatory activity in one or multiple subtypes of liposarcoma with respect to normal adipose tissue. These programs were also shown to be prognostic, wherein liposarcoma patients with higher E2F4 or E2F1 activity associated with unfavorable prognosis. A total of 259 gene sets were significantly associated with patient survival in liposarcoma, among which >50% are involved in cell cycle and proliferation. PMID:26856934
Chiang, Rachelle Johnsson; Meagher, Whitney; Slade, Sean
2015-01-01
BACKGROUND The Whole School, Whole Community, Whole Child (WSCC) model calls for greater collaboration across the community, school, and health sectors to meet the needs and support the full potential of each child. This article reports on how 3 states and 2 local school districts have implemented aspects of the WSCC model through collaboration, leadership and policy creation, alignment, and implementation. METHODS We searched state health and education department websites, local school district websites, state legislative databases, and sources of peer-reviewed and gray literature to identify materials demonstrating adoption and implementation of coordinated school health, the WSCC model, and associated policies and practices in identified states and districts. We conducted informal interviews in each state and district to reinforce the document review. RESULTS States and local school districts have been able to strategically increase collaboration, integration, and alignment of health and education through the adoption and implementation of policy and practice supporting the WSCC model. Successful utilization of the WSCC model has led to substantial positive changes in school health environments, policies, and practices. CONCLUSIONS Collaboration among health and education sectors to integrate and align services may lead to improved efficiencies and better health and education outcomes for students. PMID:26440819
Integration of Lead Discovery Tactics and the Evolution of the Lead Discovery Toolbox.
Leveridge, Melanie; Chung, Chun-Wa; Gross, Jeffrey W; Phelps, Christopher B; Green, Darren
2018-06-01
There has been much debate around the success rates of various screening strategies to identify starting points for drug discovery. Although high-throughput target-based and phenotypic screening has been the focus of this debate, techniques such as fragment screening, virtual screening, and DNA-encoded library screening are also increasingly reported as a source of new chemical equity. Here, we provide examples in which integration of more than one screening approach has improved the campaign outcome and discuss how strengths and weaknesses of various methods can be used to build a complementary toolbox of approaches, giving researchers the greatest probability of successfully identifying leads. Among others, we highlight case studies for receptor-interacting serine/threonine-protein kinase 1 and the bromo- and extra-terminal domain family of bromodomains. In each example, the unique insight or chemistries individual approaches provided are described, emphasizing the synergy of information obtained from the various tactics employed and the particular question each tactic was employed to answer. We conclude with a short prospective discussing how screening strategies are evolving, what this screening toolbox might look like in the future, how to maximize success through integration of multiple tactics, and scenarios that drive selection of one combination of tactics over another.
Valentijn, Pim P; Ruwaard, Dirk; Vrijhoef, Hubertus J M; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A
2015-10-09
Collaborative partnerships are considered an essential strategy for integrating local disjointed health and social services. Currently, little evidence is available on how integrated care arrangements between professionals and organisations are achieved through the evolution of collaboration processes over time. The first aim was to develop a typology of integrated care projects (ICPs) based on the final degree of integration as perceived by multiple stakeholders. The second aim was to study how types of integration differ in changes of collaboration processes over time and final perceived effectiveness. A longitudinal mixed-methods study design based on two data sources (surveys and interviews) was used to identify the perceived degree of integration and patterns in collaboration among 42 ICPs in primary care in The Netherlands. We used cluster analysis to identify distinct subgroups of ICPs based on the final perceived degree of integration from a professional, organisational and system perspective. With the use of ANOVAs, the subgroups were contrasted based on: 1) changes in collaboration processes over time (shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management) and 2) final perceived effectiveness (i.e. rated success) at the professional, organisational and system levels. The ICPs were classified into three subgroups with: 'United Integration Perspectives (UIP)', 'Disunited Integration Perspectives (DIP)' and 'Professional-oriented Integration Perspectives (PIP)'. ICPs within the UIP subgroup made the strongest increase in trust-based (mutual gains and relationship dynamics) as well as control-based (organisational dynamics and process management) collaboration processes and had the highest overall effectiveness rates. On the other hand, ICPs with the DIP subgroup decreased on collaboration processes and had the lowest overall effectiveness rates. ICPs within the PIP subgroup increased in control-based collaboration processes (organisational dynamics and process management) and had the highest effectiveness rates at the professional level. The differences across the three subgroups in terms of the development of collaboration processes and the final perceived effectiveness provide evidence that united stakeholders' perspectives are achieved through a constructive collaboration process over time. Disunited perspectives at the professional, organisation and system levels can be aligned by both trust-based and control-based collaboration processes.
Automated Ontology Alignment with Fuselets for Community of Interest (COI) Integration
2008-09-01
Search Example ............................................................................... 22 Figure 8 - Federated Search Example Revisited...integrating information from various sources through a single query. This is the traditional federated search problem, where the sources don’t...Figure 7 - Federated Search Example For the data sources in the graphic above, the ontologies align in a fairly straightforward manner
2012-01-01
Background Tardigrades are multicellular organisms, resistant to extreme environmental changes such as heat, drought, radiation and freezing. They outlast these conditions in an inactive form (tun) to escape damage to cellular structures and cell death. Tardigrades are apparently able to prevent or repair such damage and are therefore a crucial model organism for stress tolerance. Cultures of the tardigrade Milnesium tardigradum were dehydrated by removing the surrounding water to induce tun formation. During this process and the subsequent rehydration, metabolites were measured in a time series by GC-MS. Additionally expressed sequence tags are available, especially libraries generated from the active and inactive state. The aim of this integrated analysis is to trace changes in tardigrade metabolism and identify pathways responsible for their extreme resistance against physical stress. Results In this study we propose a novel integrative approach for the analysis of metabolic networks to identify modules of joint shifts on the transcriptomic and metabolic levels. We derive a tardigrade-specific metabolic network represented as an undirected graph with 3,658 nodes (metabolites) and 4,378 edges (reactions). Time course metabolite profiles are used to score the network nodes showing a significant change over time. The edges are scored according to information on enzymes from the EST data. Using this combined information, we identify a key subnetwork (functional module) of concerted changes in metabolic pathways, specific for de- and rehydration. The module is enriched in reactions showing significant changes in metabolite levels and enzyme abundance during the transition. It resembles the cessation of a measurable metabolism (e.g. glycolysis and amino acid anabolism) during the tun formation, the production of storage metabolites and bioprotectants, such as DNA stabilizers, and the generation of amino acids and cellular components from monosaccharides as carbon and energy source during rehydration. Conclusions The functional module identifies relationships among changed metabolites (e.g. spermidine) and reactions and provides first insights into important altered metabolic pathways. With sparse and diverse data available, the presented integrated metabolite network approach is suitable to integrate all existing data and analyse it in a combined manner. PMID:22713133
Beisser, Daniela; Grohme, Markus A; Kopka, Joachim; Frohme, Marcus; Schill, Ralph O; Hengherr, Steffen; Dandekar, Thomas; Klau, Gunnar W; Dittrich, Marcus; Müller, Tobias
2012-06-19
Tardigrades are multicellular organisms, resistant to extreme environmental changes such as heat, drought, radiation and freezing. They outlast these conditions in an inactive form (tun) to escape damage to cellular structures and cell death. Tardigrades are apparently able to prevent or repair such damage and are therefore a crucial model organism for stress tolerance. Cultures of the tardigrade Milnesium tardigradum were dehydrated by removing the surrounding water to induce tun formation. During this process and the subsequent rehydration, metabolites were measured in a time series by GC-MS. Additionally expressed sequence tags are available, especially libraries generated from the active and inactive state. The aim of this integrated analysis is to trace changes in tardigrade metabolism and identify pathways responsible for their extreme resistance against physical stress. In this study we propose a novel integrative approach for the analysis of metabolic networks to identify modules of joint shifts on the transcriptomic and metabolic levels. We derive a tardigrade-specific metabolic network represented as an undirected graph with 3,658 nodes (metabolites) and 4,378 edges (reactions). Time course metabolite profiles are used to score the network nodes showing a significant change over time. The edges are scored according to information on enzymes from the EST data. Using this combined information, we identify a key subnetwork (functional module) of concerted changes in metabolic pathways, specific for de- and rehydration. The module is enriched in reactions showing significant changes in metabolite levels and enzyme abundance during the transition. It resembles the cessation of a measurable metabolism (e.g. glycolysis and amino acid anabolism) during the tun formation, the production of storage metabolites and bioprotectants, such as DNA stabilizers, and the generation of amino acids and cellular components from monosaccharides as carbon and energy source during rehydration. The functional module identifies relationships among changed metabolites (e.g. spermidine) and reactions and provides first insights into important altered metabolic pathways. With sparse and diverse data available, the presented integrated metabolite network approach is suitable to integrate all existing data and analyse it in a combined manner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Edward J., Jr.; Henry, Karen Lynne
Sandia National Laboratories develops technologies to: (1) sustain, modernize, and protect our nuclear arsenal (2) Prevent the spread of weapons of mass destruction; (3) Provide new capabilities to our armed forces; (4) Protect our national infrastructure; (5) Ensure the stability of our nation's energy and water supplies; and (6) Defend our nation against terrorist threats. We identified the need for a single overarching Integrated Workplace Management System (IWMS) that would enable us to focus on customer missions and improve FMOC processes. Our team selected highly configurable commercial-off-the-shelf (COTS) software with out-of-the-box workflow processes that integrate strategic planning, project management, facilitymore » assessments, and space management, and can interface with existing systems, such as Oracle, PeopleSoft, Maximo, Bentley, and FileNet. We selected the Integrated Workplace Management System (IWMS) from Tririga, Inc. Facility Management System (FMS) Benefits are: (1) Create a single reliable source for facility data; (2) Improve transparency with oversight organizations; (3) Streamline FMOC business processes with a single, integrated facility-management tool; (4) Give customers simple tools and real-time information; (5) Reduce indirect costs; (6) Replace approximately 30 FMOC systems and 60 homegrown tools (such as Microsoft Access databases); and (7) Integrate with FIMS.« less
Wang, Qingguo; Jia, Peilin; Zhao, Zhongming
2015-01-01
Fueled by widespread applications of high-throughput next generation sequencing (NGS) technologies and urgent need to counter threats of pathogenic viruses, large-scale studies were conducted recently to investigate virus integration in host genomes (for example, human tumor genomes) that may cause carcinogenesis or other diseases. A limiting factor in these studies, however, is rapid virus evolution and resulting polymorphisms, which prevent reads from aligning readily to commonly used virus reference genomes, and, accordingly, make virus integration sites difficult to detect. Another confounding factor is host genomic instability as a result of virus insertions. To tackle these challenges and improve our capability to identify cryptic virus-host fusions, we present a new approach that detects Virus intEgration sites through iterative Reference SEquence customization (VERSE). To the best of our knowledge, VERSE is the first approach to improve detection through customizing reference genomes. Using 19 human tumors and cancer cell lines as test data, we demonstrated that VERSE substantially enhanced the sensitivity of virus integration site detection. VERSE is implemented in the open source package VirusFinder 2 that is available at http://bioinfo.mc.vanderbilt.edu/VirusFinder/.
Collin, J; Blais, R; White, D; Demers, A; Desbiens, F
2000-01-01
This paper reports on one aspect of the evaluation of the midwifery pilot projects in Quebec: the identification of the professional and organizational factors, as well as the mode of integrating midwives into the maternity care system, that would promote the best outcomes and the autonomy of midwives. The research strategy involved a multiple-case study, in which each midwifery pilot project represented a case. Based on a qualitative approach, the study employed various sources of data: individual interviews and focus groups with key informants, site observations and analyses of written documents. Results show that midwives were poorly integrated into the health care system during the evaluation. Four main reasons were identified: lack of knowledge about the practice of midwifery on the part of other health care providers; deficiencies in the legal and organizational structure of the pilot projects; competition over professional territories; and gaps between the midwives' and other providers' professional cultures. Recommendations are provided to facilitate the integration of midwives into the health care system.
An integrated fiberoptic-microfluidic device for agglutination detection and blood typing.
Ramasubramanian, Melur K; Alexander, Stewart P
2009-02-01
In this paper, an integrated fiberoptic-microfluidic device for the detection of agglutination for blood type cross-matching has been described. The device consists of a straight microfluidic channel through with a reacted RBC suspension is pumped with the help of a syringe pump. The flow intersects an optical path created by an emitter-received fiber optic pair integrated into the microfluidic device. A 650 nm laser diode is used as the light source and a silicon photodiode is used to detect the light intensity. The spacing between the tips of the two optic fibers can be adjusted. When fiber spacing is large and the concentration of the suspension is high, scattering phenomenon becomes the dominant mechanism for agglutination detection while at low concentrations and small spacing, optointerruption becomes the dominant mechanism. An agglutination strength factor (ASF) is calculated from the data. Studies with a variety of blood types indicate that the sensing method correctly identifies the agglutination reaction in all cases. A disposable integrated device can be designed for future implementation of the method for near-bedside pre-transfusion check.
Application Agreement and Integration Services
NASA Technical Reports Server (NTRS)
Driscoll, Kevin R.; Hall, Brendan; Schweiker, Kevin
2013-01-01
Application agreement and integration services are required by distributed, fault-tolerant, safety critical systems to assure required performance. An analysis of distributed and hierarchical agreement strategies are developed against the backdrop of observed agreement failures in fielded systems. The documented work was performed under NASA Task Order NNL10AB32T, Validation And Verification of Safety-Critical Integrated Distributed Systems Area 2. This document is intended to satisfy the requirements for deliverable 5.2.11 under Task 4.2.2.3. This report discusses the challenges of maintaining application agreement and integration services. A literature search is presented that documents previous work in the area of replica determinism. Sources of non-deterministic behavior are identified and examples are presented where system level agreement failed to be achieved. We then explore how TTEthernet services can be extended to supply some interesting application agreement frameworks. This document assumes that the reader is familiar with the TTEthernet protocol. The reader is advised to read the TTEthernet protocol standard [1] before reading this document. This document does not re-iterate the content of the standard.
The Integrated Radiation Mapper Assistant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlton, R.E.; Tripp, L.R.
1995-03-01
The Integrated Radiation Mapper Assistant (IRMA) system combines state-of-the-art radiation sensors and microprocessor based analysis techniques to perform radiation surveys. Control of the survey function is from a control station located outside the radiation thus reducing time spent in radiation areas performing radiation surveys. The system consists of a directional radiation sensor, a laser range finder, two area radiation sensors, and a video camera mounted on a pan and tilt platform. THis sensor package is deployable on a remotely operated vehicle. The outputs of the system are radiation intensity maps identifying both radiation source intensities and radiation levels throughout themore » room being surveyed. After completion of the survey, the data can be removed from the control station computer for further analysis or archiving.« less
NASA Astrophysics Data System (ADS)
Baynes, K.; Gilman, J.; Pilone, D.; Mitchell, A. E.
2015-12-01
The NASA EOSDIS (Earth Observing System Data and Information System) Common Metadata Repository (CMR) is a continuously evolving metadata system that merges all existing capabilities and metadata from EOS ClearingHOuse (ECHO) and the Global Change Master Directory (GCMD) systems. This flagship catalog has been developed with several key requirements: fast search and ingest performance ability to integrate heterogenous external inputs and outputs high availability and resiliency scalability evolvability and expandability This talk will focus on the advantages and potential challenges of tackling these requirements using a microservices architecture, which decomposes system functionality into smaller, loosely-coupled, individually-scalable elements that communicate via well-defined APIs. In addition, time will be spent examining specific elements of the CMR architecture and identifying opportunities for future integrations.
A MoTe2 based light emitting diode and photodetector for silicon photonic integrated circuits
NASA Astrophysics Data System (ADS)
Bie, Ya-Qing; Heuck, M.; Grosso, G.; Furchi, M.; Cao, Y.; Zheng, J.; Navarro-Moratalla, E.; Zhou, L.; Taniguchi, T.; Watanabe, K.; Kong, J.; Englund, D.; Jarillo-Herrero, P.
A key challenge in photonics today is to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, partly because many components such as waveguides, interferometers and modulators, could be integrated on silicon-based processors. However, light sources and photodetectors present continued challenges. Common approaches for light source include off-chip or wafer-bonded lasers based on III-V materials, but studies show advantages for directly modulated light sources. The most advanced photodetectors in silicon photonics are based on germanium growth which increases system cost. The emerging two dimensional transition metal dichalcogenides (TMDs) offer a path for optical interconnects components that can be integrated with the CMOS processing by back-end-of-the-line processing steps. Here we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe2, a TMD semiconductor with infrared band gap. The state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.
Better Assessment Science Integrating Point and Non-point Sources (BASINS)
Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is a multipurpose environmental analysis system designed to help regional, state, and local agencies perform watershed- and water quality-based studies.
BioPortal: An Open-Source Community-Based Ontology Repository
NASA Astrophysics Data System (ADS)
Noy, N.; NCBO Team
2011-12-01
Advances in computing power and new computational techniques have changed the way researchers approach science. In many fields, one of the most fruitful approaches has been to use semantically aware software to break down the barriers among disparate domains, systems, data sources, and technologies. Such software facilitates data aggregation, improves search, and ultimately allows the detection of new associations that were previously not detectable. Achieving these analyses requires software systems that take advantage of the semantics and that can intelligently negotiate domains and knowledge sources, identifying commonality across systems that use different and conflicting vocabularies, while understanding apparent differences that may be concealed by the use of superficially similar terms. An ontology, a semantically rich vocabulary for a domain of interest, is the cornerstone of software for bridging systems, domains, and resources. However, as ontologies become the foundation of all semantic technologies in e-science, we must develop an infrastructure for sharing ontologies, finding and evaluating them, integrating and mapping among them, and using ontologies in applications that help scientists process their data. BioPortal [1] is an open-source on-line community-based ontology repository that has been used as a critical component of semantic infrastructure in several domains, including biomedicine and bio-geochemical data. BioPortal, uses the social approaches in the Web 2.0 style to bring structure and order to the collection of biomedical ontologies. It enables users to provide and discuss a wide array of knowledge components, from submitting the ontologies themselves, to commenting on and discussing classes in the ontologies, to reviewing ontologies in the context of their own ontology-based projects, to creating mappings between overlapping ontologies and discussing and critiquing the mappings. Critically, it provides web-service access to all its content, enabling its integration in semantically enriched applications. [1] Noy, N.F., Shah, N.H., et al., BioPortal: ontologies and integrated data resources at the click of a mouse. Nucleic Acids Res, 2009. 37(Web Server issue): p. W170-3.
IGR J17329-2731: The birth of a symbiotic X-ray binary
NASA Astrophysics Data System (ADS)
Bozzo, E.; Bahramian, A.; Ferrigno, C.; Sanna, A.; Strader, J.; Lewis, F.; Russell, D. M.; di Salvo, T.; Burderi, L.; Riggio, A.; Papitto, A.; Gandhi, P.; Romano, P.
2018-05-01
We report on the results of the multiwavelength campaign carried out after the discovery of the INTEGRAL transient IGR J17329-2731. The optical data collected with the SOAR telescope allowed us to identify the donor star in this system as a late M giant at a distance of 2.7-1.2+3.4 kpc. The data collected quasi-simultaneously with XMM-Newton and NuSTAR showed the presence of a modulation with a period of 6680 ± 3 s in the X-ray light curves of the source. This unveils that the compact object hosted in this system is a slowly rotating neutron star. The broadband X-ray spectrum showed the presence of a strong absorption (≫1023 cm-2) and prominent emission lines at 6.4 keV, and 7.1 keV. These features are usually found in wind-fed systems, in which the emission lines result from the fluorescence of the X-rays from the accreting compact object on the surrounding stellar wind. The presence of a strong absorption line around 21 keV in the spectrum suggests a cyclotron origin, thus allowing us to estimate the neutron star magnetic field as 2.4 × 1012 G. All evidencethus suggests IGR J17329-2731 is a symbiotic X-ray binary. As no X-ray emission was ever observed from the location of IGR J17329-2731 by INTEGRAL (or other X-ray facilities) during the past 15 yr in orbit and considering that symbiotic X-ray binaries are known to be variable but persistent X-ray sources, we concluded that INTEGRAL caught the first detectable X-ray emission from IGR J17329-2731 when the source shined as a symbiotic X-ray binary. The Swift XRT monitoring performed up to 3 months after the discovery of the source, showed that it maintained a relatively stable X-ray flux and spectral properties.
Spinning-Up: the Case of the Symbiotic X-Ray Binary 3A 1954+319
NASA Technical Reports Server (NTRS)
Fuerst, F.; Marcu, D. M.; Pottschmidt, K.; Grinberg, V.; Wilms, J.; CadolleBel, M.
2011-01-01
We present a timing and spectral analysis of the variable X-ray source 3A 1954+319. Our analysis is mainly based on an outburst serendipitously observed during INTEGRAL Key Program observations of the Cygnus region in 2008 fall and on the Swift/BAT longterm light curve. Previous observations, though sparse, have identified the source to be one of only nine known symbiotic X-ray binaries, i.e., systems composed of an accreting neutron star orbiting in a highly inhomogeneous medium around an M-giant companion. The spectrum of3A 1954+319 above > 20 keV can be best described by a broken power law model. The extremely long pulse period of approx.5.3 hours is clearly visible in the INTEGRAL/ISGRI light curve and confirmed through an epoch folding period search. Furthermore, the light curve allows us to determine a very strong spin up of -2 x 10(exp -4) h/h during the outburst. This spin up is confirmed by the pulse period evolution calculated from Swift/BAT data. The Swift/BAT data also show a long spin-down trend prior to the 2008 outburst, which is confirmed in archival INTEGRAL/ISGRI data. We discuss possible accretion models and geometries allowing for the transfer of such large amounts of angular momentum and investigate the harder spectrum of this outburst compared to previously published results.
Spinning-Up: The Case of the Symbiotic X-Ray Binary 3A 1954+319
NASA Technical Reports Server (NTRS)
Fuerst, F.; Marcu, D. M.; Pottschmidt, K.; Grinberg, V.; Wilms, J.; Bel, M. Cadolle
2010-01-01
We present a timing and spectral analysis of the variable X-ray source 3A 1954+319, Our analysis is mainly based on an outburst serendipitously observed during INTEGRAL Key Program observations of the Cygnus region in 2008 fall and on the Swift/BAT longterm light curve, Previous observations, though sparse, have identified the source to be one of only nine known symbiotic X-ray binaries, i.e., systems composed of an accreting neutron star orbiting in a highly inhomogeneous medium around an M-giant companion. The spectrum of 3A 1954+319 above 20 keY can be best described by a broken power law model. The extremely long pulse period of approx.5.3 hours is clearly visible in the INTEGRAL/ISGRI light curve and confirmed through an epoch folding period search. Furthermore, the light curve allows us to determine a very strong spin up of -2x10(exp 4)h/h during the outburst. This spin up is confirmed by the pulse period evolution calculated from Swift/BAT data. The Swift/BAT data also show a long spin-down trend prior to the 2008 outburst. which is confirmed in archival INTEGRAL/ISGRI data. We discuss possible accretion models and geometries allowing for the transfer of such large amounts of angular momentum and investigate the harder spectrum of this outburst compared to previously published results.
Spectral Radiance of a Large-Area Integrating Sphere Source
Walker, James H.; Thompson, Ambler
1995-01-01
The radiance and irradiance calibration of large field-of-view scanning and imaging radiometers for remote sensing and surveillance applications has resulted in the development of novel calibration techniques. One of these techniques is the employment of large-area integrating sphere sources as radiance or irradiance secondary standards. To assist the National Aeronautical and Space Administration’s space based ozone measurement program, a commercially available large-area internally illuminated integrating sphere source’s spectral radiance was characterized in the wavelength region from 230 nm to 400 nm at the National Institute of Standards and Technology. Spectral radiance determinations and spatial mappings of the source indicate that carefully designed large-area integrating sphere sources can be measured with a 1 % to 2 % expanded uncertainty (two standard deviation estimate) in the near ultraviolet with spatial nonuniformities of 0.6 % or smaller across a 20 cm diameter exit aperture. A method is proposed for the calculation of the final radiance uncertainties of the source which includes the field of view of the instrument being calibrated. PMID:29151725
A monolithically integrated polarization entangled photon pair source on a silicon chip
Matsuda, Nobuyuki; Le Jeannic, Hanna; Fukuda, Hiroshi; Tsuchizawa, Tai; Munro, William John; Shimizu, Kaoru; Yamada, Koji; Tokura, Yasuhiro; Takesue, Hiroki
2012-01-01
Integrated photonic circuits are one of the most promising platforms for large-scale photonic quantum information systems due to their small physical size and stable interferometers with near-perfect lateral-mode overlaps. Since many quantum information protocols are based on qubits defined by the polarization of photons, we must develop integrated building blocks to generate, manipulate, and measure the polarization-encoded quantum state on a chip. The generation unit is particularly important. Here we show the first integrated polarization-entangled photon pair source on a chip. We have implemented the source as a simple and stable silicon-on-insulator photonic circuit that generates an entangled state with 91 ± 2% fidelity. The source is equipped with versatile interfaces for silica-on-silicon or other types of waveguide platforms that accommodate the polarization manipulation and projection devices as well as pump light sources. Therefore, we are ready for the full-scale implementation of photonic quantum information systems on a chip. PMID:23150781
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuemer, T.O.; Doan, L.; Su, C.W.
2000-07-01
A Compact Integrated Narcotics Detection Instrument (CINDI) has been developed at NOVA R and D, Inc., in cooperation with the US Coast Guard. This detector utilizes neutrons emitted from {sup 252}Cf. Neutrons emitted from the front face of CINDI penetrate dense compartment barrier materials with little change in energy but are backscattered by hydrogen-rich materials such as drugs. The backscattered neutrons are detected, and the rate is displayed by a microprocessor-controller integrated into CINDI. The operator guides the detector along a suspected area and receives immediate feedback from the state-of-the-art electronics. For user safety, the device incorporates a highly sensitivemore » detection scheme to permit the use of a very weak radioactive source, without compromising detectability. CINDI is capable of detecting narcotics effectively behind panels made of steel, wood, fiberglass, or even lead-lined materials. This makes it useful for inspecting marine vessels, ship bulkheads, automobiles, structure walls, or small sealed containers. Figure 2 shows three views of the CINDI instrument. CINDI responds strongly to hydrogen-rich materials such as narcotics. It has been tested at NOVA, the US Coast Guard, and Brewt Power Systems. The results of the tests show excellent response and specificity to narcotics. CINDI has led to a new technology that shows promise for identifying the concealed contraband. The new technique uses a fusion of two independent but complementary signals for detecting and possibly identifying concealed drugs in a variety of carriers such as vehicles, marine vessels, airplanes, containers, cargo, and luggage. The carriers will be scanned using both neutron and gamma-ray sources. The signal from both the neutron and gamma-ray backscattering and/or transmission can be used simultaneously to detect and possibly identify the contrabands it has been trained for. A system that can produce three-dimensional images for both signals may also be developed. The two images may be combined and analyzed by a fast host computer to detect concealed contraband. The two independent signatures when analyzed simultaneously may help determine the type of concealed contraband.« less
VizieR Online Data Catalog: ALMA 106GHz continuum observations in Chamaeleon I (Dunham+, 2016)
NASA Astrophysics Data System (ADS)
Dunham, M. M.; Offner, S. S. R.; Pineda, J. E.; Bourke, T. L.; Tobin, J. J.; Arce, H. G.; Chen, X.; di, Francesco J.; Johnstone, D.; Lee, K. I.; Myers, P. C.; Price, D.; Sadavoy, S. I.; Schnee, S.
2018-02-01
We obtained ALMA observations of every source in Chamaleon I detected in the single-dish 870 μm LABOCA survey by Belloche et al. (2011, J/A+A/527/A145), except for those listed as likely artifacts (1 source), residuals from bright sources (7 sources), or detections tentatively associated with YSOs (3 sources). We observed 73 sources from the initial list of 84 objects identified by Belloche et al. (2011, J/A+A/527/A145). We observed the 73 pointings using the ALMA Band 3 receivers during its Cycle 1 campaign between 2013 November 29 and 2014 March 08. Between 25 and 27 antennas were available for our observations, with the array configured in a relatively compact configuration to provide a resolution of approximately 2" FWHM (300 AU at the distance to Chamaeleon I). Each target was observed in a single pointing with approximately 1 minute of on-source integration time. Three out of the four available spectral windows were configured to measure the continuum at 101, 103, and 114 GHz, each with a bandwidth of 2 GHz, for a total continuum bandwidth of 6 GHz (2.8 mm) at a central frequency of 106 GHz. (2 data files).
Torres Astorga, Romina; de Los Santos Villalobos, Sergio; Velasco, Hugo; Domínguez-Quintero, Olgioly; Pereira Cardoso, Renan; Meigikos Dos Anjos, Roberto; Diawara, Yacouba; Dercon, Gerd; Mabit, Lionel
2018-05-15
Identification of hot spots of land degradation is strongly related with the selection of soil tracers for sediment pathways. This research proposes the complementary and integrated application of two analytical techniques to select the most suitable fingerprint tracers for identifying the main sources of sediments in an agricultural catchment located in Central Argentina with erosive loess soils. Diffuse reflectance Fourier transformed in the mid-infrared range (DRIFT-MIR) spectroscopy and energy-dispersive X-ray fluorescence (EDXRF) were used for a suitable fingerprint selection. For using DRIFT-MIR spectroscopy as fingerprinting technique, calibration through quantitative parameters is needed to link and correlate DRIFT-MIR spectra with soil tracers. EDXRF was used in this context for determining the concentrations of geochemical elements in soil samples. The selected tracers were confirmed using two artificial mixtures composed of known proportions of soil collected in different sites with distinctive soil uses. These fingerprint elements were used as parameters to build a predictive model with the whole set of DRIFT-MIR spectra. Fingerprint elements such as phosphorus, iron, calcium, barium, and titanium were identified for obtaining a suitable reconstruction of the source proportions in the artificial mixtures. Mid-infrared spectra produced successful prediction models (R 2 = 0.91) for Fe content and moderate useful prediction (R 2 = 0.72) for Ti content. For Ca, P, and Ba, the R 2 were 0.44, 0.58, and 0.59 respectively.
Heekes, Alexa; Tiffin, Nicki; Dane, Pierre; Mutemaringa, Themba; Smith, Mariette; Zinyakatira, Nesbert; Barron, Peter; Seebregts, Chris; Boulle, Andrew
2018-01-01
Information systems designed to support health promotion in pregnancy, such as the MomConnect programme, are potential sources of clinical information which can be used to identify pregnancies prospectively and early on. In this paper we demonstrate the feasibility and value of linking records collected through the MomConnect programme, to an emergent province-wide health information exchange in the Western Cape Province of South Africa, which already enumerates pregnancies from a range of other clinical data sources. MomConnect registrations were linked to pregnant women known to the public health services using the limited identifiers collected by MomConnect. Three-quarters of MomConnect registrations could be linked to existing pregnant women, decreasing over time as recording of the national identifier decreased. The MomConnect records were usually the first evidence of pregnancy in pregnancies which were subsequently confirmed by other sources. Those at lower risk of adverse pregnancy outcomes were more likely to register. In some cases, MomConnect was the only evidence of pregnancy for a patient. In addition, the MomConnect records provided gestational age information and new and more recently updated contact numbers to the existing contact registry. The pilot integration of the data in the Western Cape Province of South Africa demonstrates how a client-facing system can augment clinical information systems, especially in contexts where electronic medical records are not widely available. PMID:29713507
Heekes, Alexa; Tiffin, Nicki; Dane, Pierre; Mutemaringa, Themba; Smith, Mariette; Zinyakatira, Nesbert; Barron, Peter; Seebregts, Chris; Boulle, Andrew
2018-01-01
Information systems designed to support health promotion in pregnancy, such as the MomConnect programme, are potential sources of clinical information which can be used to identify pregnancies prospectively and early on. In this paper we demonstrate the feasibility and value of linking records collected through the MomConnect programme, to an emergent province-wide health information exchange in the Western Cape Province of South Africa, which already enumerates pregnancies from a range of other clinical data sources. MomConnect registrations were linked to pregnant women known to the public health services using the limited identifiers collected by MomConnect. Three-quarters of MomConnect registrations could be linked to existing pregnant women, decreasing over time as recording of the national identifier decreased. The MomConnect records were usually the first evidence of pregnancy in pregnancies which were subsequently confirmed by other sources. Those at lower risk of adverse pregnancy outcomes were more likely to register. In some cases, MomConnect was the only evidence of pregnancy for a patient. In addition, the MomConnect records provided gestational age information and new and more recently updated contact numbers to the existing contact registry. The pilot integration of the data in the Western Cape Province of South Africa demonstrates how a client-facing system can augment clinical information systems, especially in contexts where electronic medical records are not widely available.
A Semantic Web Management Model for Integrative Biomedical Informatics
Deus, Helena F.; Stanislaus, Romesh; Veiga, Diogo F.; Behrens, Carmen; Wistuba, Ignacio I.; Minna, John D.; Garner, Harold R.; Swisher, Stephen G.; Roth, Jack A.; Correa, Arlene M.; Broom, Bradley; Coombes, Kevin; Chang, Allen; Vogel, Lynn H.; Almeida, Jonas S.
2008-01-01
Background Data, data everywhere. The diversity and magnitude of the data generated in the Life Sciences defies automated articulation among complementary efforts. The additional need in this field for managing property and access permissions compounds the difficulty very significantly. This is particularly the case when the integration involves multiple domains and disciplines, even more so when it includes clinical and high throughput molecular data. Methodology/Principal Findings The emergence of Semantic Web technologies brings the promise of meaningful interoperation between data and analysis resources. In this report we identify a core model for biomedical Knowledge Engineering applications and demonstrate how this new technology can be used to weave a management model where multiple intertwined data structures can be hosted and managed by multiple authorities in a distributed management infrastructure. Specifically, the demonstration is performed by linking data sources associated with the Lung Cancer SPORE awarded to The University of Texas MDAnderson Cancer Center at Houston and the Southwestern Medical Center at Dallas. A software prototype, available with open source at www.s3db.org, was developed and its proposed design has been made publicly available as an open source instrument for shared, distributed data management. Conclusions/Significance The Semantic Web technologies have the potential to addresses the need for distributed and evolvable representations that are critical for systems Biology and translational biomedical research. As this technology is incorporated into application development we can expect that both general purpose productivity software and domain specific software installed on our personal computers will become increasingly integrated with the relevant remote resources. In this scenario, the acquisition of a new dataset should automatically trigger the delegation of its analysis. PMID:18698353
Clinical data integration of distributed data sources using Health Level Seven (HL7) v3-RIM mapping
2011-01-01
Background Health information exchange and health information integration has become one of the top priorities for healthcare systems across institutions and hospitals. Most organizations and establishments implement health information exchange and integration in order to support meaningful information retrieval among their disparate healthcare systems. The challenges that prevent efficient health information integration for heterogeneous data sources are the lack of a common standard to support mapping across distributed data sources and the numerous and diverse healthcare domains. Health Level Seven (HL7) is a standards development organization which creates standards, but is itself not the standard. They create the Reference Information Model. RIM is developed by HL7's technical committees. It is a standardized abstract representation of HL7 data across all the domains of health care. In this article, we aim to present a design and a prototype implementation of HL7 v3-RIM mapping for information integration of distributed clinical data sources. The implementation enables the user to retrieve and search information that has been integrated using HL7 v3-RIM technology from disparate health care systems. Method and results We designed and developed a prototype implementation of HL7 v3-RIM mapping function to integrate distributed clinical data sources using R-MIM classes from HL7 v3-RIM as a global view along with a collaborative centralized web-based mapping tool to tackle the evolution of both global and local schemas. Our prototype was implemented and integrated with a Clinical Database management Systems CDMS as a plug-in module. We tested the prototype system with some use case scenarios for distributed clinical data sources across several legacy CDMS. The results have been effective in improving information delivery, completing tasks that would have been otherwise difficult to accomplish, and reducing the time required to finish tasks which are used in collaborative information retrieval and sharing with other systems. Conclusions We created a prototype implementation of HL7 v3-RIM mapping for information integration between distributed clinical data sources to promote collaborative healthcare and translational research. The prototype has effectively and efficiently ensured the accuracy of the information and knowledge extractions for systems that have been integrated PMID:22104558
Santiago, Jose A; Potashkin, Judith A
2013-01-01
Shared dysregulated pathways may contribute to Parkinson's disease and type 2 diabetes, chronic diseases that afflict millions of people worldwide. Despite the evidence provided by epidemiological and gene profiling studies, the molecular and functional networks implicated in both diseases, have not been fully explored. In this study, we used an integrated network approach to investigate the extent to which Parkinson's disease and type 2 diabetes are linked at the molecular level. Using a random walk algorithm within the human functional linkage network we identified a molecular cluster of 478 neighboring genes closely associated with confirmed Parkinson's disease and type 2 diabetes genes. Biological and functional analysis identified the protein serine-threonine kinase activity, MAPK cascade, activation of the immune response, and insulin receptor and lipid signaling as convergent pathways. Integration of results from microarrays studies identified a blood signature comprising seven genes whose expression is dysregulated in Parkinson's disease and type 2 diabetes. Among this group of genes, is the amyloid precursor protein (APP), previously associated with neurodegeneration and insulin regulation. Quantification of RNA from whole blood of 192 samples from two independent clinical trials, the Harvard Biomarker Study (HBS) and the Prognostic Biomarker Study (PROBE), revealed that expression of APP is significantly upregulated in Parkinson's disease patients compared to healthy controls. Assessment of biomarker performance revealed that expression of APP could distinguish Parkinson's disease from healthy individuals with a diagnostic accuracy of 80% in both cohorts of patients. These results provide the first evidence that Parkinson's disease and diabetes are strongly linked at the molecular level and that shared molecular networks provide an additional source for identifying highly sensitive biomarkers. Further, these results suggest for the first time that increased expression of APP in blood may modulate the neurodegenerative phenotype in type 2 diabetes patients.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis
Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal
2016-01-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917
Understanding Potential Exposure Sources of Perfluorinated Carboxylic Acids in the Workplace
Kaiser, Mary A.; Dawson, Barbara J.; Barton, Catherine A.; Botelho, Miguel A.
2010-01-01
This paper integrates perspectives from analytical chemistry, environmental engineering, and industrial hygiene to better understand how workers may be exposed to perfluorinated carboxylic acids when handling them in the workplace in order to identify appropriate exposure controls. Due to the dramatic difference in physical properties of the protonated acid form and the anionic form, this family of chemicals provides unique industrial hygiene challenges. Workplace monitoring, experimental data, and modeling results were used to ascertain the most probable workplace exposure sources and transport mechanisms for perfluorooctanoic acid (PFOA) and its ammonium salt (APFO). PFOA is biopersistent and its measurement in the blood has been used to assess human exposure since it integrates exposure from all routes of entry. Monitoring suggests that inhalation of airborne material may be an important exposure route. Transport studies indicated that, under low pH conditions, PFOA, the undissociated (acid) species, actively partitions from water into air. In addition, solid-phase PFOA and APFO may also sublime into the air. Modeling studies determined that contributions from surface sublimation and loss from low pH aqueous solutions can be significant potential sources of workplace exposure. These findings suggest that keeping surfaces clean, preventing accumulation of material in unventilated areas, removing solids from waste trenches and sumps, and maintaining neutral pH in sumps can lower workplace exposures. PMID:20974675
Traits and types of health data repositories.
Wade, Ted D
2014-01-01
We review traits of reusable clinical data and offer a typology of clinical repositories with a range of known examples. Sources of clinical data suitable for research can be classified into types reflecting the data's institutional origin, original purpose, level of integration and governance. Primary data nearly always come from research studies and electronic medical records. Registries collect data on focused populations primarily to track outcomes, often using observational research methods. Warehouses are institutional information utilities repackaging clinical care data. Collections organize data from more organizations than a data warehouse, and more original data sources than a registry. Therefore even if they are heavily curated, their level of internal integration, and thus ease of use, can be less than other types. Federations are like collections except that physical control over data is distributed among donor organizations. Federations sometimes federate, giving a second level of organization. While the size, in number of patients, varies widely within each type of data source, populations over 10 K are relatively numerous, and much larger populations can be seen in warehouses and federations. One imagined ideal structure for research progress has been called an "Information Commons". It would have longitudinal, multi-leveled (environmental through molecular) data on a large population of identified, consenting individuals. These are qualities whose achievement would require long term commitment on the part of many data donors, including a willingness to make their data public.
Citizen Sensors for SHM: Towards a Crowdsourcing Platform
Ozer, Ekin; Feng, Maria Q.; Feng, Dongming
2015-01-01
This paper presents an innovative structural health monitoring (SHM) platform in terms of how it integrates smartphone sensors, the web, and crowdsourcing. The ubiquity of smartphones has provided an opportunity to create low-cost sensor networks for SHM. Crowdsourcing has given rise to citizen initiatives becoming a vast source of inexpensive, valuable but heterogeneous data. Previously, the authors have investigated the reliability of smartphone accelerometers for vibration-based SHM. This paper takes a step further to integrate mobile sensing and web-based computing for a prospective crowdsourcing-based SHM platform. An iOS application was developed to enable citizens to measure structural vibration and upload the data to a server with smartphones. A web-based platform was developed to collect and process the data automatically and store the processed data, such as modal properties of the structure, for long-term SHM purposes. Finally, the integrated mobile and web-based platforms were tested to collect the low-amplitude ambient vibration data of a bridge structure. Possible sources of uncertainties related to citizens were investigated, including the phone location, coupling conditions, and sampling duration. The field test results showed that the vibration data acquired by smartphones operated by citizens without expertise are useful for identifying structural modal properties with high accuracy. This platform can be further developed into an automated, smart, sustainable, cost-free system for long-term monitoring of structural integrity of spatially distributed urban infrastructure. Citizen Sensors for SHM will be a novel participatory sensing platform in the way that it offers hybrid solutions to transitional crowdsourcing parameters. PMID:26102490
Hwang, Jee-In; Cimino, James J; Bakken, Suzanne
2003-01-01
The purposes of the study were (1) to evaluate the usefulness of the International Standards Organization (ISO) Reference Terminology Model for Nursing Diagnoses as a terminology model for defining nursing diagnostic concepts in the Medical Entities Dictionary (MED) and (2) to create the additional hierarchical structures required for integration of nursing diagnostic concepts into the MED. The authors dissected nursing diagnostic terms from two source terminologies (Home Health Care Classification and the Omaha System) into the semantic categories of the ISO model. Consistent with the ISO model, they selected Focus and Judgment as required semantic categories for creating intensional definitions of nursing diagnostic concepts in the MED. Because the MED does not include Focus and Judgment hierarchies, the authors developed them to define the nursing diagnostic concepts. The ISO model was sufficient for dissecting the source terminologies into atomic terms. The authors identified 162 unique focus concepts from the 266 nursing diagnosis terms for inclusion in the Focus hierarchy. For the Judgment hierarchy, the authors precoordinated Judgment and Potentiality instead of using Potentiality as a qualifier of Judgment as in the ISO model. Impairment and Alteration were the most frequently occurring judgments. Nursing care represents a large proportion of health care activities; thus, it is vital that terms used by nurses are integrated into concept-oriented terminologies that provide broad coverage for the domain of health care. This study supports the utility of the ISO Reference Terminology Model for Nursing Diagnoses as a facilitator for the integration process.
Hwang, Jee-In; Cimino, James J.; Bakken, Suzanne
2003-01-01
Objective: The purposes of the study were (1) to evaluate the usefulness of the International Standards Organization (ISO) Reference Terminology Model for Nursing Diagnoses as a terminology model for defining nursing diagnostic concepts in the Medical Entities Dictionary (MED) and (2) to create the additional hierarchical structures required for integration of nursing diagnostic concepts into the MED. Design and Measurements: The authors dissected nursing diagnostic terms from two source terminologies (Home Health Care Classification and the Omaha System) into the semantic categories of the ISO model. Consistent with the ISO model, they selected Focus and Judgment as required semantic categories for creating intensional definitions of nursing diagnostic concepts in the MED. Because the MED does not include Focus and Judgment hierarchies, the authors developed them to define the nursing diagnostic concepts. Results: The ISO model was sufficient for dissecting the source terminologies into atomic terms. The authors identified 162 unique focus concepts from the 266 nursing diagnosis terms for inclusion in the Focus hierarchy. For the Judgment hierarchy, the authors precoordinated Judgment and Potentiality instead of using Potentiality as a qualifier of Judgment as in the ISO model. Impairment and Alteration were the most frequently occurring judgments. Conclusions: Nursing care represents a large proportion of health care activities; thus, it is vital that terms used by nurses are integrated into concept-oriented terminologies that provide broad coverage for the domain of health care. This study supports the utility of the ISO Reference Terminology Model for Nursing Diagnoses as a facilitator for the integration process. PMID:12668692
Learning mechanisms to limit medication administration errors.
Drach-Zahavy, Anat; Pud, Dorit
2010-04-01
This paper is a report of a study conducted to identify and test the effectiveness of learning mechanisms applied by the nursing staff of hospital wards as a means of limiting medication administration errors. Since the influential report ;To Err Is Human', research has emphasized the role of team learning in reducing medication administration errors. Nevertheless, little is known about the mechanisms underlying team learning. Thirty-two hospital wards were randomly recruited. Data were collected during 2006 in Israel by a multi-method (observations, interviews and administrative data), multi-source (head nurses, bedside nurses) approach. Medication administration error was defined as any deviation from procedures, policies and/or best practices for medication administration, and was identified using semi-structured observations of nurses administering medication. Organizational learning was measured using semi-structured interviews with head nurses, and the previous year's reported medication administration errors were assessed using administrative data. The interview data revealed four learning mechanism patterns employed in an attempt to learn from medication administration errors: integrated, non-integrated, supervisory and patchy learning. Regression analysis results demonstrated that whereas the integrated pattern of learning mechanisms was associated with decreased errors, the non-integrated pattern was associated with increased errors. Supervisory and patchy learning mechanisms were not associated with errors. Superior learning mechanisms are those that represent the whole cycle of team learning, are enacted by nurses who administer medications to patients, and emphasize a system approach to data analysis instead of analysis of individual cases.
Simultaneous EEG and MEG source reconstruction in sparse electromagnetic source imaging.
Ding, Lei; Yuan, Han
2013-04-01
Electroencephalography (EEG) and magnetoencephalography (MEG) have different sensitivities to differently configured brain activations, making them complimentary in providing independent information for better detection and inverse reconstruction of brain sources. In the present study, we developed an integrative approach, which integrates a novel sparse electromagnetic source imaging method, i.e., variation-based cortical current density (VB-SCCD), together with the combined use of EEG and MEG data in reconstructing complex brain activity. To perform simultaneous analysis of multimodal data, we proposed to normalize EEG and MEG signals according to their individual noise levels to create unit-free measures. Our Monte Carlo simulations demonstrated that this integrative approach is capable of reconstructing complex cortical brain activations (up to 10 simultaneously activated and randomly located sources). Results from experimental data showed that complex brain activations evoked in a face recognition task were successfully reconstructed using the integrative approach, which were consistent with other research findings and validated by independent data from functional magnetic resonance imaging using the same stimulus protocol. Reconstructed cortical brain activations from both simulations and experimental data provided precise source localizations as well as accurate spatial extents of localized sources. In comparison with studies using EEG or MEG alone, the performance of cortical source reconstructions using combined EEG and MEG was significantly improved. We demonstrated that this new sparse ESI methodology with integrated analysis of EEG and MEG data could accurately probe spatiotemporal processes of complex human brain activations. This is promising for noninvasively studying large-scale brain networks of high clinical and scientific significance. Copyright © 2011 Wiley Periodicals, Inc.
Knowledge Acquisition of Generic Queries for Information Retrieval
Seol, Yoon-Ho; Johnson, Stephen B.; Cimino, James J.
2002-01-01
Several studies have identified clinical questions posed by health care professionals to understand the nature of information needs during clinical practice. To support access to digital information sources, it is necessary to integrate the information needs with a computer system. We have developed a conceptual guidance approach in information retrieval, based on a knowledge base that contains the patterns of information needs. The knowledge base uses a formal representation of clinical questions based on the UMLS knowledge sources, called the Generic Query model. To improve the coverage of the knowledge base, we investigated a method for extracting plausible clinical questions from the medical literature. This poster presents the Generic Query model, shows how it is used to represent the patterns of clinical questions, and describes the framework used to extract knowledge from the medical literature.
Fan, Jean; Lee, Hae-Ock; Lee, Soohyun; Ryu, Da-Eun; Lee, Semin; Xue, Catherine; Kim, Seok Jin; Kim, Kihyun; Barkas, Nikolas; Park, Peter J; Park, Woong-Yang; Kharchenko, Peter V
2018-06-13
Characterization of intratumoral heterogeneity is critical to cancer therapy, as presence of phenotypically diverse cell populations commonly fuels relapse and resistance to treatment. Although genetic variation is a well-studied source of intratumoral heterogeneity, the functional impact of most genetic alterations remains unclear. Even less understood is the relative importance of other factors influencing heterogeneity, such as epigenetic state or tumor microenvironment. To investigate the relationship between genetic and transcriptional heterogeneity in a context of cancer progression, we devised a computational approach called HoneyBADGER to identify copy number variation and loss-of-heterozygosity in individual cells from single-cell RNA-sequencing data. By integrating allele and normalized expression information, HoneyBADGER is able to identify and infer the presence of subclone-specific alterations in individual cells and reconstruct underlying subclonal architecture. Examining several tumor types, we show that HoneyBADGER is effective at identifying deletion, amplifications, and copy-neutral loss-of-heterozygosity events, and is capable of robustly identifying subclonal focal alterations as small as 10 megabases. We further apply HoneyBADGER to analyze single cells from a progressive multiple myeloma patient to identify major genetic subclones that exhibit distinct transcriptional signatures relevant to cancer progression. Surprisingly, other prominent transcriptional subpopulations within these tumors did not line up with the genetic subclonal structure, and were likely driven by alternative, non-clonal mechanisms. These results highlight the need for integrative analysis to understand the molecular and phenotypic heterogeneity in cancer. Published by Cold Spring Harbor Laboratory Press.
Chen Peng; Ao Li
2017-01-01
The emergence of multi-dimensional data offers opportunities for more comprehensive analysis of the molecular characteristics of human diseases and therefore improving diagnosis, treatment, and prevention. In this study, we proposed a heterogeneous network based method by integrating multi-dimensional data (HNMD) to identify GBM-related genes. The novelty of the method lies in that the multi-dimensional data of GBM from TCGA dataset that provide comprehensive information of genes, are combined with protein-protein interactions to construct a weighted heterogeneous network, which reflects both the general and disease-specific relationships between genes. In addition, a propagation algorithm with resistance is introduced to precisely score and rank GBM-related genes. The results of comprehensive performance evaluation show that the proposed method significantly outperforms the network based methods with single-dimensional data and other existing approaches. Subsequent analysis of the top ranked genes suggests they may be functionally implicated in GBM, which further corroborates the superiority of the proposed method. The source code and the results of HNMD can be downloaded from the following URL: http://bioinformatics.ustc.edu.cn/hnmd/ .
NASA Astrophysics Data System (ADS)
Hao, Ming; Rohrdantz, Christian; Janetzko, Halldór; Keim, Daniel; Dayal, Umeshwar; Haug, Lars-Erik; Hsu, Mei-Chun
2012-01-01
Twitter currently receives over 190 million tweets (small text-based Web posts) and manufacturing companies receive over 10 thousand web product surveys a day, in which people share their thoughts regarding a wide range of products and their features. A large number of tweets and customer surveys include opinions about products and services. However, with Twitter being a relatively new phenomenon, these tweets are underutilized as a source for determining customer sentiments. To explore high-volume customer feedback streams, we integrate three time series-based visual analysis techniques: (1) feature-based sentiment analysis that extracts, measures, and maps customer feedback; (2) a novel idea of term associations that identify attributes, verbs, and adjectives frequently occurring together; and (3) new pixel cell-based sentiment calendars, geo-temporal map visualizations and self-organizing maps to identify co-occurring and influential opinions. We have combined these techniques into a well-fitted solution for an effective analysis of large customer feedback streams such as for movie reviews (e.g., Kung-Fu Panda) or web surveys (buyers).
Lal, Aparna
2016-01-01
Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change. PMID:26848669
Lal, Aparna
2016-02-02
Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.
An integrated multi-source energy harvester based on vibration and magnetic field energy
NASA Astrophysics Data System (ADS)
Hu, Zhengwen; Qiu, Jing; Wang, Xian; Gao, Yuan; Liu, Xin; Chang, Qijie; Long, Yibing; He, Xingduo
2018-05-01
In this paper, an integrated multi-source energy harvester (IMSEH) employing a special shaped cantilever beam and a piezoelectric transducer to convert vibration and magnetic field energy into electrical energy is presented. The electric output performance of the proposed IMSEH has been investigated. Compared to a traditional multi-source energy harvester (MSEH) or single source energy harvester (SSEH), the proposed IMSEH can simultaneously harvest vibration and magnetic field energy with an integrated structure and the electric output is greatly improved. When other conditions keep identical, the IMSEH can obtain high voltage of 12.8V. Remarkably, the proposed IMSEHs have great potential for its application in wireless sensor network.
SKYDOSE: A code for gamma skyshine calculations using the integral line-beam method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shultis, J.K.; Faw, R.E.; Brockhoff, R.C.
1994-07-01
SKYDOS evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated by three simple geometries: (1) a source in a silo; (2) a source behind an infinitely long, vertical, black wall; and (3) a source in a rectangular building. In all three geometries, an optical overhead shield may be specified. The source energy must be between 0.02 and 100 MeV (10 MeV for sources with an overhead shield). This is a user`s manual. Other references give more detail on the integral line-beam method used by SKYDOSE.
EPA Facility Registry System (FRS): NCES
This web feature service contains location and facility identification information from EPA's Facility Registry System (FRS) for the subset of facilities that link to the National Center for Education Statistics (NCES). The primary federal database for collecting and analyzing data related to education in the United States and other Nations, NCES is located in the U.S. Department of Education, within the Institute of Education Sciences. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA00e2??s national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to NCES school facilities once the NCES data has been integrated into the FRS database. Additional information on FRS is available at the EPA website http://www.epa.gov/enviro/html/fii/index.html.
Xue, Xiaobo; Schoen, Mary E; Ma, Xin Cissy; Hawkins, Troy R; Ashbolt, Nicholas J; Cashdollar, Jennifer; Garland, Jay
2015-06-15
Planning for sustainable community water systems requires a comprehensive understanding and assessment of the integrated source-drinking-wastewater systems over their life-cycles. Although traditional life cycle assessment and similar tools (e.g. footprints and emergy) have been applied to elements of these water services (i.e. water resources, drinking water, stormwater or wastewater treatment alone), we argue for the importance of developing and combining the system-based tools and metrics in order to holistically evaluate the complete water service system based on the concept of integrated resource management. We analyzed the strengths and weaknesses of key system-based tools and metrics, and discuss future directions to identify more sustainable municipal water services. Such efforts may include the need for novel metrics that address system adaptability to future changes and infrastructure robustness. Caution is also necessary when coupling fundamentally different tools so to avoid misunderstanding and consequently misleading decision-making. Published by Elsevier Ltd.
Integrating Socioeconomic and Earth Science Data Using Geobrowsers and Web Services: A Demonstration
NASA Astrophysics Data System (ADS)
Schumacher, J. A.; Yetman, G. G.
2007-12-01
The societal benefit areas identified as the focus for the Global Earth Observing System of Systems (GEOSS) 10- year implementation plan are an indicator of the importance of integrating socioeconomic data with earth science data to support decision makers. To aid this integration, CIESIN is delivering its global and U.S. demographic data to commercial and open source Geobrowsers and providing open standards based services for data access. Currently, data on population distribution, poverty, and detailed census data for the U.S. are available for visualization and access in Google Earth, NASA World Wind, and a browser-based 2-dimensional mapping client. The mapping client allows for the creation of web map documents that pull together layers from distributed servers and can be saved and shared. Visualization tools with Geobrowsers, user-driven map creation and sharing via browser-based clients, and a prototype for characterizing populations at risk to predicted precipitation deficits will be demonstrated.
Integrating single-cell transcriptomic data across different conditions, technologies, and species.
Butler, Andrew; Hoffman, Paul; Smibert, Peter; Papalexi, Efthymia; Satija, Rahul
2018-06-01
Computational single-cell RNA-seq (scRNA-seq) methods have been successfully applied to experiments representing a single condition, technology, or species to discover and define cellular phenotypes. However, identifying subpopulations of cells that are present across multiple data sets remains challenging. Here, we introduce an analytical strategy for integrating scRNA-seq data sets based on common sources of variation, enabling the identification of shared populations across data sets and downstream comparative analysis. We apply this approach, implemented in our R toolkit Seurat (http://satijalab.org/seurat/), to align scRNA-seq data sets of peripheral blood mononuclear cells under resting and stimulated conditions, hematopoietic progenitors sequenced using two profiling technologies, and pancreatic cell 'atlases' generated from human and mouse islets. In each case, we learn distinct or transitional cell states jointly across data sets, while boosting statistical power through integrated analysis. Our approach facilitates general comparisons of scRNA-seq data sets, potentially deepening our understanding of how distinct cell states respond to perturbation, disease, and evolution.
EPA Facility Registry Service (FRS): CAMDBS
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Clean Air Markets Division Business System (CAMDBS). Administered by the EPA Clean Air Markets Division, within the Office of Air and Radiation, CAMDBS supports the implementation of market-based air pollution control programs, including the Acid Rain Program and regional programs designed to reduce the transport of ozone. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to CAMDBS facilities once the CAMDBS data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.
Cheng, Linzhao; Hansen, Nancy F.; Zhao, Ling; Du, Yutao; Zou, Chunlin; Donovan, Frank X.; Chou, Bin-Kuan; Zhou, Guangyu; Li, Shijie; Dowey, Sarah N.; Ye, Zhaohui; Chandrasekharappa, Settara C.; Yang, Huanming; Mullikin, James C.; Liu, P. Paul
2012-01-01
Summary The utility of induced pluripotent stem cells (iPSCs) as models to study diseases and as sources for cell therapy depends on the integrity of their genomes. Despite recent publications of DNA sequence variations in the iPSCs, the true scope of such changes for the entire genome is not clear. Here we report the whole-genome sequencing of three human iPSC lines derived from two cell types of an adult donor by episomal vectors. The vector sequence was undetectable in the deeply sequenced iPSC lines. We identified 1058–1808 heterozygous single nucleotide variants (SNVs), but no copy number variants, in each iPSC line. Six to twelve of these SNVs were within coding regions in each iPSC line, but ~50% of them are synonymous changes and the remaining are not selectively enriched for known genes associated with cancers. Our data thus suggest that episome-mediated reprogramming is not inherently mutagenic during integration-free iPSC induction. PMID:22385660
EPA Facility Registry Service (FRS): OIL
This dataset contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Oil database. The Oil database contains information on Spill Prevention, Control, and Countermeasure (SPCC) and Facility Response Plan (FRP) subject facilities to prevent and respond to oil spills. FRP facilities are referred to as substantial harm facilities due to the quantities of oil stored and facility characteristics. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to Oil facilities once the Oil data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.
Saine, M Elle; Carbonari, Dena M; Newcomb, Craig W; Nezamzadeh, Melissa S; Haynes, Kevin; Roy, Jason A; Cardillo, Serena; Hennessy, Sean; Holick, Crystal N; Esposito, Daina B; Gallagher, Arlene M; Bhullar, Harshvinder; Strom, Brian L; Lo Re, Vincent
2015-04-02
The patterns and determinants of saxagliptin use among patients with type 2 diabetes mellitus (T2DM) are unknown in real-world settings. We compared the characteristics of T2DM patients who were new initiators of saxagliptin to those who were new initiators of non-dipeptidyl peptidase-4 (DPP-4) inhibitor oral anti-diabetic drugs (OADs) and identified factors associated with saxagliptin use. We conducted a cross-sectional study within the Clinical Practice Research Datalink (CPRD), The Health Improvement Network (THIN), US Medicare, and the HealthCore Integrated Research Database (HIRD(SM)) across the first 36 months of saxagliptin availability (29 months for US Medicare). Patients were included if they were: 1) ≥18 years old, 2) newly prescribed saxagliptin or a non-DPP-4 inhibitor OAD, and 3) enrolled in their respective database for 180 days. For each saxagliptin initiator, we randomly selected up to ten non-DPP-4 inhibitor OAD initiators matched on age, sex, and geographic region. Conditional logistic regression was used to identify determinants of saxagliptin use. We identified 64,079 saxagliptin initiators (CPRD: 1,962; THIN: 2,084; US Medicare: 51,976; HIRD(SM): 8,057) and 610,660 non-DPP-4 inhibitor OAD initiators (CPRD: 19,484; THIN: 19,936; US Medicare: 493,432; HIRD(SM): 77,808). Across all four data sources, prior OAD use, hypertension, and hyperlipidemia were associated with saxagliptin use. Saxagliptin initiation was also associated with hemoglobin A1c results >8% within the UK data sources, and a greater number of hemoglobin A1c measurements in the US data sources. In these UK and US data sources, initiation of saxagliptin was associated with prior poor glycemic control, prior OAD use, and diagnoses of hypertension and hyperlipidemia. ClinicalTrials.gov identifiers NCT01086280 , NCT01086293 , NCT01086319 , NCT01086306 , and NCT01377935.
Aarabi, A; Grebe, R; Berquin, P; Bourel Ponchel, E; Jalin, C; Fohlen, M; Bulteau, C; Delalande, O; Gondry, C; Héberlé, C; Moullart, V; Wallois, F
2012-06-01
This case study aims to demonstrate that spatiotemporal spike discrimination and source analysis are effective to monitor the development of sources of epileptic activity in time and space. Therefore, they can provide clinically useful information allowing a better understanding of the pathophysiology of individual seizures with time- and space-resolved characteristics of successive epileptic states, including interictal, preictal, postictal, and ictal states. High spatial resolution scalp EEGs (HR-EEG) were acquired from a 2-year-old girl with refractory central epilepsy and single-focus seizures as confirmed by intracerebral EEG recordings and ictal single-photon emission computed tomography (SPECT). Evaluation of HR-EEG consists of the following three global steps: (1) creation of the initial head model, (2) automatic spike and seizure detection, and finally (3) source localization. During the source localization phase, epileptic states are determined to allow state-based spike detection and localization of underlying sources for each spike. In a final cluster analysis, localization results are integrated to determine the possible sources of epileptic activity. The results were compared with the cerebral locations identified by intracerebral EEG recordings and SPECT. The results obtained with this approach were concordant with those of MRI, SPECT and distribution of intracerebral potentials. Dipole cluster centres found for spikes in interictal, preictal, ictal and postictal states were situated an average of 6.3mm from the intracerebral contacts with the highest voltage. Both amplitude and shape of spikes change between states. Dispersion of the dipoles was higher in the preictal state than in the postictal state. Two clusters of spikes were identified. The centres of these clusters changed position periodically during the various epileptic states. High-resolution surface EEG evaluated by an advanced algorithmic approach can be used to investigate the spatiotemporal characteristics of sources located in the epileptic focus. The results were validated by standard methods, ensuring good spatial resolution by MRI and SPECT and optimal temporal resolution by intracerebral EEG. Surface EEG can be used to identify different spike clusters and sources of the successive epileptic states. The method that was used in this study will provide physicians with a better understanding of the pathophysiological characteristics of epileptic activities. In particular, this method may be useful for more effective positioning of implantable intracerebral electrodes. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David
A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
Evidence Integration in Natural Acoustic Textures during Active and Passive Listening
Rupp, Andre; Celikel, Tansu
2018-01-01
Abstract Many natural sounds can be well described on a statistical level, for example, wind, rain, or applause. Even though the spectro-temporal profile of these acoustic textures is highly dynamic, changes in their statistics are indicative of relevant changes in the environment. Here, we investigated the neural representation of change detection in natural textures in humans, and specifically addressed whether active task engagement is required for the neural representation of this change in statistics. Subjects listened to natural textures whose spectro-temporal statistics were modified at variable times by a variable amount. Subjects were instructed to either report the detection of changes (active) or to passively listen to the stimuli. A subset of passive subjects had performed the active task before (passive-aware vs passive-naive). Psychophysically, longer exposure to pre-change statistics was correlated with faster reaction times and better discrimination performance. EEG recordings revealed that the build-up rate and size of parieto-occipital (PO) potentials reflected change size and change time. Reduced effects were observed in the passive conditions. While P2 responses were comparable across conditions, slope and height of PO potentials scaled with task involvement. Neural source localization identified a parietal source as the main contributor of change-specific potentials, in addition to more limited contributions from auditory and frontal sources. In summary, the detection of statistical changes in natural acoustic textures is predominantly reflected in parietal locations both on the skull and source level. The scaling in magnitude across different levels of task involvement suggests a context-dependent degree of evidence integration. PMID:29662943
Sources and turnover of organic carbon and methane in fjord and shelf sediments off northern Norway
NASA Astrophysics Data System (ADS)
Sauer, Simone; Hong, Wei-Li; Knies, Jochen; Lepland, Aivo; Forwick, Matthias; Klug, Martin; Eichinger, Florian; Baranwal, Soma; Crémière, Antoine; Chand, Shyam; Schubert, Carsten J.
2016-10-01
To better understand the present and past carbon cycling and transformation processes in methane-influenced fjord and shelf areas of northern Norway, we compared two sediment cores from the Hola trough and from Ullsfjorden. We investigated (1) the organic matter composition and sedimentological characteristics to study the sources of organic carbon (Corg) and the factors influencing Corg burial, (2) pore water geochemistry to determine the contribution of organoclastic sulfate reduction and methanogenesis to total organic carbon turnover, and (3) the carbon isotopic signature of hydrocarbons to identify the carbon transformation processes and gas sources. High sedimentation and Corg accumulation rates in Ullsfjorden support the notion that fjords are important Corg sinks. The depth of the sulfate-methane-transition (SMT) in the fjord is controlled by the supply of predominantly marine organic matter to the sediment. Organoclastic sulfate reduction accounts for 60% of the total depth-integrated sulfate reduction in the fjord. In spite of the presence of ethane, propane, and butane, we suggest a purely microbial origin of light hydrocarbons in the sediments based on their low δ13C values. In the Hola trough, sedimentation and Corg accumulation rates changed during the deglacial-to-post-glacial transition from approximately 80 cm ka-1 to erosion at present. Thus, Corg burial in this part of the shelf is presently absent. Low organic matter content in the sediment and low rates of organoclastic sulfate reduction (only 3% of total depth-integrated sulfate reduction) entail that the shallow depth of the SMT is controlled mostly by ascending thermogenic methane from deeper sources.
Evidence Integration in Natural Acoustic Textures during Active and Passive Listening.
Górska, Urszula; Rupp, Andre; Boubenec, Yves; Celikel, Tansu; Englitz, Bernhard
2018-01-01
Many natural sounds can be well described on a statistical level, for example, wind, rain, or applause. Even though the spectro-temporal profile of these acoustic textures is highly dynamic, changes in their statistics are indicative of relevant changes in the environment. Here, we investigated the neural representation of change detection in natural textures in humans, and specifically addressed whether active task engagement is required for the neural representation of this change in statistics. Subjects listened to natural textures whose spectro-temporal statistics were modified at variable times by a variable amount. Subjects were instructed to either report the detection of changes (active) or to passively listen to the stimuli. A subset of passive subjects had performed the active task before (passive-aware vs passive-naive). Psychophysically, longer exposure to pre-change statistics was correlated with faster reaction times and better discrimination performance. EEG recordings revealed that the build-up rate and size of parieto-occipital (PO) potentials reflected change size and change time. Reduced effects were observed in the passive conditions. While P2 responses were comparable across conditions, slope and height of PO potentials scaled with task involvement. Neural source localization identified a parietal source as the main contributor of change-specific potentials, in addition to more limited contributions from auditory and frontal sources. In summary, the detection of statistical changes in natural acoustic textures is predominantly reflected in parietal locations both on the skull and source level. The scaling in magnitude across different levels of task involvement suggests a context-dependent degree of evidence integration.
NASA Technical Reports Server (NTRS)
Ross, Kenton W.; Graham, William D.
2007-01-01
In the aftermath of Hurricane Katrina and in response to the needs of SSC (Stennis Space Center), NASA required the generation of decision support products with a broad range of geospatial inputs. Applying a systems engineering approach, the NASA ARTPO (Applied Research and Technology Project Office) at SSC evaluated the Center's requirements and source data quality. ARTPO identified data and information products that had the potential to meet decision-making requirements; included were remotely sensed data ranging from high-spatial-resolution aerial images through high-temporal-resolution MODIS (Moderate Resolution Imaging Spectroradiometer) products. Geospatial products, such as FEMA's (Federal Emergency Management Agency's) Advisory Base Flood Elevations, were also relevant. Where possible, ARTPO applied SSC calibration/validation expertise to both clarify the quality of various data source options and to validate that the inputs that were finally chosen met SSC requirements. ARTPO integrated various information sources into multiple decision support products, including two maps: Hurricane Katrina Inundation Effects at Stennis Space Center (highlighting surge risk posture) and Vegetation Change In and Around Stennis Space Center: Katrina and Beyond (highlighting fire risk posture).
NASA Technical Reports Server (NTRS)
Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen;
2012-01-01
For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.
Evaluation of noise in the neonatal intensive care unit.
Benini, F; Magnavita, V; Lago, P; Arslan, E; Pisan, P
1996-01-01
This study evaluated the noise level inside the incubators in a neonatal intensive care unit and identified its sources in order to attempt to reduce it. Although noise is not a proven risk factor as far as the sensory integrity of newborns is concerned, it is certainly an important cause of stress to them and a source of serious and dangerous changes in their behavioral and physiologic states. Noise recorded inside the incubators had two components. The first was background noise from the incubator motors, which varied from 74.2 to 79.9 dB, and was similar to environmental noise. The second source was impulsive events beyond 80 dB. These events were the result of voluntary and involuntary contact with the incubators' Plexiglas surface or to the abrupt opening and closing of their access ports. Considering its decibel levels and frequency, this latter component is undoubtedly an important source of stress to newborns. Moreover, these data reveal the need to train health care personnel on how to reduce such noise by taking more care in the handling of infants.
ROSAT observations of clusters with wide-angle tailed radio sources
NASA Technical Reports Server (NTRS)
Burns, Jack O.
1993-01-01
The goal of these ROSAT PSPC pointed observations was to understand the nature of X-ray emission associated clusters that contain luminous wide-angle tailed (WAT) radio sources identified with the centrally dominant cluster galaxies. These 500 kpc diameter radio sources are strongly affected by confinement and interaction with the intracluster medium. So, a complete picture of the origin and evolution of these radio sources is not possible without detailed X-ray observations which sample the distribution and temperature of the surrounding hot gas. Two WAT clusters have been observed with the ROSAT PSPC to date. The first is Abell 2634 which contains the WAT 3C 465 and was approved for observations in AO-1. Unfortunately, these observations were broken into two widely separated pieces in time. The first data set containing about 9000 sec of integration arrived in mid-March, 1992. The second data set containing about 10,500 sec arrived just recently in early April (after a first tape was destroyed in the mail). The second cluster is 1919+479 which was approved for observations in AO-2. These ROSAT data arrived in October 1992.
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1991-01-01
The primary objective is to develop a methodology for predicting operational and support parameters and costs of proposed space systems. The first phase consists of: (1) the identification of data sources; (2) the development of a methodology for determining system reliability and maintainability parameters; (3) the implementation of the methodology through the use of prototypes; and (4) support in the development of an integrated computer model. The phase 1 results are documented and a direction is identified to proceed to accomplish the overall objective.
Optical identification of IGR J13091+1137 as a heavily obscured AGN
NASA Astrophysics Data System (ADS)
Masetti, N.; Palazzi, E.; Malizia, A.; Bird, A. J.; Norci, L.; Bruni, I.; Bazzano, A.; de Rosa, A.
2006-02-01
In order to identify the nature of the active nucleus of spiral galaxy NGC 4992, associated with the Chandra and INTEGRAL high-energy sources CXOU J130905.6+113803 and IGR J13091+1137 (see Halpern, Atel #572 and Sazonov et al., 2005, A&A, 444, L37), we acquired a 15-minute optical spectrum on February 1, 2006 with the instrument BFOSC mounted on the `G.D. Cassini' 1.5m telescope of the Astronomical Observatory of Bologna located in Loiano (Italy).
The Language Demands of Science Reading in Middle School
NASA Astrophysics Data System (ADS)
Fang, Zhihui
2006-04-01
The language used to construct knowledge, beliefs, and worldviews in school science is distinct from the social language that students use in their everyday ordinary life. This difference is a major source of reading difficulty for many students, especially struggling readers and English-language learners. This article identifies some of the linguistic challenges involved in reading middle-school science texts and suggests several teaching strategies to help students cope with these challenges. It is argued that explicit attention to the unique language of school science should be an integral part of science literacy pedagogy.
Chen, Keping; Blong, Russell; Jacobson, Carol
2003-04-01
This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.
Kissling, W Daniel; Ahumada, Jorge A; Bowser, Anne; Fernandez, Miguel; Fernández, Néstor; García, Enrique Alonso; Guralnick, Robert P; Isaac, Nick J B; Kelling, Steve; Los, Wouter; McRae, Louise; Mihoub, Jean-Baptiste; Obst, Matthias; Santamaria, Monica; Skidmore, Andrew K; Williams, Kristen J; Agosti, Donat; Amariles, Daniel; Arvanitidis, Christos; Bastin, Lucy; De Leo, Francesca; Egloff, Willi; Elith, Jane; Hobern, Donald; Martin, David; Pereira, Henrique M; Pesole, Graziano; Peterseil, Johannes; Saarenmaa, Hannu; Schigel, Dmitry; Schmeller, Dirk S; Segata, Nicola; Turak, Eren; Uhlir, Paul F; Wee, Brian; Hardisty, Alex R
2018-02-01
Much biodiversity data is collected worldwide, but it remains challenging to assemble the scattered knowledge for assessing biodiversity status and trends. The concept of Essential Biodiversity Variables (EBVs) was introduced to structure biodiversity monitoring globally, and to harmonize and standardize biodiversity data from disparate sources to capture a minimum set of critical variables required to study, report and manage biodiversity change. Here, we assess the challenges of a 'Big Data' approach to building global EBV data products across taxa and spatiotemporal scales, focusing on species distribution and abundance. The majority of currently available data on species distributions derives from incidentally reported observations or from surveys where presence-only or presence-absence data are sampled repeatedly with standardized protocols. Most abundance data come from opportunistic population counts or from population time series using standardized protocols (e.g. repeated surveys of the same population from single or multiple sites). Enormous complexity exists in integrating these heterogeneous, multi-source data sets across space, time, taxa and different sampling methods. Integration of such data into global EBV data products requires correcting biases introduced by imperfect detection and varying sampling effort, dealing with different spatial resolution and extents, harmonizing measurement units from different data sources or sampling methods, applying statistical tools and models for spatial inter- or extrapolation, and quantifying sources of uncertainty and errors in data and models. To support the development of EBVs by the Group on Earth Observations Biodiversity Observation Network (GEO BON), we identify 11 key workflow steps that will operationalize the process of building EBV data products within and across research infrastructures worldwide. These workflow steps take multiple sequential activities into account, including identification and aggregation of various raw data sources, data quality control, taxonomic name matching and statistical modelling of integrated data. We illustrate these steps with concrete examples from existing citizen science and professional monitoring projects, including eBird, the Tropical Ecology Assessment and Monitoring network, the Living Planet Index and the Baltic Sea zooplankton monitoring. The identified workflow steps are applicable to both terrestrial and aquatic systems and a broad range of spatial, temporal and taxonomic scales. They depend on clear, findable and accessible metadata, and we provide an overview of current data and metadata standards. Several challenges remain to be solved for building global EBV data products: (i) developing tools and models for combining heterogeneous, multi-source data sets and filling data gaps in geographic, temporal and taxonomic coverage, (ii) integrating emerging methods and technologies for data collection such as citizen science, sensor networks, DNA-based techniques and satellite remote sensing, (iii) solving major technical issues related to data product structure, data storage, execution of workflows and the production process/cycle as well as approaching technical interoperability among research infrastructures, (iv) allowing semantic interoperability by developing and adopting standards and tools for capturing consistent data and metadata, and (v) ensuring legal interoperability by endorsing open data or data that are free from restrictions on use, modification and sharing. Addressing these challenges is critical for biodiversity research and for assessing progress towards conservation policy targets and sustainable development goals. © 2017 The Authors. Biological Reviews published by John Wiley & Sons Ltd on behalf of Cambridge Philosophical Society.