Sample records for source integrated relational

  1. Integration of relational and textual biomedical sources. A pilot experiment using a semi-automated method for logical schema acquisition.

    PubMed

    García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J

    2010-01-01

    Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.

  2. Learning to Integrate Divergent Information Sources: The Interplay of Epistemic Cognition and Epistemic Metacognition

    ERIC Educational Resources Information Center

    Barzilai, Sarit; Ka'adan, Ibtisam

    2017-01-01

    Learning to integrate multiple information sources is vital for advancing learners' digital literacy. Previous studies have found that learners' epistemic metacognitive knowledge about the nature of knowledge and knowing is related to their strategic integration performance. The purpose of this study was to understand how these relations come into…

  3. Systems and methods for an integrated electrical sub-system powered by wind energy

    DOEpatents

    Liu, Yan [Ballston Lake, NY; Garces, Luis Jose [Niskayuna, NY

    2008-06-24

    Various embodiments relate to systems and methods related to an integrated electrically-powered sub-system and wind power system including a wind power source, an electrically-powered sub-system coupled to and at least partially powered by the wind power source, the electrically-powered sub-system being coupled to the wind power source through power converters, and a supervisory controller coupled to the wind power source and the electrically-powered sub-system to monitor and manage the integrated electrically-powered sub-system and wind power system.

  4. A Formal Integrity Framework with Application to a Secure Information ATM (SIATM)

    DTIC Science & Technology

    2012-10-01

    work on Integrity and resultant implementations seems to have focussed more on a matters related to source authentication and transmission assurance...to have focussed more on a matters related to source authentication and transmission assurance. However, the quality of data aspect is becoming more...implementations seems to have focussed more on matters related to source authentication and transmission assur- ance, for which there is a

  5. SDSS IV MaNGA: Dependence of Global and Spatially Resolved SFR–M ∗ Relations on Galaxy Properties

    NASA Astrophysics Data System (ADS)

    Pan, Hsi-An; Lin, Lihwai; Hsieh, Bau-Ching; Sánchez, Sebastián F.; Ibarra-Medel, Héctor; Boquien, Médéric; Lacerna, Ivan; Argudo-Fernández, Maria; Bizyaev, Dmitry; Cano-Díaz, Mariana; Drory, Niv; Gao, Yang; Masters, Karen; Pan, Kaike; Tabor, Martha; Tissera, Patricia; Xiao, Ting

    2018-02-01

    The galaxy integrated Hα star formation rate–stellar mass relation, or SFR(global)–M *(global) relation, is crucial for understanding star formation history and evolution of galaxies. However, many studies have dealt with SFR using unresolved measurements, which makes it difficult to separate out the contamination from other ionizing sources, such as active galactic nuclei and evolved stars. Using the integral field spectroscopic observations from SDSS-IV MaNGA, we spatially disentangle the contribution from different Hα powering sources for ∼1000 galaxies. We find that, when including regions dominated by all ionizing sources in galaxies, the spatially resolved relation between Hα surface density (ΣHα (all)) and stellar mass surface density (Σ*(all)) progressively turns over at the high Σ*(all) end for increasing M *(global) and/or bulge dominance (bulge-to-total light ratio, B/T). This in turn leads to the flattening of the integrated Hα(global)–M *(global) relation in the literature. By contrast, there is no noticeable flattening in both integrated Hα(H II)–M *(H II) and spatially resolved ΣHα (H II)–Σ*(H II) relations when only regions where star formation dominates the ionization are considered. In other words, the flattening can be attributed to the increasing regions powered by non-star-formation sources, which generally have lower ionizing ability than star formation. An analysis of the fractional contribution of non-star-formation sources to total Hα luminosity of a galaxy suggests a decreasing role of star formation as an ionizing source toward high-mass, high-B/T galaxies and bulge regions. This result indicates that the appearance of the galaxy integrated SFR–M * relation critically depends on their global properties (M *(global) and B/T) and relative abundances of various ionizing sources within the galaxies.

  6. Consumer-led health-related online sources and their impact on consumers: An integrative review of the literature.

    PubMed

    Laukka, Elina; Rantakokko, Piia; Suhonen, Marjo

    2017-04-01

    The aim of the review was to describe consumer-led health-related online sources and their impact on consumers. The review was carried out as an integrative literature review. Quantisation and qualitative content analysis were used as the analysis method. The most common method used by the included studies was qualitative content analysis. This review identified the consumer-led health-related online sources used between 2009 and 2016 as health-related online communities, health-related social networking sites and health-related rating websites. These sources had an impact on peer support; empowerment; health literacy; physical, mental and emotional wellbeing; illness management; and relationships between healthcare organisations and consumers. The knowledge of the existence of the health-related online sources provides healthcare organisations with an opportunity to listen to their consumers' 'voice'. The sources make healthcare consumers more competent actors in relation to healthcare, and the knowledge of them is a valuable resource for healthcare organisations. Additionally, these health-related online sources might create an opportunity to reduce the need for drifting among the healthcare services. Healthcare policymakers and organisations could benefit from having a strategy of increasing their health-related online sources.

  7. Integration of Schemas on the Pre-Design Level Using the KCPM-Approach

    NASA Astrophysics Data System (ADS)

    Vöhringer, Jürgen; Mayr, Heinrich C.

    Integration is a central research and operational issue in information system design and development. It can be conducted on the system, schema, and view or data level. On the system level, integration deals with the progressive linking and testing of system components to merge their functional and technical characteristics and behavior into a comprehensive, interoperable system. Schema integration comprises the comparison and merging of two or more schemas, usually conceptual database schemas. The integration of data deals with merging the contents of multiple sources of related data. View integration is similar to schema integration, however focuses on views and queries on these instead of schemas. All these types of integration have in common, that two or more sources are merged and previously compared, in order to identify matches and mismatches as well as conflicts and inconsistencies. The sources may stem from heterogeneous companies, organizational units or projects. Integration enables the reuse and combined use of source components.

  8. Chinese Localisation of Evergreen: An Open Source Integrated Library System

    ERIC Educational Resources Information Center

    Zou, Qing; Liu, Guoying

    2009-01-01

    Purpose: The purpose of this paper is to investigate various issues related to Chinese language localisation in Evergreen, an open source integrated library system (ILS). Design/methodology/approach: A Simplified Chinese version of Evergreen was implemented and tested and various issues such as encoding, indexing, searching, and sorting…

  9. Large-scale adverse effects related to treatment evidence standardization (LAERTES): an open scalable system for linking pharmacovigilance evidence sources with clinical data.

    PubMed

    2017-03-07

    Integrating multiple sources of pharmacovigilance evidence has the potential to advance the science of safety signal detection and evaluation. In this regard, there is a need for more research on how to integrate multiple disparate evidence sources while making the evidence computable from a knowledge representation perspective (i.e., semantic enrichment). Existing frameworks suggest well-promising outcomes for such integration but employ a rather limited number of sources. In particular, none have been specifically designed to support both regulatory and clinical use cases, nor have any been designed to add new resources and use cases through an open architecture. This paper discusses the architecture and functionality of a system called Large-scale Adverse Effects Related to Treatment Evidence Standardization (LAERTES) that aims to address these shortcomings. LAERTES provides a standardized, open, and scalable architecture for linking evidence sources relevant to the association of drugs with health outcomes of interest (HOIs). Standard terminologies are used to represent different entities. For example, drugs and HOIs are represented in RxNorm and Systematized Nomenclature of Medicine -- Clinical Terms respectively. At the time of this writing, six evidence sources have been loaded into the LAERTES evidence base and are accessible through prototype evidence exploration user interface and a set of Web application programming interface services. This system operates within a larger software stack provided by the Observational Health Data Sciences and Informatics clinical research framework, including the relational Common Data Model for observational patient data created by the Observational Medical Outcomes Partnership. Elements of the Linked Data paradigm facilitate the systematic and scalable integration of relevant evidence sources. The prototype LAERTES system provides useful functionality while creating opportunities for further research. Future work will involve improving the method for normalizing drug and HOI concepts across the integrated sources, aggregated evidence at different levels of a hierarchy of HOI concepts, and developing more advanced user interface for drug-HOI investigations.

  10. Neuronal integration of dynamic sources: Bayesian learning and Bayesian inference.

    PubMed

    Siegelmann, Hava T; Holzman, Lars E

    2010-09-01

    One of the brain's most basic functions is integrating sensory data from diverse sources. This ability causes us to question whether the neural system is computationally capable of intelligently integrating data, not only when sources have known, fixed relative dependencies but also when it must determine such relative weightings based on dynamic conditions, and then use these learned weightings to accurately infer information about the world. We suggest that the brain is, in fact, fully capable of computing this parallel task in a single network and describe a neural inspired circuit with this property. Our implementation suggests the possibility that evidence learning requires a more complex organization of the network than was previously assumed, where neurons have different specialties, whose emergence brings the desired adaptivity seen in human online inference.

  11. SNPConvert: SNP Array Standardization and Integration in Livestock Species.

    PubMed

    Nicolazzi, Ezequiel Luis; Marras, Gabriele; Stella, Alessandra

    2016-06-09

    One of the main advantages of single nucleotide polymorphism (SNP) array technology is providing genotype calls for a specific number of SNP markers at a relatively low cost. Since its first application in animal genetics, the number of available SNP arrays for each species has been constantly increasing. However, conversely to that observed in whole genome sequence data analysis, SNP array data does not have a common set of file formats or coding conventions for allele calling. Therefore, the standardization and integration of SNP array data from multiple sources have become an obstacle, especially for users with basic or no programming skills. Here, we describe the difficulties related to handling SNP array data, focusing on file formats, SNP allele coding, and mapping. We also present SNPConvert suite, a multi-platform, open-source, and user-friendly set of tools to overcome these issues. This tool, which can be integrated with open-source and open-access tools already available, is a first step towards an integrated system to standardize and integrate any type of raw SNP array data. The tool is available at: https://github. com/nicolazzie/SNPConvert.git.

  12. WATERSHED CLASSIFICATION AS A TOOL FOR MONITORING, ASSESSMENT, AND MANAGEMENT

    EPA Science Inventory

    Most sources of stream impairment are related to nonpoint source pollution. To more efficiently deal with TMDL-related issues, an integrated approach to small watershed assessment, diagnosis, and restoration planning is needed that is based on differences in sensitivity and prob...

  13. Commentary: Advances in Research on Sourcing-Source Credibility and Reliable Processes for Producing Knowledge Claims

    ERIC Educational Resources Information Center

    Chinn, Clark A.; Rinehart, Ronald W.

    2016-01-01

    In our commentary on this excellent set of articles on "Sourcing in the Reading Process," we endeavor to synthesize the findings from the seven articles and discuss future research. We discuss significant contributions related to source memory, source evaluation, use of sources in action and belief, integration of information from…

  14. Survey of organizational research climates in three research intensive, doctoral granting universities.

    PubMed

    Wells, James A; Thrush, Carol R; Martinson, Brian C; May, Terry A; Stickler, Michelle; Callahan, Eileen C; Klomparens, Karen L

    2014-12-01

    The Survey of Organizational Research Climate (SOuRCe) is a new instrument that assesses dimensions of research integrity climate, including ethical leadership, socialization and communication processes, and policies, procedures, structures, and processes to address risks to research integrity. We present a descriptive analysis to characterize differences on the SOuRCe scales across departments, fields of study, and status categories (faculty, postdoctoral scholars, and graduate students) for 11,455 respondents from three research-intensive universities. Among the seven SOuRCe scales, variance explained by status and fields of study ranged from 7.6% (Advisor-Advisee Relations) to 16.2% (Integrity Norms). Department accounted for greater than 50% of the variance explained for each of the SOuRCe scales, ranging from 52.6% (Regulatory Quality) to 80.3% (Integrity Inhibitors). It is feasible to implement this instrument in large university settings across a broad range of fields, department types, and individual roles within academic units. Published baseline results provide initial data for institutions using the SOuRCe who wish to compare their own research integrity climates. © The Author(s) 2014.

  15. DESIGN OF A HIGH COMPRESSION, DIRECT INJECTION, SPARK-IGNITION, METHANOL FUELED RESEARCH ENGINE WITH AN INTEGRAL INJECTOR-IGNITION SOURCE INSERT, SAE PAPER 2001-01-3651

    EPA Science Inventory

    A stratified charge research engine and test stand were designed and built for this work. The primary goal of this project was to evaluate the feasibility of using a removal integral injector ignition source insert which allows a convenient method of charging the relative locat...

  16. Integrating information from disparate sources: the Walter Reed National Surgical Quality Improvement Program Data Transfer Project.

    PubMed

    Nelson, Victoria; Nelson, Victoria Ruth; Li, Fiona; Green, Susan; Tamura, Tomoyoshi; Liu, Jun-Min; Class, Margaret

    2008-11-06

    The Walter Reed National Surgical Quality Improvement Program Data Transfer web module integrates with medical and surgical information systems, and leverages outside standards, such as the National Library of Medicine's RxNorm, to process surgical and risk assessment data. Key components of the project included a needs assessment with nurse reviewers and a data analysis for federated (standards were locally controlled) data sources. The resulting interface streamlines nurse reviewer workflow by integrating related tasks and data.

  17. On the power output of some idealized source configurations with one or more characteristic dimensions

    NASA Technical Reports Server (NTRS)

    Levine, H.

    1982-01-01

    The calculation of power output from a (finite) linear array of equidistant point sources is investigated with allowance for a relative phase shift and particular focus on the circumstances of small/large individual source separation. A key role is played by the estimates found for a twin parameter definite integral that involves the Fejer kernel functions, where N denotes a (positive) integer; these results also permit a quantitative accounting of energy partition between the principal and secondary lobes of the array pattern. Continuously distributed sources along a finite line segment or an open ended circular cylindrical shell are considered, and estimates for the relatively lower output in the latter configuration are made explicit when the shell radius is small compared to the wave length. A systematic reduction of diverse integrals which characterize the energy output from specific line and strip sources is investigated.

  18. Automating data acquisition into ontologies from pharmacogenetics relational data sources using declarative object definitions and XML.

    PubMed

    Rubin, Daniel L; Hewett, Micheal; Oliver, Diane E; Klein, Teri E; Altman, Russ B

    2002-01-01

    Ontologies are useful for organizing large numbers of concepts having complex relationships, such as the breadth of genetic and clinical knowledge in pharmacogenomics. But because ontologies change and knowledge evolves, it is time consuming to maintain stable mappings to external data sources that are in relational format. We propose a method for interfacing ontology models with data acquisition from external relational data sources. This method uses a declarative interface between the ontology and the data source, and this interface is modeled in the ontology and implemented using XML schema. Data is imported from the relational source into the ontology using XML, and data integrity is checked by validating the XML submission with an XML schema. We have implemented this approach in PharmGKB (http://www.pharmgkb.org/), a pharmacogenetics knowledge base. Our goals were to (1) import genetic sequence data, collected in relational format, into the pharmacogenetics ontology, and (2) automate the process of updating the links between the ontology and data acquisition when the ontology changes. We tested our approach by linking PharmGKB with data acquisition from a relational model of genetic sequence information. The ontology subsequently evolved, and we were able to rapidly update our interface with the external data and continue acquiring the data. Similar approaches may be helpful for integrating other heterogeneous information sources in order make the diversity of pharmacogenetics data amenable to computational analysis.

  19. Artemis: Integrating Scientific Data on the Grid (Preprint)

    DTIC Science & Technology

    2004-07-01

    Theseus execution engine [Barish and Knoblock 03] to efficiently execute the generated datalog program. The Theseus execution engine has a wide...variety of operations to query databases, web sources, and web services. Theseus also contains a wide variety of relational operations, such as...selection, union, or projection. Furthermore, Theseus optimizes the execution of an integration plan by querying several data sources in parallel and

  20. Data Foundry: Data Warehousing and Integration for Scientific Data Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musick, R.; Critchlow, T.; Ganesh, M.

    2000-02-29

    Data warehousing is an approach for managing data from multiple sources by representing them with a single, coherent point of view. Commercial data warehousing products have been produced by companies such as RebBrick, IBM, Brio, Andyne, Ardent, NCR, Information Advantage, Informatica, and others. Other companies have chosen to develop their own in-house data warehousing solution using relational databases, such as those sold by Oracle, IBM, Informix and Sybase. The typical approaches include federated systems, and mediated data warehouses, each of which, to some extent, makes use of a series of source-specific wrapper and mediator layers to integrate the data intomore » a consistent format which is then presented to users as a single virtual data store. These approaches are successful when applied to traditional business data because the data format used by the individual data sources tends to be rather static. Therefore, once a data source has been integrated into a data warehouse, there is relatively little work required to maintain that connection. However, that is not the case for all data sources. Data sources from scientific domains tend to regularly change their data model, format and interface. This is problematic because each change requires the warehouse administrator to update the wrapper, mediator, and warehouse interfaces to properly read, interpret, and represent the modified data source. Furthermore, the data that scientists require to carry out research is continuously changing as their understanding of a research question develops, or as their research objectives evolve. The difficulty and cost of these updates effectively limits the number of sources that can be integrated into a single data warehouse, or makes an approach based on warehousing too expensive to consider.« less

  1. Social Integration and Domestic Violence Support in an Indigenous Community: Women's Recommendations of Formal Versus Informal Sources of Support.

    PubMed

    Gauthier, G Robin; Francisco, Sara C; Khan, Bilal; Dombrowski, Kirk

    2018-05-01

    Throughout North America, indigenous women experience higher rates of intimate partner violence and sexual violence than any other ethnic group, and so it is of particular importance to understand sources of support for Native American women. In this article, we use social network analysis to study the relationship between social integration and women's access to domestic violence support by examining the recommendations they would give to another woman in need. We ask two main questions: First, are less integrated women more likely to make no recommendation at all when compared with more socially integrated women? Second, are less integrated women more likely than more integrated women to nominate a formal source of support rather than an informal one? We use network data collected from interviews with 158 Canadian women residing in an indigenous community to measure their access to support. We find that, in general, less integrated women are less likely to make a recommendation than more integrated women. However, when they do make a recommendation, less integrated women are more likely to recommend a formal source of support than women who are more integrated. These results add to our understanding of how access to two types of domestic violence support is embedded in the larger set of social relations of an indigenous community.

  2. Wilber's Integral Theory and Dossey's Theory of Integral Nursing: An Examination of Two Integral Approaches in Nursing Scholarship.

    PubMed

    Shea, Linda; Frisch, Noreen

    2016-09-01

    The purpose of this article is to examine Dossey's theory of integral nursing in relation to its major theoretical source, Wilber's integral theory. Although several nursing scholars have written about integral theory in relation to nursing scholarship and practice, Dossey's theory of integral nursing may be influencing how nurses take up integral theory in a significant way due to an extensive outreach in the holistic nursing community. Despite this wide circulation, the theory of integral nursing has yet to be reviewed in the nursing literature. This article (a) compares Dossey's theory of integral nursing with Wilber's integral theory and (b) contrasts Dossey's integral approach with another integral approach used by other scholars of integral theory. © The Author(s) 2015.

  3. Fluorescence errors in integrating sphere measurements of remote phosphor type LED light sources

    NASA Astrophysics Data System (ADS)

    Keppens, A.; Zong, Y.; Podobedov, V. B.; Nadal, M. E.; Hanselaer, P.; Ohno, Y.

    2011-05-01

    The relative spectral radiant flux error caused by phosphor fluorescence during integrating sphere measurements is investigated both theoretically and experimentally. Integrating sphere and goniophotometer measurements are compared and used for model validation, while a case study provides additional clarification. Criteria for reducing fluorescence errors to a degree of negligibility as well as a fluorescence error correction method based on simple matrix algebra are presented. Only remote phosphor type LED light sources are studied because of their large phosphor surfaces and high application potential in general lighting.

  4. Audiovisual Speech Integration in Pervasive Developmental Disorder: Evidence from Event-Related Potentials

    ERIC Educational Resources Information Center

    Magnee, Maurice J. C. M.; de Gelder, Beatrice; van Engeland, Herman; Kemner, Chantal

    2008-01-01

    Background: Integration of information from multiple sensory sources is an important prerequisite for successful social behavior, especially during face-to-face conversation. It has been suggested that communicative impairments among individuals with pervasive developmental disorders (PDD) might be caused by an inability to integrate synchronously…

  5. Wave field synthesis of a virtual source located in proximity to a loudspeaker array.

    PubMed

    Lee, Jung-Min; Choi, Jung-Woo; Kim, Yang-Hann

    2013-09-01

    For the derivation of 2.5-dimensional operator in wave field synthesis, a virtual source is assumed to be positioned far from a loudspeaker array. However, such far-field approximation inevitably results in a reproduction error when the virtual source is placed adjacent to an array. In this paper, a method is proposed to generate a virtual source close to and behind a continuous line array of loudspeakers. A driving function is derived by reducing a surface integral (Rayleigh integral) to a line integral based on the near-field assumption. The solution is then combined with the far-field formula of wave field synthesis by introducing a weighting function that can adjust the near- and far-field contribution of each driving function. This enables production of a virtual source anywhere in relation to the array. Simulations show the proposed method can reduce the reproduction error to below -18 dB, regardless of the virtual source position.

  6. COHeRE: Cross-Ontology Hierarchical Relation Examination for Ontology Quality Assurance.

    PubMed

    Cui, Licong

    Biomedical ontologies play a vital role in healthcare information management, data integration, and decision support. Ontology quality assurance (OQA) is an indispensable part of the ontology engineering cycle. Most existing OQA methods are based on the knowledge provided within the targeted ontology. This paper proposes a novel cross-ontology analysis method, Cross-Ontology Hierarchical Relation Examination (COHeRE), to detect inconsistencies and possible errors in hierarchical relations across multiple ontologies. COHeRE leverages the Unified Medical Language System (UMLS) knowledge source and the MapReduce cloud computing technique for systematic, large-scale ontology quality assurance work. COHeRE consists of three main steps with the UMLS concepts and relations as the input. First, the relations claimed in source vocabularies are filtered and aggregated for each pair of concepts. Second, inconsistent relations are detected if a concept pair is related by different types of relations in different source vocabularies. Finally, the uncovered inconsistent relations are voted according to their number of occurrences across different source vocabularies. The voting result together with the inconsistent relations serve as the output of COHeRE for possible ontological change. The highest votes provide initial suggestion on how such inconsistencies might be fixed. In UMLS, 138,987 concept pairs were found to have inconsistent relationships across multiple source vocabularies. 40 inconsistent concept pairs involving hierarchical relationships were randomly selected and manually reviewed by a human expert. 95.8% of the inconsistent relations involved in these concept pairs indeed exist in their source vocabularies rather than being introduced by mistake in the UMLS integration process. 73.7% of the concept pairs with suggested relationship were agreed by the human expert. The effectiveness of COHeRE indicates that UMLS provides a promising environment to enhance qualities of biomedical ontologies by performing cross-ontology examination.

  7. How Sources of Sexual Information Relate to Adolescents' Beliefs about Sex

    ERIC Educational Resources Information Center

    Bleakley, Amy; Hennessy, Michael; Fishbein, Martin; Jordan, Amy

    2009-01-01

    Objectives: To examine how sources of sexual information are associated with adolescents' behavioral, normative, and control beliefs about having sexual intercourse using the integrative model of behavior change. Methods: Survey data from a quota sample of 459 youth. Results: The most frequently reported sources were friends, teachers, mothers,…

  8. Integration and Beyond

    PubMed Central

    Stead, William W.; Miller, Randolph A.; Musen, Mark A.; Hersh, William R.

    2000-01-01

    The vision of integrating information—from a variety of sources, into the way people work, to improve decisions and process—is one of the cornerstones of biomedical informatics. Thoughts on how this vision might be realized have evolved as improvements in information and communication technologies, together with discoveries in biomedical informatics, and have changed the art of the possible. This review identified three distinct generations of “integration” projects. First-generation projects create a database and use it for multiple purposes. Second-generation projects integrate by bringing information from various sources together through enterprise information architecture. Third-generation projects inter-relate disparate but accessible information sources to provide the appearance of integration. The review suggests that the ideas developed in the earlier generations have not been supplanted by ideas from subsequent generations. Instead, the ideas represent a continuum of progress along the three dimensions of workflow, structure, and extraction. PMID:10730596

  9. Integration of Reference Frames Using VLBI

    NASA Technical Reports Server (NTRS)

    Ma, Chopo; Smith, David E. (Technical Monitor)

    2001-01-01

    Very Long Baseline Interferometry (VLBI) has the unique potential to integrate the terrestrial and celestial reference frames through simultaneous estimation of positions and velocities of approx. 40 active VLBI stations and a similar number of stations/sites with sufficient historical data, the position and position stability of approx. 150 well-observed extragalactic radio sources and another approx. 500 sources distributed fairly uniformly on the sky, and the time series of the five parameters that specify the relative orientation of the two frames. The full realization of this potential is limited by a number of factors including the temporal and spatial distribution of the stations, uneven distribution of observations over the sources and the sky, variations in source structure, modeling of the solid/fluid Earth and troposphere, logistical restrictions on the daily observing network size, and differing strategies for optimizing analysis for TRF, for CRF and for EOP. The current status of separately optimized and integrated VLBI analysis will be discussed.

  10. NoSQL data model for semi-automatic integration of ethnomedicinal plant data from multiple sources.

    PubMed

    Ningthoujam, Sanjoy Singh; Choudhury, Manabendra Dutta; Potsangbam, Kumar Singh; Chetia, Pankaj; Nahar, Lutfun; Sarker, Satyajit D; Basar, Norazah; Das Talukdar, Anupam

    2014-01-01

    Sharing traditional knowledge with the scientific community could refine scientific approaches to phytochemical investigation and conservation of ethnomedicinal plants. As such, integration of traditional knowledge with scientific data using a single platform for sharing is greatly needed. However, ethnomedicinal data are available in heterogeneous formats, which depend on cultural aspects, survey methodology and focus of the study. Phytochemical and bioassay data are also available from many open sources in various standards and customised formats. To design a flexible data model that could integrate both primary and curated ethnomedicinal plant data from multiple sources. The current model is based on MongoDB, one of the Not only Structured Query Language (NoSQL) databases. Although it does not contain schema, modifications were made so that the model could incorporate both standard and customised ethnomedicinal plant data format from different sources. The model presented can integrate both primary and secondary data related to ethnomedicinal plants. Accommodation of disparate data was accomplished by a feature of this database that supported a different set of fields for each document. It also allowed storage of similar data having different properties. The model presented is scalable to a highly complex level with continuing maturation of the database, and is applicable for storing, retrieving and sharing ethnomedicinal plant data. It can also serve as a flexible alternative to a relational and normalised database. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Atlas - a data warehouse for integrative bioinformatics.

    PubMed

    Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire M S; Ling, John; Ouellette, B F Francis

    2005-02-21

    We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: http://bioinformatics.ubc.ca/atlas/

  12. Atlas – a data warehouse for integrative bioinformatics

    PubMed Central

    Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire MS; Ling, John; Ouellette, BF Francis

    2005-01-01

    Background We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. Description The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. Conclusion The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: PMID:15723693

  13. Golden Ratio Versus Pi as Random Sequence Sources for Monte Carlo Integration

    NASA Technical Reports Server (NTRS)

    Sen, S. K.; Agarwal, Ravi P.; Shaykhian, Gholam Ali

    2007-01-01

    We discuss here the relative merits of these numbers as possible random sequence sources. The quality of these sequences is not judged directly based on the outcome of all known tests for the randomness of a sequence. Instead, it is determined implicitly by the accuracy of the Monte Carlo integration in a statistical sense. Since our main motive of using a random sequence is to solve real world problems, it is more desirable if we compare the quality of the sequences based on their performances for these problems in terms of quality/accuracy of the output. We also compare these sources against those generated by a popular pseudo-random generator, viz., the Matlab rand and the quasi-random generator ha/ton both in terms of error and time complexity. Our study demonstrates that consecutive blocks of digits of each of these numbers produce a good random sequence source. It is observed that randomly chosen blocks of digits do not have any remarkable advantage over consecutive blocks for the accuracy of the Monte Carlo integration. Also, it reveals that pi is a better source of a random sequence than theta when the accuracy of the integration is concerned.

  14. Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.

    PubMed

    Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua

    2015-01-01

    A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.

  15. Unified method to integrate and blend several, potentially related, sources of information for genetic evaluation.

    PubMed

    Vandenplas, Jérémie; Colinet, Frederic G; Gengler, Nicolas

    2014-09-30

    A condition to predict unbiased estimated breeding values by best linear unbiased prediction is to use simultaneously all available data. However, this condition is not often fully met. For example, in dairy cattle, internal (i.e. local) populations lead to evaluations based only on internal records while widely used foreign sires have been selected using internally unavailable external records. In such cases, internal genetic evaluations may be less accurate and biased. Because external records are unavailable, methods were developed to combine external information that summarizes these records, i.e. external estimated breeding values and associated reliabilities, with internal records to improve accuracy of internal genetic evaluations. Two issues of these methods concern double-counting of contributions due to relationships and due to records. These issues could be worse if external information came from several evaluations, at least partially based on the same records, and combined into a single internal evaluation. Based on a Bayesian approach, the aim of this research was to develop a unified method to integrate and blend simultaneously several sources of information into an internal genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. This research resulted in equations that integrate and blend simultaneously several sources of information and avoid double-counting of contributions due to relationships and due to records. The performance of the developed equations was evaluated using simulated and real datasets. The results showed that the developed equations integrated and blended several sources of information well into a genetic evaluation. The developed equations also avoided double-counting of contributions due to relationships and due to records. Furthermore, because all available external sources of information were correctly propagated, relatives of external animals benefited from the integrated information and, therefore, more reliable estimated breeding values were obtained. The proposed unified method integrated and blended several sources of information well into a genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. The unified method can also be extended to other types of situations such as single-step genomic or multi-trait evaluations, combining information across different traits.

  16. Semantic integration of data on transcriptional regulation

    PubMed Central

    Baitaluk, Michael; Ponomarenko, Julia

    2010-01-01

    Motivation: Experimental and predicted data concerning gene transcriptional regulation are distributed among many heterogeneous sources. However, there are no resources to integrate these data automatically or to provide a ‘one-stop shop’ experience for users seeking information essential for deciphering and modeling gene regulatory networks. Results: IntegromeDB, a semantic graph-based ‘deep-web’ data integration system that automatically captures, integrates and manages publicly available data concerning transcriptional regulation, as well as other relevant biological information, is proposed in this article. The problems associated with data integration are addressed by ontology-driven data mapping, multiple data annotation and heterogeneous data querying, also enabling integration of the user's data. IntegromeDB integrates over 100 experimental and computational data sources relating to genomics, transcriptomics, genetics, and functional and interaction data concerning gene transcriptional regulation in eukaryotes and prokaryotes. Availability: IntegromeDB is accessible through the integrated research environment BiologicalNetworks at http://www.BiologicalNetworks.org Contact: baitaluk@sdsc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20427517

  17. Semantic integration of data on transcriptional regulation.

    PubMed

    Baitaluk, Michael; Ponomarenko, Julia

    2010-07-01

    Experimental and predicted data concerning gene transcriptional regulation are distributed among many heterogeneous sources. However, there are no resources to integrate these data automatically or to provide a 'one-stop shop' experience for users seeking information essential for deciphering and modeling gene regulatory networks. IntegromeDB, a semantic graph-based 'deep-web' data integration system that automatically captures, integrates and manages publicly available data concerning transcriptional regulation, as well as other relevant biological information, is proposed in this article. The problems associated with data integration are addressed by ontology-driven data mapping, multiple data annotation and heterogeneous data querying, also enabling integration of the user's data. IntegromeDB integrates over 100 experimental and computational data sources relating to genomics, transcriptomics, genetics, and functional and interaction data concerning gene transcriptional regulation in eukaryotes and prokaryotes. IntegromeDB is accessible through the integrated research environment BiologicalNetworks at http://www.BiologicalNetworks.org baitaluk@sdsc.edu Supplementary data are available at Bioinformatics online.

  18. Empowering Provenance in Data Integration

    NASA Astrophysics Data System (ADS)

    Kondylakis, Haridimos; Doerr, Martin; Plexousakis, Dimitris

    The provenance of data has recently been recognized as central to the trust one places in data. This paper presents a novel framework in order to empower provenance in a mediator based data integration system. We use a simple mapping language for mapping schema constructs, between an ontology and relational sources, capable to carry provenance information. This language extends the traditional data exchange setting by translating our mapping specifications into source-to-target tuple generating dependencies (s-t tgds). Then we define formally the provenance information we want to retrieve i.e. annotation, source and tuple provenance. We provide three algorithms to retrieve provenance information using information stored on the mappings and the sources. We show the feasibility of our solution and the advantages of our framework.

  19. [Arabian food pyramid: unified framework for nutritional health messages].

    PubMed

    Shokr, Adel M

    2008-01-01

    There are several ways to present nutritional health messages, particularly pyramidic indices, but they have many deficiencies such as lack of agreement on a unified or clear methodology for food grouping and ignoring nutritional group inter-relation and integration. This causes confusion for health educators and target individuals. This paper presents an Arabian food pyramid that aims to unify the bases of nutritional health messages, bringing together the function, contents, source and nutritional group servings and indicating the inter-relation and integration of nutritional groups. This provides comprehensive, integrated, simple and flexible health messages.

  20. The Digital Ageing Atlas: integrating the diversity of age-related changes into a unified resource.

    PubMed

    Craig, Thomas; Smelick, Chris; Tacutu, Robi; Wuttke, Daniel; Wood, Shona H; Stanley, Henry; Janssens, Georges; Savitskaya, Ekaterina; Moskalev, Alexey; Arking, Robert; de Magalhães, João Pedro

    2015-01-01

    Multiple studies characterizing the human ageing phenotype have been conducted for decades. However, there is no centralized resource in which data on multiple age-related changes are collated. Currently, researchers must consult several sources, including primary publications, in order to obtain age-related data at various levels. To address this and facilitate integrative, system-level studies of ageing we developed the Digital Ageing Atlas (DAA). The DAA is a one-stop collection of human age-related data covering different biological levels (molecular, cellular, physiological, psychological and pathological) that is freely available online (http://ageing-map.org/). Each of the >3000 age-related changes is associated with a specific tissue and has its own page displaying a variety of information, including at least one reference. Age-related changes can also be linked to each other in hierarchical trees to represent different types of relationships. In addition, we developed an intuitive and user-friendly interface that allows searching, browsing and retrieving information in an integrated and interactive fashion. Overall, the DAA offers a new approach to systemizing ageing resources, providing a manually-curated and readily accessible source of age-related changes. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Graph-Based Weakly-Supervised Methods for Information Extraction & Integration

    ERIC Educational Resources Information Center

    Talukdar, Partha Pratim

    2010-01-01

    The variety and complexity of potentially-related data resources available for querying--webpages, databases, data warehouses--has been growing ever more rapidly. There is a growing need to pose integrative queries "across" multiple such sources, exploiting foreign keys and other means of interlinking data to merge information from diverse…

  2. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

  3. Semantic integration of gene expression analysis tools and data sources using software connectors.

    PubMed

    Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2013-10-25

    The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

  4. Research on effects of baffle position in an integrating sphere on the luminous flux measurement

    NASA Astrophysics Data System (ADS)

    Lin, Fangsheng; Li, Tiecheng; Yin, Dejin; Lai, Lei; Xia, Ming

    2016-09-01

    In the field of optical metrology, luminous flux is an important index to characterize the quality of electric light source. Currently, the majority of luminous flux measurement is based on the integrating sphere method, so measurement accuracy of integrating sphere is the key factor. There are plenty of factors affecting the measurement accuracy, such as coating, power and the position of light source. However, the baffle which is a key part of integrating sphere has important effects on the measurement results. The paper analyzes in detail the principle of an ideal integrating sphere. We use moving rail to change the relative position of baffle and light source inside the sphere. By experiments, measured luminous flux values at different distances between the light source and baffle are obtained, which we used to take analysis of the effects of different baffle position on the measurement. By theoretical calculation, computer simulation and experiment, we obtain the optimum position of baffle for luminous flux measurements. Based on the whole luminous flux measurement error analysis, we develop the methods and apparatus to improve the luminous flux measurement accuracy and reliability. It makes our unifying and transferring work of the luminous flux more accurate in East China and provides effective protection for our traceability system.

  5. Efficient Approaches for Evaluating the Planar Microstrip Green's Function and its Applications to the Analysis of Microstrip Antennas.

    NASA Astrophysics Data System (ADS)

    Barkeshli, Sina

    A relatively simple and efficient closed form asymptotic representation of the microstrip dyadic surface Green's function is developed. The large parameter in this asymptotic development is proportional to the lateral separation between the source and field points along the planar microstrip configuration. Surprisingly, this asymptotic solution remains accurate even for very small (almost two tenths of a wavelength) lateral separation of the source and field points. The present asymptotic Green's function will thus allow a very efficient calculation of the currents excited on microstrip antenna patches/feed lines and monolithic millimeter and microwave integrated circuit (MIMIC) elements based on a moment method (MM) solution of an integral equation for these currents. The kernal of the latter integral equation is the present asymptotic form of the microstrip Green's function. It is noted that the conventional Sommerfeld integral representation of the microstrip surface Green's function is very poorly convergent when used in this MM formulation. In addition, an efficient exact steepest descent path integral form employing a radially propagating representation of the microstrip dyadic Green's function is also derived which exhibits a relatively faster convergence when compared to the conventional Sommerfeld integral representation. The same steepest descent form could also be obtained by deforming the integration contour of the conventional Sommerfeld representation; however, the radially propagating integral representation exhibits better convergence properties for laterally separated source and field points even before the steepest descent path of integration is used. Numerical results based on the efficient closed form asymptotic solution for the microstrip surface Green's function developed in this work are presented for the mutual coupling between a pair of dipoles on a single layer grounded dielectric slab. The accuracy of the latter calculations is confirmed by comparison with results based on an exact integral representation for that Green's function.

  6. Evaluation of hazard and integrity monitor functions for integrated alerting and notification using a sensor simulation framework

    NASA Astrophysics Data System (ADS)

    Bezawada, Rajesh; Uijt de Haag, Maarten

    2010-04-01

    This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.

  7. Integrating watershed- and farm-scale modeling framework for targeting critical source areas while maintaining farm economic viability

    USDA-ARS?s Scientific Manuscript database

    Quantitative risk assessments of pollution and data related to the effectiveness of mitigating best management practices (BMPs) are important aspects of nonpoint source (NPS) pollution control efforts, particularly those driven by specific water quality objectives and by measurable improvement goals...

  8. Situational Leadership, Perception, and the Impact of Power.

    ERIC Educational Resources Information Center

    Hersey, Paul; And Others

    1979-01-01

    Integrates the concept of power with situational leadership by relating the perception of a leader's power bases with leadership styles. Sources of power are identified; situational leadership is reviewed; and the Power Perception Profile is discussed. Maturity levels and their relationships to power sources and leadership styles are discussed.…

  9. An annular superposition integral for axisymmetric radiators.

    PubMed

    Kelly, James F; McGough, Robert J

    2007-02-01

    A fast integral expression for computing the nearfield pressure is derived for axisymmetric radiators. This method replaces the sum of contributions from concentric annuli with an exact double integral that converges much faster than methods that evaluate the Rayleigh-Sommerfeld integral or the generalized King integral. Expressions are derived for plane circular pistons using both continuous wave and pulsed excitations. Several commonly used apodization schemes for the surface velocity distribution are considered, including polynomial functions and a "smooth piston" function. The effect of different apodization functions on the spectral content of the wave field is explored. Quantitative error and time comparisons between the new method, the Rayleigh-Sommerfeld integral, and the generalized King integral are discussed. At all error levels considered, the annular superposition method achieves a speed-up of at least a factor of 4 relative to the point-source method and a factor of 3 relative to the generalized King integral without increasing the computational complexity.

  10. Probabilistic data integration and computational complexity

    NASA Astrophysics Data System (ADS)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under-estimation of uncertainty. However, in both examples, one can also analyze the performance of the sampling methods used to solve the data integration problem to indicate the existence of biased information. This can be used actively to avoid biases in the available information and subsequently in the final uncertainty evaluation.

  11. Identifications of Four Integral Sources in the Galactic Plane via CHANDRA Localizations

    NASA Technical Reports Server (NTRS)

    Tomsick, John A.; Chaty, Sylvain; Rodriquez, Jerome; Foschini, Luigi; Walter, Roland; Kaaret, Philip

    2006-01-01

    Hard X-ray imaging of the Galactic plane by the INTEGRAL satellite is uncovering large numbers of 20-100 keV "IGR" sources. We present results from Chandra, INTEGRAL, optical, and IR observations of four IGR sources: three sources in the Norma region of the Galaxy(1GR J16195-4945,IGR J16207-5129, and IGR J16167-4957) and one that is closer to the Galactic center (IGR 5171 95-4100). In all four cases, one relatively bright Chandra source is seen in the INTEGRAL error circle, and these are likely to be the soft X-ray counterparts of the IGR sources. They have hard 0.3-10 keV spectra with power-law photon indices of Gamma = 0.5-1.1. While many previously studied IGR sources show high column densities (NH approx. 10(exp 23)-10(exp 24)/sq cm), only IGR J16195-4945 has a column density that could be as high as 10(exp 23)/sq cm. Using optical and IR sky survey catalogs and our own photometry, we have obtained identifications for all four sources. The J-band magnitudes are in the range 14.9-10.4, and we have used the optical/IR spectral energy distributions (SEDs) to constrain the nature of the sources. Blackbody components with temperature lower limits of >9400 K for IGR J16195-4945 and >18,000 K for IGR J16207-5129 indicate that these are very likely high-mass X-ray binaries (HMXBs). However, for IGR 516167-4957 and IGR J17195-4100, low extinction and the SEDs indicate later spectral types for the putative companions, suggesting that these are not HMXBs.

  12. INTEGRAL/SPI data segmentation to retrieve source intensity variations

    NASA Astrophysics Data System (ADS)

    Bouchet, L.; Amestoy, P. R.; Buttari, A.; Rouet, F.-H.; Chauvin, M.

    2013-07-01

    Context. The INTEGRAL/SPI, X/γ-ray spectrometer (20 keV-8 MeV) is an instrument for which recovering source intensity variations is not straightforward and can constitute a difficulty for data analysis. In most cases, determining the source intensity changes between exposures is largely based on a priori information. Aims: We propose techniques that help to overcome the difficulty related to source intensity variations, which make this step more rational. In addition, the constructed "synthetic" light curves should permit us to obtain a sky model that describes the data better and optimizes the source signal-to-noise ratios. Methods: For this purpose, the time intensity variation of each source was modeled as a combination of piecewise segments of time during which a given source exhibits a constant intensity. To optimize the signal-to-noise ratios, the number of segments was minimized. We present a first method that takes advantage of previous time series that can be obtained from another instrument on-board the INTEGRAL observatory. A data segmentation algorithm was then used to synthesize the time series into segments. The second method no longer needs external light curves, but solely SPI raw data. For this, we developed a specific algorithm that involves the SPI transfer function. Results: The time segmentation algorithms that were developed solve a difficulty inherent to the SPI instrument, which is the intensity variations of sources between exposures, and it allows us to obtain more information about the sources' behavior. Based on observations with INTEGRAL, an ESA project with instruments and science data centre funded by ESA member states (especially the PI countries: Denmark, France, Germany, Italy, Spain, and Switzerland), Czech Republic and Poland with participation of Russia and the USA.

  13. Integrating Stomach Content and Stable Isotope Analyses to Quantify the Diets of Pygoscelid Penguins

    PubMed Central

    Polito, Michael J.; Trivelpiece, Wayne Z.; Karnovsky, Nina J.; Ng, Elizabeth; Patterson, William P.; Emslie, Steven D.

    2011-01-01

    Stomach content analysis (SCA) and more recently stable isotope analysis (SIA) integrated with isotopic mixing models have become common methods for dietary studies and provide insight into the foraging ecology of seabirds. However, both methods have drawbacks and biases that may result in difficulties in quantifying inter-annual and species-specific differences in diets. We used these two methods to simultaneously quantify the chick-rearing diet of Chinstrap (Pygoscelis antarctica) and Gentoo (P. papua) penguins and highlight methods of integrating SCA data to increase accuracy of diet composition estimates using SIA. SCA biomass estimates were highly variable and underestimated the importance of soft-bodied prey such as fish. Two-source, isotopic mixing model predictions were less variable and identified inter-annual and species-specific differences in the relative amounts of fish and krill in penguin diets not readily apparent using SCA. In contrast, multi-source isotopic mixing models had difficulty estimating the dietary contribution of fish species occupying similar trophic levels without refinement using SCA-derived otolith data. Overall, our ability to track inter-annual and species-specific differences in penguin diets using SIA was enhanced by integrating SCA data to isotopic mixing modes in three ways: 1) selecting appropriate prey sources, 2) weighting combinations of isotopically similar prey in two-source mixing models and 3) refining predicted contributions of isotopically similar prey in multi-source models. PMID:22053199

  14. Auditing the multiply-related concepts within the UMLS

    PubMed Central

    Mougin, Fleur; Grabar, Natalia

    2014-01-01

    Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853

  15. Integration of SAR and DEM data: Geometrical considerations

    NASA Technical Reports Server (NTRS)

    Kropatsch, Walter G.

    1991-01-01

    General principles for integrating data from different sources are derived from the experience of registration of SAR images with digital elevation models (DEM) data. The integration consists of establishing geometrical relations between the data sets that allow us to accumulate information from both data sets for any given object point (e.g., elevation, slope, backscatter of ground cover, etc.). Since the geometries of the two data are completely different they cannot be compared on a pixel by pixel basis. The presented approach detects instances of higher level features in both data sets independently and performs the matching at the high level. Besides the efficiency of this general strategy it further allows the integration of additional knowledge sources: world knowledge and sensor characteristics are also useful sources of information. The SAR features layover and shadow can be detected easily in SAR images. An analytical method to find such regions also in a DEM needs in addition the parameters of the flight path of the SAR sensor and the range projection model. The generation of the SAR layover and shadow maps is summarized and new extensions to this method are proposed.

  16. The Unified Medical Language System (UMLS): integrating biomedical terminology

    PubMed Central

    Bodenreider, Olivier

    2004-01-01

    The Unified Medical Language System (http://umlsks.nlm.nih.gov) is a repository of biomedical vocabularies developed by the US National Library of Medicine. The UMLS integrates over 2 million names for some 900 000 concepts from more than 60 families of biomedical vocabularies, as well as 12 million relations among these concepts. Vocabularies integrated in the UMLS Metathesaurus include the NCBI taxonomy, Gene Ontology, the Medical Subject Headings (MeSH), OMIM and the Digital Anatomist Symbolic Knowledge Base. UMLS concepts are not only inter-related, but may also be linked to external resources such as GenBank. In addition to data, the UMLS includes tools for customizing the Metathesaurus (MetamorphoSys), for generating lexical variants of concept names (lvg) and for extracting UMLS concepts from text (MetaMap). The UMLS knowledge sources are updated quarterly. All vocabularies are available at no fee for research purposes within an institution, but UMLS users are required to sign a license agreement. The UMLS knowledge sources are distributed on CD-ROM and by FTP. PMID:14681409

  17. The Unified Medical Language System (UMLS): integrating biomedical terminology.

    PubMed

    Bodenreider, Olivier

    2004-01-01

    The Unified Medical Language System (http://umlsks.nlm.nih.gov) is a repository of biomedical vocabularies developed by the US National Library of Medicine. The UMLS integrates over 2 million names for some 900,000 concepts from more than 60 families of biomedical vocabularies, as well as 12 million relations among these concepts. Vocabularies integrated in the UMLS Metathesaurus include the NCBI taxonomy, Gene Ontology, the Medical Subject Headings (MeSH), OMIM and the Digital Anatomist Symbolic Knowledge Base. UMLS concepts are not only inter-related, but may also be linked to external resources such as GenBank. In addition to data, the UMLS includes tools for customizing the Metathesaurus (MetamorphoSys), for generating lexical variants of concept names (lvg) and for extracting UMLS concepts from text (MetaMap). The UMLS knowledge sources are updated quarterly. All vocabularies are available at no fee for research purposes within an institution, but UMLS users are required to sign a license agreement. The UMLS knowledge sources are distributed on CD-ROM and by FTP.

  18. Data registration and integration requirements for severe storms research

    NASA Technical Reports Server (NTRS)

    Dalton, J. T.

    1982-01-01

    Severe storms research is characterized by temporal scales ranging from minutes (for thunderstorms and tornadoes) to hours (for hurricanes and extra-tropical cyclones). Spatial scales range from tens to hundreds of kilometers. Sources of observational data include a variety of ground based and satellite systems. Requirements for registration and intercomparison of data from these various sources are examined and the potential for operational forecasting application of techniques resulting from the research is discussed. The sensor characteristics and processing procedures relating to the overlay and integrated analysis of satellite and surface observations for severe storms research are reviewed.

  19. A method for reducing the largest relative errors in Monte Carlo iterated-fission-source calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunter, J. L.; Sutton, T. M.

    2013-07-01

    In Monte Carlo iterated-fission-source calculations relative uncertainties on local tallies tend to be larger in lower-power regions and smaller in higher-power regions. Reducing the largest uncertainties to an acceptable level simply by running a larger number of neutron histories is often prohibitively expensive. The uniform fission site method has been developed to yield a more spatially-uniform distribution of relative uncertainties. This is accomplished by biasing the density of fission neutron source sites while not biasing the solution. The method is integrated into the source iteration process, and does not require any auxiliary forward or adjoint calculations. For a given amountmore » of computational effort, the use of the method results in a reduction of the largest uncertainties relative to the standard algorithm. Two variants of the method have been implemented and tested. Both have been shown to be effective. (authors)« less

  20. Your perspective and my benefit: multiple lesion models of self-other integration strategies during social bargaining.

    PubMed

    Melloni, Margherita; Billeke, Pablo; Baez, Sandra; Hesse, Eugenia; de la Fuente, Laura; Forno, Gonzalo; Birba, Agustina; García-Cordero, Indira; Serrano, Cecilia; Plastino, Angelo; Slachevsky, Andrea; Huepe, David; Sigman, Mariano; Manes, Facundo; García, Adolfo M; Sedeño, Lucas; Ibáñez, Agustín

    2016-11-01

    Recursive social decision-making requires the use of flexible, context-sensitive long-term strategies for negotiation. To succeed in social bargaining, participants' own perspectives must be dynamically integrated with those of interactors to maximize self-benefits and adapt to the other's preferences, respectively. This is a prerequisite to develop a successful long-term self-other integration strategy. While such form of strategic interaction is critical to social decision-making, little is known about its neurocognitive correlates. To bridge this gap, we analysed social bargaining behaviour in relation to its structural neural correlates, ongoing brain dynamics (oscillations and related source space), and functional connectivity signatures in healthy subjects and patients offering contrastive lesion models of neurodegeneration and focal stroke: behavioural variant frontotemporal dementia, Alzheimer's disease, and frontal lesions. All groups showed preserved basic bargaining indexes. However, impaired self-other integration strategy was found in patients with behavioural variant frontotemporal dementia and frontal lesions, suggesting that social bargaining critically depends on the integrity of prefrontal regions. Also, associations between behavioural performance and data from voxel-based morphometry and voxel-based lesion-symptom mapping revealed a critical role of prefrontal regions in value integration and strategic decisions for self-other integration strategy. Furthermore, as shown by measures of brain dynamics and related sources during the task, the self-other integration strategy was predicted by brain anticipatory activity (alpha/beta oscillations with sources in frontotemporal regions) associated with expectations about others' decisions. This pattern was reduced in all clinical groups, with greater impairments in behavioural variant frontotemporal dementia and frontal lesions than Alzheimer's disease. Finally, connectivity analysis from functional magnetic resonance imaging evidenced a fronto-temporo-parietal network involved in successful self-other integration strategy, with selective compromise of long-distance connections in frontal disorders. In sum, this work provides unprecedented evidence of convergent behavioural and neurocognitive signatures of strategic social bargaining in different lesion models. Our findings offer new insights into the critical roles of prefrontal hubs and associated temporo-parietal networks for strategic social negotiation. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Solution of the wave equation for open surfaces involving a line integral over the edge. [for supersonic propeller noise prediction

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1984-01-01

    A simple mathematical model of a stationary source distribution for the supersonic-propeller noise-prediction formula of Farassat (1983) is developed to test the validity of the formula solutions. The conventional thickness source term is used in place of the Isom thickness formula; the relative importance of the line and surface integrals in the solutions is evaluated; and the numerical results are compared with those obtained with a conventional retarded-time solution in tables. Good agreement is obtained over elevation angles from 10 to 90 deg, and the line-integral contribution is found to be significant at all elevation angles and of the same order of magnitude as the surface-integral contribution at angles less than 30 deg. The amplitude-normalized directivity patterns for the four cases computed (x = 1.5 or 10; k = 5.0 or 50) are presented graphically.

  2. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  3. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  4. The dynamics of multimodal integration: The averaging diffusion model.

    PubMed

    Turner, Brandon M; Gao, Juan; Koenig, Scott; Palfy, Dylan; L McClelland, James

    2017-12-01

    We combine extant theories of evidence accumulation and multi-modal integration to develop an integrated framework for modeling multimodal integration as a process that unfolds in real time. Many studies have formulated sensory processing as a dynamic process where noisy samples of evidence are accumulated until a decision is made. However, these studies are often limited to a single sensory modality. Studies of multimodal stimulus integration have focused on how best to combine different sources of information to elicit a judgment. These studies are often limited to a single time point, typically after the integration process has occurred. We address these limitations by combining the two approaches. Experimentally, we present data that allow us to study the time course of evidence accumulation within each of the visual and auditory domains as well as in a bimodal condition. Theoretically, we develop a new Averaging Diffusion Model in which the decision variable is the mean rather than the sum of evidence samples and use it as a base for comparing three alternative models of multimodal integration, allowing us to assess the optimality of this integration. The outcome reveals rich individual differences in multimodal integration: while some subjects' data are consistent with adaptive optimal integration, reweighting sources of evidence as their relative reliability changes during evidence integration, others exhibit patterns inconsistent with optimality.

  5. An annular superposition integral for axisymmetric radiators

    PubMed Central

    Kelly, James F.; McGough, Robert J.

    2007-01-01

    A fast integral expression for computing the nearfield pressure is derived for axisymmetric radiators. This method replaces the sum of contributions from concentric annuli with an exact double integral that converges much faster than methods that evaluate the Rayleigh-Sommerfeld integral or the generalized King integral. Expressions are derived for plane circular pistons using both continuous wave and pulsed excitations. Several commonly used apodization schemes for the surface velocity distribution are considered, including polynomial functions and a “smooth piston” function. The effect of different apodization functions on the spectral content of the wave field is explored. Quantitative error and time comparisons between the new method, the Rayleigh-Sommerfeld integral, and the generalized King integral are discussed. At all error levels considered, the annular superposition method achieves a speed-up of at least a factor of 4 relative to the point-source method and a factor of 3 relative to the generalized King integral without increasing the computational complexity. PMID:17348500

  6. Foreign-born physicians' perceptions of discrimination and stress in Finland: a cross-sectional questionnaire study.

    PubMed

    Heponiemi, Tarja; Hietapakka, Laura; Lehtoaro, Salla; Aalto, Anna-Mari

    2018-06-07

    Foreign-born physicians fill in the shortage of physicians in many developed countries. Labour market theory and previous studies suggest that foreign-born physicians may be a disadvantaged group with a higher likelihood of discrimination and less prestigious jobs. The present study examines foreign-born physicians' experiences of discrimination (coming from management, colleagues and patients separately) and patient-related stress and integration-related stress, and it examines how gender, age, employment sector, country of birth, years from getting a practicing license in Finland, language problems, cross-cultural training, cross-cultural empathy, team climate and skill discretion were associated with these factors. The present study was a cross-sectional questionnaire study among 371 foreign-born physicians in Finland, aged between 26 and 65 (65% women). Analyses of covariance and logistic regression analyses were conducted to examine the associations. A good team climate and high cross-cultural empathy were associated with lower likelihoods of discrimination from all sources, patient-related stress and integration-related stress. Skill discretion was associated with lower levels of integration-related stress and discrimination from management and colleagues. Language problems were associated with higher levels of integration-related stress. The biggest sources of discrimination were patients and their relatives. The present study showed the importance of a good team climate, cross-cultural empathy and patience, skill discretion and language skills in regard to the proper integration of foreign-born health care employees into the workplace. Good job resources, such as a good team climate and the possibility to use one's skills, may help foreign-born employees, for instance by giving them support when needed and offering flexibility. Health care organizations should invest in continuous language training for foreign-born employees and also offer support when there are language problems. Moreover, it seems that training increasing cross-cultural empathy and patience might be beneficial.

  7. The Management of Cognitive Load During Complex Cognitive Skill Acquisition by Means of Computer-Simulated Problem Solving

    ERIC Educational Resources Information Center

    Kester, Liesbeth; Kirschner, Paul A.; van Merrienboer, Jeroen J.G.

    2005-01-01

    This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition…

  8. Nonscience Majors' Perceptions on the Use of YouTube Video to Support Learning in an Integrated Science Lecture

    ERIC Educational Resources Information Center

    Eick, Charles Joseph; King, David T., Jr.

    2012-01-01

    The instructor of an integrated science course for nonscience majors embedded content-related video segments from YouTube and other similar internet sources into lecture. Through this study, the instructor wanted to know students' perceptions of how video use engaged them and increased their interest and understanding of science. Written survey…

  9. Design Science Methodology Applied to a Chemical Surveillance Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.

    Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less

  10. Prefrontal Engagement during Source Memory Retrieval Depends on the Prior Encoding Task

    PubMed Central

    Kuo, Trudy Y.; Van Petten, Cyma

    2008-01-01

    The prefrontal cortex is strongly engaged by some, but not all, episodic memory tests. Prior work has shown that source recognition tests—those that require memory for conjunctions of studied attributes—yield deficient performance in patients with prefrontal damage and greater prefrontal activity in healthy subjects, as compared to simple recognition tests. Here, we tested the hypothesis that there is no intrinsic relationship between the prefrontal cortex and source memory, but that the prefrontal cortex is engaged by the demand to retrieve weakly encoded relationships. Subjects attempted to remember object/color conjunctions after an encoding task that focused on object identity alone, and an integrative encoding task that encouraged attention to object/color relationships. After the integrative encoding task, the late prefrontal brain electrical activity that typically occurs in source memory tests was eliminated. Earlier brain electrical activity related to successful recognition of the objects was unaffected by the nature of prior encoding. PMID:16839287

  11. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  12. Sources of Disconnection in Neurocognitive Aging: Cerebral White Matter Integrity, Resting-state Functional Connectivity, and White Matter Hyperintensity Volume

    PubMed Central

    Madden, David J.; Parks, Emily L.; Tallman, Catherine W.; Boylan, Maria A.; Hoagey, David A.; Cocjin, Sally B.; Packard, Lauren E.; Johnson, Micah A.; Chou, Ying-hui; Potter, Guy G.; Chen, Nan-kuei; Siciliano, Rachel E.; Monge, Zachary A.; Honig, Jesse A.; Diaz, Michele T.

    2017-01-01

    Age-related decline in fluid cognition can be characterized as a disconnection among specific brain structures, leading to a decline in functional efficiency. The potential sources of disconnection, however, are unclear. We investigated imaging measures of cerebral white matter integrity, resting-state functional connectivity, and white matter hyperintensity (WMH) volume as mediators of the relation between age and fluid cognition, in 145 healthy, community-dwelling adults 19–79 years of age. At a general level of analysis, with a single composite measure of fluid cognition and single measures of each of the three imaging modalities, age exhibited an independent influence on the cognitive and imaging measures, and the imaging variables did not mediate the age-cognition relation. At a more specific level of analysis, resting-state functional connectivity of sensorimotor networks was a significant mediator of the age-related decline in executive function. These findings suggest that different levels of analysis lead to different models of neurocognitive disconnection, and that resting-state functional connectivity, in particular, may contribute to age-related decline in executive function. PMID:28389085

  13. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, themore » necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.« less

  14. Methods for radiation detection and characterization using a multiple detector probe

    DOEpatents

    Akers, Douglas William; Roybal, Lyle Gene

    2014-11-04

    Apparatuses, methods, and systems relating to radiological characterization of environments are disclosed. Multi-detector probes with a plurality of detectors in a common housing may be used to substantially concurrently detect a plurality of different radiation activities and types. Multiple multi-detector probes may be used in a down-hole environment to substantially concurrently detect radioactive activity and contents of a buried waste container. Software may process, analyze, and integrate the data from the different multi-detector probes and the different detector types therein to provide source location and integrated analysis as to the source types and activity in the measured environment. Further, the integrated data may be used to compensate for differential density effects and the effects of radiation shielding materials within the volume being measured.

  15. Curating and Integrating Data from Multiple Sources to Support Healthcare Analytics.

    PubMed

    Ng, Kenney; Kakkanatt, Chris; Benigno, Michael; Thompson, Clay; Jackson, Margaret; Cahan, Amos; Zhu, Xinxin; Zhang, Ping; Huang, Paul

    2015-01-01

    As the volume and variety of healthcare related data continues to grow, the analysis and use of this data will increasingly depend on the ability to appropriately collect, curate and integrate disparate data from many different sources. We describe our approach to and highlight our experiences with the development of a robust data collection, curation and integration infrastructure that supports healthcare analytics. This system has been successfully applied to the processing of a variety of data types including clinical data from electronic health records and observational studies, genomic data, microbiomic data, self-reported data from surveys and self-tracked data from wearable devices from over 600 subjects. The curated data is currently being used to support healthcare analytic applications such as data visualization, patient stratification and predictive modeling.

  16. Source recognition by stimulus content in the MTL.

    PubMed

    Park, Heekyeong; Abellanoza, Cheryl; Schaeffer, James; Gandy, Kellen

    2014-03-17

    Source memory is considered to be the cornerstone of episodic memory that enables us to discriminate similar but different events. In the present fMRI study, we investigated whether neural correlates of source retrieval differed by stimulus content in the medial temporal lobe (MTL) when the item and context had been integrated as a perceptually unitized entity. Participants were presented with a list of items either in verbal or pictorial form overlaid on a colored square and instructed to integrate both the item and context into a single image. At test, participants judged the study status of test items and the color in which studied items were presented. Source recognition invariant of stimulus content elicited retrieval activity in both the left anterior hippocampus extending to the perirhinal cortex and the right posterior hippocampus. Word-selective source recognition was related to activity in the left perirhinal cortex, whereas picture-selective source recognition was identified in the left posterior hippocampus. Neural activity sensitive to novelty detection common to both words and pictures was found in the left anterior and right posterior hippocampus. Novelty detection selective to words was associated with the left perirhinal cortex, while activity sensitive to new pictures was identified in the bilateral hippocampus and adjacent MTL cortices, including the parahippocampal, entorhinal, and perirhinal cortices. These findings provide further support for the integral role of the hippocampus both in source recognition and in detection of new stimuli across stimulus content. Additionally, novelty effects in the MTL reveal the integral role of the MTL cortex as the interface for processing new information. Collectively, the present findings demonstrate the importance of the MTL for both previously experienced and novel events. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Bayesian Integration and Characterization of Composition C-4 Plastic Explosives Based on Time-of-Flight Secondary Ion Mass Spectrometry and Laser Ablation-Inductively Coupled Plasma Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahoney, Christine M.; Kelly, Ryan T.; Alexander, M. L.

    Key elements regarding the use of non-radioactive ionization sources will be presented as related to explosives detection by mass spectrometry and ion mobility spectrometry. Various non-radioactive ionization sources will be discussed along with associated ionization mechanisms pertaining to specific sample types.

  18. The Riemann-Lanczos equations in general relativity and their integrability

    NASA Astrophysics Data System (ADS)

    Dolan, P.; Gerber, A.

    2008-06-01

    The aim of this paper is to examine the Riemann-Lanczos equations and how they can be made integrable. They consist of a system of linear first-order partial differential equations that arise in general relativity, whereby the Riemann curvature tensor is generated by an unknown third-order tensor potential field called the Lanczos tensor. Our approach is based on the theory of jet bundles, where all field variables and all their partial derivatives of all relevant orders are treated as independent variables alongside the local manifold coordinates (xa) on the given space-time manifold M. This approach is adopted in (a) Cartan's method of exterior differential systems, (b) Vessiot's dual method using vector field systems, and (c) the Janet-Riquier theory of systems of partial differential equations. All three methods allow for the most general situations under which integrability conditions can be found. They give equivalent results, namely, that involutivity is always achieved at all generic points of the jet manifold M after a finite number of prolongations. Two alternative methods that appear in the general relativity literature to find integrability conditions for the Riemann-Lanczos equations generate new partial differential equations for the Lanczos potential that introduce a source term, which is nonlinear in the components of the Riemann tensor. We show that such sources do not occur when either of method (a), (b), or (c) are used.

  19. Integrals and integral equations in linearized wing theory

    NASA Technical Reports Server (NTRS)

    Lomax, Harvard; Heaslet, Max A; Fuller, Franklyn B

    1951-01-01

    The formulas of subsonic and supersonic wing theory for source, doublet, and vortex distributions are reviewed and a systematic presentation is provided which relates these distributions to the pressure and to the vertical induced velocity in the plane of the wing. It is shown that care must be used in treating the singularities involved in the analysis and that the order of integration is not always reversible. Concepts suggested by the irreversibility of order of integration are shown to be useful in the inversion of singular integral equations when operational techniques are used. A number of examples are given to illustrate the methods presented, attention being directed to supersonic flight speed.

  20. The integrative review: updated methodology.

    PubMed

    Whittemore, Robin; Knafl, Kathleen

    2005-12-01

    The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives.

  1. The study of middle school mathematics and science teachers' practices, perceptions, and attitudes related to mathematics and science integration

    NASA Astrophysics Data System (ADS)

    Leszczynski, Eliza

    The purpose of this qualitative study was to investigate the nature of mathematics and science connections made by sixth and seventh grade mathematics and science teachers in their classrooms. This study also examined the extent to which these connections represented mathematics and science integration and described the teachers' perceptions of and attitudes about mathematics and science integration. The primary data sources included classroom observations and teacher interviews. Findings suggested that teacher practices in making mathematics and science connections in the classroom incorporated many of the characteristics of integrated instruction presented in the literature. Teacher attitudes toward integration were found to be generally positive and supportive of integrated instruction. Mathematics teachers shared a common perception of integration being two separate lessons taught together in one lesson. In contrast, science teachers perceived integration to be a seamless blend of the two disciplines. The researcher related these perceptions and attitudes to the teachers' past experiences with mathematics and science connections and integration, and also to their practices of mathematics and science connections in the study.

  2. Warehousing Structured and Unstructured Data for Data Mining.

    ERIC Educational Resources Information Center

    Miller, L. L.; Honavar, Vasant; Barta, Tom

    1997-01-01

    Describes an extensible object-oriented view system that supports the integration of both structured and unstructured data sources in either the multidatabase or data warehouse environment. Discusses related work and data mining issues. (AEF)

  3. Special Medicare reimbursement and fraud and abuse considerations for management services organizations, medical foundations, and integrated delivery systems.

    PubMed

    DeMuro, P R; Owens, J F

    1994-01-01

    This chapter discusses certain Medicare reimbursement and fraud and abuse considerations for management services organizations (MSOs), medical foundations, and integrated delivery systems. It stresses the necessity of a business plan, the sources of capitalization that might be used in creating an integrated delivery system, and their effect on Medicare reimbursement. It also discusses related party principles and considerations and the Medicare "incident to" regulations. Furthermore, it discusses the application of certain Medicare safe harbor regulations on MSOs' structures and services, and those of medical foundations and integrated delivery systems.

  4. Comparison of Two Methodologies for Calibrating Satellite Instruments in the Visible and Near-Infrared

    NASA Technical Reports Server (NTRS)

    Barnes, Robert A.; Brown, Steven W.; Lykke, Keith R.; Guenther, Bruce; Butler, James J.; Schwarting, Thomas; Turpie, Kevin; Moyer, David; DeLuccia, Frank; Moeller, Christopher

    2015-01-01

    Traditionally, satellite instruments that measure Earth-reflected solar radiation in the visible and near infrared wavelength regions have been calibrated for radiance responsivity in a two-step method. In the first step, the relative spectral response (RSR) of the instrument is determined using a nearly monochromatic light source such as a lamp-illuminated monochromator. These sources do not typically fill the field-of-view of the instrument nor act as calibrated sources of light. Consequently, they only provide a relative (not absolute) spectral response for the instrument. In the second step, the instrument views a calibrated source of broadband light, such as a lamp-illuminated integrating sphere. The RSR and the sphere absolute spectral radiance are combined to determine the absolute spectral radiance responsivity (ASR) of the instrument. More recently, a full-aperture absolute calibration approach using widely tunable monochromatic lasers has been developed. Using these sources, the ASR of an instrument can be determined in a single step on a wavelength-by-wavelength basis. From these monochromatic ASRs, the responses of the instrument bands to broadband radiance sources can be calculated directly, eliminating the need for calibrated broadband light sources such as lamp-illuminated integrating spheres. In this work, the traditional broadband source-based calibration of the Suomi National Preparatory Project (SNPP) Visible Infrared Imaging Radiometer Suite (VIIRS) sensor is compared with the laser-based calibration of the sensor. Finally, the impact of the new full-aperture laser-based calibration approach on the on-orbit performance of the sensor is considered.

  5. Comparison of two methodologies for calibrating satellite instruments in the visible and near infrared

    PubMed Central

    Barnes, Robert A.; Brown, Steven W.; Lykke, Keith R.; Guenther, Bruce; Butler, James J.; Schwarting, Thomas; Moyer, David; Turpie, Kevin; DeLuccia, Frank; Moeller, Christopher

    2016-01-01

    Traditionally, satellite instruments that measure Earth-reflected solar radiation in the visible and near infrared wavelength regions have been calibrated for radiance responsivity in a two-step method. In the first step, the relative spectral response (RSR) of the instrument is determined using a nearly monochromatic light source such as a lamp-illuminated monochromator. These sources do not typically fill the field-of-view of the instrument nor act as calibrated sources of light. Consequently, they only provide a relative (not absolute) spectral response for the instrument. In the second step, the instrument views a calibrated source of broadband light, such as a lamp-illuminated integrating sphere. The RSR and the sphere absolute spectral radiance are combined to determine the absolute spectral radiance responsivity (ASR) of the instrument. More recently, a full-aperture absolute calibration approach using widely tunable monochromatic lasers has been developed. Using these sources, the ASR of an instrument can be determined in a single step on a wavelength-by-wavelength basis. From these monochromatic ASRs, the responses of the instrument bands to broadband radiance sources can be calculated directly, eliminating the need for calibrated broadband light sources such as integrating spheres. In this work, the traditional broadband source-based calibration of the Suomi National Preparatory Project (SNPP) Visible Infrared Imaging Radiometer Suite (VIIRS) sensor is compared with the laser-based calibration of the sensor. Finally, the impact of the new full-aperture laser-based calibration approach on the on-orbit performance of the sensor is considered. PMID:26836861

  6. SYNTHESIS OF NOVEL ALL-DIELECTRIC GRATING FILTERS USING GENETIC ALGORITHMS

    NASA Technical Reports Server (NTRS)

    Zuffada, Cinzia; Cwik, Tom; Ditchman, Christopher

    1997-01-01

    We are concerned with the design of inhomogeneous, all dielectric (lossless) periodic structures which act as filters. Dielectric filters made as stacks of inhomogeneous gratings and layers of materials are being used in optical technology, but are not common at microwave frequencies. The problem is then finding the periodic cell's geometric configuration and permittivity values which correspond to a specified reflectivity/transmittivity response as a function of frequency/illumination angle. This type of design can be thought of as an inverse-source problem, since it entails finding a distribution of sources which produce fields (or quantities derived from them) of given characteristics. Electromagnetic sources (electric and magnetic current densities) in a volume are related to the outside fields by a well known linear integral equation. Additionally, the sources are related to the fields inside the volume by a constitutive equation, involving the material properties. Then, the relationship linking the fields outside the source region to those inside is non-linear, in terms of material properties such as permittivity, permeability and conductivity. The solution of the non-linear inverse problem is cast here as a combination of two linear steps, by explicitly introducing the electromagnetic sources in the computational volume as a set of unknowns in addition to the material unknowns. This allows to solve for material parameters and related electric fields in the source volume which are consistent with Maxwell's equations. Solutions are obtained iteratively by decoupling the two steps. First, we invert for the permittivity only in the minimization of a cost function and second, given the materials, we find the corresponding electric fields through direct solution of the integral equation in the source volume. The sources thus computed are used to generate the far fields and the synthesized triter response. The cost function is obtained by calculating the deviation between the synthesized value of reflectivity/transmittivity and the desired one. Solution geometries for the periodic cell are sought as gratings (ensembles of columns of different heights and widths), or combinations of homogeneous layers of different dielectric materials and gratings. Hence the explicit unknowns of the inversion step are the material permittivities and the relative boundaries separating homogeneous parcels of the periodic cell.

  7. On the Assessment of Acoustic Scattering and Shielding by Time Domain Boundary Integral Equation Solutions

    NASA Technical Reports Server (NTRS)

    Hu, Fang Q.; Pizzo, Michelle E.; Nark, Douglas M.

    2016-01-01

    Based on the time domain boundary integral equation formulation of the linear convective wave equation, a computational tool dubbed Time Domain Fast Acoustic Scattering Toolkit (TD-FAST) has recently been under development. The time domain approach has a distinct advantage that the solutions at all frequencies are obtained in a single computation. In this paper, the formulation of the integral equation, as well as its stabilization by the Burton-Miller type reformulation, is extended to cases of a constant mean flow in an arbitrary direction. In addition, a "Source Surface" is also introduced in the formulation that can be employed to encapsulate regions of noise sources and to facilitate coupling with CFD simulations. This is particularly useful for applications where the noise sources are not easily described by analytical source terms. Numerical examples are presented to assess the accuracy of the formulation, including a computation of noise shielding by a thin barrier motivated by recent Historical Baseline F31A31 open rotor noise shielding experiments. Furthermore, spatial resolution requirements of the time domain boundary element method are also assessed using point per wavelength metrics. It is found that, using only constant basis functions and high-order quadrature for surface integration, relative errors of less than 2% may be obtained when the surface spatial resolution is 5 points-per-wavelength (PPW) or 25 points-per-wavelength squared (PPW2).

  8. Emotion impairs extrinsic source memory--An ERP study.

    PubMed

    Mao, Xinrui; You, Yuqi; Li, Wen; Guo, Chunyan

    2015-09-01

    Substantial advancements in understanding emotional modulation of item memory notwithstanding, controversies remain as to how emotion influences source memory. Using an emotional extrinsic source memory paradigm combined with remember/know judgments and two key event-related potentials (ERPs)-the FN400 (a frontal potential at 300-500 ms related to familiarity) and the LPC (a later parietal potential at 500-700 ms related to recollection), our research investigated the impact of emotion on extrinsic source memory and the underlying processes. We varied a semantic prompt (either "people" or "scene") preceding a study item to manipulate the extrinsic source. Behavioral data indicated a significant effect of emotion on "remember" responses to extrinsic source details, suggesting impaired recollection-based source memory in emotional (both positive and negative) relative to neutral conditions. In parallel, differential FN400 and LPC amplitudes (correctly remembered - incorrectly remembered sources) revealed emotion-related interference, suggesting impaired familiarity and recollection memory of extrinsic sources associated with positive or negative items. These findings thus lend support to the notion of emotion-induced memory trade off: while enhancing memory of central items and intrinsic/integral source details, emotion nevertheless disrupts memory of peripheral contextual details, potentially impairing both familiarity and recollection. Importantly, that positive and negative items result in comparable memory impairment suggests that arousal (vs. affective valence) plays a critical role in modulating dynamic interactions among automatic and elaborate processes involved in memory. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Auditing the multiply-related concepts within the UMLS.

    PubMed

    Mougin, Fleur; Grabar, Natalia

    2014-10-01

    This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  10. Data Integration for Dynamic and Sustainable Systems Biology Resources: Challenges and Lessons Learned

    PubMed Central

    Gabbard, Joseph L.; Shukla, Maulik; Sobral, Bruno

    2010-01-01

    Systems biology and infectious disease (host-pathogen-environment) research and development is becoming increasingly dependent on integrating data from diverse and dynamic sources. Maintaining integrated resources over long periods of time presents distinct challenges. This paper describes experiences and lessons learned from integrating data in two five-year projects focused on pathosystems biology: the Pathosystems Resource Integration Center (PATRIC, http://patric.vbi.vt.edu/), with a goal of developing bioinformatics resources for the research and countermeasures development communities based on genomics data, and the Resource Center for Biodefense Proteomics Research (RCBPR, http://www.proteomicsresource.org/), with a goal of developing resources based on the experiment data such as microarray and proteomics data from diverse sources and technologies. Some challenges include integrating genomic sequence and experiment data, data synchronization, data quality control, and usability engineering. We present examples of a variety of data integration problems drawn from our experiences with PATRIC and RBPRC, as well as open research questions related to long term sustainability, and describe the next steps to meeting these challenges. Novel contributions of this work include (1) an approach for addressing discrepancies between experiment results and interpreted results and (2) expanding the range of data integration techniques to include usability engineering at the presentation level. PMID:20491070

  11. Industrial pollution and the management of river water quality: a model of Kelani River, Sri Lanka.

    PubMed

    Gunawardena, Asha; Wijeratne, E M S; White, Ben; Hailu, Atakelty; Pandit, Ram

    2017-08-19

    Water quality of the Kelani River has become a critical issue in Sri Lanka due to the high cost of maintaining drinking water standards and the market and non-market costs of deteriorating river ecosystem services. By integrating a catchment model with a river model of water quality, we developed a method to estimate the effect of pollution sources on ambient water quality. Using integrated model simulations, we estimate (1) the relative contribution from point (industrial and domestic) and non-point sources (river catchment) to river water quality and (2) pollutant transfer coefficients for zones along the lower section of the river. Transfer coefficients provide the basis for policy analyses in relation to the location of new industries and the setting of priorities for industrial pollution control. They also offer valuable information to design socially optimal economic policy to manage industrialized river catchments.

  12. On the periodic Toda lattice hierarchy with an integral source

    NASA Astrophysics Data System (ADS)

    Babajanov, Bazar; Fečkan, Michal; Urazboev, Gayrat

    2017-11-01

    This work is devoted to the application of inverse spectral problem for integration of the periodic Toda lattice hierarchy with an integral type source. The effective method is presented of constructing the periodic Toda lattice hierarchy with an integral source.

  13. Method to manage integration error in the Green-Kubo method.

    PubMed

    Oliveira, Laura de Sousa; Greaney, P Alex

    2017-02-01

    The Green-Kubo method is a commonly used approach for predicting transport properties in a system from equilibrium molecular dynamics simulations. The approach is founded on the fluctuation dissipation theorem and relates the property of interest to the lifetime of fluctuations in its thermodynamic driving potential. For heat transport, the lattice thermal conductivity is related to the integral of the autocorrelation of the instantaneous heat flux. A principal source of error in these calculations is that the autocorrelation function requires a long averaging time to reduce remnant noise. Integrating the noise in the tail of the autocorrelation function becomes conflated with physically important slow relaxation processes. In this paper we present a method to quantify the uncertainty on transport properties computed using the Green-Kubo formulation based on recognizing that the integrated noise is a random walk, with a growing envelope of uncertainty. By characterizing the noise we can choose integration conditions to best trade off systematic truncation error with unbiased integration noise, to minimize uncertainty for a given allocation of computational resources.

  14. Method to manage integration error in the Green-Kubo method

    NASA Astrophysics Data System (ADS)

    Oliveira, Laura de Sousa; Greaney, P. Alex

    2017-02-01

    The Green-Kubo method is a commonly used approach for predicting transport properties in a system from equilibrium molecular dynamics simulations. The approach is founded on the fluctuation dissipation theorem and relates the property of interest to the lifetime of fluctuations in its thermodynamic driving potential. For heat transport, the lattice thermal conductivity is related to the integral of the autocorrelation of the instantaneous heat flux. A principal source of error in these calculations is that the autocorrelation function requires a long averaging time to reduce remnant noise. Integrating the noise in the tail of the autocorrelation function becomes conflated with physically important slow relaxation processes. In this paper we present a method to quantify the uncertainty on transport properties computed using the Green-Kubo formulation based on recognizing that the integrated noise is a random walk, with a growing envelope of uncertainty. By characterizing the noise we can choose integration conditions to best trade off systematic truncation error with unbiased integration noise, to minimize uncertainty for a given allocation of computational resources.

  15. Acquisition Program Lead Systems Integration/Lead Capabilities Integration Decision Support Methodology and Tool

    DTIC Science & Technology

    2015-04-30

    from the MIT Sloan School that provide a relative complexity score for functions (Product and Context Complexity). The PMA assesses the complexity...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or

  16. Enhancing DInSAR capabilities for landslide monitoring by applying GIS-based multicriteria filtering analysis

    NASA Astrophysics Data System (ADS)

    Beyene, F.; Knospe, S.; Busch, W.

    2015-04-01

    Landslide detection and monitoring remain difficult with conventional differential radar interferometry (DInSAR) because most pixels of radar interferograms around landslides are affected by different error sources. These are mainly related to the nature of high radar viewing angles and related spatial distortions (such as overlays and shadows), temporal decorrelations owing to vegetation cover, and speed and direction of target sliding masses. On the other hand, GIS can be used to integrate spatial datasets obtained from many sources (including radar and non-radar sources). In this paper, a GRID data model is proposed to integrate deformation data derived from DInSAR processing with other radar origin data (coherence, layover and shadow, slope and aspect, local incidence angle) and external datasets collected from field study of landslide sites and other sources (geology, geomorphology, hydrology). After coordinate transformation and merging of data, candidate landslide representing pixels of high quality radar signals were filtered out by applying a GIS based multicriteria filtering analysis (GIS-MCFA), which excludes grid points in areas of shadow and overlay, low coherence, non-detectable and non-landslide deformations, and other possible sources of errors from the DInSAR data processing. At the end, the results obtained from GIS-MCFA have been verified by using the external datasets (existing landslide sites collected from fieldworks, geological and geomorphologic maps, rainfall data etc.).

  17. Distribution of immunodeficiency fact files with XML--from Web to WAP.

    PubMed

    Väliaho, Jouni; Riikonen, Pentti; Vihinen, Mauno

    2005-06-26

    Although biomedical information is growing rapidly, it is difficult to find and retrieve validated data especially for rare hereditary diseases. There is an increased need for services capable of integrating and validating information as well as proving it in a logically organized structure. A XML-based language enables creation of open source databases for storage, maintenance and delivery for different platforms. Here we present a new data model called fact file and an XML-based specification Inherited Disease Markup Language (IDML), that were developed to facilitate disease information integration, storage and exchange. The data model was applied to primary immunodeficiencies, but it can be used for any hereditary disease. Fact files integrate biomedical, genetic and clinical information related to hereditary diseases. IDML and fact files were used to build a comprehensive Web and WAP accessible knowledge base ImmunoDeficiency Resource (IDR) available at http://bioinf.uta.fi/idr/. A fact file is a user oriented user interface, which serves as a starting point to explore information on hereditary diseases. The IDML enables the seamless integration and presentation of genetic and disease information resources in the Internet. IDML can be used to build information services for all kinds of inherited diseases. The open source specification and related programs are available at http://bioinf.uta.fi/idml/.

  18. Revisions to the JDL data fusion model

    NASA Astrophysics Data System (ADS)

    Steinberg, Alan N.; Bowman, Christopher L.; White, Franklin E.

    1999-03-01

    The Data Fusion Model maintained by the Joint Directors of Laboratories (JDL) Data Fusion Group is the most widely-used method for categorizing data fusion-related functions. This paper discusses the current effort to revise the expand this model to facilitate the cost-effective development, acquisition, integration and operation of multi- sensor/multi-source systems. Data fusion involves combining information - in the broadest sense - to estimate or predict the state of some aspect of the universe. These may be represented in terms of attributive and relational states. If the job is to estimate the state of a people, it can be useful to include consideration of informational and perceptual states in addition to the physical state. Developing cost-effective multi-source information systems requires a method for specifying data fusion processing and control functions, interfaces, and associate databases. The lack of common engineering standards for data fusion systems has been a major impediment to integration and re-use of available technology: current developments do not lend themselves to objective evaluation, comparison or re-use. This paper reports on proposed revisions and expansions of the JDL Data FUsion model to remedy some of these deficiencies. This involves broadening the functional model and related taxonomy beyond the original military focus, and integrating the Data Fusion Tree Architecture model for system description, design and development.

  19. The timing and sources of information for the adoption and implementation of production innovations

    NASA Technical Reports Server (NTRS)

    Ettlie, J. E.

    1976-01-01

    Two dimensions (personal-impersonal and internal-external) are used to characterize information sources as they become important during the interorganizational transfer of production innovations. The results of three studies are reviewed for the purpose of deriving a model of the timing and importance of different information sources and the utilization of new technology. Based on the findings of two retrospective studies, it was concluded that the pattern of information seeking behavior in user organizations during the awareness stage of adoption is not a reliable predictor of the eventual utilization rate. Using the additional findings of a real-time study, an empirical model of the relative importance of information sources for successful user organizations is presented. These results are extended and integrated into a theoretical model consisting of a time-profile of successful implementations and the relative importance of four types of information sources during seven stages of the adoption-implementation process.

  20. Report of the National Heart, Lung, and Blood Institute Working Group: An Integrated Network for Congenital Heart Disease Research

    PubMed Central

    Pasquali, Sara K.; Jacobs, Jeffrey P.; Farber, Gregory K.; Bertoch, David; Blume, Elizabeth D.; Burns, Kristin M.; Campbell, Robert; Chang, Anthony C.; Chung, Wendy K.; Riehle-Colarusso, Tiffany; Curtis, Lesley H.; Forrest, Christopher B.; Gaynor, William J.; Gaies, Michael G.; Go, Alan S.; Henchey, Paul; Martin, Gerard R.; Pearson, Gail; Pemberton, Victoria L.; Schwartz, Steven M.; Vincent, Robert; Kaltman, Jonathan R.

    2016-01-01

    The National Heart, Lung, and Blood Institute convened a Working Group in January 2015 to explore issues related to an integrated data network for congenital heart disease (CHD) research. The overall goal was to develop a common vision for how the rapidly increasing volumes of data captured across numerous sources can be managed, integrated, and analyzed to improve care and outcomes. This report summarizes the current landscape of CHD data, data integration methodologies used across other fields, key considerations for data integration models in CHD, and the short- and long-term vision and recommendations made by the Working Group. PMID:27045129

  1. Searching Across the International Space Station Databases

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; McDermott, William J.; Smith, Ernest E.; Bell, David G.; Gurram, Mohana

    2007-01-01

    Data access in the enterprise generally requires us to combine data from different sources and different formats. It is advantageous thus to focus on the intersection of the knowledge across sources and domains; keeping irrelevant knowledge around only serves to make the integration more unwieldy and more complicated than necessary. A context search over multiple domain is proposed in this paper to use context sensitive queries to support disciplined manipulation of domain knowledge resources. The objective of a context search is to provide the capability for interrogating many domain knowledge resources, which are largely semantically disjoint. The search supports formally the tasks of selecting, combining, extending, specializing, and modifying components from a diverse set of domains. This paper demonstrates a new paradigm in composition of information for enterprise applications. In particular, it discusses an approach to achieving data integration across multiple sources, in a manner that does not require heavy investment in database and middleware maintenance. This lean approach to integration leads to cost-effectiveness and scalability of data integration with an underlying schemaless object-relational database management system. This highly scalable, information on demand system framework, called NX-Search, which is an implementation of an information system built on NETMARK. NETMARK is a flexible, high-throughput open database integration framework for managing, storing, and searching unstructured or semi-structured arbitrary XML and HTML used widely at the National Aeronautics Space Administration (NASA) and industry.

  2. Linear diffusion into a Faraday cage.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warne, Larry Kevin; Lin, Yau Tang; Merewether, Kimball O.

    2011-11-01

    Linear lightning diffusion into a Faraday cage is studied. An early-time integral valid for large ratios of enclosure size to enclosure thickness and small relative permeability ({mu}/{mu}{sub 0} {le} 10) is used for this study. Existing solutions for nearby lightning impulse responses of electrically thick-wall enclosures are refined and extended to calculate the nearby lightning magnetic field (H) and time-derivative magnetic field (HDOT) inside enclosures of varying thickness caused by a decaying exponential excitation. For a direct strike scenario, the early-time integral for a worst-case line source outside the enclosure caused by an impulse is simplified and numerically integrated tomore » give the interior H and HDOT at the location closest to the source as well as a function of distance from the source. H and HDOT enclosure response functions for decaying exponentials are considered for an enclosure wall of any thickness. Simple formulas are derived to provide a description of enclosure interior H and HDOT as well. Direct strike voltage and current bounds for a single-turn optimally-coupled loop for all three waveforms are also given.« less

  3. Quality of Information Approach to Improving Source Selection in Tactical Networks

    DTIC Science & Technology

    2017-02-01

    consider the performance of this process based on metrics relating to quality of information: accuracy, timeliness, completeness and reliability. These...that are indicators of that the network is meeting these quality requirements. We study effective data rate, social distance, link integrity and the...utility of information as metrics within a multi-genre network to determine the quality of information of its available sources. This paper proposes a

  4. The efficiency of the heat pump water heater, during DHW tapping cycle

    NASA Astrophysics Data System (ADS)

    Gużda, Arkadiusz; Szmolke, Norbert

    2017-10-01

    This paper discusses one of the most effective systems for domestic hot water (DHW) production based on air-source heat pump with an integrated tank. The operating principle of the heat pump is described in detail. Moreover, there is an account of experimental set-up and results of the measurements. In the experimental part, measurements were conducted with the aim of determining the energy parameters and measures of the economic efficiency related to the presented solution. The measurements that were conducted are based on the tapping cycle that is similar to the recommended one in EN-16147 standard. The efficiency of the air source heat pump during the duration of the experiment was 2.43. In the end of paper, authors conducted a simplified ecological analysis in order to determine the influence of operation of air-source heat pump with integrated tank on the environment. Moreover the compression with the different source of energy (gas boiler with closed combustion chamber and boiler fired by the coal) was conducted. The heat pump is the ecological friendly source of the energy.

  5. 19 CFR 10.532 - Integrated Sourcing Initiative.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Integrated Sourcing Initiative. 10.532 Section 10... Trade Agreement Rules of Origin § 10.532 Integrated Sourcing Initiative. (a) For purposes of General... Sourcing Initiative if: (1) The good, in its condition as imported, is both classified in a tariff...

  6. Ontology-Based Retrieval of Spatially Related Objects for Location Based Services

    NASA Astrophysics Data System (ADS)

    Haav, Hele-Mai; Kaljuvee, Aivi; Luts, Martin; Vajakas, Toivo

    Advanced Location Based Service (LBS) applications have to integrate information stored in GIS, information about users' preferences (profile) as well as contextual information and information about application itself. Ontology engineering provides methods to semantically integrate several data sources. We propose an ontology-driven LBS development framework: the paper describes the architecture of ontologies and their usage for retrieval of spatially related objects relevant to the user. Our main contribution is to enable personalised ontology driven LBS by providing a novel approach for defining personalised semantic spatial relationships by means of ontologies. The approach is illustrated by an industrial case study.

  7. DBGC: A Database of Human Gastric Cancer

    PubMed Central

    Wang, Chao; Zhang, Jun; Cai, Mingdeng; Zhu, Zhenggang; Gu, Wenjie; Yu, Yingyan; Zhang, Xiaoyan

    2015-01-01

    The Database of Human Gastric Cancer (DBGC) is a comprehensive database that integrates various human gastric cancer-related data resources. Human gastric cancer-related transcriptomics projects, proteomics projects, mutations, biomarkers and drug-sensitive genes from different sources were collected and unified in this database. Moreover, epidemiological statistics of gastric cancer patients in China and clinicopathological information annotated with gastric cancer cases were also integrated into the DBGC. We believe that this database will greatly facilitate research regarding human gastric cancer in many fields. DBGC is freely available at http://bminfor.tongji.edu.cn/dbgc/index.do PMID:26566288

  8. Distributed XQuery-Based Integration and Visualization of Multimodality Brain Mapping Data

    PubMed Central

    Detwiler, Landon T.; Suciu, Dan; Franklin, Joshua D.; Moore, Eider B.; Poliakov, Andrew V.; Lee, Eunjung S.; Corina, David P.; Ojemann, George A.; Brinkley, James F.

    2008-01-01

    This paper addresses the need for relatively small groups of collaborating investigators to integrate distributed and heterogeneous data about the brain. Although various national efforts facilitate large-scale data sharing, these approaches are generally too “heavyweight” for individual or small groups of investigators, with the result that most data sharing among collaborators continues to be ad hoc. Our approach to this problem is to create a “lightweight” distributed query architecture, in which data sources are accessible via web services that accept arbitrary query languages but return XML results. A Distributed XQuery Processor (DXQP) accepts distributed XQueries in which subqueries are shipped to the remote data sources to be executed, with the resulting XML integrated by DXQP. A web-based application called DXBrain accesses DXQP, allowing a user to create, save and execute distributed XQueries, and to view the results in various formats including a 3-D brain visualization. Example results are presented using distributed brain mapping data sources obtained in studies of language organization in the brain, but any other XML source could be included. The advantage of this approach is that it is very easy to add and query a new source, the tradeoff being that the user needs to understand XQuery and the schemata of the underlying sources. For small numbers of known sources this burden is not onerous for a knowledgeable user, leading to the conclusion that the system helps to fill the gap between ad hoc local methods and large scale but complex national data sharing efforts. PMID:19198662

  9. Nonlinear characterization of a silicon integrated Bragg waveguide filter.

    PubMed

    Massara, Micol Previde; Menotti, Matteo; Bergamasco, Nicola; Harris, Nicholas C; Baehr-Jones, Tom; Hochberg, Michael; Galland, Christophe; Liscidini, Marco; Galli, Matteo; Bajoni, Daniele

    2018-03-01

    Bragg waveguides are promising optical filters for pump suppression in spontaneous four-wave mixing (FWM) photon sources. In this work, we investigate the generation of unwanted photon pairs in the filter itself. We do this by taking advantage of the relation between spontaneous and classical FWM, which allows for the precise characterization of the nonlinear response of the device. The pair generation rate estimated from the classical measurement is compared with the theoretical value calculated by means of a full quantum model of the filter, which also allows investigation of the spectral properties of the generated pairs. We find a good agreement between theory and experiment, confirming that stimulated FWM is a valuable approach to characterize the nonlinear response of an integrated filter, and that the pairs generated in a Bragg waveguide are not a serious issue for the operation of a fully integrated nonclassical source.

  10. Integrated species distribution models: combining presence-background data and site-occupancy data with imperfect detection

    USGS Publications Warehouse

    Koshkina, Vira; Wang, Yang; Gordon, Ascelin; Dorazio, Robert; White, Matthew; Stone, Lewi

    2017-01-01

    Two main sources of data for species distribution models (SDMs) are site-occupancy (SO) data from planned surveys, and presence-background (PB) data from opportunistic surveys and other sources. SO surveys give high quality data about presences and absences of the species in a particular area. However, due to their high cost, they often cover a smaller area relative to PB data, and are usually not representative of the geographic range of a species. In contrast, PB data is plentiful, covers a larger area, but is less reliable due to the lack of information on species absences, and is usually characterised by biased sampling. Here we present a new approach for species distribution modelling that integrates these two data types.We have used an inhomogeneous Poisson point process as the basis for constructing an integrated SDM that fits both PB and SO data simultaneously. It is the first implementation of an Integrated SO–PB Model which uses repeated survey occupancy data and also incorporates detection probability.The Integrated Model's performance was evaluated, using simulated data and compared to approaches using PB or SO data alone. It was found to be superior, improving the predictions of species spatial distributions, even when SO data is sparse and collected in a limited area. The Integrated Model was also found effective when environmental covariates were significantly correlated. Our method was demonstrated with real SO and PB data for the Yellow-bellied glider (Petaurus australis) in south-eastern Australia, with the predictive performance of the Integrated Model again found to be superior.PB models are known to produce biased estimates of species occupancy or abundance. The small sample size of SO datasets often results in poor out-of-sample predictions. Integrated models combine data from these two sources, providing superior predictions of species abundance compared to using either data source alone. Unlike conventional SDMs which have restrictive scale-dependence in their predictions, our Integrated Model is based on a point process model and has no such scale-dependency. It may be used for predictions of abundance at any spatial-scale while still maintaining the underlying relationship between abundance and area.

  11. Low-loss integrated electrical surface plasmon source with ultra-smooth metal film fabricated by polymethyl methacrylate 'bond and peel' method.

    PubMed

    Liu, Wenjie; Hu, Xiaolong; Zou, Qiushun; Wu, Shaoying; Jin, Chongjun

    2018-06-15

    External light sources are mostly employed to functionalize the plasmonic components, resulting in a bulky footprint. Electrically driven integrated plasmonic devices, combining ultra-compact critical feature sizes with extremely high transmission speeds and low power consumption, can link plasmonics with the present-day electronic world. In an effort to achieve this prospect, suppressing the losses in the plasmonic devices becomes a pressing issue. In this work, we developed a novel polymethyl methacrylate 'bond and peel' method to fabricate metal films with sub-nanometer smooth surfaces on semiconductor wafers. Based on this method, we further fabricated a compact plasmonic source containing a metal-insulator-metal (MIM) waveguide with an ultra-smooth metal surface on a GaAs-based light-emitting diode wafer. An increase in propagation length of the SPP mode by a factor of 2.95 was achieved as compared with the conventional device containing a relatively rough metal surface. Numerical calculations further confirmed that the propagation length is comparable to the theoretical prediction on the MIM waveguide with perfectly smooth metal surfaces. This method facilitates low-loss and high-integration of electrically driven plasmonic devices, thus provides an immediate opportunity for the practical application of on-chip integrated plasmonic circuits.

  12. Low-loss integrated electrical surface plasmon source with ultra-smooth metal film fabricated by polymethyl methacrylate ‘bond and peel’ method

    NASA Astrophysics Data System (ADS)

    Liu, Wenjie; Hu, Xiaolong; Zou, Qiushun; Wu, Shaoying; Jin, Chongjun

    2018-06-01

    External light sources are mostly employed to functionalize the plasmonic components, resulting in a bulky footprint. Electrically driven integrated plasmonic devices, combining ultra-compact critical feature sizes with extremely high transmission speeds and low power consumption, can link plasmonics with the present-day electronic world. In an effort to achieve this prospect, suppressing the losses in the plasmonic devices becomes a pressing issue. In this work, we developed a novel polymethyl methacrylate ‘bond and peel’ method to fabricate metal films with sub-nanometer smooth surfaces on semiconductor wafers. Based on this method, we further fabricated a compact plasmonic source containing a metal-insulator-metal (MIM) waveguide with an ultra-smooth metal surface on a GaAs-based light-emitting diode wafer. An increase in propagation length of the SPP mode by a factor of 2.95 was achieved as compared with the conventional device containing a relatively rough metal surface. Numerical calculations further confirmed that the propagation length is comparable to the theoretical prediction on the MIM waveguide with perfectly smooth metal surfaces. This method facilitates low-loss and high-integration of electrically driven plasmonic devices, thus provides an immediate opportunity for the practical application of on-chip integrated plasmonic circuits.

  13. Bi-photon spectral correlation measurements from a silicon nanowire in the quantum and classical regimes

    PubMed Central

    Jizan, Iman; Helt, L. G.; Xiong, Chunle; Collins, Matthew J.; Choi, Duk-Yong; Joon Chae, Chang; Liscidini, Marco; Steel, M. J.; Eggleton, Benjamin J.; Clark, Alex S.

    2015-01-01

    The growing requirement for photon pairs with specific spectral correlations in quantum optics experiments has created a demand for fast, high resolution and accurate source characterisation. A promising tool for such characterisation uses classical stimulated processes, in which an additional seed laser stimulates photon generation yielding much higher count rates, as recently demonstrated for a χ(2) integrated source in A. Eckstein et al. Laser Photon. Rev. 8, L76 (2014). In this work we extend these results to χ(3) integrated sources, directly measuring for the first time the relation between spectral correlation measurements via stimulated and spontaneous four wave mixing in an integrated optical waveguide, a silicon nanowire. We directly confirm the speed-up due to higher count rates and demonstrate that this allows additional resolution to be gained when compared to traditional coincidence measurements without any increase in measurement time. As the pump pulse duration can influence the degree of spectral correlation, all of our measurements are taken for two different pump pulse widths. This allows us to confirm that the classical stimulated process correctly captures the degree of spectral correlation regardless of pump pulse duration, and cements its place as an essential characterisation method for the development of future quantum integrated devices. PMID:26218609

  14. Privacy preserving integration of health care data.

    PubMed

    Adam, Nabil; White, Tom; Shafiq, Basit; Vaidya, Jaideep; He, Xiaoyun

    2007-10-11

    For health care related research studies the medical records of patients may need to be retrieved from multiple sites with different regulations on the disclosure of health information. Given the sensitive nature of health care information, privacy is a major concern when patients' health care data is used for research purposes. In this paper, we propose an approach for integration and querying of health care data from multiple sources in a secure and privacy preserving manner.

  15. Response Functions for Neutron Skyshine Analyses

    NASA Astrophysics Data System (ADS)

    Gui, Ah Auu

    Neutron and associated secondary photon line-beam response functions (LBRFs) for point monodirectional neutron sources and related conical line-beam response functions (CBRFs) for azimuthally symmetric neutron sources are generated using the MCNP Monte Carlo code for use in neutron skyshine analyses employing the internal line-beam and integral conical-beam methods. The LBRFs are evaluated at 14 neutron source energies ranging from 0.01 to 14 MeV and at 18 emission angles from 1 to 170 degrees. The CBRFs are evaluated at 13 neutron source energies in the same energy range and at 13 source polar angles (1 to 89 degrees). The response functions are approximated by a three parameter formula that is continuous in source energy and angle using a double linear interpolation scheme. These response function approximations are available for a source-to-detector range up to 2450 m and for the first time, give dose equivalent responses which are required for modern radiological assessments. For the CBRF, ground correction factors for neutrons and photons are calculated and approximated by empirical formulas for use in air-over-ground neutron skyshine problems with azimuthal symmetry. In addition, a simple correction procedure for humidity effects on the neutron skyshine dose is also proposed. The approximate LBRFs are used with the integral line-beam method to analyze four neutron skyshine problems with simple geometries: (1) an open silo, (2) an infinite wall, (3) a roofless rectangular building, and (4) an infinite air medium. In addition, two simple neutron skyshine problems involving an open source silo are analyzed using the integral conical-beam method. The results obtained using the LBRFs and the CBRFs are then compared with MCNP results and results of previous studies.

  16. Integrating multiple satellite data for crop monitoring

    USDA-ARS?s Scientific Manuscript database

    Remote sensing provides a valuable data source for detecting crop types, monitoring crop condition and predicting crop yields from space. Routine and continuous remote sensing data are critical for agricultural research and operational applications. Since crop field dimensions tend to be relatively ...

  17. Kinematic cross-correlation induces sensory integration across separate objects.

    PubMed

    Debats, Nienke B; Ernst, Marc O; Heuer, Herbert

    2017-12-01

    In a basic cursor-control task, the perceived positions of the hand and the cursor are biased towards each other. We recently found that this phenomenon conforms to the reliability-based weighting mechanism of optimal multisensory integration. This indicates that optimal integration is not restricted to sensory signals originating from a single source, as is the prevailing view, but that it also applies to separate objects that are connected by a kinematic relation (i.e. hand and cursor). In the current study, we examined which aspects of the kinematic relation are crucial for eliciting the sensory integration: (i) the cross-correlation between kinematic variables of the hand and cursor trajectories, and/or (ii) an internal model of the hand-cursor kinematic transformation. Participants made out-and-back movements from the centre of a semicircular workspace to its boundary, after which they judged the position where either their hand or the cursor hit the boundary. We analysed the position biases and found that the integration was strong in a condition with high kinematic correlations (a straight hand trajectory was mapped to a straight cursor trajectory), that it was significantly reduced for reduced kinematic correlations (a straight hand trajectory was transformed into a curved cursor trajectory) and that it was not affected by the inability to acquire an internal model of the kinematic transformation (i.e. by the trial-to-trial variability of the cursor curvature). These findings support the idea that correlations play a crucial role in multisensory integration irrespective of the number of sensory sources involved. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  18. Integrating Source Apportionment Tracers into a Bottom-up Inventory of Methane Emissions in the Barnett Shale Hydraulic Fracturing Region.

    PubMed

    Townsend-Small, Amy; Marrero, Josette E; Lyon, David R; Simpson, Isobel J; Meinardi, Simone; Blake, Donald R

    2015-07-07

    A growing dependence on natural gas for energy may exacerbate emissions of the greenhouse gas methane (CH4). Identifying fingerprints of these emissions is critical to our understanding of potential impacts. Here, we compare stable isotopic and alkane ratio tracers of natural gas, agricultural, and urban CH4 sources in the Barnett Shale hydraulic fracturing region near Fort Worth, Texas. Thermogenic and biogenic sources were compositionally distinct, and emissions from oil wells were enriched in alkanes and isotopically depleted relative to natural gas wells. Emissions from natural gas production varied in δ(13)C and alkane ratio composition, with δD-CH4 representing the most consistent tracer of natural gas sources. We integrated our data into a bottom-up inventory of CH4 for the region, resulting in an inventory of ethane (C2H6) sources for comparison to top-down estimates of CH4 and C2H6 emissions. Methane emissions in the Barnett are a complex mixture of urban, agricultural, and fossil fuel sources, which makes source apportionment challenging. For example, spatial heterogeneity in gas composition and high C2H6/CH4 ratios in emissions from conventional oil production add uncertainty to top-down models of source apportionment. Future top-down studies may benefit from the addition of δD-CH4 to distinguish thermogenic and biogenic sources.

  19. Geologic map of Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Hults, Chad P.; Mull, Charles G.; Karl, Susan M.

    2015-12-31

    This Alaska compilation is unique in that it is integrated with a rich database of information provided in the spatial datasets and standalone attribute databases. Within the spatial files every line and polygon is attributed to its original source; the references to these sources are contained in related tables, as well as in stand-alone tables. Additional attributes include typical lithology, geologic setting, and age range for the map units. Also included are tables of radiometric ages.

  20. Source Region Modeling of Explosions 2 and 3 from the Source Physics Experiment Using the Rayleigh Integral Method

    NASA Astrophysics Data System (ADS)

    Jones, K. R.; Arrowsmith, S.; Whitaker, R. W.

    2012-12-01

    The overall mission of the National Center for Nuclear Security (NCNS) Source Physics Experiment at the National Nuclear Security Site (SPE-N) near Las Vegas, Nevada is to improve upon and develop new physics based models for underground nuclear explosions using scaled, underground chemical explosions as proxies. To this end, we use the Rayleigh integral as an approximation to the Helmholz-Kirchoff integral, [Whitaker, 2007 and Arrowsmith et al., 2011], to model infrasound generation in the far-field. Infrasound generated by single-point explosive sources above ground can typically be treated as monopole point-sources. While the source is relatively simple, the research needed to model above ground point-sources is complicated by path effects related to the propagation of the acoustic signal and out of the scope of this study. In contrast, for explosions that occur below ground, including the SPE explosions, the source region is more complicated but the observation distances are much closer (< 5 km), thus greatly reducing the complication of path effects. In this case, elastic energy from the explosions radiates upward and spreads out, depending on depth, to a more distributed region at the surface. Due to this broad surface perturbation of the atmosphere we cannot model the source as a simple monopole point-source. Instead, we use the analogy of a piston mounted in a rigid, infinite baffle, where the surface area that moves as a result of the explosion is the piston and the surrounding region is the baffle. The area of the "piston" is determined by the depth and explosive yield of the event. In this study we look at data from SPE-N-2 and SPE-N-3. Both shots had an explosive yield of 1 ton at a depth of 45 m. We collected infrasound data with up to eight stations and 32 sensors within a 5 km radius of ground zero. To determine the area of the surface acceleration, we used data from twelve surface accelerometers installed within 100 m radially about ground zero. With the accelerometer data defining the vertical motion of the surface, we use the Rayleigh Integral Method, [Whitaker, 2007 and Arrowsmith et al., 2011], to generate a synthetic infrasound pulse to compare to the observed data. Because the phase across the "piston" is not necessarily uniform, constructive and destructive interference will change the shape of the acoustic pulse if observed directly above the source (on-axis) or perpendicular to the source (off-axis). Comparing the observed data to the synthetic data we note that the overall structure of the pulse agrees well and that the differences can be attributed to a number of possibilities, including the sensors used, topography, meteorological conditions, etc. One other potential source of error between the observed and calculated data is that we use a flat, symmetric source region for the "piston" where in reality the source region is not flat and not perfectly symmetric. A primary goal of this work is to better understand and model the relationships between surface area, depth, and yield of underground explosions.

  1. Peak fitting and integration uncertainties for the Aerodyne Aerosol Mass Spectrometer

    NASA Astrophysics Data System (ADS)

    Corbin, J. C.; Othman, A.; Haskins, J. D.; Allan, J. D.; Sierau, B.; Worsnop, D. R.; Lohmann, U.; Mensah, A. A.

    2015-04-01

    The errors inherent in the fitting and integration of the pseudo-Gaussian ion peaks in Aerodyne High-Resolution Aerosol Mass Spectrometers (HR-AMS's) have not been previously addressed as a source of imprecision for these instruments. This manuscript evaluates the significance of these uncertainties and proposes a method for their estimation in routine data analysis. Peak-fitting uncertainties, the most complex source of integration uncertainties, are found to be dominated by errors in m/z calibration. These calibration errors comprise significant amounts of both imprecision and bias, and vary in magnitude from ion to ion. The magnitude of these m/z calibration errors is estimated for an exemplary data set, and used to construct a Monte Carlo model which reproduced well the observed trends in fits to the real data. The empirically-constrained model is used to show that the imprecision in the fitted height of isolated peaks scales linearly with the peak height (i.e., as n1), thus contributing a constant-relative-imprecision term to the overall uncertainty. This constant relative imprecision term dominates the Poisson counting imprecision term (which scales as n0.5) at high signals. The previous HR-AMS uncertainty model therefore underestimates the overall fitting imprecision. The constant relative imprecision in fitted peak height for isolated peaks in the exemplary data set was estimated as ~4% and the overall peak-integration imprecision was approximately 5%. We illustrate the importance of this constant relative imprecision term by performing Positive Matrix Factorization (PMF) on a~synthetic HR-AMS data set with and without its inclusion. Finally, the ability of an empirically-constrained Monte Carlo approach to estimate the fitting imprecision for an arbitrary number of known overlapping peaks is demonstrated. Software is available upon request to estimate these error terms in new data sets.

  2. Mining relational paths in integrated biomedical data.

    PubMed

    He, Bing; Tang, Jie; Ding, Ying; Wang, Huijun; Sun, Yuyin; Shin, Jae Hong; Chen, Bin; Moorthy, Ganesh; Qiu, Judy; Desai, Pankaj; Wild, David J

    2011-01-01

    Much life science and biology research requires an understanding of complex relationships between biological entities (genes, compounds, pathways, diseases, and so on). There is a wealth of data on such relationships in publicly available datasets and publications, but these sources are overlapped and distributed so that finding pertinent relational data is increasingly difficult. Whilst most public datasets have associated tools for searching, there is a lack of searching methods that can cross data sources and that in particular search not only based on the biological entities themselves but also on the relationships between them. In this paper, we demonstrate how graph-theoretic algorithms for mining relational paths can be used together with a previous integrative data resource we developed called Chem2Bio2RDF to extract new biological insights about the relationships between such entities. In particular, we use these methods to investigate the genetic basis of side-effects of thiazolinedione drugs, and in particular make a hypothesis for the recently discovered cardiac side-effects of Rosiglitazone (Avandia) and a prediction for Pioglitazone which is backed up by recent clinical studies.

  3. On volume-source representations based on the representation theorem

    NASA Astrophysics Data System (ADS)

    Ichihara, Mie; Kusakabe, Tetsuya; Kame, Nobuki; Kumagai, Hiroyuki

    2016-01-01

    We discuss different ways to characterize a moment tensor associated with an actual volume change of ΔV C , which has been represented in terms of either the stress glut or the corresponding stress-free volume change ΔV T . Eshelby's virtual operation provides a conceptual model relating ΔV C to ΔV T and the stress glut, where non-elastic processes such as phase transitions allow ΔV T to be introduced and subsequent elastic deformation of - ΔV T is assumed to produce the stress glut. While it is true that ΔV T correctly represents the moment tensor of an actual volume source with volume change ΔV C , an explanation as to why such an operation relating ΔV C to ΔV T exists has not previously been given. This study presents a comprehensive explanation of the relationship between ΔV C and ΔV T based on the representation theorem. The displacement field is represented using Green's function, which consists of two integrals over the source surface: one for displacement and the other for traction. Both integrals are necessary for representing volumetric sources, whereas the representation of seismic faults includes only the first term, as the second integral over the two adjacent fault surfaces, across which the traction balances, always vanishes. Therefore, in a seismological framework, the contribution from the second term should be included as an additional surface displacement. We show that the seismic moment tensor of a volume source is directly obtained from the actual state of the displacement and stress at the source without considering any virtual non-elastic operations. A purely mathematical procedure based on the representation theorem enables us to specify the additional imaginary displacement necessary for representing a volume source only by the displacement term, which links ΔV C to ΔV T . It also specifies the additional imaginary stress necessary for representing a moment tensor solely by the traction term, which gives the "stress glut." The imaginary displacement-stress approach clarifies the mathematical background to the classical theory.

  4. A Close Investigation into Source Use in Integrated Second Language Writing Tasks

    ERIC Educational Resources Information Center

    Plakans, Lia; Gebril, Atta

    2012-01-01

    An increasing number of writing programs and assessments are employing writing-from-sources tasks in which reading and writing are integrated. The integration of reading and writing in such contexts raises a number of questions with regard to writers' use of sources in their writing, the functions these sources serve, and how proficiency affects…

  5. Earthquake source nucleation process in the zone of a permanently creeping deep fault

    NASA Astrophysics Data System (ADS)

    Lykov, V. I.; Mostryukov, A. O.

    2008-10-01

    The worldwide practice of earthquake prediction, whose beginning relates to the 1970s, shows that spatial manifestations of various precursors under real seismotectonic conditions are very irregular. As noted in [Kurbanov et al., 1980], zones of bending, intersection, and branching of deep faults, where conditions are favorable for increasing tangential tectonic stresses, serve as “natural amplifiers” of precursory effects. The earthquake of September 28, 2004, occurred on the Parkfield segment of the San Andreas deep fault in the area of a local bending of its plane. The fault segment about 60 km long and its vicinities are the oldest prognostic area in California. Results of observations before and after the earthquake were promptly analyzed and published in a special issue of Seismological Research Letters (2005, Vol. 76, no. 1). We have an original method enabling the monitoring of the integral rigidity of seismically active rock massifs. The integral rigidity is determined from the relative numbers of brittle and viscous failure acts during the formation of source ruptures of background earthquakes in a given massif. Fracture mechanisms are diagnosed from the steepness of the first arrival of the direct P wave. Principles underlying our method are described in [Lykov and Mostryukov, 1996, 2001, 2003]. Results of monitoring have been directly displayed at the site of the Laboratory ( http://wwwbrk.adm.yar.ru/russian/1_512/index.html ) since the mid-1990s. It seems that this information has not attracted the attention of American seismologists. This paper assesses the informativeness of the rigidity monitoring at the stage of formation of a strong earthquake source in relation to other methods.

  6. Integrated and Translational Nonclinical In Vivo Cardiovascular Risk Assessment: Gaps and Opportunities

    EPA Science Inventory

    Cardiovascular (CV) safety concerns are a significant source of drug development attrition in the pharmaceutical industry today. Though current nonclinical testing paradigms have largely prevented catastrophic CV events in Phase I studies, many challenges relating to the inabil...

  7. Integration of NASA-sponsored studies on aluminum welding

    NASA Technical Reports Server (NTRS)

    Masubuchi, K.

    1972-01-01

    The results are presented of numerous studies relating to aluminum alloy welding. The subjects covered include: (1) effects of porosity on weld joint performance, (2) sources of porosity, (3) weld thermal effects, (4) residual stresses and distortion, and (5) manufacturing process system control.

  8. Drug2Gene: an exhaustive resource to explore effectively the drug-target relation network.

    PubMed

    Roider, Helge G; Pavlova, Nadia; Kirov, Ivaylo; Slavov, Stoyan; Slavov, Todor; Uzunov, Zlatyo; Weiss, Bertram

    2014-03-11

    Information about drug-target relations is at the heart of drug discovery. There are now dozens of databases providing drug-target interaction data with varying scope, and focus. Therefore, and due to the large chemical space, the overlap of the different data sets is surprisingly small. As searching through these sources manually is cumbersome, time-consuming and error-prone, integrating all the data is highly desirable. Despite a few attempts, integration has been hampered by the diversity of descriptions of compounds, and by the fact that the reported activity values, coming from different data sets, are not always directly comparable due to usage of different metrics or data formats. We have built Drug2Gene, a knowledge base, which combines the compound/drug-gene/protein information from 19 publicly available databases. A key feature is our rigorous unification and standardization process which makes the data truly comparable on a large scale, allowing for the first time effective data mining in such a large knowledge corpus. As of version 3.2, Drug2Gene contains 4,372,290 unified relations between compounds and their targets most of which include reported bioactivity data. We extend this set with putative (i.e. homology-inferred) relations where sufficient sequence homology between proteins suggests they may bind to similar compounds. Drug2Gene provides powerful search functionalities, very flexible export procedures, and a user-friendly web interface. Drug2Gene v3.2 has become a mature and comprehensive knowledge base providing unified, standardized drug-target related information gathered from publicly available data sources. It can be used to integrate proprietary data sets with publicly available data sets. Its main goal is to be a 'one-stop shop' to identify tool compounds targeting a given gene product or for finding all known targets of a drug. Drug2Gene with its integrated data set of public compound-target relations is freely accessible without restrictions at http://www.drug2gene.com.

  9. Predicting dense nonaqueous phase liquid dissolution using a simplified source depletion model parameterized with partitioning tracers

    NASA Astrophysics Data System (ADS)

    Basu, Nandita B.; Fure, Adrian D.; Jawitz, James W.

    2008-07-01

    Simulations of nonpartitioning and partitioning tracer tests were used to parameterize the equilibrium stream tube model (ESM) that predicts the dissolution dynamics of dense nonaqueous phase liquids (DNAPLs) as a function of the Lagrangian properties of DNAPL source zones. Lagrangian, or stream-tube-based, approaches characterize source zones with as few as two trajectory-integrated parameters, in contrast to the potentially thousands of parameters required to describe the point-by-point variability in permeability and DNAPL in traditional Eulerian modeling approaches. The spill and subsequent dissolution of DNAPLs were simulated in two-dimensional domains having different hydrologic characteristics (variance of the log conductivity field = 0.2, 1, and 3) using the multiphase flow and transport simulator UTCHEM. Nonpartitioning and partitioning tracers were used to characterize the Lagrangian properties (travel time and trajectory-integrated DNAPL content statistics) of DNAPL source zones, which were in turn shown to be sufficient for accurate prediction of source dissolution behavior using the ESM throughout the relatively broad range of hydraulic conductivity variances tested here. The results were found to be relatively insensitive to travel time variability, suggesting that dissolution could be accurately predicted even if the travel time variance was only coarsely estimated. Estimation of the ESM parameters was also demonstrated using an approximate technique based on Eulerian data in the absence of tracer data; however, determining the minimum amount of such data required remains for future work. Finally, the stream tube model was shown to be a more unique predictor of dissolution behavior than approaches based on the ganglia-to-pool model for source zone characterization.

  10. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    DOEpatents

    Brettin, Thomas S.; Cottingham, Robert W.; Griffith, Shelton D.; Quest, Daniel J.

    2015-09-08

    A system and method of integrating diverse sources of data and data streams is presented. The method can include selecting a scenario based on a topic, creating a multi-relational directed graph based on the scenario, identifying and converting resources in accordance with the scenario and updating the multi-directed graph based on the resources, identifying data feeds in accordance with the scenario and updating the multi-directed graph based on the data feeds, identifying analytical routines in accordance with the scenario and updating the multi-directed graph using the analytical routines and identifying data outputs in accordance with the scenario and defining queries to produce the data outputs from the multi-directed graph.

  11. New opportunities of real-world data from clinical routine settings in life-cycle management of drugs: example of an integrative approach in multiple sclerosis.

    PubMed

    Rothenbacher, Dietrich; Capkun, Gorana; Uenal, Hatice; Tumani, Hayrettin; Geissbühler, Yvonne; Tilson, Hugh

    2015-05-01

    The assessment and demonstration of a positive benefit-risk balance of a drug is a life-long process and includes specific data from preclinical, clinical development and post-launch experience. However, new integrative approaches are needed to enrich evidence from clinical trials and sponsor-initiated observational studies with information from multiple additional sources, including registry information and other existing observational data and, more recently, health-related administrative claims and medical records databases. To illustrate the value of this approach, this paper exemplifies such a cross-package approach to the area of multiple sclerosis, exploring also possible analytic strategies when using these multiple sources of information.

  12. PRECISION INTEGRATOR FOR MINUTE ELECTRIC CURRENTS

    DOEpatents

    Hemmendinger, A.; Helmer, R.J.

    1961-10-24

    An integrator is described for measuring the value of integrated minute electrical currents. The device consists of a source capacitor connected in series with the source of such electrical currents, a second capacitor of accurately known capacitance and a source of accurately known and constant potential, means responsive to the potentials developed across the source capacitor for reversibly connecting the second capacitor in series with the source of known potential and with the source capacitor and at a rate proportional to the potential across the source capacitor to maintain the magnitude of the potential across the source capacitor at approximately zero. (AEC)

  13. The association between family and friend integration and physical activity: results from the NHIS.

    PubMed

    Larsen, Britta A; Strong, David; Linke, Sarah E

    2014-06-01

    Social integration predicts morbidity and mortality, but its relationships with specific health behaviors that could explain this relationship, such as physical activity, have not been established. Additionally, studies associating social integration with health have not distinguished between sources of social contact (family vs. friends), which could be differentially related to health. The purpose of this study was to examine the association between social integration and physical activity and to explore differences in family and friend social integration. Data came from the 2001 wave of the National Health Interview Survey. Adult participants (N = 33,326) indicated levels of social integration by reporting whether they had seen and/or called friends and/or family in the past 2 weeks and also reported their weekly minutes of physical activity. Logistic regression was used to determine odds of meeting physical activity (PA) guidelines (≥ 150 min/week) and odds of inactivity (0 min/week) based on levels of social integration. Greater integration predicted higher odds of meeting PA guidelines and lower odds of inactivity after controlling for sociodemographic variables. This association was stronger and dose-dependent for integration with friends, whereas moderate family contact predicted greater activity than high levels of family contact. Those who are more socially integrated, particularly with friends rather than family, are also more physically active, which could partially explain the link between social integration and morbidity and mortality. Future studies examining this association should distinguish between sources of integration and explore why and how contact with friends vs. family is differentially associated with health behaviors.

  14. SU-G-201-16: Thermal Imaging in Source Visualization and Radioactivity Measurement for High Dose Rate Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, X; Lei, Y; Zheng, D

    2016-06-15

    Purpose: High Dose Rate (HDR) brachytherapy poses a special challenge to radiation safety and quality assurance (QA) due to its high radioactivity, and it is thus critical to verify the HDR source location and its radioactive strength. This study demonstrates a new method for measuring HDR source location and radioactivity utilizing thermal imaging. A potential application would relate to HDR QA and safety improvement. Methods: Heating effects by an HDR source were studied using Finite Element Analysis (FEA). Thermal cameras were used to visualize an HDR source inside a plastic applicator made of polyvinylidene difluoride (PVDF). Using different source dwellmore » times, correlations between the HDR source strength and heating effects were studied, thus establishing potential daily QA criteria using thermal imaging Results: For an Ir1?2 source with a radioactivity of 10 Ci, the decay-induced heating power inside the source is ∼13.3 mW. After the HDR source was extended into the PVDF applicator and reached thermal equilibrium, thermal imaging visualized the temperature gradient of 10 K/cm along the PVDF applicator surface, which agreed with FEA modeling. For Ir{sup 192} source activities ranging from 4.20–10.20 Ci, thermal imaging could verify source activity with an accuracy of 6.3% with a dwell time of 10 sec, and an accuracy of 2.5 % with 100 sec. Conclusion: Thermal imaging is a feasible tool to visualize HDR source dwell positions and verify source integrity. Patient safety and treatment quality will be improved by integrating thermal measurements into HDR QA procedures.« less

  15. Integrated Budget Office Toolbox

    NASA Technical Reports Server (NTRS)

    Rushing, Douglas A.; Blakeley, Chris; Chapman, Gerry; Robertson, Bill; Horton, Allison; Besser, Thomas; McCarthy, Debbie

    2010-01-01

    The Integrated Budget Office Toolbox (IBOT) combines budgeting, resource allocation, organizational funding, and reporting features in an automated, integrated tool that provides data from a single source for Johnson Space Center (JSC) personnel. Using a common interface, concurrent users can utilize the data without compromising its integrity. IBOT tracks planning changes and updates throughout the year using both phasing and POP-related (program-operating-plan-related) budget information for the current year, and up to six years out. Separating lump-sum funds received from HQ (Headquarters) into separate labor, travel, procurement, Center G&A (general & administrative), and servicepool categories, IBOT creates a script that significantly reduces manual input time. IBOT also manages the movement of travel and procurement funds down to the organizational level and, using its integrated funds management feature, helps better track funding at lower levels. Third-party software is used to create integrated reports in IBOT that can be generated for plans, actuals, funds received, and other combinations of data that are currently maintained in the centralized format. Based on Microsoft SQL, IBOT incorporates generic budget processes, is transportable, and is economical to deploy and support.

  16. Time-integrated passive sampling as a complement to conventional point-in-time sampling for investigating drinking-water quality, McKenzie River Basin, Oregon, 2007 and 2010-11

    USGS Publications Warehouse

    McCarthy, Kathleen A.; Alvarez, David A.

    2014-01-01

    The Eugene Water & Electric Board (EWEB) supplies drinking water to approximately 200,000 people in Eugene, Oregon. The sole source of this water is the McKenzie River, which has consistently excellent water quality relative to established drinking-water standards. To ensure that this quality is maintained as land use in the source basin changes and water demands increase, EWEB has developed a proactive management strategy that includes a combination of conventional point-in-time discrete water sampling and time‑integrated passive sampling with a combination of chemical analyses and bioassays to explore water quality and identify where vulnerabilities may lie. In this report, we present the results from six passive‑sampling deployments at six sites in the basin, including the intake and outflow from the EWEB drinking‑water treatment plant (DWTP). This is the first known use of passive samplers to investigate both the source and finished water of a municipal DWTP. Results indicate that low concentrations of several polycyclic aromatic hydrocarbons and organohalogen compounds are consistently present in source waters, and that many of these compounds are also present in finished drinking water. The nature and patterns of compounds detected suggest that land-surface runoff and atmospheric deposition act as ongoing sources of polycyclic aromatic hydrocarbons, some currently used pesticides, and several legacy organochlorine pesticides. Comparison of results from point-in-time and time-integrated sampling indicate that these two methods are complementary and, when used together, provide a clearer understanding of contaminant sources than either method alone.

  17. Using image mapping towards biomedical and biological data sharing

    PubMed Central

    2013-01-01

    Image-based data integration in eHealth and life sciences is typically concerned with the method used for anatomical space mapping, needed to retrieve, compare and analyse large volumes of biomedical data. In mapping one image onto another image, a mechanism is used to match and find the corresponding spatial regions which have the same meaning between the source and the matching image. Image-based data integration is useful for integrating data of various information structures. Here we discuss a broad range of issues related to data integration of various information structures, review exemplary work on image representation and mapping, and discuss the challenges that these techniques may bring. PMID:24059352

  18. Integration of Geodata in Documenting Castle Ruins

    NASA Astrophysics Data System (ADS)

    Delis, P.; Wojtkowska, M.; Nerc, P.; Ewiak, I.; Lada, A.

    2016-06-01

    Textured three dimensional models are currently the one of the standard methods of representing the results of photogrammetric works. A realistic 3D model combines the geometrical relations between the structure's elements with realistic textures of each of its elements. Data used to create 3D models of structures can be derived from many different sources. The most commonly used tool for documentation purposes, is a digital camera and nowadays terrestrial laser scanning (TLS). Integration of data acquired from different sources allows modelling and visualization of 3D models historical structures. Additional aspect of data integration is possibility of complementing of missing points for example in point clouds. The paper shows the possibility of integrating data from terrestrial laser scanning with digital imagery and an analysis of the accuracy of the presented methods. The paper describes results obtained from raw data consisting of a point cloud measured using terrestrial laser scanning acquired from a Leica ScanStation2 and digital imagery taken using a Kodak DCS Pro 14N camera. The studied structure is the ruins of the Ilza castle in Poland.

  19. Semantic web data warehousing for caGrid.

    PubMed

    McCusker, James P; Phillips, Joshua A; González Beltrán, Alejandra; Finkelstein, Anthony; Krauthammer, Michael

    2009-10-01

    The National Cancer Institute (NCI) is developing caGrid as a means for sharing cancer-related data and services. As more data sets become available on caGrid, we need effective ways of accessing and integrating this information. Although the data models exposed on caGrid are semantically well annotated, it is currently up to the caGrid client to infer relationships between the different models and their classes. In this paper, we present a Semantic Web-based data warehouse (Corvus) for creating relationships among caGrid models. This is accomplished through the transformation of semantically-annotated caBIG Unified Modeling Language (UML) information models into Web Ontology Language (OWL) ontologies that preserve those semantics. We demonstrate the validity of the approach by Semantic Extraction, Transformation and Loading (SETL) of data from two caGrid data sources, caTissue and caArray, as well as alignment and query of those sources in Corvus. We argue that semantic integration is necessary for integration of data from distributed web services and that Corvus is a useful way of accomplishing this. Our approach is generalizable and of broad utility to researchers facing similar integration challenges.

  20. All-source Information Management and Integration for Improved Collective Intelligence Production

    DTIC Science & Technology

    2011-06-01

    Intelligence (ELINT) • Open Source Intelligence ( OSINT ) • Technical Intelligence (TECHINT) These intelligence disciplines produce... intelligence , measurement and signature intelligence , signals intelligence , and open - source data, in the production of intelligence . All- source intelligence ...All- Source Information Integration and Management) R&D Project 3 All- Source Intelligence

  1. 75 FR 67277 - Process for Review of Swaps for Mandatory Clearing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-02

    ... enacted to reduce risk, increase transparency, and promote market integrity within the financial system by... ability to manage the risks associated with clearing the swap, especially if the Commission determines... relating to product specifications; participant eligibility standards; pricing sources, models, and...

  2. Mars Mission Specialist

    ERIC Educational Resources Information Center

    Burton, Bill; Ogden, Kate; Walker, Becky; Bledsoe, Leslie; Hardage, Lauren

    2018-01-01

    For the last several years, the authors have implemented an integrated Mars Colony project for their third-grade classes. Students explored several considerations related to colonizing and inhabiting a new world, including food sources, types of citizens, transportation, and housing design. Nearly everything about the project was open-ended, full…

  3. Conflicting but close: Readers' integration of information sources as a function of their disagreement.

    PubMed

    Saux, Gaston; Britt, Anne; Le Bigot, Ludovic; Vibert, Nicolas; Burin, Debora; Rouet, Jean-François

    2017-01-01

    According to the documents model framework (Britt, Perfetti, Sandak, & Rouet, 1999), readers' detection of contradictions within texts increases their integration of source-content links (i.e., who says what). This study examines whether conflict may also strengthen the relationship between the respective sources. In two experiments, participants read brief news reports containing two critical statements attributed to different sources. In half of the reports, the statements were consistent with each other, whereas in the other half they were discrepant. Participants were tested for source memory and source integration in an immediate item-recognition task (Experiment 1) and a cued recall task (Experiments 1 and 2). In both experiments, discrepancies increased readers' memory for sources. We found that discrepant sources enhanced retrieval of the other source compared to consistent sources (using a delayed recall measure; Experiments 1 and 2). However, discrepant sources failed to prime the other source as evidenced in an online recognition measure (Experiment 1). We argue that discrepancies promoted the construction of links between sources, but that integration did not take place during reading.

  4. EnRICH: Extraction and Ranking using Integration and Criteria Heuristics.

    PubMed

    Zhang, Xia; Greenlee, M Heather West; Serb, Jeanne M

    2013-01-15

    High throughput screening technologies enable biologists to generate candidate genes at a rate that, due to time and cost constraints, cannot be studied by experimental approaches in the laboratory. Thus, it has become increasingly important to prioritize candidate genes for experiments. To accomplish this, researchers need to apply selection requirements based on their knowledge, which necessitates qualitative integration of heterogeneous data sources and filtration using multiple criteria. A similar approach can also be applied to putative candidate gene relationships. While automation can assist in this routine and imperative procedure, flexibility of data sources and criteria must not be sacrificed. A tool that can optimize the trade-off between automation and flexibility to simultaneously filter and qualitatively integrate data is needed to prioritize candidate genes and generate composite networks from heterogeneous data sources. We developed the java application, EnRICH (Extraction and Ranking using Integration and Criteria Heuristics), in order to alleviate this need. Here we present a case study in which we used EnRICH to integrate and filter multiple candidate gene lists in order to identify potential retinal disease genes. As a result of this procedure, a candidate pool of several hundred genes was narrowed down to five candidate genes, of which four are confirmed retinal disease genes and one is associated with a retinal disease state. We developed a platform-independent tool that is able to qualitatively integrate multiple heterogeneous datasets and use different selection criteria to filter each of them, provided the datasets are tables that have distinct identifiers (required) and attributes (optional). With the flexibility to specify data sources and filtering criteria, EnRICH automatically prioritizes candidate genes or gene relationships for biologists based on their specific requirements. Here, we also demonstrate that this tool can be effectively and easily used to apply highly specific user-defined criteria and can efficiently identify high quality candidate genes from relatively sparse datasets.

  5. Unprecedented long-term frequency stability with a microwave resonator oscillator.

    PubMed

    Grop, Serge; Schafer, Wolfgang; Bourgeois, Pierre-Yves; Kersale, Yann; Oxborrow, Mark; Rubiola, Enrico; Giordano, Vincent

    2011-08-01

    This article reports on the long-term frequency stability characterization of a new type of cryogenic sapphire oscillator using an autonomous pulse-tube cryocooler as its cold source. This new design enables a relative frequency stability of better than 4.5 x 10(-15) over one day of integration. To the best of our knowledge, this represents the best long-term frequency stability ever obtained with a signal source based on a macroscopic resonator.

  6. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    PubMed

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  7. Queries over Unstructured Data: Probabilistic Methods to the Rescue

    NASA Astrophysics Data System (ADS)

    Sarawagi, Sunita

    Unstructured data like emails, addresses, invoices, call transcripts, reviews, and press releases are now an integral part of any large enterprise. A challenge of modern business intelligence applications is analyzing and querying data seamlessly across structured and unstructured sources. This requires the development of automated techniques for extracting structured records from text sources and resolving entity mentions in data from various sources. The success of any automated method for extraction and integration depends on how effectively it unifies diverse clues in the unstructured source and in existing structured databases. We argue that statistical learning techniques like Conditional Random Fields (CRFs) provide a accurate, elegant and principled framework for tackling these tasks. Given the inherent noise in real-world sources, it is important to capture the uncertainty of the above operations via imprecise data models. CRFs provide a sound probability distribution over extractions but are not easy to represent and query in a relational framework. We present methods of approximating this distribution to query-friendly row and column uncertainty models. Finally, we present models for representing the uncertainty of de-duplication and algorithms for various Top-K count queries on imprecise duplicates.

  8. Brightness measurement of an electron impact gas ion source for proton beam writing applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, N.; Santhana Raman, P.; Department of Electrical and Computer Engineering, National University of Singapore, Singapore 117583

    We are developing a high brightness nano-aperture electron impact gas ion source, which can create ion beams from a miniature ionization chamber with relatively small virtual source sizes, typically around 100 nm. A prototype source of this kind was designed and successively micro-fabricated using integrated circuit technology. Experiments to measure source brightness were performed inside a field emission scanning electron microscope. The total output current was measured to be between 200 and 300 pA. The highest estimated reduced brightness was found to be comparable to the injecting focused electron beam reduced brightness. This translates into an ion reduced brightness thatmore » is significantly better than that of conventional radio frequency ion sources, currently used in single-ended MeV accelerators.« less

  9. Brightness measurement of an electron impact gas ion source for proton beam writing applications.

    PubMed

    Liu, N; Xu, X; Pang, R; Raman, P Santhana; Khursheed, A; van Kan, J A

    2016-02-01

    We are developing a high brightness nano-aperture electron impact gas ion source, which can create ion beams from a miniature ionization chamber with relatively small virtual source sizes, typically around 100 nm. A prototype source of this kind was designed and successively micro-fabricated using integrated circuit technology. Experiments to measure source brightness were performed inside a field emission scanning electron microscope. The total output current was measured to be between 200 and 300 pA. The highest estimated reduced brightness was found to be comparable to the injecting focused electron beam reduced brightness. This translates into an ion reduced brightness that is significantly better than that of conventional radio frequency ion sources, currently used in single-ended MeV accelerators.

  10. The supercontinuum laser as a flexible source for quasi-steady state and time resolved fluorescence studies

    NASA Astrophysics Data System (ADS)

    Fenske, Roger; Näther, Dirk U.; Dennis, Richard B.; Smith, S. Desmond

    2010-02-01

    Commercial Fluorescence Lifetime Spectrometers have long suffered from the lack of a simple, compact and relatively inexpensive broad spectral band light source that can be flexibly employed for both quasi-steady state and time resolved measurements (using Time Correlated Single Photon Counting [TCSPC]). This paper reports the integration of an optically pumped photonic crystal fibre, supercontinuum source1 (Fianium model SC400PP) as a light source in Fluorescence Lifetime Spectrometers (Edinburgh Instruments FLS920 and Lifespec II), with single photon counting detectors (micro-channel plate photomultiplier and a near-infrared photomultiplier) covering the UV to NIR range. An innovative method of spectral selection of the supercontinuum source involving wedge interference filters is also discussed.

  11. Path-integral method for the source apportionment of photochemical pollutants

    NASA Astrophysics Data System (ADS)

    Dunker, A. M.

    2015-06-01

    A new, path-integral method is presented for apportioning the concentrations of pollutants predicted by a photochemical model to emissions from different sources. A novel feature of the method is that it can apportion the difference in a species concentration between two simulations. For example, the anthropogenic ozone increment, which is the difference between a simulation with all emissions present and another simulation with only the background (e.g., biogenic) emissions included, can be allocated to the anthropogenic emission sources. The method is based on an existing, exact mathematical equation. This equation is applied to relate the concentration difference between simulations to line or path integrals of first-order sensitivity coefficients. The sensitivities describe the effects of changing the emissions and are accurately calculated by the decoupled direct method. The path represents a continuous variation of emissions between the two simulations, and each path can be viewed as a separate emission-control strategy. The method does not require auxiliary assumptions, e.g., whether ozone formation is limited by the availability of volatile organic compounds (VOCs) or nitrogen oxides (NOx), and can be used for all the species predicted by the model. A simplified configuration of the Comprehensive Air Quality Model with Extensions (CAMx) is used to evaluate the accuracy of different numerical integration procedures and the dependence of the source contributions on the path. A Gauss-Legendre formula using three or four points along the path gives good accuracy for apportioning the anthropogenic increments of ozone, nitrogen dioxide, formaldehyde, and nitric acid. Source contributions to these increments were obtained for paths representing proportional control of all anthropogenic emissions together, control of NOx emissions before VOC emissions, and control of VOC emissions before NOx emissions. There are similarities in the source contributions from the three paths but also differences due to the different chemical regimes resulting from the emission-control strategies.

  12. Path-integral method for the source apportionment of photochemical pollutants

    NASA Astrophysics Data System (ADS)

    Dunker, A. M.

    2014-12-01

    A new, path-integral method is presented for apportioning the concentrations of pollutants predicted by a photochemical model to emissions from different sources. A novel feature of the method is that it can apportion the difference in a species concentration between two simulations. For example, the anthropogenic ozone increment, which is the difference between a simulation with all emissions present and another simulation with only the background (e.g., biogenic) emissions included, can be allocated to the anthropogenic emission sources. The method is based on an existing, exact mathematical equation. This equation is applied to relate the concentration difference between simulations to line or path integrals of first-order sensitivity coefficients. The sensitivities describe the effects of changing the emissions and are accurately calculated by the decoupled direct method. The path represents a continuous variation of emissions between the two simulations, and each path can be viewed as a separate emission-control strategy. The method does not require auxiliary assumptions, e.g., whether ozone formation is limited by the availability of volatile organic compounds (VOC's) or nitrogen oxides (NOx), and can be used for all the species predicted by the model. A simplified configuration of the Comprehensive Air Quality Model with Extensions is used to evaluate the accuracy of different numerical integration procedures and the dependence of the source contributions on the path. A Gauss-Legendre formula using 3 or 4 points along the path gives good accuracy for apportioning the anthropogenic increments of ozone, nitrogen dioxide, formaldehyde, and nitric acid. Source contributions to these increments were obtained for paths representing proportional control of all anthropogenic emissions together, control of NOx emissions before VOC emissions, and control of VOC emissions before NOx emissions. There are similarities in the source contributions from the three paths but also differences due to the different chemical regimes resulting from the emission-control strategies.

  13. JBioWH: an open-source Java framework for bioinformatics data integration

    PubMed Central

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh PMID:23846595

  14. JBioWH: an open-source Java framework for bioinformatics data integration.

    PubMed

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh.

  15. Integrated water system simulation by considering hydrological and biogeochemical processes: model development, with parameter sensitivity and autocalibration

    NASA Astrophysics Data System (ADS)

    Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.

    2016-02-01

    Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.

  16. A Kirchhoff Approach to Seismic Modeling and Prestack Depth Migration

    DTIC Science & Technology

    1993-05-01

    continuation of sources and geophones by finite difference (S-G finite - difference migration ), are relatively slow and dip-limited. Compared to S-G... finite - difference migration , the Kirchhoff integral implements prestack migration relatively efficiently and has no dip limitation. Liu .Mlodeling and...for modeling and migration . In this paper, a finite - difference algorithm is used to calculate traveltimes and amplitudes. With the help of

  17. Common source cascode amplifiers for integrating IR-FPA applications

    NASA Technical Reports Server (NTRS)

    Woolaway, James T.; Young, Erick T.

    1989-01-01

    Space based astronomical infrared measurements present stringent performance requirements on the infrared detector arrays and their associated readout circuitry. To evaluate the usefulness of commercial CMOS technology for astronomical readout applications a theoretical and experimental evaluation was performed on source follower and common-source cascode integrating amplifiers. Theoretical analysis indicates that for conditions where the input amplifier integration capacitance is limited by the detectors capacitance the input referred rms noise electrons of each amplifier should be equivalent. For conditions of input gate limited capacitance the source follower should provide lower noise. Measurements of test circuits containing both source follower and common source cascode circuits showed substantially lower input referred noise for the common-source cascode input circuits. Noise measurements yielded 4.8 input referred rms noise electrons for an 8.5 minute integration. The signal and noise gain of the common-source cascode amplifier appears to offer substantial advantages in acheiving predicted noise levels.

  18. Development and Performance of a Filter Radiometer Monitor System for Integrating Sphere Sources

    NASA Technical Reports Server (NTRS)

    Ding, Leibo; Kowalewski, Matthew G.; Cooper, John W.; Smith, GIlbert R.; Barnes, Robert A.; Waluschka, Eugene; Butler, James J.

    2011-01-01

    The NASA Goddard Space Flight Center (GSFC) Radiometric Calibration Laboratory (RCL) maintains several large integrating sphere sources covering the visible to the shortwave infrared wavelength range. Two critical, functional requirements of an integrating sphere source are short and long-term operational stability and repeatability. Monitoring the source is essential in determining the origin of systemic errors, thus increasing confidence in source performance and quantifying repeatability. If monitor data falls outside the established parameters, this could be an indication that the source requires maintenance or re-calibration against the National Institute of Science and Technology (NIST) irradiance standard. The GSFC RCL has developed a Filter Radiometer Monitoring System (FRMS) to continuously monitor the performance of its integrating sphere calibration sources in the 400 to 2400nm region. Sphere output change mechanisms include lamp aging, coating (e.g. BaSO4) deterioration, and ambient water vapor level. The Filter Radiometer Monitor System (FRMS) wavelength bands are selected to quantify changes caused by these mechanisms. The FRMS design and operation are presented, as well as data from monitoring four of the RCL s integrating sphere sources.

  19. Experimental investigation on AC unit integrated with sensible heat storage (SHS)

    NASA Astrophysics Data System (ADS)

    Aziz, N. A.; Amin, N. A. M.; Majid, M. S. A.; Hussin, A.; Zhubir, S.

    2017-10-01

    The growth in population and economy has increases the energy demand and raises the concerns over the sustainable energy source. Towards the sustainable development, energy efficiency in buildings has become a prime objective. In this paper, the integration of thermal energy storage was studied. This paper presents an experimental investigation on the performance of an air conditioning unit integrated with sensible heat storage (SHS) system. The results were compared to the conventional AC systems in the terms of average electricity usage, indoor temperature and the relative humidity inside the experimented room (cabin container). Results show that the integration of water tank as an SHS reduces the electricity usage by 5%, while the integration of well-insulated water tank saves up to 8% of the electricity consumption.

  20. Toward generalized human factors taxonomy for classifying ASAP incident reports, AQP performance ratings, and FOQA output

    DOT National Transportation Integrated Search

    2003-01-01

    Over the years, the FAA has partnered with industry to develop a number of programs for reporting, classifying, and analyzing safety-related data. Despite their successes, none of these programs has been able to integrate data from multiple sources. ...

  1. Integrative psychotherapy.

    PubMed

    Kozarić-Kovacić, Dragica

    2008-09-01

    The main purposes of the article are to present the history of integration in psychotherapy, the reasons of the development integrative approaches, and the approaches to integration in psychotherapy. Three approaches to integration in psychotherapy exist: theoretical integration, theoretical eclecticism, and common factors in different psychotherapeutic trends. In integrative psychotherapy, the basic epistemology, theory, and clinical practice are based on the phenomenology, field theory, holism, dialogue, and co-creation of dialogue in the therapeutic relationship. The main criticism is that integrative psychotherapy suffers from confusion and many unresolved controversies. It is difficult to theoretically and methodologically define the clinically applied model that is based on such a different epistemological and theoretical presumptions. Integrative psychotherapy is a synthesis of humanistic psychotherapy, object relations theory, and psychoanalytical self psychology. It focuses on the dynamics and potentials of human relationships, with a goal of changing the relations and understanding internal and external resistances. The process of integrative psychotherapy is primarily focused on the developmental-relational model and co-creation of psychotherapeutic relationship as a single interactive event, which is not unilateral, but rather a joint endeavor by both the therapist and the patient/client. The need for a relationship is an important human need and represents a process of attunement that occurs as a response to the need for a relationship, a unique interpersonal contact between two people. If this need is not met, it manifests with the different feelings and various defenses. To meet this need, we need to have another person with whom we can establish a sensitive, attuned relationship. Thus, the therapist becomes this person who tries to supplement what the person did not receive. Neuroscience can be a source of integration through different therapies. We may say that both neuroscience and neurobiology offer yet another bridge for integration of different schools of thought and supports the importance of the developmental relational model during the developmental phases and relational process in psychotherapy in which the quality of therapeutic relationship is the primary healing process. Furthermore, the development of integrative psychotherapy in Croatia and the organization of the Croatian program, which is identical to the program of the European Association for Integrative Psychotherapy is shortly described.

  2. Multisensory processing of naturalistic objects in motion: a high-density electrical mapping and source estimation study.

    PubMed

    Senkowski, Daniel; Saint-Amour, Dave; Kelly, Simon P; Foxe, John J

    2007-07-01

    In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.

  3. The Integration of the Competition in Contracting Act of 1984 in Systems Acquisition.

    DTIC Science & Technology

    1986-04-01

    Thomas L. "Mechanized Contract Document Preparation and Abstract System." §3 FMS. 7-9 Dec 83. p.18-22 . 5. Brechtel, Donald L., Capt, USAF, Brost ...and Momentum." Government Executive. Mar 85. p. 16+. 22. Roeder, George L. "Computer Aided Source Selection." 8__R. 7-9 Dec 83. p. 214-216. 23...Competition Advocacy Office. Andrews AFB. telecon. 12 Nov 85. B. RELATED SOURCES Articles and Periodicals Coburn, George M. "The New Bid Protest Remedies

  4. PathJam: a new service for integrating biological pathway information.

    PubMed

    Glez-Peña, Daniel; Reboiro-Jato, Miguel; Domínguez, Rubén; Gómez-López, Gonzalo; Pisano, David G; Fdez-Riverola, Florentino

    2010-10-28

    Biological pathways are crucial to much of the scientific research today including the study of specific biological processes related with human diseases. PathJam is a new comprehensive and freely accessible web-server application integrating scattered human pathway annotation from several public sources. The tool has been designed for both (i) being intuitive for wet-lab users providing statistical enrichment analysis of pathway annotations and (ii) giving support to the development of new integrative pathway applications. PathJam’s unique features and advantages include interactive graphs linking pathways and genes of interest, downloadable results in fully compatible formats, GSEA compatible output files and a standardized RESTful API.

  5. Sound source localization on an axial fan at different operating points

    NASA Astrophysics Data System (ADS)

    Zenger, Florian J.; Herold, Gert; Becker, Stefan; Sarradj, Ennes

    2016-08-01

    A generic fan with unskewed fan blades is investigated using a microphone array method. The relative motion of the fan with respect to the stationary microphone array is compensated by interpolating the microphone data to a virtual rotating array with the same rotational speed as the fan. Hence, beamforming algorithms with deconvolution, in this case CLEAN-SC, could be applied. Sound maps and integrated spectra of sub-components are evaluated for five operating points. At selected frequency bands, the presented method yields sound maps featuring a clear circular source pattern corresponding to the nine fan blades. Depending on the adjusted operating point, sound sources are located on the leading or trailing edges of the fan blades. Integrated spectra show that in most cases leading edge noise is dominant for the low-frequency part and trailing edge noise for the high-frequency part. The shift from leading to trailing edge noise is strongly dependent on the operating point and frequency range considered.

  6. INTEGRATING SOURCE WATER PROTECTION AND DRINKING WATER TREATMENT: U.S. ENVIRONMENTAL PROTECTION AGENCY'S WATER SUPPLY AND WATER RESOURCES DIVISION

    EPA Science Inventory

    The U.S. Environmental Protection Agency's (EPA) Water Supply and Water Resources Division (WSWRD) is an internationally recognized water research organization established to assist in responding to public health concerns related to drinking water supplies. WSWRD has evolved from...

  7. Designing a Virtual-Reality-Based, Gamelike Math Learning Environment

    ERIC Educational Resources Information Center

    Xu, Xinhao; Ke, Fengfeng

    2016-01-01

    This exploratory study examined the design issues related to a virtual-reality-based, gamelike learning environment (VRGLE) developed via OpenSimulator, an open-source virtual reality server. The researchers collected qualitative data to examine the VRGLE's usability, playability, and content integration for math learning. They found it important…

  8. Relations among Functional Systems in Behavior Analysis

    ERIC Educational Resources Information Center

    Thompson, Travis

    2007-01-01

    This paper proposes that an organism's integrated repertoire of operant behavior has the status of a biological system, similar to other biological systems, like the nervous, cardiovascular, or immune systems. Evidence from a number of sources indicates that the distinctions between biological and behavioral events is often misleading, engendering…

  9. EUTROPHICATION MODELING CAPABILITIES FOR WATER QUALITY AND INTEGRATION TOWARDS ECOLOGICAL ENDPOINTS

    EPA Science Inventory

    A primary environmental focus for the use of mathematical models is for characterization of sources of nutrients and sediments and their relative loadings from large river basins, and the impact of land uses from smaller sub-basins on water quality in rivers, lakes, and estuaries...

  10. 45 CFR 1610.9 - Accounting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Accounting. 1610.9 Section 1610.9 Public Welfare Regulations Relating to Public Welfare (Continued) LEGAL SERVICES CORPORATION USE OF NON-LSC FUNDS, TRANSFERS OF LSC FUNDS, PROGRAM INTEGRITY § 1610.9 Accounting. Funds received by a recipient from a source...

  11. The MMWR: A Resource for Teaching Medical Geography.

    ERIC Educational Resources Information Center

    Pyle, Gerald F.

    1984-01-01

    Accounts from the Morbidity and Mortality Weekly Report, published by the Centers for Disease Control in Atlanta, Georgia can be integrated with materials from related scientific sources to help college students develop an understanding of the emerging geography of recently discovered diseases. Legionnaires' disease is used as an example. (RM)

  12. Integrating Mercury Science and Policy in the Marine Context: Challenges and Opportunities

    PubMed Central

    Lambert, Kathleen F.; Evers, David C.; Warner, Kimberly A.; King, Susannah L.; Selin, Noelle E.

    2014-01-01

    Mercury is a global pollutant and presents policy challenges at local, regional, and global scales. Mercury poses risks to the health of people, fish, and wildlife exposed to elevated levels of mercury, most commonly from the consumption of methylmercury in marine and estuarine fish. The patchwork of current mercury abatement efforts limits the effectiveness of national and multi-national policies. This paper provides an overview of the major policy challenges and opportunities related to mercury in coastal and marine environments, and highlights science and policy linkages of the past several decades. The U.S. policy examples explored here point to the need for a full life cycle approach to mercury policy with a focus on source reduction and increased attention to: (1) the transboundary movement of mercury in air, water, and biota; (2) the coordination of policy efforts across multiple environmental media; (3) the cross-cutting issues related to pollutant interactions, mitigation of legacy sources, and adaptation to elevated mercury via improved communication efforts; and (4) the integration of recent research on human and ecological health effects into benefits analyses for regulatory purposes. Stronger science and policy integration will benefit national and international efforts to prevent, control, and minimize exposure to methylmercury. PMID:22901766

  13. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  14. Long-term effectiveness of the integrated schistosomiasis control strategy with emphasis on infectious source control in China: a 10-year evaluation from 2005 to 2014.

    PubMed

    Wang, Xiaoli; Wang, Wei; Wang, Peng

    2017-02-01

    Schistosomiasis is a neglected tropical parasitic disease of great public health significance worldwide. Currently, mass drug administration with praziquantel remains the major strategy for global schistosomiasis control programs. Since 2005, an integrated strategy with emphasis on infectious source control was implemented for the control of schistosomiasis japonica, a major public health concern in China, and pilot studies have demonstrated that such a strategy is effective to reduce the prevalence of Schistosoma japonicum infection in both humans and bovines. However, there is little knowledge on the long-term effectiveness of this integrated strategy for controlling schistosomiasis japonica. The aim of this study was to evaluate the long-term effectiveness of the integrated strategy for schistosomiasis control following the 10-year implementation, based on the data from the national schistosomiasis control program released by the Ministry of Health, People's Republic of China. In 2014, there were 5 counties in which the transmission of schistosomiasis japonica had not been interrupted, which reduced by 95.2% as compared to that in 2005 (105 counties). The number of schistosomiasis patients and acute cases reduced by 85.5 and 99.7% in 2014 (115,614 cases and 2 cases) as compared to that in 2005 (798,762 cases and 564 cases), and the number of bovines and S. japonicum-infected bovines reduced by 47.9 and 98% in 2014 (919,579 bovines and 666 infected bovines) as compared to that in 2005 (1,764,472 bovines and 33,736 infected bovines), respectively. During the 10-year implementation of the integrated strategy, however, there was a minor fluctuation in the area of Oncomelania hupensis snail habitats, and there was only a 5.6% reduction in the area of snail habitats in 2014 relative to in 2005. The results of the current study demonstrate that the 10-year implementation of the integrated strategy with emphasis on infectious source has greatly reduced schistosomiasis-related morbidity in humans and bovines. It is concluded that the new integrated strategy has remarkable long-term effectiveness on the transmission of schistosomiasis japonica in China, which facilitates the shift of the national schistosomiasis control program from transmission control to transmission interruption and elimination. However, such a strategy seems to have little effect on the shrinking of areas of snail habitats.

  15. Sensorimotor integration: basic concepts, abnormalities related to movement disorders and sensorimotor training-induced cortical reorganization.

    PubMed

    Machado, Sergio; Cunha, Marlo; Velasques, Bruna; Minc, Daniel; Teixeira, Silmar; Domingues, Clayton A; Silva, Julio G; Bastos, Victor H; Budde, Henning; Cagy, Mauricio; Basile, Luis; Piedade, Roberto; Ribeiro, Pedro

    2010-10-01

    Sensorimotor integration is defined as the capability of the central nervous system to integrate different sources of stimuli, and parallelly, to transform such inputs in motor actions. To review the basic principles of sensorimotor integration, such as, its neural bases and its elementary mechanisms involved in specific goal-directed tasks performed by healthy subjects, and the abnormalities reported in the most common movement disorders, such as, Parkinson' disease, dystonia and stroke, like the cortical reorganization-related mechanisms. Whether these disorders are associated with an abnormal peripheral sensory input or defective central processing is still unclear, but most of the data support a central mechanism. We found that the sensorimotor integration process plays a potential role in elementary mechanisms involved in specific goal-directed tasks performed by healthy subjects and in occurrence of abnormalities in most common movement disorders and, moreover, play a potential role on the acquisition of abilities that have as critical factor the coupling of different sensory data which will constitute the basis of elaboration of motor outputs consciously goal-directed.

  16. Integrated Analysis of Mutation Data from Various Sources Identifies Key Genes and Signaling Pathways in Hepatocellular Carcinoma

    PubMed Central

    Wei, Lin; Tang, Ruqi; Lian, Baofeng; Zhao, Yingjun; He, Xianghuo; Xie, Lu

    2014-01-01

    Background Recently, a number of studies have performed genome or exome sequencing of hepatocellular carcinoma (HCC) and identified hundreds or even thousands of mutations in protein-coding genes. However, these studies have only focused on a limited number of candidate genes, and many important mutation resources remain to be explored. Principal Findings In this study, we integrated mutation data obtained from various sources and performed pathway and network analysis. We identified 113 pathways that were significantly mutated in HCC samples and found that the mutated genes included in these pathways contained high percentages of known cancer genes, and damaging genes and also demonstrated high conservation scores, indicating their important roles in liver tumorigenesis. Five classes of pathways that were mutated most frequently included (a) proliferation and apoptosis related pathways, (b) tumor microenvironment related pathways, (c) neural signaling related pathways, (d) metabolic related pathways, and (e) circadian related pathways. Network analysis further revealed that the mutated genes with the highest betweenness coefficients, such as the well-known cancer genes TP53, CTNNB1 and recently identified novel mutated genes GNAL and the ADCY family, may play key roles in these significantly mutated pathways. Finally, we highlight several key genes (e.g., RPS6KA3 and PCLO) and pathways (e.g., axon guidance) in which the mutations were associated with clinical features. Conclusions Our workflow illustrates the increased statistical power of integrating multiple studies of the same subject, which can provide biological insights that would otherwise be masked under individual sample sets. This type of bioinformatics approach is consistent with the necessity of making the best use of the ever increasing data provided in valuable databases, such as TCGA, to enhance the speed of deciphering human cancers. PMID:24988079

  17. Integrated analysis of mutation data from various sources identifies key genes and signaling pathways in hepatocellular carcinoma.

    PubMed

    Zhang, Yuannv; Qiu, Zhaoping; Wei, Lin; Tang, Ruqi; Lian, Baofeng; Zhao, Yingjun; He, Xianghuo; Xie, Lu

    2014-01-01

    Recently, a number of studies have performed genome or exome sequencing of hepatocellular carcinoma (HCC) and identified hundreds or even thousands of mutations in protein-coding genes. However, these studies have only focused on a limited number of candidate genes, and many important mutation resources remain to be explored. In this study, we integrated mutation data obtained from various sources and performed pathway and network analysis. We identified 113 pathways that were significantly mutated in HCC samples and found that the mutated genes included in these pathways contained high percentages of known cancer genes, and damaging genes and also demonstrated high conservation scores, indicating their important roles in liver tumorigenesis. Five classes of pathways that were mutated most frequently included (a) proliferation and apoptosis related pathways, (b) tumor microenvironment related pathways, (c) neural signaling related pathways, (d) metabolic related pathways, and (e) circadian related pathways. Network analysis further revealed that the mutated genes with the highest betweenness coefficients, such as the well-known cancer genes TP53, CTNNB1 and recently identified novel mutated genes GNAL and the ADCY family, may play key roles in these significantly mutated pathways. Finally, we highlight several key genes (e.g., RPS6KA3 and PCLO) and pathways (e.g., axon guidance) in which the mutations were associated with clinical features. Our workflow illustrates the increased statistical power of integrating multiple studies of the same subject, which can provide biological insights that would otherwise be masked under individual sample sets. This type of bioinformatics approach is consistent with the necessity of making the best use of the ever increasing data provided in valuable databases, such as TCGA, to enhance the speed of deciphering human cancers.

  18. Summary of Research 2000, Department of Systems Management

    DTIC Science & Technology

    2001-12-01

    Postgraduate School, June 2000. Fryzlewicz, J., "Analysis of Measures of Performance and Continuous Improvement at the Naval Dental Center Pearl Harbor," Masters...mart driven relational system. Fourth, using the prototype relational data mart as a source system, a contemporary OLAP application is used to prove the...warehouse solution to integrating legacy systems are discussed. DoD KEY TECHNOLOGY AREA: Computing and Software KEYWORDS: OLAP , Data Warehouse

  19. MalaCards: an integrated compendium for diseases and their annotation

    PubMed Central

    Rappaport, Noa; Nativ, Noam; Stelzer, Gil; Twik, Michal; Guan-Golan, Yaron; Iny Stein, Tsippi; Bahir, Iris; Belinky, Frida; Morrey, C. Paul; Safran, Marilyn; Lancet, Doron

    2013-01-01

    Comprehensive disease classification, integration and annotation are crucial for biomedical discovery. At present, disease compilation is incomplete, heterogeneous and often lacking systematic inquiry mechanisms. We introduce MalaCards, an integrated database of human maladies and their annotations, modeled on the architecture and strategy of the GeneCards database of human genes. MalaCards mines and merges 44 data sources to generate a computerized card for each of 16 919 human diseases. Each MalaCard contains disease-specific prioritized annotations, as well as inter-disease connections, empowered by the GeneCards relational database, its searches and GeneDecks set analyses. First, we generate a disease list from 15 ranked sources, using disease-name unification heuristics. Next, we use four schemes to populate MalaCards sections: (i) directly interrogating disease resources, to establish integrated disease names, synonyms, summaries, drugs/therapeutics, clinical features, genetic tests and anatomical context; (ii) searching GeneCards for related publications, and for associated genes with corresponding relevance scores; (iii) analyzing disease-associated gene sets in GeneDecks to yield affiliated pathways, phenotypes, compounds and GO terms, sorted by a composite relevance score and presented with GeneCards links; and (iv) searching within MalaCards itself, e.g. for additional related diseases and anatomical context. The latter forms the basis for the construction of a disease network, based on shared MalaCards annotations, embodying associations based on etiology, clinical features and clinical conditions. This broadly disposed network has a power-law degree distribution, suggesting that this might be an inherent property of such networks. Work in progress includes hierarchical malady classification, ontological mapping and disease set analyses, striving to make MalaCards an even more effective tool for biomedical research. Database URL: http://www.malacards.org/ PMID:23584832

  20. Integrating semantic dimension into openEHR archetypes for the management of cerebral palsy electronic medical records.

    PubMed

    Ellouze, Afef Samet; Bouaziz, Rafik; Ghorbel, Hanen

    2016-10-01

    Integrating semantic dimension into clinical archetypes is necessary once modeling medical records. First, it enables semantic interoperability and, it offers applying semantic activities on clinical data and provides a higher design quality of Electronic Medical Record (EMR) systems. However, to obtain these advantages, designers need to use archetypes that cover semantic features of clinical concepts involved in their specific applications. In fact, most of archetypes filed within open repositories are expressed in the Archetype Definition Language (ALD) which allows defining only the syntactic structure of clinical concepts weakening semantic activities on the EMR content in the semantic web environment. This paper focuses on the modeling of an EMR prototype for infants affected by Cerebral Palsy (CP), using the dual model approach and integrating semantic web technologies. Such a modeling provides a better delivery of quality of care and ensures semantic interoperability between all involved therapies' information systems. First, data to be documented are identified and collected from the involved therapies. Subsequently, data are analyzed and arranged into archetypes expressed in accordance of ADL. During this step, open archetype repositories are explored, in order to find the suitable archetypes. Then, ADL archetypes are transformed into archetypes expressed in OWL-DL (Ontology Web Language - Description Language). Finally, we construct an ontological source related to these archetypes enabling hence their annotation to facilitate data extraction and providing possibility to exercise semantic activities on such archetypes. Semantic dimension integration into EMR modeled in accordance to the archetype approach. The feasibility of our solution is shown through the development of a prototype, baptized "CP-SMS", which ensures semantic exploitation of CP EMR. This prototype provides the following features: (i) creation of CP EMR instances and their checking by using a knowledge base which we have constructed by interviews with domain experts, (ii) translation of initially CP ADL archetypes into CP OWL-DL archetypes, (iii) creation of an ontological source which we can use to annotate obtained archetypes and (vi) enrichment and supply of the ontological source and integration of semantic relations by providing hence fueling the ontology with new concepts, ensuring consistency and eliminating ambiguity between concepts. The degree of semantic interoperability that could be reached between EMR systems depends strongly on the quality of the used archetypes. Thus, the integration of semantic dimension in archetypes modeling process is crucial. By creating an ontological source and annotating archetypes, we create a supportive platform ensuring semantic interoperability between archetypes-based EMR-systems. Copyright © 2016. Published by Elsevier Inc.

  1. A Boltzmann constant determination based on Johnson noise thermometry

    NASA Astrophysics Data System (ADS)

    Flowers-Jacobs, N. E.; Pollarolo, A.; Coakley, K. J.; Fox, A. E.; Rogalla, H.; Tew, W. L.; Benz, S. P.

    2017-10-01

    A value for the Boltzmann constant was measured electronically using an improved version of the Johnson Noise Thermometry (JNT) system at the National Institute of Standards and Technology (NIST), USA. This system is different from prior ones, including those from the 2011 determination at NIST and both 2015 and 2017 determinations at the National Institute of Metrology (NIM), China. As in all three previous determinations, the main contribution to the combined uncertainty is the statistical uncertainty in the noise measurement, which is mitigated by accumulating and integrating many weeks of cross-correlated measured data. The second major uncertainty contribution also still results from variations in the frequency response of the ratio of the measured spectral noise of the two noise sources, the sense resistor at the triple-point of water and the superconducting quantum voltage noise source. In this paper, we briefly describe the major differences between our JNT system and previous systems, in particular the input circuit and approach we used to match the frequency responses of the two noise sources. After analyzing and integrating 50 d of accumulated data, we determined a value: k~=1.380 642 9(69)× {{10}-23} J K-1 with a relative standard uncertainty of 5.0× {{10}-6} and relative offset -4.05× {{10}-6} from the CODATA 2014 recommended value.

  2. Comparison of Two Methodologies for Calibrating Satellite Instruments in the Visible and Near Infrared

    NASA Technical Reports Server (NTRS)

    Barnes, Robert A.; Brown, Steven W.; Lykke, Keith R.; Guenther, Bruce; Xiong, Xiaoxiong (Jack); Butler, James J.

    2010-01-01

    Traditionally, satellite instruments that measure Earth-reflected solar radiation in the visible and near infrared wavelength regions have been calibrated for radiance response in a two-step method. In the first step, the spectral response of the instrument is determined using a nearly monochromatic light source, such a lamp-illuminated monochromator. Such sources only provide a relative spectral response (RSR) for the instrument, since they do not act as calibrated sources of light nor do they typically fill the field-of-view of the instrument. In the second step, the instrument views a calibrated source of broadband light, such as lamp-illuminated integrating sphere. In the traditional method, the RSR and the sphere spectral radiance are combined and, with the instrument's response, determine the absolute spectral radiance responsivity of the instrument. More recently, an absolute calibration system using widely tunable monochromatic laser systems has been developed, Using these sources, the absolute spectral responsivity (ASR) of an instrument can be determined on a wavelength-hy-wavelength basis. From these monochromatic ASRs. the responses of the instrument bands to broadband radiance sources can be calculated directly, eliminating the need for calibrated broadband light sources such as integrating spheres. Here we describe the laser-based calibration and the traditional broad-band source-based calibration of the NPP VIIRS sensor, and compare the derived calibration coefficients for the instrument. Finally, we evaluate the impact of the new calibration approach on the on-orbit performance of the sensor.

  3. The Arithmetic of Emotion: Integration of Incidental and Integral Affect in Judgments and Decisions

    PubMed Central

    Västfjäll, Daniel; Slovic, Paul; Burns, William J.; Erlandsson, Arvid; Koppel, Lina; Asutay, Erkin; Tinghög, Gustav

    2016-01-01

    Research has demonstrated that two types of affect have an influence on judgment and decision making: incidental affect (affect unrelated to a judgment or decision such as a mood) and integral affect (affect that is part of the perceiver’s internal representation of the option or target under consideration). So far, these two lines of research have seldom crossed so that knowledge concerning their combined effects is largely missing. To fill this gap, the present review highlights differences and similarities between integral and incidental affect. Further, common and unique mechanisms that enable these two types of affect to influence judgment and choices are identified. Finally, some basic principles for affect integration when the two sources co-occur are outlined. These mechanisms are discussed in relation to existing work that has focused on incidental or integral affect but not both. PMID:27014136

  4. Quantum dash based single section mode locked lasers for photonic integrated circuits.

    PubMed

    Joshi, Siddharth; Calò, Cosimo; Chimot, Nicolas; Radziunas, Mindaugas; Arkhipov, Rostislav; Barbet, Sophie; Accard, Alain; Ramdane, Abderrahim; Lelarge, Francois

    2014-05-05

    We present the first demonstration of an InAs/InP Quantum Dash based single-section frequency comb generator designed for use in photonic integrated circuits (PICs). The laser cavity is closed using a specifically designed Bragg reflector without compromising the mode-locking performance of the self pulsating laser. This enables the integration of single-section mode-locked laser in photonic integrated circuits as on-chip frequency comb generators. We also investigate the relations between cavity modes in such a device and demonstrate how the dispersion of the complex mode frequencies induced by the Bragg grating implies a violation of the equi-distance between the adjacent mode frequencies and, therefore, forbids the locking of the modes in a classical Bragg Device. Finally we integrate such a Bragg Mirror based laser with Semiconductor Optical Amplifier (SOA) to demonstrate the monolithic integration of QDash based low phase noise sources in PICs.

  5. XML-based approaches for the integration of heterogeneous bio-molecular data.

    PubMed

    Mesiti, Marco; Jiménez-Ruiz, Ernesto; Sanz, Ismael; Berlanga-Llavori, Rafael; Perlasca, Paolo; Valentini, Giorgio; Manset, David

    2009-10-15

    The today's public database infrastructure spans a very large collection of heterogeneous biological data, opening new opportunities for molecular biology, bio-medical and bioinformatics research, but raising also new problems for their integration and computational processing. In this paper we survey the most interesting and novel approaches for the representation, integration and management of different kinds of biological data by exploiting XML and the related recommendations and approaches. Moreover, we present new and interesting cutting edge approaches for the appropriate management of heterogeneous biological data represented through XML. XML has succeeded in the integration of heterogeneous biomolecular information, and has established itself as the syntactic glue for biological data sources. Nevertheless, a large variety of XML-based data formats have been proposed, thus resulting in a difficult effective integration of bioinformatics data schemes. The adoption of a few semantic-rich standard formats is urgent to achieve a seamless integration of the current biological resources.

  6. Space charge dosimeters for extremely low power measurements of radiation in shipping containers

    DOEpatents

    Britton, Jr., Charles L.; Buckner, Mark A [Oak Ridge, TN; Hanson, Gregory R [Clinton, TN; Bryan, William L [Knoxville, TN

    2011-05-03

    Methods and apparatus are described for space charge dosimeters for extremely low power measurements of radiation in shipping containers. A method includes insitu polling a suite of passive integrating ionizing radiation sensors including reading-out dosimetric data from a first passive integrating ionizing radiation sensor and a second passive integrating ionizing radiation sensor, where the first passive integrating ionizing radiation sensor and the second passive integrating ionizing radiation sensor remain situated where the dosimetric data was integrated while reading-out. Another method includes arranging a plurality of ionizing radiation sensors in a spatially dispersed array; determining a relative position of each of the plurality of ionizing radiation sensors to define a volume of interest; collecting ionizing radiation data from at least a subset of the plurality of ionizing radiation sensors; and triggering an alarm condition when a dose level of an ionizing radiation source is calculated to exceed a threshold.

  7. Space charge dosimeters for extremely low power measurements of radiation in shipping containers

    DOEpatents

    Britton, Jr; Charles, L [Alcoa, TN; Buckner, Mark A [Oak Ridge, TN; Hanson, Gregory R [Clinton, TN; Bryan, William L [Knoxville, TN

    2011-04-26

    Methods and apparatus are described for space charge dosimeters for extremely low power measurements of radiation in shipping containers. A method includes in situ polling a suite of passive integrating ionizing radiation sensors including reading-out dosimetric data from a first passive integrating ionizing radiation sensor and a second passive integrating ionizing radiation sensor, where the first passive integrating ionizing radiation sensor and the second passive integrating ionizing radiation sensor remain situated where the dosimetric data was integrated while reading-out. Another method includes arranging a plurality of ionizing radiation sensors in a spatially dispersed array; determining a relative position of each of the plurality of ionizing radiation sensors to define a volume of interest; collecting ionizing radiation data from at least a subset of the plurality of ionizing radiation sensors; and triggering an alarm condition when a dose level of an ionizing radiation source is calculated to exceed a threshold.

  8. Sources of nitrogen and phosphorus emissions to Irish rivers: estimates from the Source Load Apportionment Model (SLAM)

    NASA Astrophysics Data System (ADS)

    Mockler, Eva; Deakin, Jenny; Archbold, Marie; Daly, Donal; Bruen, Michael

    2017-04-01

    More than half of the river and lake water bodies in Europe are at less than good ecological status or potential, and diffuse pollution from agriculture remains a major, but not the only, cause of this poor performance. In Ireland, it is evident that agri-environmental policy and land management practices have, in many areas, reduced nutrient emissions to water, mitigating the potential impact on water quality. However, additional measures may be required in order to further decouple the relationship between agricultural productivity and emissions to water, which is of vital importance given the on-going agricultural intensification in Ireland. Catchment management can be greatly supported by modelling, which can reduce the resources required to analyse large amounts of information and can enable investigations and measures to be targeted. The Source Load Apportionment Model (SLAM) framework was developed to support catchment management in Ireland by characterising the contributions from various sources of phosphorus (P) and nitrogen (N) emissions to water. The SLAM integrates multiple national spatial datasets relating to nutrient emissions to surface water, including land use and physical characteristics of the sub-catchments to predict emissions from point (wastewater, industry discharges and septic tank systems) and diffuse sources (agriculture, forestry, peatlands, etc.). The annual nutrient emissions predicted by the SLAM were assessed against nutrient monitoring data for 16 major river catchments covering 50% of the area of Ireland. At national scale, results indicate that the total average annual emissions to surface water in Ireland are over 2,700 t yr-1 of P and 80,000 t yr-1 of N. The SLAM results include the proportional contributions from individual sources at a range of scales from sub-catchment to national, and show that the main sources of P are from wastewater and agriculture, with wide variations across the country related to local anthropogenic pressures and the hydrogeological setting. Agriculture is the main source of N emissions to water across all regions of Ireland. The SLAM results have been incorporated into an Integrated Catchment Management process and used in conjunction with monitoring data and local knowledge during the characterisation of all Irish water bodies by the Environmental Protection Agency. This demonstrates the successful integration of research into catchment management to inform the identification of (i) the sources of nutrients at regional and local scales and (ii) the potential significant pressures and appropriate mitigation measures.

  9. KaBOB: ontology-based semantic integration of biomedical databases.

    PubMed

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for formal reasoning over a wealth of integrated biomedical data.

  10. Improving the interoperability of biomedical ontologies with compound alignments.

    PubMed

    Oliveira, Daniela; Pesquita, Catia

    2018-01-09

    Ontologies are commonly used to annotate and help process life sciences data. Although their original goal is to facilitate integration and interoperability among heterogeneous data sources, when these sources are annotated with distinct ontologies, bridging this gap can be challenging. In the last decade, ontology matching systems have been evolving and are now capable of producing high-quality mappings for life sciences ontologies, usually limited to the equivalence between two ontologies. However, life sciences research is becoming increasingly transdisciplinary and integrative, fostering the need to develop matching strategies that are able to handle multiple ontologies and more complex relations between their concepts. We have developed ontology matching algorithms that are able to find compound mappings between multiple biomedical ontologies, in the form of ternary mappings, finding for instance that "aortic valve stenosis"(HP:0001650) is equivalent to the intersection between "aortic valve"(FMA:7236) and "constricted" (PATO:0001847). The algorithms take advantage of search space filtering based on partial mappings between ontology pairs, to be able to handle the increased computational demands. The evaluation of the algorithms has shown that they are able to produce meaningful results, with precision in the range of 60-92% for new mappings. The algorithms were also applied to the potential extension of logical definitions of the OBO and the matching of several plant-related ontologies. This work is a first step towards finding more complex relations between multiple ontologies. The evaluation shows that the results produced are significant and that the algorithms could satisfy specific integration needs.

  11. Data integration of structured and unstructured sources for assigning clinical codes to patient stays

    PubMed Central

    Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim

    2016-01-01

    Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458

  12. Independent evaluation of point source fossil fuel CO2 emissions to better than 10%

    PubMed Central

    Turnbull, Jocelyn Christine; Keller, Elizabeth D.; Norris, Margaret W.; Wiltshire, Rachael M.

    2016-01-01

    Independent estimates of fossil fuel CO2 (CO2ff) emissions are key to ensuring that emission reductions and regulations are effective and provide needed transparency and trust. Point source emissions are a key target because a small number of power plants represent a large portion of total global emissions. Currently, emission rates are known only from self-reported data. Atmospheric observations have the potential to meet the need for independent evaluation, but useful results from this method have been elusive, due to challenges in distinguishing CO2ff emissions from the large and varying CO2 background and in relating atmospheric observations to emission flux rates with high accuracy. Here we use time-integrated observations of the radiocarbon content of CO2 (14CO2) to quantify the recently added CO2ff mole fraction at surface sites surrounding a point source. We demonstrate that both fast-growing plant material (grass) and CO2 collected by absorption into sodium hydroxide solution provide excellent time-integrated records of atmospheric 14CO2. These time-integrated samples allow us to evaluate emissions over a period of days to weeks with only a modest number of measurements. Applying the same time integration in an atmospheric transport model eliminates the need to resolve highly variable short-term turbulence. Together these techniques allow us to independently evaluate point source CO2ff emission rates from atmospheric observations with uncertainties of better than 10%. This uncertainty represents an improvement by a factor of 2 over current bottom-up inventory estimates and previous atmospheric observation estimates and allows reliable independent evaluation of emissions. PMID:27573818

  13. Independent evaluation of point source fossil fuel CO2 emissions to better than 10%.

    PubMed

    Turnbull, Jocelyn Christine; Keller, Elizabeth D; Norris, Margaret W; Wiltshire, Rachael M

    2016-09-13

    Independent estimates of fossil fuel CO2 (CO2ff) emissions are key to ensuring that emission reductions and regulations are effective and provide needed transparency and trust. Point source emissions are a key target because a small number of power plants represent a large portion of total global emissions. Currently, emission rates are known only from self-reported data. Atmospheric observations have the potential to meet the need for independent evaluation, but useful results from this method have been elusive, due to challenges in distinguishing CO2ff emissions from the large and varying CO2 background and in relating atmospheric observations to emission flux rates with high accuracy. Here we use time-integrated observations of the radiocarbon content of CO2 ((14)CO2) to quantify the recently added CO2ff mole fraction at surface sites surrounding a point source. We demonstrate that both fast-growing plant material (grass) and CO2 collected by absorption into sodium hydroxide solution provide excellent time-integrated records of atmospheric (14)CO2 These time-integrated samples allow us to evaluate emissions over a period of days to weeks with only a modest number of measurements. Applying the same time integration in an atmospheric transport model eliminates the need to resolve highly variable short-term turbulence. Together these techniques allow us to independently evaluate point source CO2ff emission rates from atmospheric observations with uncertainties of better than 10%. This uncertainty represents an improvement by a factor of 2 over current bottom-up inventory estimates and previous atmospheric observation estimates and allows reliable independent evaluation of emissions.

  14. Impact analysis of traffic-related air pollution based on real-time traffic and basic meteorological information.

    PubMed

    Pan, Long; Yao, Enjian; Yang, Yang

    2016-12-01

    With the rapid development of urbanization and motorization in China, traffic-related air pollution has become a major component of air pollution which constantly jeopardizes public health. This study proposes an integrated framework for estimating the concentration of traffic-related air pollution with real-time traffic and basic meteorological information and also for further evaluating the impact of traffic-related air pollution. First, based on the vehicle emission factor models sensitive to traffic status, traffic emissions are calculated according to the real-time link-based average traffic speed, traffic volume, and vehicular fleet composition. Then, based on differences in meteorological conditions, traffic pollution sources are divided into line sources and point sources, and the corresponding methods to determine the dynamic affecting areas are also proposed. Subsequently, with basic meteorological data, Gaussian dispersion model and puff integration model are applied respectively to estimate the concentration of traffic-related air pollution. Finally, the proposed estimating framework is applied to calculate the distribution of CO concentration in the main area of Beijing, and the population exposure is also calculated to evaluate the impact of traffic-related air pollution on public health. Results show that there is a certain correlation between traffic indicators (i.e., traffic speed and traffic intensity) of the affecting area and traffic-related CO concentration of the target grid, which indicates the methods to determine the affecting areas are reliable. Furthermore, the reliability of the proposed estimating framework is verified by comparing the predicted and the observed ambient CO concentration. In addition, results also show that the traffic-related CO concentration is higher in morning and evening peak hours, and has a heavier impact on public health within the Fourth Ring Road of Beijing due to higher population density and higher CO concentration under calm wind condition in this area. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Physical and Economic Integration of Carbon Capture Methods with Sequestration Sinks

    NASA Astrophysics Data System (ADS)

    Murrell, G. R.; Thyne, G. D.

    2007-12-01

    Currently there are several different carbon capture technologies either available or in active development for coal- fired power plants. Each approach has different advantages, limitations and costs that must be integrated with the method of sequestration and the physiochemical properties of carbon dioxide to evaluate which approach is most cost effective. For large volume point sources such as coal-fired power stations, the only viable sequestration sinks are either oceanic or geological in nature. However, the carbon processes and systems under consideration produce carbon dioxide at a variety of pressure and temperature conditions that must be made compatible with the sinks. Integration of all these factors provides a basis for meaningful economic comparisons between the alternatives. The high degree of compatibility between carbon dioxide produced by integrated gasification combined cycle technology and geological sequestration conditions makes it apparent that this coupling currently holds the advantage. Using a basis that includes complete source-to-sink sequestration costs, the relative cost benefit of pre-combustion IGCC compared to other post-combustion methods is on the order of 30%. Additional economic benefits arising from enhanced oil recovery revenues and potential sequestration credits further improve this coupling.

  16. Integrating geo web services for a user driven exploratory analysis

    NASA Astrophysics Data System (ADS)

    Moncrieff, Simon; Turdukulov, Ulanbek; Gulland, Elizabeth-Kate

    2016-04-01

    In data exploration, several online data sources may need to be dynamically aggregated or summarised over spatial region, time interval, or set of attributes. With respect to thematic data, web services are mainly used to present results leading to a supplier driven service model limiting the exploration of the data. In this paper we propose a user need driven service model based on geo web processing services. The aim of the framework is to provide a method for the scalable and interactive access to various geographic data sources on the web. The architecture combines a data query, processing technique and visualisation methodology to rapidly integrate and visually summarise properties of a dataset. We illustrate the environment on a health related use case that derives Age Standardised Rate - a dynamic index that needs integration of the existing interoperable web services of demographic data in conjunction with standalone non-spatial secure database servers used in health research. Although the example is specific to the health field, the architecture and the proposed approach are relevant and applicable to other fields that require integration and visualisation of geo datasets from various web services and thus, we believe is generic in its approach.

  17. Semantic web data warehousing for caGrid

    PubMed Central

    McCusker, James P; Phillips, Joshua A; Beltrán, Alejandra González; Finkelstein, Anthony; Krauthammer, Michael

    2009-01-01

    The National Cancer Institute (NCI) is developing caGrid as a means for sharing cancer-related data and services. As more data sets become available on caGrid, we need effective ways of accessing and integrating this information. Although the data models exposed on caGrid are semantically well annotated, it is currently up to the caGrid client to infer relationships between the different models and their classes. In this paper, we present a Semantic Web-based data warehouse (Corvus) for creating relationships among caGrid models. This is accomplished through the transformation of semantically-annotated caBIG® Unified Modeling Language (UML) information models into Web Ontology Language (OWL) ontologies that preserve those semantics. We demonstrate the validity of the approach by Semantic Extraction, Transformation and Loading (SETL) of data from two caGrid data sources, caTissue and caArray, as well as alignment and query of those sources in Corvus. We argue that semantic integration is necessary for integration of data from distributed web services and that Corvus is a useful way of accomplishing this. Our approach is generalizable and of broad utility to researchers facing similar integration challenges. PMID:19796399

  18. Modelling future impacts of air pollution using the multi-scale UK Integrated Assessment Model (UKIAM).

    PubMed

    Oxley, Tim; Dore, Anthony J; ApSimon, Helen; Hall, Jane; Kryza, Maciej

    2013-11-01

    Integrated assessment modelling has evolved to support policy development in relation to air pollutants and greenhouse gases by providing integrated simulation tools able to produce quick and realistic representations of emission scenarios and their environmental impacts without the need to re-run complex atmospheric dispersion models. The UK Integrated Assessment Model (UKIAM) has been developed to investigate strategies for reducing UK emissions by bringing together information on projected UK emissions of SO2, NOx, NH3, PM10 and PM2.5, atmospheric dispersion, criteria for protection of ecosystems, urban air quality and human health, and data on potential abatement measures to reduce emissions, which may subsequently be linked to associated analyses of costs and benefits. We describe the multi-scale model structure ranging from continental to roadside, UK emission sources, atmospheric dispersion of emissions, implementation of abatement measures, integration with European-scale modelling, and environmental impacts. The model generates outputs from a national perspective which are used to evaluate alternative strategies in relation to emissions, deposition patterns, air quality metrics and ecosystem critical load exceedance. We present a selection of scenarios in relation to the 2020 Business-As-Usual projections and identify potential further reductions beyond those currently being planned. © 2013.

  19. Integrated Array/Metadata Analytics

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  20. Robust averaging protects decisions from noise in neural computations

    PubMed Central

    Herce Castañón, Santiago; Solomon, Joshua A.; Vandormael, Hildward

    2017-01-01

    An ideal observer will give equivalent weight to sources of information that are equally reliable. However, when averaging visual information, human observers tend to downweight or discount features that are relatively outlying or deviant (‘robust averaging’). Why humans adopt an integration policy that discards important decision information remains unknown. Here, observers were asked to judge the average tilt in a circular array of high-contrast gratings, relative to an orientation boundary defined by a central reference grating. Observers showed robust averaging of orientation, but the extent to which they did so was a positive predictor of their overall performance. Using computational simulations, we show that although robust averaging is suboptimal for a perfect integrator, it paradoxically enhances performance in the presence of “late” noise, i.e. which corrupts decisions during integration. In other words, robust decision strategies increase the brain’s resilience to noise arising in neural computations during decision-making. PMID:28841644

  1. Integrated assessment of exposure to PM2.5 in South India and its relation with cardiovascular risk: Design of the CHAI observational cohort study.

    PubMed

    Tonne, Cathryn; Salmon, Maëlle; Sanchez, Margaux; Sreekanth, V; Bhogadi, Santhi; Sambandam, Sankar; Balakrishnan, Kalpana; Kinra, Sanjay; Marshall, Julian D

    2017-08-01

    While there is convincing evidence that fine particulate matter causes cardiovascular mortality and morbidity, little of the evidence is based on populations outside of high income countries, leaving large uncertainties at high exposures. India is an attractive setting for investigating the cardiovascular risk of particles across a wide concentration range, including concentrations for which there is the largest uncertainty in the exposure-response relationship. CHAI is a European Research Council funded project that investigates the relationship between particulate air pollution from outdoor and household sources with markers of atherosclerosis, an important cardiovascular pathology. The project aims to (1) characterize the exposure of a cohort of adults to particulate air pollution from household and outdoor sources (2) integrate information from GPS, wearable cameras, and continuous measurements of personal exposure to particles to understand where and through which activities people are most exposed and (3) quantify the association between particles and markers of atherosclerosis. CHAI has the potential to make important methodological contributions to modeling air pollution exposure integrating outdoor and household sources as well as in the application of wearable camera data in environmental exposure assessment. Copyright © 2017 Elsevier GmbH. All rights reserved.

  2. Integrating data and mashup concepts in Hydro-Meteorological Research: the torrential rainfall event in Genoa (4th November 2011) case study.

    NASA Astrophysics Data System (ADS)

    Bedrina, T.; Parodi, A.; Quarati, A.; Clematis, A.; Rebora, N.; Laiosa, D.

    2012-04-01

    One of the critical issues in Hydro-Meteorological Research (HMR) is a better exploitation of data archives according to a multidisciplinary perspective. Different Earth science databases offer a huge amount of observational data, which often need to be assembled, processed, combined accordingly HM scientists needs. The cooperation between scientists active in HMR and Information and Communication Technologies (ICT) is essential in the development of innovative tools and applications for manipulating, aggregating and re-arranging heterogeneous information in flexible way. In this paper it is described an application devoted to the collection and integration of HM datasets, originated by public or private sources, freely exposed via Web services API. This application uses the mashup, recently become very popular in many fields, (Chow S.-W., 2007) technology concepts. Such methodology means combination of data and/or programs published by external online sources into an integrated experience. Mashup seems to be a promising methodology to respond to the multiple data-related activities into which HM researchers are daily involved (e.g. finding and retrieving high volume data; learning formats and developing readers; extracting parameters; performing filtering and mask; developing analysis and visualization tools). The specific case study of the recent extreme rainfall event, occurred over Genoa in Italy on the 4th November 2011 is shown through the integration of semi-professional weather observational networks as free available data source in addition to official weather networks.

  3. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Mehl, S.; Velasco Mansilla, V.

    2015-12-01

    FREEWAT is an HORIZON 2020 EU project. FREEWAT main result will be an open source and public domain GIS integrated modelling environment for the simulation of water quantity and quality in surface water and groundwater with an integrated water management and planning module. FREEWAT aims at promoting water resource management by simplifying the application of the Water Framework Directive and related Directives. Specific objectives of the project are: to coordinate previous EU and national funded research to integrate existing software modules for water management in a single environment into the GIS based FREEWAT and to support the FREEWAT application in an innovative participatory approach gathering technical staff and relevant stakeholders (policy and decision makers) in designing scenarios for application of water policies. The open source characteristics of the platform allow to consider this an initiative "ad includendum", as further institutions or developers may contribute to the development. Core of the platform is the SID&GRID framework (GIS integrated physically-based distributed numerical hydrological model based on a modified version of MODFLOW 2005; Rossetto et al. 2013) in its version ported to QGIS desktop. Activities are carried out on two lines: (i) integration of modules to fulfill the end-users requirements, including tools for producing feasibility and management plans; (ii) a set of activities to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - module for water management and planning; - calibration, uncertainty and sensitivity analysis; - module for solute transport in unsaturated zone; - module for crop growth and water requirements in agriculture; - tools for groundwater quality issues and for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. Large stakeholders involvement is thought to guarantee results dissemination and exploitation.

  4. Cheminformatics and the Semantic Web: adding value with linked data and enhanced provenance

    PubMed Central

    Frey, Jeremy G; Bird, Colin L

    2013-01-01

    Cheminformatics is evolving from being a field of study associated primarily with drug discovery into a discipline that embraces the distribution, management, access, and sharing of chemical data. The relationship with the related subject of bioinformatics is becoming stronger and better defined, owing to the influence of Semantic Web technologies, which enable researchers to integrate heterogeneous sources of chemical, biochemical, biological, and medical information. These developments depend on a range of factors: the principles of chemical identifiers and their role in relationships between chemical and biological entities; the importance of preserving provenance and properly curated metadata; and an understanding of the contribution that the Semantic Web can make at all stages of the research lifecycle. The movements toward open access, open source, and open collaboration all contribute to progress toward the goals of integration. PMID:24432050

  5. SPATIALLY RESOLVED STAR FORMATION MAIN SEQUENCE OF GALAXIES IN THE CALIFA SURVEY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cano-Díaz, M.; Sánchez, S. F.; Zibetti, S.

    2016-04-20

    The “main sequence of galaxies”–defined in terms of the total star formation rate ψ versus the total stellar mass M {sub *}—is a well-studied tight relation that has been observed at several wavelengths and at different redshifts. All earlier studies have derived this relation from integrated properties of galaxies. We recover the same relation from an analysis of spatially resolved properties, with integral field spectroscopic (IFS) observations of 306 galaxies from the CALIFA survey. We consider the SFR surface density in units of log( M {sub ⊙} yr{sup −1} Kpc{sup −2}) and the stellar mass surface density in units ofmore » log( M {sub ⊙} Kpc{sup −2}) in individual spaxels that probe spatial scales of 0.5–1.5 Kpc. This local relation exhibits a high degree of correlation with small scatter ( σ = 0.23 dex), irrespective of the dominant ionization source of the host galaxy or its integrated stellar mass. We highlight (i) the integrated star formation main sequence formed by galaxies whose dominant ionization process is related to star formation, for which we find a slope of 0.81 ± 0.02; (ii) for the spatially resolved relation obtained with the spaxel analysis, we find a slope of 0.72 ± 0.04; and (iii) for the integrated main sequence, we also identified a sequence formed by galaxies that are dominated by an old stellar population, which we have called the retired galaxies sequence.« less

  6. Flat conductor cable for electrical packaging

    NASA Technical Reports Server (NTRS)

    Angele, W.

    1972-01-01

    Flat conductor cable (FCC) is relatively new, highly promising means for electrical packaging and system integration. FCC offers numerous desirable traits (weight, volume and cost savings, flexibility, high reliability, predictable and repeatable electrical characteristics) which make it extremely attractive as a packaging medium. FCC, today, finds wide application in everything from integration of lunar equipment to the packaging of electronics in nuclear submarines. Described are cable construction and means of termination, applicable specifications and standards, and total FCC systems. A list of additional sources of data is also included for more intensive study.

  7. Demonstratives in Motion: The Grammaticalization of Demonstratives as a Window into Synchronic Phenomena

    ERIC Educational Resources Information Center

    Ferrazzano, Lisa Reisig

    2013-01-01

    There is significant variation in the literature on how demonstratives are characterized semantically, leading to divergent syntactic analyses of demonstratives. A major source of this disagreement regards how distance specifications relate to the demonstrative: whether [+/- speaker] is an integral property of the demonstrative or not. I argue…

  8. The Influence of Textbooks on Teachers' Knowledge of Chemical Bonding Representations Relative to Students' Difficulties Understanding

    ERIC Educational Resources Information Center

    Bergqvist, Anna; Chang Rundgren, Shu-Nu

    2017-01-01

    Background: Textbooks are integral tools for teachers' lessons. Several researchers observed that school teachers rely heavily on textbooks as informational sources when planning lessons. Moreover, textbooks are an important resource for developing students' knowledge as they contain various representations that influence students' learning.…

  9. 78 FR 15953 - Cooperative Agreement To Support Regulatory Research Related to Food and Drug Administration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-13

    ... and expert feedback on approaches to standardizing of REMS and integrating them into the health care... following organization is eligible to apply: ECHCR. Within the Brookings Institution, the mission of the... source application for award of a cooperative agreement to the Brookings Institution's Engelberg Center...

  10. Inverse scattering for an exterior Dirichlet program

    NASA Technical Reports Server (NTRS)

    Hariharan, S. I.

    1981-01-01

    Scattering due to a metallic cylinder which is in the field of a wire carrying a periodic current is considered. The location and shape of the cylinder is obtained with a far field measurement in between the wire and the cylinder. The same analysis is applicable in acoustics in the situation that the cylinder is a soft wall body and the wire is a line source. The associated direct problem in this situation is an exterior Dirichlet problem for the Helmholtz equation in two dimensions. An improved low frequency estimate for the solution of this problem using integral equation methods is presented. The far field measurements are related to the solutions of boundary integral equations in the low frequency situation. These solutions are expressed in terms of mapping function which maps the exterior of the unknown curve onto the exterior of a unit disk. The coefficients of the Laurent expansion of the conformal transformations are related to the far field coefficients. The first far field coefficient leads to the calculation of the distance between the source and the cylinder.

  11. Translating Knowledge: The role of Shared Learning in Bridging the Science-Application Divide

    NASA Astrophysics Data System (ADS)

    Moench, M.

    2014-12-01

    As the organizers of this session state: "Understanding and managing our future relation with the Earth requires research and knowledge spanning diverse fields, and integrated, societally-relevant science that is geared toward solutions." In most cases, however, integration is weak and scientific outputs do not match decision maker requirements. As a result, while scientific results may be highly relevant to society that relevance is operationally far from clear. This paper explores the use of shared learning processes to bridge the gap between the evolving body of scientific information on climate change and its relevance for resilience planning in cities across Asia. Examples related to understanding uncertainty, the evolution of scientific knowledge from different sources, and data extraction and presentation are given using experiences generated over five years of work as part of the Rockefeller Foundation supported Asian Cities Climate Change Resilience Network and other programs. Results suggest that processes supporting effective translation of knowledge between different sources and different applications are essential for the identification of solutions that respond to the dynamics and uncertainties inherent in global change processes.

  12. SchizConnect: Mediating neuroimaging databases on schizophrenia and related disorders for large-scale integration.

    PubMed

    Wang, Lei; Alpert, Kathryn I; Calhoun, Vince D; Cobia, Derin J; Keator, David B; King, Margaret D; Kogan, Alexandr; Landis, Drew; Tallis, Marcelo; Turner, Matthew D; Potkin, Steven G; Turner, Jessica A; Ambite, Jose Luis

    2016-01-01

    SchizConnect (www.schizconnect.org) is built to address the issues of multiple data repositories in schizophrenia neuroimaging studies. It includes a level of mediation--translating across data sources--so that the user can place one query, e.g. for diffusion images from male individuals with schizophrenia, and find out from across participating data sources how many datasets there are, as well as downloading the imaging and related data. The current version handles the Data Usage Agreements across different studies, as well as interpreting database-specific terminologies into a common framework. New data repositories can also be mediated to bring immediate access to existing datasets. Compared with centralized, upload data sharing models, SchizConnect is a unique, virtual database with a focus on schizophrenia and related disorders that can mediate live data as information is being updated at each data source. It is our hope that SchizConnect can facilitate testing new hypotheses through aggregated datasets, promoting discovery related to the mechanisms underlying schizophrenic dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. A two-channel, spectrally degenerate polarization entangled source on chip

    NASA Astrophysics Data System (ADS)

    Sansoni, Linda; Luo, Kai Hong; Eigner, Christof; Ricken, Raimund; Quiring, Viktor; Herrmann, Harald; Silberhorn, Christine

    2017-12-01

    Integrated optics provides the platform for the experimental implementation of highly complex and compact circuits for quantum information applications. In this context integrated waveguide sources represent a powerful resource for the generation of quantum states of light due to their high brightness and stability. However, the confinement of the light in a single spatial mode limits the realization of multi-channel sources. Due to this challenge one of the most adopted sources in quantum information processes, i.e. a source which generates spectrally indistinguishable polarization entangled photons in two different spatial modes, has not yet been realized in a fully integrated platform. Here we overcome this limitation by suitably engineering two periodically poled waveguides and an integrated polarization splitter in lithium niobate. This source produces polarization entangled states with fidelity of F = 0.973 ±0.003 and a test of Bell's inequality results in a violation larger than 14 standard deviations. It can work both in pulsed and continuous wave regime. This device represents a new step toward the implementation of fully integrated circuits for quantum information applications.

  14. Investigating the Use of Integrated Instructions to Reduce the Cognitive Load Associated with Doing Practical Work in Secondary School Science

    NASA Astrophysics Data System (ADS)

    Haslam, Carolyn Yvonne; Hamilton, Richard Joseph

    2010-09-01

    This study investigated the effects of integrated illustrations on understanding instructions for practical work in science. Ninety-six secondary school students who were unfamiliar with the target content knowledge and practical equipment took part. The students were divided into two conditions: (1) modified instructions containing integrated text and illustrations, and (2) conventional instructions containing text only. Modified instructions produced significantly higher levels of performance on task, lower time to completion and perceived cognitive load and task difficulty, higher relative efficiency score, and higher post-test scores than the conventional instructions. When learners are inexperienced and the information is complex, the results suggest that physically integrating mutually referring sources of information reduces cognitive load, and therefore makes practical work instructions easier to understand.

  15. Integrated watershed- and farm-scale modeling framework for targeting critical source areas while maintaining farm economic viability.

    PubMed

    Ghebremichael, Lula T; Veith, Tamie L; Hamlett, James M

    2013-01-15

    Quantitative risk assessments of pollution and data related to the effectiveness of mitigating best management practices (BMPs) are important aspects of nonpoint source pollution control efforts, particularly those driven by specific water quality objectives and by measurable improvement goals, such as the total maximum daily load (TMDL) requirements. Targeting critical source areas (CSAs) that generate disproportionately high pollutant loads within a watershed is a crucial step in successfully controlling nonpoint source pollution. The importance of watershed simulation models in assisting with the quantitative assessments of CSAs of pollution (relative to their magnitudes and extents) and of the effectiveness of associated BMPs has been well recognized. However, due to the distinct disconnect between the hydrological scale in which these models conduct their evaluation and the farm scale at which feasible BMPs are actually selected and implemented, and due to the difficulty and uncertainty involved in transferring watershed model data to farm fields, there are limited practical applications of these tools in the current nonpoint source pollution control efforts by conservation specialists for delineating CSAs and planning targeting measures. There are also limited approaches developed that can assess impacts of CSA-targeted BMPs on farm productivity and profitability together with the assessment of water quality improvements expected from applying these measures. This study developed a modeling framework that integrates farm economics and environmental aspects (such as identification and mitigation of CSAs) through joint use of watershed- and farm-scale models in a closed feedback loop. The integration of models in a closed feedback loop provides a way for environmental changes to be evaluated with regard to the impact on the practical aspects of farm management and economics, adjusted or reformulated as necessary, and revaluated with respect to effectiveness of environmental mitigation at the farm- and watershed-levels. This paper also outlines steps needed to extract important CSA-related information from a watershed model to help inform targeting decisions at the farm scale. The modeling framework is demonstrated with two unique case studies in the northeastern United States (New York and Vermont), with supporting data from numerous published, location-specific studies at both the watershed and farm scales. Using the integrated modeling framework, it can be possible to compare the costs (in terms of changes required in farm system components or financial compensations for retiring crop lands) and benefits (in terms of measurable water quality improvement goals) of implementing targeted BMPs. This multi-scale modeling approach can be used in the multi-objective task of mitigating CSAs of pollution to meet water quality goals while maintaining farm-level economic viability. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Integrating Space Communication Network Capabilities via Web Portal Technologies

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.; Lee, Carlyn-Ann; Lau, Chi-Wung; Cheung, Kar-Ming; Levesque, Michael; Carruth, Butch; Coffman, Adam; Wallace, Mike

    2014-01-01

    We have developed a service portal prototype as part of an investigation into the feasibility of using Java portlet technology as a means of providing integrated access to NASA communications network services. Portal servers provide an attractive platform for this role due to the various built-in collaboration applications they can provide, combined with the possibility to develop custom inter-operating portlets to extent their functionality while preserving common presentation and behavior. This paper describes various options for integration of network services related to planning and scheduling, and results based on use of a popular open-source portal framework. Plans are underway to develop an operational SCaN Service Portal, building on the experiences reported here.

  17. JWST Pathfinder Telescope Integration

    NASA Technical Reports Server (NTRS)

    Matthews, Gary W.; Kennard, Scott H.; Broccolo, Ronald T.; Ellis, James M.; Daly, Elizabeth A.; Hahn, Walter G.; Amon, John N.; Mt. Pleasant, Stephen M.; Texter, Scott; Atkinson, Charles B.; hide

    2015-01-01

    The James Webb Space Telescope (JWST) is a 6.5m, segmented, IR telescope that will explore the first light of the universe after the big bang. In 2014, a major risk reduction effort related to the Alignment, Integration, and Test (AI&T) of the segmented telescope was completed. The Pathfinder telescope includes two Primary Mirror Segment Assemblies (PMSA's) and the Secondary Mirror Assembly (SMA) onto a flight-like composite telescope backplane. This pathfinder allowed the JWST team to assess the alignment process and to better understand the various error sources that need to be accommodated in the flight build. The successful completion of the Pathfinder Telescope provides a final integration roadmap for the flight operations that will start in August 2015.

  18. Energy Systems Integration Facility (ESIF): Golden, CO - Energy Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheppy, Michael; VanGeet, Otto; Pless, Shanti

    2015-03-01

    At NREL's Energy Systems Integration Facility (ESIF) in Golden, Colo., scientists and engineers work to overcome challenges related to how the nation generates, delivers and uses energy by modernizing the interplay between energy sources, infrastructure, and data. Test facilities include a megawatt-scale ac electric grid, photovoltaic simulators and a load bank. Additionally, a high performance computing data center (HPCDC) is dedicated to advancing renewable energy and energy efficient technologies. A key design strategy is to use waste heat from the HPCDC to heat parts of the building. The ESIF boasts an annual EUI of 168.3 kBtu/ft2. This article describes themore » building's procurement, design and first year of performance.« less

  19. Thermophotovoltaic Energy Conversion for Space Applications

    NASA Astrophysics Data System (ADS)

    Teofilo, V. L.; Choong, P.; Chen, W.; Chang, J.; Tseng, Y.-L.

    2006-01-01

    Thermophotovoltaic (TPV) energy conversion cells have made steady and over the years considerable progress since first evaluated by Lockheed Martin for direct conversion using nuclear power sources in the mid 1980s. The design trades and evaluations for application to the early defensive missile satellites of the Strategic Defense Initiative found the cell technology to be immature with unacceptably low cell efficiencies comparable to thermoelectric of <10%. Rapid advances in the epitaxial growth technology for ternary compound semiconductors, novel double hetero-structure junctions, innovative monolithic integrated cell architecture, and bandpass tandem filter have, in concert, significantly improved cell efficiencies to 25% with the promise of 35% using solar cell like multi-junction approach in the near future. Recent NASA sponsored design and feasibility testing programs have demonstrated the potential for 19% system efficiency for 100 We radioisotopic power sources at an integrated specific power of ~14 We/kg. Current state of TPV cell technology however limits the operating temperature of the converter cells to < 400K due to radiator mass consideration. This limitation imposes no system mass penalty for the low power application for use with radioisotopes power sources because of the high specific power of the TPV cell converters. However, the application of TPV energy conversion for high power sources has been perceived as having a major impediment above 1 kWe due to the relative low waste heat rejection temperature. We explore this limitation and compare the integrated specific power of TPV converters with current and projected TPV cells with other advanced space power conversion technologies. We find that when the redundancy needed required for extended space exploration missions is considered, the TPV converters have a much higher range of applicability then previously understood. Furthermore, we believe that with a relatively modest modifications of the current epitaxial growth in MOCVD, an optimal cell architecture for elevated TPV operation can be found to out-perform the state-of-the-art TPV at an elevated temperature.

  20. Sequencing the Cortical Processing of Pitch-Evoking Stimuli using EEG Analysis and Source Estimation

    PubMed Central

    Butler, Blake E.; Trainor, Laurel J.

    2012-01-01

    Cues to pitch include spectral cues that arise from tonotopic organization and temporal cues that arise from firing patterns of auditory neurons. fMRI studies suggest a common pitch center is located just beyond primary auditory cortex along the lateral aspect of Heschl’s gyrus, but little work has examined the stages of processing for the integration of pitch cues. Using electroencephalography, we recorded cortical responses to high-pass filtered iterated rippled noise (IRN) and high-pass filtered complex harmonic stimuli, which differ in temporal and spectral content. The two stimulus types were matched for pitch saliency, and a mismatch negativity (MMN) response was elicited by infrequent pitch changes. The P1 and N1 components of event-related potentials (ERPs) are thought to arise from primary and secondary auditory areas, respectively, and to result from simple feature extraction. MMN is generated in secondary auditory cortex and is thought to act on feature-integrated auditory objects. We found that peak latencies of both P1 and N1 occur later in response to IRN stimuli than to complex harmonic stimuli, but found no latency differences between stimulus types for MMN. The location of each ERP component was estimated based on iterative fitting of regional sources in the auditory cortices. The sources of both the P1 and N1 components elicited by IRN stimuli were located dorsal to those elicited by complex harmonic stimuli, whereas no differences were observed for MMN sources across stimuli. Furthermore, the MMN component was located between the P1 and N1 components, consistent with fMRI studies indicating a common pitch region in lateral Heschl’s gyrus. These results suggest that while the spectral and temporal processing of different pitch-evoking stimuli involves different cortical areas during early processing, by the time the object-related MMN response is formed, these cues have been integrated into a common representation of pitch. PMID:22740836

  1. [Applications of GIS in biomass energy source research].

    PubMed

    Su, Xian-Ming; Wang, Wu-Kui; Li, Yi-Wei; Sun, Wen-Xiang; Shi, Hai; Zhang, Da-Hong

    2010-03-01

    Biomass resources have the characteristics of widespread and dispersed distribution, which have close relations to the environment, climate, soil, and land use, etc. Geographic information system (GIS) has the functions of spatial analysis and the flexibility of integrating with other application models and algorithms, being of predominance to the biomass energy source research. This paper summarized the researches on the GIS applications in biomass energy source research, with the focus in the feasibility study of bioenergy development, assessment of biomass resources amount and distribution, layout of biomass exploitation and utilization, evaluation of gaseous emission from biomass burning, and biomass energy information system. Three perspectives of GIS applications in biomass energy source research were proposed, i. e., to enrich the data source, to improve the capacity on data processing and decision-support, and to generate the online proposal.

  2. Design structure for in-system redundant array repair in integrated circuits

    DOEpatents

    Bright, Arthur A.; Crumley, Paul G.; Dombrowa, Marc; Douskey, Steven M.; Haring, Rudolf A.; Oakland, Steven F.; Quellette, Michael R.; Strissel, Scott A.

    2008-11-25

    A design structure for repairing an integrated circuit during operation of the integrated circuit. The integrated circuit comprising of a multitude of memory arrays and a fuse box holding control data for controlling redundancy logic of the arrays. The design structure provides the integrated circuit with a control data selector for passing the control data from the fuse box to the memory arrays; providing a source of alternate control data, external of the integrated circuit; and connecting the source of alternate control data to the control data selector. The design structure further passes the alternate control data from the source thereof, through the control data selector and to the memory arrays to control the redundancy logic of the memory arrays.

  3. Comparison of a new integrated current source with the modified Howland circuit for EIT applications.

    PubMed

    Hong, Hongwei; Rahal, Mohamad; Demosthenous, Andreas; Bayford, Richard H

    2009-10-01

    Multi-frequency electrical impedance tomography (MF-EIT) systems require current sources that are accurate over a wide frequency range (1 MHz) and with large load impedance variations. The most commonly employed current source design in EIT systems is the modified Howland circuit (MHC). The MHC requires tight matching of resistors to achieve high output impedance and may suffer from instability over a wide frequency range in an integrated solution. In this paper, we introduce a new integrated current source design in CMOS technology and compare its performance with the MHC. The new integrated design has advantages over the MHC in terms of power consumption and area. The output current and the output impedance of both circuits were determined through simulations and measurements over the frequency range of 10 kHz to 1 MHz. For frequencies up to 1 MHz, the measured maximum variation of the output current for the integrated current source is 0.8% whereas for the MHC the corresponding value is 1.5%. Although the integrated current source has an output impedance greater than 1 MOmega up to 1 MHz in simulations, in practice, the impedance is greater than 160 kOmega up to 1 MHz due to the presence of stray capacitance.

  4. Slow Temporal Integration Enables Robust Neural Coding and Perception of a Cue to Sound Source Location.

    PubMed

    Brown, Andrew D; Tollin, Daniel J

    2016-09-21

    In mammals, localization of sound sources in azimuth depends on sensitivity to interaural differences in sound timing (ITD) and level (ILD). Paradoxically, while typical ILD-sensitive neurons of the auditory brainstem require millisecond synchrony of excitatory and inhibitory inputs for the encoding of ILDs, human and animal behavioral ILD sensitivity is robust to temporal stimulus degradations (e.g., interaural decorrelation due to reverberation), or, in humans, bilateral clinical device processing. Here we demonstrate that behavioral ILD sensitivity is only modestly degraded with even complete decorrelation of left- and right-ear signals, suggesting the existence of a highly integrative ILD-coding mechanism. Correspondingly, we find that a majority of auditory midbrain neurons in the central nucleus of the inferior colliculus (of chinchilla) effectively encode ILDs despite complete decorrelation of left- and right-ear signals. We show that such responses can be accounted for by relatively long windows of bilateral excitatory-inhibitory interaction, which we explicitly measure using trains of narrowband clicks. Neural and behavioral data are compared with the outputs of a simple model of ILD processing with a single free parameter, the duration of excitatory-inhibitory interaction. Behavioral, neural, and modeling data collectively suggest that ILD sensitivity depends on binaural integration of excitation and inhibition within a ≳3 ms temporal window, significantly longer than observed in lower brainstem neurons. This relatively slow integration potentiates a unique role for the ILD system in spatial hearing that may be of particular importance when informative ITD cues are unavailable. In mammalian hearing, interaural differences in the timing (ITD) and level (ILD) of impinging sounds carry critical information about source location. However, natural sounds are often decorrelated between the ears by reverberation and background noise, degrading the fidelity of both ITD and ILD cues. Here we demonstrate that behavioral ILD sensitivity (in humans) and neural ILD sensitivity (in single neurons of the chinchilla auditory midbrain) remain robust under stimulus conditions that render ITD cues undetectable. This result can be explained by "slow" temporal integration arising from several-millisecond-long windows of excitatory-inhibitory interaction evident in midbrain, but not brainstem, neurons. Such integrative coding can account for the preservation of ILD sensitivity despite even extreme temporal degradations in ecological acoustic stimuli. Copyright © 2016 the authors 0270-6474/16/369908-14$15.00/0.

  5. Semantic Web meets Integrative Biology: a survey.

    PubMed

    Chen, Huajun; Yu, Tong; Chen, Jake Y

    2013-01-01

    Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.

  6. Pedagogy and Primary Sources: Outcomes of the Library of Congress' Professional Development Program, Teaching with Primary Sources at Loyola

    ERIC Educational Resources Information Center

    Fry, Michelle L.

    2010-01-01

    Until recently, few K-12 teachers outside of social studies have integrated primary sources in classroom instruction. Integrating primary sources in educational practice does require an uncommon pedagogical understanding. Addressing this K-12 educator need is the Library of Congress. Recently, the Library implemented a national educator…

  7. Anatomy education for the YouTube generation.

    PubMed

    Barry, Denis S; Marzouk, Fadi; Chulak-Oglu, Kyrylo; Bennett, Deirdre; Tierney, Paul; O'Keeffe, Gerard W

    2016-01-01

    Anatomy remains a cornerstone of medical education despite challenges that have seen a significant reduction in contact hours over recent decades; however, the rise of the "YouTube Generation" or "Generation Connected" (Gen C), offers new possibilities for anatomy education. Gen C, which consists of 80% Millennials, actively interact with social media and integrate it into their education experience. Most are willing to merge their online presence with their degree programs by engaging with course materials and sharing their knowledge freely using these platforms. This integration of social media into undergraduate learning, and the attitudes and mindset of Gen C, who routinely creates and publishes blogs, podcasts, and videos online, has changed traditional learning approaches and the student/teacher relationship. To gauge this, second year undergraduate medical and radiation therapy students (n = 73) were surveyed regarding their use of online social media in relation to anatomy learning. The vast majority of students had employed web-based platforms to source information with 78% using YouTube as their primary source of anatomy-related video clips. These findings suggest that the academic anatomy community may find value in the integration of social media into blended learning approaches in anatomy programs. This will ensure continued connection with the YouTube generation of students while also allowing for academic and ethical oversight regarding the use of online video clips whose provenance may not otherwise be known. © 2015 American Association of Anatomists.

  8. Design, Fabrication, and Characterization of Carbon Nanotube Field Emission Devices for Advanced Applications

    NASA Astrophysics Data System (ADS)

    Radauscher, Erich Justin

    Carbon nanotubes (CNTs) have recently emerged as promising candidates for electron field emission (FE) cathodes in integrated FE devices. These nanostructured carbon materials possess exceptional properties and their synthesis can be thoroughly controlled. Their integration into advanced electronic devices, including not only FE cathodes, but sensors, energy storage devices, and circuit components, has seen rapid growth in recent years. The results of the studies presented here demonstrate that the CNT field emitter is an excellent candidate for next generation vacuum microelectronics and related electron emission devices in several advanced applications. The work presented in this study addresses determining factors that currently confine the performance and application of CNT-FE devices. Characterization studies and improvements to the FE properties of CNTs, along with Micro-Electro-Mechanical Systems (MEMS) design and fabrication, were utilized in achieving these goals. Important performance limiting parameters, including emitter lifetime and failure from poor substrate adhesion, are examined. The compatibility and integration of CNT emitters with the governing MEMS substrate (i.e., polycrystalline silicon), and its impact on these performance limiting parameters, are reported. CNT growth mechanisms and kinetics were investigated and compared to silicon (100) to improve the design of CNT emitter integrated MEMS based electronic devices, specifically in vacuum microelectronic device (VMD) applications. Improved growth allowed for design and development of novel cold-cathode FE devices utilizing CNT field emitters. A chemical ionization (CI) source based on a CNT-FE electron source was developed and evaluated in a commercial desktop mass spectrometer for explosives trace detection. This work demonstrated the first reported use of a CNT-based ion source capable of collecting CI mass spectra. The CNT-FE source demonstrated low power requirements, pulsing capabilities, and average lifetimes of over 320 hours when operated in constant emission mode under elevated pressures, without sacrificing performance. Additionally, a novel packaged ion source for miniature mass spectrometer applications using CNT emitters, a MEMS based Nier-type geometry, and a Low Temperature Cofired Ceramic (LTCC) 3D scaffold with integrated ion optics were developed and characterized. While previous research has shown other devices capable of collecting ion currents on chip, this LTCC packaged MEMS micro-ion source demonstrated improvements in energy and angular dispersion as well as the ability to direct the ions out of the packaged source and towards a mass analyzer. Simulations and experimental design, fabrication, and characterization were used to make these improvements. Finally, novel CNT-FE devices were developed to investigate their potential to perform as active circuit elements in VMD circuits. Difficulty integrating devices at micron-scales has hindered the use of vacuum electronic devices in integrated circuits, despite the unique advantages they offer in select applications. Using a combination of particle trajectory simulation and experimental characterization, device performance in an integrated platform was investigated. Solutions to the difficulties in operating multiple devices in close proximity and enhancing electron transmission (i.e., reducing grid loss) are explored in detail. A systematic and iterative process was used to develop isolation structures that reduced crosstalk between neighboring devices from 15% on average, to nearly zero. Innovative geometries and a new operational mode reduced grid loss by nearly threefold, thereby improving transmission of the emitted cathode current to the anode from 25% in initial designs to 70% on average. These performance enhancements are important enablers for larger scale integration and for the realization of complex vacuum microelectronic circuits.

  9. Enhancing participatory approach in water resources management: development of a survey to evaluate stakeholders needs and priorities related to software capabilities

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Josef, S.; Boukalova, Z.; Triana, F.; Ghetta, M.; Sabbatini, T.; Bonari, E.; Cannata, M.; De Filippis, G.

    2016-12-01

    The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management) aims at simplifying the application of EU-water related Directives, by developing an open source and public domain, GIS-integrated platform for planning and management of ground- and surface-water resources. The FREEWAT platform is conceived as a canvas, where several distributed and physically-based simulation codes are virtually integrated. The choice of such codes was supported by the result of a survey performed by means of questionnaires distributed to 14 case study FREEWAT project partners and several stakeholders. This was performed in the first phase of the project within the WP 6 (Enhanced science and participatory approach evidence-based decision making), Task 6.1 (Definition of a "needs/tools" evaluation grid). About 30% among all the invited entities and institutions from several EU and non-EU Countries expressed their interest in contributing to the survey. Most of them were research institutions, government and geoenvironmental companies and river basin authorities.The result of the questionnaire provided a spectrum of needs and priorities of partners/stakeholders, which were addressed during the development phase of the FREEWAT platform. The main needs identified were related to ground- and surface-water quality, sustainable water management, interaction between groundwater/surface-water bodies, and design and management of Managed Aquifer Recharge schemes. Needs and priorities were then connected to the specific EU Directives and Regulations to be addressed.One of the main goals of the questionnaires was to collect information and suggestions regarding the use of existing commercial/open-source software tools to address needs and priorities, and regarding the needs to address specific water-related processes/problems.

  10. An integrated water system model considering hydrological and biogeochemical processes at basin scale: model construction and application

    NASA Astrophysics Data System (ADS)

    Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.

    2014-08-01

    Integrated water system modeling is a reasonable approach to provide scientific understanding and possible solutions to tackle the severe water crisis faced over the world and to promote the implementation of integrated river basin management. Such a modeling practice becomes more feasible nowadays due to better computing facilities and available data sources. In this study, the process-oriented water system model (HEXM) is developed by integrating multiple water related processes including hydrology, biogeochemistry, environment and ecology, as well as the interference of human activities. The model was tested in the Shaying River Catchment, the largest, highly regulated and heavily polluted tributary of Huai River Basin in China. The results show that: HEXM is well integrated with good performance on the key water related components in the complex catchments. The simulated daily runoff series at all the regulated and less-regulated stations matches observations, especially for the high and low flow events. The average values of correlation coefficient and coefficient of efficiency are 0.81 and 0.63, respectively. The dynamics of observed daily ammonia-nitrogen (NH4N) concentration, as an important index to assess water environmental quality in China, are well captured with average correlation coefficient of 0.66. Furthermore, the spatial patterns of nonpoint source pollutant load and grain yield are also simulated properly, and the outputs have good agreements with the statistics at city scale. Our model shows clear superior performance in both calibration and validation in comparison with the widely used SWAT model. This model is expected to give a strong reference for water system modeling in complex basins, and provide the scientific foundation for the implementation of integrated river basin management all over the world as well as the technical guide for the reasonable regulation of dams and sluices and environmental improvement in river basins.

  11. A functional magnetic resonance imaging investigation of short-term source and item memory for negative pictures.

    PubMed

    Mitchell, Karen J; Mather, Mara; Johnson, Marcia K; Raye, Carol L; Greene, Erich J

    2006-10-02

    We investigated the hypothesis that arousal recruits attention to item information, thereby disrupting working memory processes that help bind items to context. Using functional magnetic resonance imaging, we compared brain activity when participants remembered negative or neutral picture-location conjunctions (source memory) versus pictures only. Behaviorally, negative trials showed disruption of short-term source, but not picture, memory; long-term picture recognition memory was better for negative than for neutral pictures. Activity in areas involved in working memory and feature integration (precentral gyrus and its intersect with superior temporal gyrus) was attenuated on negative compared with neutral source trials relative to picture-only trials. Visual processing areas (middle occipital and lingual gyri) showed greater activity for negative than for neutral trials, especially on picture-only trials.

  12. Radiometric calibration of an airborne multispectral scanner. [of Thematic Mapper Simulator

    NASA Technical Reports Server (NTRS)

    Markham, Brian L.; Ahmad, Suraiya P.; Jackson, Ray D.; Moran, M. S.; Biggar, Stuart F.; Gellman, David I.; Slater, Philip N.

    1991-01-01

    The absolute radiometric calibration of the NS001 Thematic Mapper Simulator reflective channels was examined based on laboratory tests and in-flight comparisons to ground measurements. The NS001 data are calibrated in-flight by reference to the NS001 internal integrating sphere source. This source's power supply or monitoring circuitry exhibited greater instability in-flight during 1988-1989 than in the laboratory. Extrapolating laboratory behavior to in-flight data resulted in 7-20 percent radiance errors relative to ground measurements and atmospheric modeling. Assuming constancy in the source's output between laboraotry and in-flight resulted in generally smaller errors. Upgrades to the source's power supply and monitoring circuitry in 1990 improved its in-flight stability, though in-flight ground reflectance based calibration tests have not yet been performed.

  13. Integrative Literature Review: Ascertaining Discharge Readiness for Pediatrics After Anesthesia.

    PubMed

    Whitley, Deborah R

    2016-02-01

    Unplanned hospital readmissions after the administration of general anesthesia for ambulatory procedures may contribute to loss of reimbursement and assessment of financial penalties. Pediatric patients represent a unique anesthetic risk. The purpose of this integrative literature review was to ascertain specific criteria used to evaluate discharge readiness for pediatric patients after anesthesia. This study is an integrative review of literature. An integrative literature search was conducted and included literature sources dated January 2008 to November 2013. Key words included pediatric, anesthesia, discharge, criteria, standards, assessment, recovery, postoperative, postanesthesia, scale, score, outpatient, and ambulatory. Eleven literature sources that contributed significantly to the research question were identified. Levels of evidence included three systematic reviews, one randomized controlled trial, three cohort studies, two case series, and two expert opinions. This integrative literature review revealed evidence-based discharge criteria endorsing home readiness for postanesthesia pediatric patients should incorporate consideration for physiological baselines, professional judgment with regard to infant consciousness, and professional practice standards/guidelines. Additionally, identifying and ensuring discharge to a competent adult was considered imperative. Nurses should be aware that frequently used anesthesia scoring systems originated in the 1970s, and this review was unable to locate current literature examining the reliability and validity of their use in conjunction with modern anesthesia-related health care practices. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  14. Proceedings of the 24th Seismic Research Review: Nuclear Explosion Monitoring: Innovation and Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, N. Jill

    2002-09-17

    These proceedings contain papers prepared for the 24th Seismic Research Review: Nuclear Explosion Monitoring: Innovation and Integration, held 17-19 September, 2002 in Ponte Vedra Beach, Florida. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the precedingmore » year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.« less

  15. Integrated genome browser: visual analytics platform for genomics.

    PubMed

    Freese, Nowlan H; Norris, David C; Loraine, Ann E

    2016-07-15

    Genome browsers that support fast navigation through vast datasets and provide interactive visual analytics functions can help scientists achieve deeper insight into biological systems. Toward this end, we developed Integrated Genome Browser (IGB), a highly configurable, interactive and fast open source desktop genome browser. Here we describe multiple updates to IGB, including all-new capabilities to display and interact with data from high-throughput sequencing experiments. To demonstrate, we describe example visualizations and analyses of datasets from RNA-Seq, ChIP-Seq and bisulfite sequencing experiments. Understanding results from genome-scale experiments requires viewing the data in the context of reference genome annotations and other related datasets. To facilitate this, we enhanced IGB's ability to consume data from diverse sources, including Galaxy, Distributed Annotation and IGB-specific Quickload servers. To support future visualization needs as new genome-scale assays enter wide use, we transformed the IGB codebase into a modular, extensible platform for developers to create and deploy all-new visualizations of genomic data. IGB is open source and is freely available from http://bioviz.org/igb aloraine@uncc.edu. © The Author 2016. Published by Oxford University Press.

  16. A compact, all-optical, THz wave generator based on self-modulation in a slab photonic crystal waveguide with a single sub-nanometer graphene layer.

    PubMed

    Asadi, R; Ouyang, Z; Mohammd, M M

    2015-07-14

    We design a compact, all-optical THz wave generator based on self-modulation in a 1-D slab photonic crystal (PhC) waveguide with a single sub-nanometer graphene layer by using enhanced nonlinearity of graphene. It has been shown that at the bandgap edge of higher bands of a 1-D slab PhC, through only one sub-nanometer graphene layer we can obtain a compact, high modulation factor (about 0.98 percent), self-intensity modulator at a high frequency (about 0.6 THz) and low threshold intensity (about 15 MW per square centimeter), and further a compact, all-optical THz wave generator by integrating the self-modulator with a THz photodiode or photonic mixer. Such a THz source is expected to have a relatively high efficiency compared with conventional sources based on optical methods. The proposed THz source can find wide applications in THz science and technology, e.g., in THz imaging, THz sensors and detectors, THz communication systems, and THz optical integrated logic circuits.

  17. Complex within complex: integrative taxonomy reveals hidden diversity in Cicadetta brevipennis (Hemiptera: Cicadidae) and unexpected relationships with a song divergent relative

    USDA-ARS?s Scientific Manuscript database

    Multiple sources of data in combination are essential for species delimitation and classification of difficult taxonomic groups. Here we investigate a cicada taxon with unusual cryptic diversity and we attempt to resolve seemingly contradictory data sets. Cicada songs act as species-specific premati...

  18. On the sources of vegetation activity variation, and their relation with water balance in Mexico

    Treesearch

    F. Mora; L.R. Iverson

    1998-01-01

    Natural landscape surface processes are largely controlled by the relationship between climate and vegetation. Water balance integrates the effects of climate on patterns of vegetation distribution and productivity, and for that season, functional relationships can be established using water balance variables as predictors of vegetation response. In this study, we...

  19. Prospects for phenological monitoring in an arid southwestern U.S. rangeland using field observations with hyperspatial and moderate resolution imagery

    USDA-ARS?s Scientific Manuscript database

    Relating field observations of plant phenological events to remotely sensed depictions of land surface phenology remains a challenge to the vertical integration of data from disparate sources. This research conducted at the Jornada Basin Long-Term Ecological Research site in southern New Mexico cap...

  20. Prospects for phenological monitoring in an arid southwestern U.S. rangeland using field observations with hyperspatial and moderate resolution imagery

    USDA-ARS?s Scientific Manuscript database

    Relating field observations of plan phonological events to remotely sensed depictions of land surface phenology remains a challenge to the vertical integration of data from disparate sources. This research conducted at the Jornada Basin Long-Term Ecological Research site in southern New Mexico capit...

  1. Epistemic Beliefs and Their Relation to Multiple-Text Comprehension: A Norwegian Program of Research

    ERIC Educational Resources Information Center

    Ferguson, Leila E.

    2015-01-01

    Nowadays, students are required to use multiple information sources to complete tasks, both in and out of school. The beliefs that students hold about knowledge and knowing--their epistemic beliefs-- have been linked to successful integration of information across multiple texts. Framed by literature on epistemic belief research from an…

  2. Alternative forms of the Spencer-Fano equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inokuti, M.; Kowari, K.

    We point out a relation between the electron degradation spectra determined by two differing cross-section sets but subject to the same source. The relation takes a form of the Fredholm integral equation of the second kind and may be viewed as an alternative form of the Spencer-Fano equation. The relation leads to a precise definition of the partial degradation spectra of electrons of successive generations. It also provides a basis for the perturbation theory by which one calculates effects of small changes of cross-section data upon the electron degradation spectrum.

  3. Darwin Core: An Evolving Community-Developed Biodiversity Data Standard

    PubMed Central

    Wieczorek, John; Bloom, David; Guralnick, Robert; Blum, Stan; Döring, Markus; Giovanni, Renato; Robertson, Tim; Vieglais, David

    2012-01-01

    Biodiversity data derive from myriad sources stored in various formats on many distinct hardware and software platforms. An essential step towards understanding global patterns of biodiversity is to provide a standardized view of these heterogeneous data sources to improve interoperability. Fundamental to this advance are definitions of common terms. This paper describes the evolution and development of Darwin Core, a data standard for publishing and integrating biodiversity information. We focus on the categories of terms that define the standard, differences between simple and relational Darwin Core, how the standard has been implemented, and the community processes that are essential for maintenance and growth of the standard. We present case-study extensions of the Darwin Core into new research communities, including metagenomics and genetic resources. We close by showing how Darwin Core records are integrated to create new knowledge products documenting species distributions and changes due to environmental perturbations. PMID:22238640

  4. The GALAXIE all-optical FEL project

    NASA Astrophysics Data System (ADS)

    Rosenzweig, J. B.; Arab, E.; Andonian, G.; Cahill, A.; Fitzmorris, K.; Fukusawa, A.; Hoang, P.; Jovanovic, I.; Marcus, G.; Marinelli, A.; Murokh, A.; Musumeci, P.; Naranjo, B.; O'Shea, B.; O'Shea, F.; Ovodenko, A.; Pogorelsky, I.; Putterman, S.; Roberts, K.; Shumail, M.; Tantawi, S.; Valloni, A.; Yakimenko, V.; Xu, G.

    2012-12-01

    We describe a comprehensive project, funded under the DARPA AXiS program, to develop an all-optical table-top X-ray FEL based on dielectric acceleration and electromagnetic undulators, yielding a compact source of coherent X-rays for medical and related applications. The compactness of this source demands that high field (>GV/m) acceleration and undulation-inducing fields be employed, thus giving rise to the project's acronym: GV/m AcceLerator And X-ray Integrated Experiment (GALAXIE). There are numerous physics and technical hurdles to surmount in this ambitious scenario, and the integrated solutions include: a biharmonic photonic TW structure, 200 micron wavelength electromagnetic undulators, 5 μm laser development, ultra-high brighness magnetized/asymmetric emittance electron beam generation, and SASE FEL operation. We describe the overall design philosophy of the project, the innovative approaches to addressing the challenges presented by the design, and the significant progress towards realization of these approaches in the nine months since project initialization.

  5. The integration of claims to health-care: a programming approach.

    PubMed

    Anand, Paul

    2003-09-01

    The paper contributes to the use of social choice and welfare theory in health economics by developing and applying the integration of claims framework to health-care rationing. Related to Sen's critique of neo-classical welfare economics, the integration of claims framework recognises three primitive sources of claim: consequences, deontology and procedures. A taxonomy is presented with the aid of which it is shown that social welfare functions reflecting these claims individually or together, can be specified. Some of the resulting social choice rules can be regarded as generalisations of health-maximisation and all have normative justifications, though the justifications may not be universally acceptable. The paper shows how non-linear programming can be used to operationalise such choice rules and illustrates their differential impacts on the optimal provision of health-care. Following discussion of relations to the capabilities framework and the context in which rationing occurs, the paper concludes that the integration of claims provides a viable framework for modelling health-care rationing that is technically rigorous, general and tractable, as well as being consistent with relevant moral considerations and citizen preferences.

  6. 75 FR 30159 - Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support Air...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-28

    ...--Experimental Aircraft Association ELT--Emergency Locator Transmitter ES--Extended Squitter EUROCAE--European...--Security Certification and Accreditation Procedures SDA--System Design Assurance SIL--Source Integrity.... Surveillance Integrity Level 6. Source Integrity Level (SIL) and System Design Assurance (SDA) 7. Secondary...

  7. Integration of Landsat, Seasat, and other geo-data sources

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Blackwell, R. J.; Stromberg, W. D.

    1979-01-01

    The paper discusses integration of Landsat, Seasat, and other geographic information sources. Mosaicking of radar data and registration of radar to Landsat digital imagery are described, and six types of geophysical data, including gravity and magnetic measurements, are integrated and analyzed using image processing techniques.

  8. Organic aerosol components derived from 25 AMS data sets across Europe using a consistent ME-2 based source apportionment approach

    NASA Astrophysics Data System (ADS)

    Crippa, M.; Canonaco, F.; Lanz, V. A.; Äijälä, M.; Allan, J. D.; Carbone, S.; Capes, G.; Ceburnis, D.; Dall'Osto, M.; Day, D. A.; DeCarlo, P. F.; Ehn, M.; Eriksson, A.; Freney, E.; Hildebrandt Ruiz, L.; Hillamo, R.; Jimenez, J. L.; Junninen, H.; Kiendler-Scharr, A.; Kortelainen, A.-M.; Kulmala, M.; Laaksonen, A.; Mensah, A. A.; Mohr, C.; Nemitz, E.; O'Dowd, C.; Ovadnevaite, J.; Pandis, S. N.; Petäjä, T.; Poulain, L.; Saarikoski, S.; Sellegri, K.; Swietlicki, E.; Tiitta, P.; Worsnop, D. R.; Baltensperger, U.; Prévôt, A. S. H.

    2014-06-01

    Organic aerosols (OA) represent one of the major constituents of submicron particulate matter (PM1) and comprise a huge variety of compounds emitted by different sources. Three intensive measurement field campaigns to investigate the aerosol chemical composition all over Europe were carried out within the framework of the European Integrated Project on Aerosol Cloud Climate and Air Quality Interactions (EUCAARI) and the intensive campaigns of European Monitoring and Evaluation Programme (EMEP) during 2008 (May-June and September-October) and 2009 (February-March). In this paper we focus on the identification of the main organic aerosol sources and we define a standardized methodology to perform source apportionment using positive matrix factorization (PMF) with the multilinear engine (ME-2) on Aerodyne aerosol mass spectrometer (AMS) data. Our source apportionment procedure is tested and applied on 25 data sets accounting for two urban, several rural and remote and two high altitude sites; therefore it is likely suitable for the treatment of AMS-related ambient data sets. For most of the sites, four organic components are retrieved, improving significantly previous source apportionment results where only a separation in primary and secondary OA sources was possible. Generally, our solutions include two primary OA sources, i.e. hydrocarbon-like OA (HOA) and biomass burning OA (BBOA) and two secondary OA components, i.e. semi-volatile oxygenated OA (SV-OOA) and low-volatility oxygenated OA (LV-OOA). For specific sites cooking-related (COA) and marine-related sources (MSA) are also separated. Finally, our work provides a large overview of organic aerosol sources in Europe and an interesting set of highly time resolved data for modeling purposes.

  9. DASMiner: discovering and integrating data from DAS sources

    PubMed Central

    2009-01-01

    Background DAS is a widely adopted protocol for providing syntactic interoperability among biological databases. The popularity of DAS is due to a simplified and elegant mechanism for data exchange that consists of sources exposing their RESTful interfaces for data access. As a growing number of DAS services are available for molecular biology resources, there is an incentive to explore this protocol in order to advance data discovery and integration among these resources. Results We developed DASMiner, a Matlab toolkit for querying DAS data sources that enables creation of integrated biological models using the information available in DAS-compliant repositories. DASMiner is composed by a browser application and an API that work together to facilitate gathering of data from different DAS sources, which can be used for creating enriched datasets from multiple sources. The browser is used to formulate queries and navigate data contained in DAS sources. Users can execute queries against these sources in an intuitive fashion, without the need of knowing the specific DAS syntax for the particular source. Using the source's metadata provided by the DAS Registry, the browser's layout adapts to expose only the set of commands and coordinate systems supported by the specific source. For this reason, the browser can interrogate any DAS source, independently of the type of data being served. The API component of DASMiner may be used for programmatic access of DAS sources by programs in Matlab. Once the desired data is found during navigation, the query is exported in the format of an API call to be used within any Matlab application. We illustrate the use of DASMiner by creating integrative models of histone modification maps and protein-protein interaction networks. These enriched datasets were built by retrieving and integrating distributed genomic and proteomic DAS sources using the API. Conclusion The support of the DAS protocol allows that hundreds of molecular biology databases to be treated as a federated, online collection of resources. DASMiner enables full exploration of these resources, and can be used to deploy applications and create integrated views of biological systems using the information deposited in DAS repositories. PMID:19919683

  10. Flat-Spectrum Radio Sources as Likely Counterparts of Unidentified INTEGRAL Sources (Research Note)

    NASA Technical Reports Server (NTRS)

    Molina, M.; Landi, R.; Bassani, L.; Malizia, A.; Stephen, J. B.; Bazzano, A.; Bird, A. J.; Gehrels, N.

    2012-01-01

    Many sources in the fourth INTEGRAL/IBIS catalogue are still unidentified since they lack an optical counterpart. An important tool that can help in identifying and classifying these sources is the cross-correlation with radio catalogues, which are very sensitive and positionally accurate. Moreover, the radio properties of a source, such as the spectrum or morphology, could provide further insight into its nature. In particular, flat-spectrum radio sources at high Galactic latitudes are likely to be AGN, possibly associated to a blazar or to the compact core of a radio galaxy. Here we present a small sample of 6 sources extracted from the fourth INTEGRAL/IBIS catalogue that are still unidentified or unclassified, but which are very likely associated with a bright, flat-spectrum radio object. To confirm the association and to study the source X-ray spectral parameters, we performed X-ray follow-up observations with Swift/XRT of all objects. We report in this note the overall results obtained from this search and discuss the nature of each individual INTEGRAL source. We find that 5 of the 6 radio associations are also detected in X-rays; furthermore, in 3 cases they are the only counterpart found. More specifically, IGR J06073-0024 is a flat-spectrum radio quasar at z = 1.08, IGR J14488-4008 is a newly discovered radio galaxy, while IGR J18129-0649 is an AGN of a still unknown type. The nature of two sources (IGR J07225-3810 and IGR J19386-4653) is less well defined, since in both cases we find another X-ray source in the INTEGRAL error circle; nevertheless, the flat-spectrum radio source, likely to be a radio loud AGN, remains a viable and, in fact, a more convincing association in both cases. Only for the last object (IGR J11544-7618) could we not find any convincing counterpart since the radio association is not an X-ray emitter, while the only X-ray source seen in the field is a G star and therefore unlikely to produce the persistent emission seen by INTEGRAL.

  11. The use of the virtual source technique in computing scattering from periodic ocean surfaces.

    PubMed

    Abawi, Ahmad T

    2011-08-01

    In this paper the virtual source technique is used to compute scattering of a plane wave from a periodic ocean surface. The virtual source technique is a method of imposing boundary conditions using virtual sources, with initially unknown complex amplitudes. These amplitudes are then determined by applying the boundary conditions. The fields due to these virtual sources are given by the environment Green's function. In principle, satisfying boundary conditions on an infinite surface requires an infinite number of sources. In this paper, the periodic nature of the surface is employed to populate a single period of the surface with virtual sources and m surface periods are added to obtain scattering from the entire surface. The use of an accelerated sum formula makes it possible to obtain a convergent sum with relatively small number of terms (∼40). The accuracy of the technique is verified by comparing its results with those obtained using the integral equation technique.

  12. Content Integration across Multiple Documents Reduces Memory for Sources

    ERIC Educational Resources Information Center

    Braasch, Jason L. G.; McCabe, Rebecca M.; Daniel, Frances

    2016-01-01

    The current experiments systematically examined semantic content integration as a mechanism for explaining source inattention and forgetting when reading-to-remember multiple texts. For all 3 experiments, degree of semantic overlap was manipulated amongst messages provided by various information sources. In Experiment 1, readers' source…

  13. Immigrant community integration in world cities

    PubMed Central

    Lamanna, Fabio; Lenormand, Maxime; Salas-Olmedo, María Henar; Romanillos, Gustavo; Gonçalves, Bruno

    2018-01-01

    As a consequence of the accelerated globalization process, today major cities all over the world are characterized by an increasing multiculturalism. The integration of immigrant communities may be affected by social polarization and spatial segregation. How are these dynamics evolving over time? To what extent the different policies launched to tackle these problems are working? These are critical questions traditionally addressed by studies based on surveys and census data. Such sources are safe to avoid spurious biases, but the data collection becomes an intensive and rather expensive work. Here, we conduct a comprehensive study on immigrant integration in 53 world cities by introducing an innovative approach: an analysis of the spatio-temporal communication patterns of immigrant and local communities based on language detection in Twitter and on novel metrics of spatial integration. We quantify the Power of Integration of cities –their capacity to spatially integrate diverse cultures– and characterize the relations between different cultures when acting as hosts or immigrants. PMID:29538383

  14. Integrating the human element into the systems engineering process and MBSE methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tadros, Michael Samir

    In response to the challenges related to the increasing size and complexity of systems, organizations have recognized the need to integrate human considerations in the beginning stages of systems development. Human Systems Integration (HSI) seeks to accomplish this objective by incorporating human factors within systems engineering (SE) processes and methodologies, which is the focus of this paper. A representative set of HSI methods from multiple sources are organized, analyzed, and mapped to the systems engineering Vee-model. These methods are then consolidated and evaluated against the SE process and Models-Based Systems Engineering (MBSE) methodology to determine where and how they couldmore » integrate within systems development activities in the form of specific enhancements. Overall conclusions based on these evaluations are presented and future research areas are proposed.« less

  15. Warehousing re-annotated cancer genes for biomarker meta-analysis.

    PubMed

    Orsini, M; Travaglione, A; Capobianco, E

    2013-07-01

    Translational research in cancer genomics assigns a fundamental role to bioinformatics in support of candidate gene prioritization with regard to both biomarker discovery and target identification for drug development. Efforts in both such directions rely on the existence and constant update of large repositories of gene expression data and omics records obtained from a variety of experiments. Users who interactively interrogate such repositories may have problems in retrieving sample fields that present limited associated information, due for instance to incomplete entries or sometimes unusable files. Cancer-specific data sources present similar problems. Given that source integration usually improves data quality, one of the objectives is keeping the computational complexity sufficiently low to allow an optimal assimilation and mining of all the information. In particular, the scope of integrating intraomics data can be to improve the exploration of gene co-expression landscapes, while the scope of integrating interomics sources can be that of establishing genotype-phenotype associations. Both integrations are relevant to cancer biomarker meta-analysis, as the proposed study demonstrates. Our approach is based on re-annotating cancer-specific data available at the EBI's ArrayExpress repository and building a data warehouse aimed to biomarker discovery and validation studies. Cancer genes are organized by tissue with biomedical and clinical evidences combined to increase reproducibility and consistency of results. For better comparative evaluation, multiple queries have been designed to efficiently address all types of experiments and platforms, and allow for retrieval of sample-related information, such as cell line, disease state and clinical aspects. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Systems biology driven software design for the research enterprise.

    PubMed

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-06-25

    In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data.

  17. Discourse comprehension in L2: Making sense of what is not explicitly said.

    PubMed

    Foucart, Alice; Romero-Rivas, Carlos; Gort, Bernharda Lottie; Costa, Albert

    2016-12-01

    Using ERPs, we tested whether L2 speakers can integrate multiple sources of information (e.g., semantic, pragmatic information) during discourse comprehension. We presented native speakers and L2 speakers with three-sentence scenarios in which the final sentence was highly causally related, intermediately related, or causally unrelated to its context; its interpretation therefore required simple or complex inferences. Native speakers revealed a gradual N400-like effect, larger in the causally unrelated condition than in the highly related condition, and falling in-between in the intermediately related condition, replicating previous results. In the crucial intermediately related condition, L2 speakers behaved like native speakers, however, showing extra processing in a later time-window. Overall, the results show that, when reading, L2 speakers are able to process information from the local context and prior information (e.g., world knowledge) to build global coherence, suggesting that they process different sources of information to make inferences online during discourse comprehension, like native speakers. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Social network extraction based on Web: 3. the integrated superficial method

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.

    2018-03-01

    The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.

  19. Integrating an Automatic Judge into an Open Source LMS

    ERIC Educational Resources Information Center

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  20. What it takes to get proactive: An integrative multilevel model of the antecedents of personal initiative.

    PubMed

    Hong, Ying; Liao, Hui; Raub, Steffen; Han, Joo Hun

    2016-05-01

    Building upon and extending Parker, Bindl, and Strauss's (2010) theory of proactive motivation, we develop an integrated, multilevel model to examine how contextual factors shape employees' proactive motivational states and, through these proactive motivational states, influence their personal initiative behavior. Using data from a sample of hotels collected from 3 sources and over 2 time periods, we show that establishment-level initiative-enhancing human resource management (HRM) systems were positively related to departmental initiative climate, which was positively related to employee personal initiative through employee role-breadth self-efficacy. Further, department-level empowering leadership was positively related to initiative climate only when initiative-enhancing HRM systems were low. These findings offer interesting implications for research on personal initiative and for the management of employee proactivity in organizations. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. The excitation of long period seismic waves by a source spanning a structural discontinuity

    NASA Astrophysics Data System (ADS)

    Woodhouse, J. H.

    Simple theoretical results are obtained for the excitation of seismic waves by an indigenous seismic source in the case that the source volume is intersected by a structural discontinuity. In the long wavelength approximation the seismic radiation is identical to that of a point source placed on one side of the discontinuity or of a different point source placed on the other side. The moment tensors of these two equivalent sources are related by a specific linear transformation and may differ appreciably both in magnitude and geometry. Either of these sources could be obtained by linear inversion of seismic data but the physical interpretation is more complicated than in the usual case. A source which involved no volume change would, for example, yield an isotropic component if, during inversion, it were assumed to lie on the wrong side of the discontinuity. The problem of determining the true moment tensor of the source is indeterminate unless further assumptions are made about the stress glut distribution; one way to resolve this indeterminancy is to assume proportionality between the integrated stress glut on each side of the discontinuity.

  2. Piecewise synonyms for enhanced UMLS source terminology integration.

    PubMed

    Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J

    2007-10-11

    The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The open source Project Haystack initiative defines meta data and communication standards related to data from buildings and intelligent devices. The Project Haystack REST API defines standard formats and operations for exchanging Haystack tagged data over HTTP. The HaystackRuby gem wraps calls to this REST API to enable Ruby application to easily integrate data hosted on a Project Haystack compliant server. The HaystackRuby gem was developed at the National Renewable Energy Lab to support applications related to campus energy. We hope that this tool may be useful to others.

  4. Multisource geological data mining and its utilization of uranium resources exploration

    NASA Astrophysics Data System (ADS)

    Zhang, Jie-lin

    2009-10-01

    Nuclear energy as one of clear energy sources takes important role in economic development in CHINA, and according to the national long term development strategy, many more nuclear powers will be built in next few years, so it is a great challenge for uranium resources exploration. Research and practice on mineral exploration demonstrates that utilizing the modern Earth Observe System (EOS) technology and developing new multi-source geological data mining methods are effective approaches to uranium deposits prospecting. Based on data mining and knowledge discovery technology, this paper uses multi-source geological data to character electromagnetic spectral, geophysical and spatial information of uranium mineralization factors, and provides the technical support for uranium prospecting integrating with field remote sensing geological survey. Multi-source geological data used in this paper include satellite hyperspectral image (Hyperion), high spatial resolution remote sensing data, uranium geological information, airborne radiometric data, aeromagnetic and gravity data, and related data mining methods have been developed, such as data fusion of optical data and Radarsat image, information integration of remote sensing and geophysical data, and so on. Based on above approaches, the multi-geoscience information of uranium mineralization factors including complex polystage rock mass, mineralization controlling faults and hydrothermal alterations have been identified, the metallogenic potential of uranium has been evaluated, and some predicting areas have been located.

  5. Multiscale Metabolic Modeling: Dynamic Flux Balance Analysis on a Whole-Plant Scale1[W][OPEN

    PubMed Central

    Grafahrend-Belau, Eva; Junker, Astrid; Eschenröder, André; Müller, Johannes; Schreiber, Falk; Junker, Björn H.

    2013-01-01

    Plant metabolism is characterized by a unique complexity on the cellular, tissue, and organ levels. On a whole-plant scale, changing source and sink relations accompanying plant development add another level of complexity to metabolism. With the aim of achieving a spatiotemporal resolution of source-sink interactions in crop plant metabolism, a multiscale metabolic modeling (MMM) approach was applied that integrates static organ-specific models with a whole-plant dynamic model. Allowing for a dynamic flux balance analysis on a whole-plant scale, the MMM approach was used to decipher the metabolic behavior of source and sink organs during the generative phase of the barley (Hordeum vulgare) plant. It reveals a sink-to-source shift of the barley stem caused by the senescence-related decrease in leaf source capacity, which is not sufficient to meet the nutrient requirements of sink organs such as the growing seed. The MMM platform represents a novel approach for the in silico analysis of metabolism on a whole-plant level, allowing for a systemic, spatiotemporally resolved understanding of metabolic processes involved in carbon partitioning, thus providing a novel tool for studying yield stability and crop improvement. PMID:23926077

  6. Target/error overlap in jargonaphasia: The case for a one-source model, lexical and non-lexical summation, and the special status of correct responses.

    PubMed

    Olson, Andrew; Halloran, Elizabeth; Romani, Cristina

    2015-12-01

    We present three jargonaphasic patients who made phonological errors in naming, repetition and reading. We analyse target/response overlap using statistical models to answer three questions: 1) Is there a single phonological source for errors or two sources, one for target-related errors and a separate source for abstruse errors? 2) Can correct responses be predicted by the same distribution used to predict errors or do they show a completion boost (CB)? 3) Is non-lexical and lexical information summed during reading and repetition? The answers were clear. 1) Abstruse errors did not require a separate distribution created by failure to access word forms. Abstruse and target-related errors were the endpoints of a single overlap distribution. 2) Correct responses required a special factor, e.g., a CB or lexical/phonological feedback, to preserve their integrity. 3) Reading and repetition required separate lexical and non-lexical contributions that were combined at output. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Measurement of the Boltzmann constant by Johnson noise thermometry using a superconducting integrated circuit

    NASA Astrophysics Data System (ADS)

    Urano, C.; Yamazawa, K.; Kaneko, N.-H.

    2017-12-01

    We report on our measurement of the Boltzmann constant by Johnson noise thermometry (JNT) using an integrated quantum voltage noise source (IQVNS) that is fully implemented with superconducting integrated circuit technology. The IQVNS generates calculable pseudo white noise voltages to calibrate the JNT system. The thermal noise of a sensing resistor placed at the temperature of the triple point of water was measured precisely by the IQVNS-based JNT. We accumulated data of more than 429 200 s in total (over 6 d) and used the Akaike information criterion to estimate the fitting frequency range for the quadratic model to calculate the Boltzmann constant. Upon detailed evaluation of the uncertainty components, the experimentally obtained Boltzmann constant was k=1.380 6436× {{10}-23} J K-1 with a relative combined uncertainty of 10.22× {{10}-6} . The value of k is relatively -3.56× {{10}-6} lower than the CODATA 2014 value (Mohr et al 2016 Rev. Mod. Phys. 88 035009).

  8. An Architecture for Standardized Terminology Services by Wrapping and Integration of Existing Applications

    PubMed Central

    Cornet, Ronald; Prins, Antoon K.

    2003-01-01

    Research on terminology services has resulted in development of applications and definition of standards, but has not yet led to widespread use of (standardized) terminology services in practice. Current terminology services offer functionality both for concept representation and lexical knowledge representation, hampering the possibility of combining the strengths of dedicated (concept and lexical) services. We therefore propose an extensible architecture in which concept-related and lexicon-related components are integrated and made available through a uniform interface. This interface can be extended in order to conform to existing standards, making it possible to use dedicated (third-party) components in a standardized way. As a proof of concept and a reference implementation, a SOAP-based Java implementation of the terminology service is being developed, providing wrappers for Protégé and UMLS Knowledge Source Server. Other systems, such as the Description Logic-based reasoner RACER can be easily integrated by implementation of an appropriate wrapper. PMID:14728158

  9. Platform-dependent optimization considerations for mHealth applications

    NASA Astrophysics Data System (ADS)

    Kaghyan, Sahak; Akopian, David; Sarukhanyan, Hakob

    2015-03-01

    Modern mobile devices contain integrated sensors that enable multitude of applications in such fields as mobile health (mHealth), entertainment, sports, etc. Human physical activity monitoring is one of such the emerging applications. There exists a range of challenges that relate to activity monitoring tasks, and, particularly, exploiting optimal solutions and architectures for respective mobile software application development. This work addresses mobile computations related to integrated inertial sensors for activity monitoring, such as accelerometers, gyroscopes, integrated global positioning system (GPS) and WLAN-based positioning, that can be used for activity monitoring. Some of the aspects will be discussed in this paper. Each of the sensing data sources has its own characteristics such as specific data formats, data rates, signal acquisition durations etc., and these specifications affect energy consumption. Energy consumption significantly varies as sensor data acquisition is followed by data analysis including various transformations and signal processing algorithms. This paper will address several aspects of more optimal activity monitoring implementations exploiting state-of-the-art capabilities of modern platforms.

  10. Leveraging Web Services in Providing Efficient Discovery, Retrieval, and Integration of NASA-Sponsored Observations and Predictions

    NASA Astrophysics Data System (ADS)

    Bambacus, M.; Alameh, N.; Cole, M.

    2006-12-01

    The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.

  11. Biomarkers in Transit Reveal the Nature of Fluvial Integration

    NASA Astrophysics Data System (ADS)

    Ponton, C.; West, A.; Feakins, S. J.; Galy, V.

    2013-12-01

    The carbon and hydrogen isotopic composition of vascular plant leaf waxes are common proxies for hydrologic and vegetation change. Sedimentary archives off major river systems are prime targets for continental paleoclimate studies under the assumption that rivers integrate changes in terrestrial organic carbon (OC) composition over their drainage basin. However, the proportional contribution of sources within the basin (e.g. head waters vs. floodplain) and the transit times of OC through the fluvial system remain largely unknown. This lack of quantifiable information about the proportions and timescales of integration within large catchments poses a challenge for paleoclimate reconstructions. To examine the sources of terrestrial OC eroded and supplied to a river system and the spatial distribution of these sources, we use compound specific isotope analysis (i.e. δ13C, Δ14C, and δD) on plant-derived leaf waxes, filtered from large volumes of river water (20-200L) along a major river system. We selected the Kosñipata River that drains the western flank of the Andes in Peru, joins the Madre de Dios River across the Amazonian floodplain, and ultimately contributes to the Amazon River. Our study encompassed an elevation gradient of >4 km, in an almost entirely forested catchment. Precipitation δD values vary by >50‰ due to the isotopic effect of elevation, a feature we exploit to identify the sources of plant wax n-alkanoic acids transported by the river. We used the δD plant wax values from tributary rivers as source constrains and the main stem values as the integrated signal. In addition, compound specific radiocarbon on individual chain length n-alkanoic acids provide unprecedented detail on the integrated age of these compounds. Preliminary results have established that 1) most of the OC transport occurs in the wet season; 2) total carbon transport in the Madre de Dios is dominated by lowland sources because of the large floodplain area, but initial data suggest that OC from high elevations may be proportionally overrepresented relative to areal extent, with possibly important implications for biomarker isotope composition; 3) timescales of different biomarkers vary considerably; 4) the composition of OC varies downstream and with depth stratification within large rivers. We filtered >1000L of river water in this remote location during the wet season, and are presently replicating that study during the dry season, providing a seasonal comparison of OC transport in this major river system.

  12. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Fine Particulate Pollution and Source Apportionment in the Urban Centers for Africa, Asia and Latin America

    NASA Astrophysics Data System (ADS)

    Guttikunda, S. K.; Johnson, T. M.; Procee, P.

    2004-12-01

    Fossil fuel combustion for domestic cooking and heating, power generation, industrial processes, and motor vehicles are the primary sources of air pollution in the developing country cities. Over the past twenty years, major advances have been made in understanding the social and economic consequences of air pollution. In both industrialized and developing countries, it has been shown that air pollution from energy combustion has detrimental impacts on human health and the environment. Lack of information on the sectoral contributions to air pollution - especially fine particulates, is one of the typical constraints for an effective integrated urban air quality management program. Without such information, it is difficult, if not impossible, for decision makers to provide policy advice and make informed investment decisions related to air quality improvements in developing countries. This also raises the need for low-cost ways of determining the principal sources of fine PM for a proper planning and decision making. The project objective is to develop and verify a methodology to assess and monitor the sources of PM, using a combination of ground-based monitoring and source apportionment techniques. This presentation will focus on four general tasks: (1) Review of the science and current activities in the combined use of monitoring data and modeling for better understanding of PM pollution. (2) Review of recent advances in atmospheric source apportionment techniques (e.g., principal component analysis, organic markers, source-receptor modeling techniques). (3) Develop a general methodology to use integrated top-down and bottom-up datasets. (4) Review of a series of current case studies from Africa, Asia and Latin America and the methodologies applied to assess the air pollution and its sources.

  14. Research considerations when studying disasters.

    PubMed

    Cox, Catherine Wilson

    2008-03-01

    Nurses play an integral role during disasters because they are called upon more than any other health care professional during disaster response efforts; consequently, nurse researchers are interested in studying the issues that impact nurses in the aftermath of a disaster. This article offers research considerations for nurse scientists when developing proposals related to disaster research and identifies resources and possible funding sources for their projects.

  15. Social Studies Pre-Service Teachers' Views on the EU Membership Process: A Multidimensional Evaluation

    ERIC Educational Resources Information Center

    Gençtürk, Ebru

    2015-01-01

    One of the general purposes of Social Studies is to integrate individuals with the social life by providing accurate knowledge and skills about their environment and society. As well as the role of Social Studies in raising consciousness on EU relations, Social Studies teachers' views about EU membership and the sources of these views are…

  16. SCALEUS: Semantic Web Services Integration for Biomedical Applications.

    PubMed

    Sernadela, Pedro; González-Castro, Lorena; Oliveira, José Luís

    2017-04-01

    In recent years, we have witnessed an explosion of biological data resulting largely from the demands of life science research. The vast majority of these data are freely available via diverse bioinformatics platforms, including relational databases and conventional keyword search applications. This type of approach has achieved great results in the last few years, but proved to be unfeasible when information needs to be combined or shared among different and scattered sources. During recent years, many of these data distribution challenges have been solved with the adoption of semantic web. Despite the evident benefits of this technology, its adoption introduced new challenges related with the migration process, from existent systems to the semantic level. To facilitate this transition, we have developed Scaleus, a semantic web migration tool that can be deployed on top of traditional systems in order to bring knowledge, inference rules, and query federation to the existent data. Targeted at the biomedical domain, this web-based platform offers, in a single package, straightforward data integration and semantic web services that help developers and researchers in the creation process of new semantically enhanced information systems. SCALEUS is available as open source at http://bioinformatics-ua.github.io/scaleus/ .

  17. Automated Ontology Alignment with Fuselets for Community of Interest (COI) Integration

    DTIC Science & Technology

    2008-09-01

    Search Example ............................................................................... 22 Figure 8 - Federated Search Example Revisited...integrating information from various sources through a single query. This is the traditional federated search problem, where the sources don’t...Figure 7 - Federated Search Example For the data sources in the graphic above, the ontologies align in a fairly straightforward manner

  18. A review on automated sorting of source-separated municipal solid waste for recycling.

    PubMed

    Gundupalli, Sathish Paulraj; Hait, Subrata; Thakur, Atul

    2017-02-01

    A crucial prerequisite for recycling forming an integral part of municipal solid waste (MSW) management is sorting of useful materials from source-separated MSW. Researchers have been exploring automated sorting techniques to improve the overall efficiency of recycling process. This paper reviews recent advances in physical processes, sensors, and actuators used as well as control and autonomy related issues in the area of automated sorting and recycling of source-separated MSW. We believe that this paper will provide a comprehensive overview of the state of the art and will help future system designers in the area. In this paper, we also present research challenges in the field of automated waste sorting and recycling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. The use of composite fingerprints to quantify sediment sources in a wildfire impacted landscape, Alberta, Canada.

    PubMed

    Stone, M; Collins, A L; Silins, U; Emelko, M B; Zhang, Y S

    2014-03-01

    There is increasing global concern regarding the impacts of large scale land disturbance by wildfire on a wide range of water and related ecological services. This study explores the impact of the 2003 Lost Creek wildfire in the Crowsnest River basin, Alberta, Canada on regional scale sediment sources using a tracing approach. A composite geochemical fingerprinting procedure was used to apportion the sediment efflux among three key spatial sediment sources: 1) unburned (reference) 2) burned and 3) burned sub-basins that were subsequently salvage logged. Spatial sediment sources were characterized by collecting time-integrated suspended sediment samples using passive devices during the entire ice free periods in 2009 and 2010. The tracing procedure combines the Kruskal-Wallis H-test, principal component analysis and genetic-algorithm driven discriminant function analysis for source discrimination. Source apportionment was based on a numerical mass balance model deployed within a Monte Carlo framework incorporating both local optimization and global (genetic algorithm) optimization. The mean relative frequency-weighted average median inputs from the three spatial source units were estimated to be 17% (inter-quartile uncertainty range 0-32%) from the reference areas, 45% (inter-quartile uncertainty range 25-65%) from the burned areas and 38% (inter-quartile uncertainty range 14-59%) from the burned-salvage logged areas. High sediment inputs from burned and the burned-salvage logged areas, representing spatial source units 2 and 3, reflect the lasting effects of forest canopy and forest floor organic matter disturbance during the 2003 wildfire including increased runoff and sediment availability related to high terrestrial erosion, streamside mass wasting and river bank collapse. The results demonstrate the impact of wildfire and incremental pressures associated with salvage logging on catchment spatial sediment sources in higher elevation Montane regions where forest growth and vegetation recovery are relatively slow. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. A MoTe2 based light emitting diode and photodetector for silicon photonic integrated circuits

    NASA Astrophysics Data System (ADS)

    Bie, Ya-Qing; Heuck, M.; Grosso, G.; Furchi, M.; Cao, Y.; Zheng, J.; Navarro-Moratalla, E.; Zhou, L.; Taniguchi, T.; Watanabe, K.; Kong, J.; Englund, D.; Jarillo-Herrero, P.

    A key challenge in photonics today is to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, partly because many components such as waveguides, interferometers and modulators, could be integrated on silicon-based processors. However, light sources and photodetectors present continued challenges. Common approaches for light source include off-chip or wafer-bonded lasers based on III-V materials, but studies show advantages for directly modulated light sources. The most advanced photodetectors in silicon photonics are based on germanium growth which increases system cost. The emerging two dimensional transition metal dichalcogenides (TMDs) offer a path for optical interconnects components that can be integrated with the CMOS processing by back-end-of-the-line processing steps. Here we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe2, a TMD semiconductor with infrared band gap. The state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.

  1. Better Assessment Science Integrating Point and Non-point Sources (BASINS)

    EPA Pesticide Factsheets

    Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is a multipurpose environmental analysis system designed to help regional, state, and local agencies perform watershed- and water quality-based studies.

  2. BioWarehouse: a bioinformatics database warehouse toolkit

    PubMed Central

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for bioinformatics. PMID:16556315

  3. BioWarehouse: a bioinformatics database warehouse toolkit.

    PubMed

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David W J; Tenenbaum, Jessica D; Karp, Peter D

    2006-03-23

    This article addresses the problem of interoperation of heterogeneous bioinformatics databases. We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. BioWarehouse embodies significant progress on the database integration problem for bioinformatics.

  4. Integrating multiple immunogenetic data sources for feature extraction and mining somatic hypermutation patterns: the case of "towards analysis" in chronic lymphocytic leukaemia.

    PubMed

    Kavakiotis, Ioannis; Xochelli, Aliki; Agathangelidis, Andreas; Tsoumakas, Grigorios; Maglaveras, Nicos; Stamatopoulos, Kostas; Hadzidimitriou, Anastasia; Vlahavas, Ioannis; Chouvarda, Ioanna

    2016-06-06

    Somatic Hypermutation (SHM) refers to the introduction of mutations within rearranged V(D)J genes, a process that increases the diversity of Immunoglobulins (IGs). The analysis of SHM has offered critical insight into the physiology and pathology of B cells, leading to strong prognostication markers for clinical outcome in chronic lymphocytic leukaemia (CLL), the most frequent adult B-cell malignancy. In this paper we present a methodology for integrating multiple immunogenetic and clinocobiological data sources in order to extract features and create high quality datasets for SHM analysis in IG receptors of CLL patients. This dataset is used as the basis for a higher level integration procedure, inspired form social choice theory. This is applied in the Towards Analysis, our attempt to investigate the potential ontogenetic transformation of genes belonging to specific stereotyped CLL subsets towards other genes or gene families, through SHM. The data integration process, followed by feature extraction, resulted in the generation of a dataset containing information about mutations occurring through SHM. The Towards analysis performed on the integrated dataset applying voting techniques, revealed the distinct behaviour of subset #201 compared to other subsets, as regards SHM related movements among gene clans, both in allele-conserved and non-conserved gene areas. With respect to movement between genes, a high percentage movement towards pseudo genes was found in all CLL subsets. This data integration and feature extraction process can set the basis for exploratory analysis or a fully automated computational data mining approach on many as yet unanswered, clinically relevant biological questions.

  5. Spectral Radiance of a Large-Area Integrating Sphere Source

    PubMed Central

    Walker, James H.; Thompson, Ambler

    1995-01-01

    The radiance and irradiance calibration of large field-of-view scanning and imaging radiometers for remote sensing and surveillance applications has resulted in the development of novel calibration techniques. One of these techniques is the employment of large-area integrating sphere sources as radiance or irradiance secondary standards. To assist the National Aeronautical and Space Administration’s space based ozone measurement program, a commercially available large-area internally illuminated integrating sphere source’s spectral radiance was characterized in the wavelength region from 230 nm to 400 nm at the National Institute of Standards and Technology. Spectral radiance determinations and spatial mappings of the source indicate that carefully designed large-area integrating sphere sources can be measured with a 1 % to 2 % expanded uncertainty (two standard deviation estimate) in the near ultraviolet with spatial nonuniformities of 0.6 % or smaller across a 20 cm diameter exit aperture. A method is proposed for the calculation of the final radiance uncertainties of the source which includes the field of view of the instrument being calibrated. PMID:29151725

  6. A monolithically integrated polarization entangled photon pair source on a silicon chip

    PubMed Central

    Matsuda, Nobuyuki; Le Jeannic, Hanna; Fukuda, Hiroshi; Tsuchizawa, Tai; Munro, William John; Shimizu, Kaoru; Yamada, Koji; Tokura, Yasuhiro; Takesue, Hiroki

    2012-01-01

    Integrated photonic circuits are one of the most promising platforms for large-scale photonic quantum information systems due to their small physical size and stable interferometers with near-perfect lateral-mode overlaps. Since many quantum information protocols are based on qubits defined by the polarization of photons, we must develop integrated building blocks to generate, manipulate, and measure the polarization-encoded quantum state on a chip. The generation unit is particularly important. Here we show the first integrated polarization-entangled photon pair source on a chip. We have implemented the source as a simple and stable silicon-on-insulator photonic circuit that generates an entangled state with 91 ± 2% fidelity. The source is equipped with versatile interfaces for silica-on-silicon or other types of waveguide platforms that accommodate the polarization manipulation and projection devices as well as pump light sources. Therefore, we are ready for the full-scale implementation of photonic quantum information systems on a chip. PMID:23150781

  7. Integration and Utilization of Nuclear Systems on the Moon and Mars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houts, Michael G.; Schmidt, George R.; Bragg-Sitton, Shannon

    2006-01-20

    Over the past five decades numerous studies have identified nuclear energy as an enhancing or enabling technology for planetary surface exploration missions. This includes both radioisotope and fission sources for providing both heat and electricity. Nuclear energy sources were used to provide electricity on Apollo missions 12, 14, 15, 16, and 17, and on the Mars Viking landers. Very small nuclear energy sources were used to provide heat on the Mars Pathfinder, Spirit, and Opportunity rovers. Research has been performed at NASA MSFC to help assess potential issues associated with surface nuclear energy sources, and to generate data that couldmore » be useful to a future program. Research areas include System Integration, use of Regolith as Radiation Shielding, Waste Heat Rejection, Surface Environmental Effects on the Integrated System, Thermal Simulators, Surface System Integration / Interface / Interaction Testing, End-to-End Breadboard Development, Advanced Materials Development, Surface Energy Source Coolants, and Planetary Surface System Thermal Management and Control. This paper provides a status update on several of these research areas.« less

  8. Miniaturized integration of a fluorescence microscope

    PubMed Central

    Ghosh, Kunal K.; Burns, Laurie D.; Cocker, Eric D.; Nimmerjahn, Axel; Ziv, Yaniv; Gamal, Abbas El; Schnitzer, Mark J.

    2013-01-01

    The light microscope is traditionally an instrument of substantial size and expense. Its miniaturized integration would enable many new applications based on mass-producible, tiny microscopes. Key prospective usages include brain imaging in behaving animals towards relating cellular dynamics to animal behavior. Here we introduce a miniature (1.9 g) integrated fluorescence microscope made from mass-producible parts, including semiconductor light source and sensor. This device enables high-speed cellular-level imaging across ∼0.5 mm2 areas in active mice. This capability allowed concurrent tracking of Ca2+ spiking in >200 Purkinje neurons across nine cerebellar microzones. During mouse locomotion, individual microzones exhibited large-scale, synchronized Ca2+ spiking. This is a mesoscopic neural dynamic missed by prior techniques for studying the brain at other length scales. Overall, the integrated microscope is a potentially transformative technology that permits distribution to many animals and enables diverse usages, such as portable diagnostics or microscope arrays for large-scale screens. PMID:21909102

  9. Stepwise Connectivity of the Modal Cortex Reveals the Multimodal Organization of the Human Brain

    PubMed Central

    Sepulcre, Jorge; Sabuncu, Mert R.; Yeo, Thomas B.; Liu, Hesheng; Johnson, Keith A.

    2012-01-01

    How human beings integrate information from external sources and internal cognition to produce a coherent experience is still not well understood. During the past decades, anatomical, neurophysiological and neuroimaging research in multimodal integration have stood out in the effort to understand the perceptual binding properties of the brain. Areas in the human lateral occipito-temporal, prefrontal and posterior parietal cortices have been associated with sensory multimodal processing. Even though this, rather patchy, organization of brain regions gives us a glimpse of the perceptual convergence, the articulation of the flow of information from modality-related to the more parallel cognitive processing systems remains elusive. Using a method called Stepwise Functional Connectivity analysis, the present study analyzes the functional connectome and transitions from primary sensory cortices to higher-order brain systems. We identify the large-scale multimodal integration network and essential connectivity axes for perceptual integration in the human brain. PMID:22855814

  10. Integrating Remote and Social Sensing Data for a Scenario on Secure Societies in Big Data Platform

    NASA Astrophysics Data System (ADS)

    Albani, Sergio; Lazzarini, Michele; Koubarakis, Manolis; Taniskidou, Efi Karra; Papadakis, George; Karkaletsis, Vangelis; Giannakopoulos, George

    2016-08-01

    In the framework of the Horizon 2020 project BigDataEurope (Integrating Big Data, Software & Communities for Addressing Europe's Societal Challenges), a pilot for the Secure Societies Societal Challenge was designed considering the requirements coming from relevant stakeholders. The pilot is focusing on the integration in a Big Data platform of data coming from remote and social sensing.The information on land changes coming from the Copernicus Sentinel 1A sensor (Change Detection workflow) is integrated with information coming from selected Twitter and news agencies accounts (Event Detection workflow) in order to provide the user with multiple sources of information.The Change Detection workflow implements a processing chain in a distributed parallel manner, exploiting the Big Data capabilities in place; the Event Detection workflow implements parallel and distributed social media and news agencies monitoring as well as suitable mechanisms to detect and geo-annotate the related events.

  11. Template for preparation of papers for IEEE sponsored conferences & symposia.

    PubMed

    Sacchi, L; Dagliati, A; Tibollo, V; Leporati, P; De Cata, P; Cerra, C; Chiovato, L; Bellazzi, R

    2015-01-01

    To improve the access to medical information is necessary to design and implement integrated informatics techniques aimed to gather data from different and heterogeneous sources. This paper describes the technologies used to integrate data coming from the electronic medical record of the IRCCS Fondazione Maugeri (FSM) hospital of Pavia, Italy, and combines them with administrative, pharmacy drugs purchase coming from the local healthcare agency (ASL) of the Pavia area and environmental open data of the same region. The integration process is focused on data coming from a cohort of one thousand patients diagnosed with Type 2 Diabetes Mellitus (T2DM). Data analysis and temporal data mining techniques have been integrated to enhance the initial dataset allowing the possibility to stratify patients using further information coming from the mined data like behavioral patterns of prescription-related drug purchases and other frequent clinical temporal patterns, through the use of an intuitive dashboard controlled system.

  12. Miniaturized integration of a fluorescence microscope.

    PubMed

    Ghosh, Kunal K; Burns, Laurie D; Cocker, Eric D; Nimmerjahn, Axel; Ziv, Yaniv; Gamal, Abbas El; Schnitzer, Mark J

    2011-09-11

    The light microscope is traditionally an instrument of substantial size and expense. Its miniaturized integration would enable many new applications based on mass-producible, tiny microscopes. Key prospective usages include brain imaging in behaving animals for relating cellular dynamics to animal behavior. Here we introduce a miniature (1.9 g) integrated fluorescence microscope made from mass-producible parts, including a semiconductor light source and sensor. This device enables high-speed cellular imaging across ∼0.5 mm2 areas in active mice. This capability allowed concurrent tracking of Ca2+ spiking in >200 Purkinje neurons across nine cerebellar microzones. During mouse locomotion, individual microzones exhibited large-scale, synchronized Ca2+ spiking. This is a mesoscopic neural dynamic missed by prior techniques for studying the brain at other length scales. Overall, the integrated microscope is a potentially transformative technology that permits distribution to many animals and enables diverse usages, such as portable diagnostics or microscope arrays for large-scale screens.

  13. Implementation and evaluation of PM2.5 source contribution ...

    EPA Pesticide Factsheets

    Source culpability assessments are useful for developing effective emissions control programs. The Integrated Source Apportionment Method (ISAM) has been implemented in the Community Multiscale Air Quality (CMAQ) model to track contributions from source groups and regions to ambient levels and deposited amounts of primary and secondary inorganic PM2.5. Confidence in this approach is established by comparing ISAM source contribution estimates to emissions zero-out simulations recognizing that these approaches are not always expected to provide the same answer. The comparisons are expected to be most similar for more linear processes such as those involving primary emissions of PM2.5 and most different for non-linear systems like ammonium nitrate formation. Primarily emitted PM2.5 (e.g. elemental carbon), sulfur dioxide, ammonia, and nitrogen oxide contribution estimates compare well to zero-out estimates for ambient concentration and deposition. PM2.5 sulfate ion relationships are strong, but nonlinearity is evident and shown to be related to aqueous phase oxidation reactions in the host model. ISAM and zero-out contribution estimates are less strongly related for PM2.5 ammonium nitrate, resulting from instances of non-linear chemistry and negative responses (increases in PM2.5 due to decreases in emissions). ISAM is demonstrated in the context of an annual simulation tracking well characterized emissions source sectors and boundary conditions shows source contri

  14. Clinical data integration of distributed data sources using Health Level Seven (HL7) v3-RIM mapping

    PubMed Central

    2011-01-01

    Background Health information exchange and health information integration has become one of the top priorities for healthcare systems across institutions and hospitals. Most organizations and establishments implement health information exchange and integration in order to support meaningful information retrieval among their disparate healthcare systems. The challenges that prevent efficient health information integration for heterogeneous data sources are the lack of a common standard to support mapping across distributed data sources and the numerous and diverse healthcare domains. Health Level Seven (HL7) is a standards development organization which creates standards, but is itself not the standard. They create the Reference Information Model. RIM is developed by HL7's technical committees. It is a standardized abstract representation of HL7 data across all the domains of health care. In this article, we aim to present a design and a prototype implementation of HL7 v3-RIM mapping for information integration of distributed clinical data sources. The implementation enables the user to retrieve and search information that has been integrated using HL7 v3-RIM technology from disparate health care systems. Method and results We designed and developed a prototype implementation of HL7 v3-RIM mapping function to integrate distributed clinical data sources using R-MIM classes from HL7 v3-RIM as a global view along with a collaborative centralized web-based mapping tool to tackle the evolution of both global and local schemas. Our prototype was implemented and integrated with a Clinical Database management Systems CDMS as a plug-in module. We tested the prototype system with some use case scenarios for distributed clinical data sources across several legacy CDMS. The results have been effective in improving information delivery, completing tasks that would have been otherwise difficult to accomplish, and reducing the time required to finish tasks which are used in collaborative information retrieval and sharing with other systems. Conclusions We created a prototype implementation of HL7 v3-RIM mapping for information integration between distributed clinical data sources to promote collaborative healthcare and translational research. The prototype has effectively and efficiently ensured the accuracy of the information and knowledge extractions for systems that have been integrated PMID:22104558

  15. SchizConnect: Mediating Neuroimaging Databases on Schizophrenia and Related Disorders for Large-Scale Integration

    PubMed Central

    Wang, Lei; Alpert, Kathryn I.; Calhoun, Vince D.; Cobia, Derin J.; Keator, David B.; King, Margaret D.; Kogan, Alexandr; Landis, Drew; Tallis, Marcelo; Turner, Matthew D.; Potkin, Steven G.; Turner, Jessica A.; Ambite, Jose Luis

    2015-01-01

    SchizConnect (www.schizconnect.org) is built to address the issues of multiple data repositories in schizophrenia neuroimaging studies. It includes a level of mediation—translating across data sources—so that the user can place one query, e.g. for diffusion images from male individuals with schizophrenia, and find out from across participating data sources how many datasets there are, as well as downloading the imaging and related data. The current version handles the Data Usage Agreements across different studies, as well as interpreting database-specific terminologies into a common framework. New data repositories can also be mediated to bring immediate access to existing datasets. Compared with centralized, upload data sharing models, SchizConnect is a unique, virtual database with a focus on schizophrenia and related disorders that can mediate live data as information are being updated at each data source. It is our hope that SchizConnect can facilitate testing new hypotheses through aggregated datasets, promoting discovery related to the mechanisms underlying schizophrenic dysfunction. PMID:26142271

  16. Simultaneous EEG and MEG source reconstruction in sparse electromagnetic source imaging.

    PubMed

    Ding, Lei; Yuan, Han

    2013-04-01

    Electroencephalography (EEG) and magnetoencephalography (MEG) have different sensitivities to differently configured brain activations, making them complimentary in providing independent information for better detection and inverse reconstruction of brain sources. In the present study, we developed an integrative approach, which integrates a novel sparse electromagnetic source imaging method, i.e., variation-based cortical current density (VB-SCCD), together with the combined use of EEG and MEG data in reconstructing complex brain activity. To perform simultaneous analysis of multimodal data, we proposed to normalize EEG and MEG signals according to their individual noise levels to create unit-free measures. Our Monte Carlo simulations demonstrated that this integrative approach is capable of reconstructing complex cortical brain activations (up to 10 simultaneously activated and randomly located sources). Results from experimental data showed that complex brain activations evoked in a face recognition task were successfully reconstructed using the integrative approach, which were consistent with other research findings and validated by independent data from functional magnetic resonance imaging using the same stimulus protocol. Reconstructed cortical brain activations from both simulations and experimental data provided precise source localizations as well as accurate spatial extents of localized sources. In comparison with studies using EEG or MEG alone, the performance of cortical source reconstructions using combined EEG and MEG was significantly improved. We demonstrated that this new sparse ESI methodology with integrated analysis of EEG and MEG data could accurately probe spatiotemporal processes of complex human brain activations. This is promising for noninvasively studying large-scale brain networks of high clinical and scientific significance. Copyright © 2011 Wiley Periodicals, Inc.

  17. Correlating ion energies and CF{sub 2} surface production during fluorocarbon plasma processing of silicon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Ina T.; Zhou Jie; Fisher, Ellen R.

    2006-07-01

    Ion energy distribution (IED) measurements are reported for ions in the plasma molecular beam source of the imaging of radicals interacting with surfaces (IRIS) apparatus. The IEDs and relative intensities of nascent ions in C{sub 3}F{sub 8} and C{sub 4}F{sub 8} plasma molecular beams were measured using a Hiden PSM003 mass spectrometer mounted on the IRIS main chamber. The IEDs are complex and multimodal, with mean ion energies ranging from 29 to 92 eV. Integrated IEDs provided relative ion intensities as a function of applied rf power and source pressure. Generally, higher applied rf powers and lower source pressures resultedmore » in increased ion intensities and mean ion energies. Most significantly, a comparison to CF{sub 2} surface interaction measurements previously made in our laboratories reveals that mean ion energies are directly and linearly correlated to CF{sub 2} surface production in these systems.« less

  18. Sources and sinks of plastic debris in estuaries: A conceptual model integrating biological, physical and chemical distribution mechanisms.

    PubMed

    Vermeiren, Peter; Muñoz, Cynthia C; Ikejima, Kou

    2016-12-15

    Micro- and macroplastic accumulation threatens estuaries worldwide because of the often dense human populations, diverse plastic inputs and high potential for plastic degradation and storage in these ecosystems. Nonetheless, our understanding of plastic sources and sinks remains limited. We designed conceptual models of the local and estuary-wide transport of plastics. We identify processes affecting the position of plastics in the water column; processes related to the mixing of fresh and salt water; and processes resulting from the influences of wind, topography, and organism-plastic interactions. The models identify gaps in the spatial context of plastic-organisms interactions, the chemical behavior of plastics in estuaries, effects of wind on plastic suspension-deposition cycles, and the relative importance of processes affecting the position in the water column. When interpreted in the context of current understanding, sinks with high management potential can be identified. However, source-sink patterns vary among estuary types and with local scale processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. EEG Oscillations Are Modulated in Different Behavior-Related Networks during Rhythmic Finger Movements.

    PubMed

    Seeber, Martin; Scherer, Reinhold; Müller-Putz, Gernot R

    2016-11-16

    Sequencing and timing of body movements are essential to perform motoric tasks. In this study, we investigate the temporal relation between cortical oscillations and human motor behavior (i.e., rhythmic finger movements). High-density EEG recordings were used for source imaging based on individual anatomy. We separated sustained and movement phase-related EEG source amplitudes based on the actual finger movements recorded by a data glove. Sustained amplitude modulations in the contralateral hand area show decrease for α (10-12 Hz) and β (18-24 Hz), but increase for high γ (60-80 Hz) frequencies during the entire movement period. Additionally, we found movement phase-related amplitudes, which resembled the flexion and extension sequence of the fingers. Especially for faster movement cadences, movement phase-related amplitudes included high β (24-30 Hz) frequencies in prefrontal areas. Interestingly, the spectral profiles and source patterns of movement phase-related amplitudes differed from sustained activities, suggesting that they represent different frequency-specific large-scale networks. First, networks were signified by the sustained element, which statically modulate their synchrony levels during continuous movements. These networks may upregulate neuronal excitability in brain regions specific to the limb, in this study the right hand area. Second, movement phase-related networks, which modulate their synchrony in relation to the movement sequence. We suggest that these frequency-specific networks are associated with distinct functions, including top-down control, sensorimotor prediction, and integration. The separation of different large-scale networks, we applied in this work, improves the interpretation of EEG sources in relation to human motor behavior. EEG recordings provide high temporal resolution suitable to relate cortical oscillations to actual movements. Investigating EEG sources during rhythmic finger movements, we distinguish sustained from movement phase-related amplitude modulations. We separate these two EEG source elements motivated by our previous findings in gait. Here, we found two types of large-scale networks, representing the right fingers in distinction from the time sequence of the movements. These findings suggest that EEG source amplitudes reconstructed in a cortical patch are the superposition of these simultaneously present network activities. Separating these frequency-specific networks is relevant for studying function and possible dysfunction of the cortical sensorimotor system in humans as well as to provide more advanced features for brain-computer interfaces. Copyright © 2016 the authors 0270-6474/16/3611671-11$15.00/0.

  20. Signals from the ventrolateral thalamus to the motor cortex during locomotion

    PubMed Central

    Marlinski, Vladimir; Nilaweera, Wijitha U.; Zelenin, Pavel V.; Sirota, Mikhail G.

    2012-01-01

    The activity of the motor cortex during locomotion is profoundly modulated in the rhythm of strides. The source of modulation is not known. In this study we examined the activity of one of the major sources of afferent input to the motor cortex, the ventrolateral thalamus (VL). Experiments were conducted in chronically implanted cats with an extracellular single-neuron recording technique. VL neurons projecting to the motor cortex were identified by antidromic responses. During locomotion, the activity of 92% of neurons was modulated in the rhythm of strides; 67% of cells discharged one activity burst per stride, a pattern typical for the motor cortex. The characteristics of these discharges in most VL neurons appeared to be well suited to contribute to the locomotion-related activity of the motor cortex. In addition to simple locomotion, we examined VL activity during walking on a horizontal ladder, a task that requires vision for correct foot placement. Upon transition from simple to ladder locomotion, the activity of most VL neurons exhibited the same changes that have been reported for the motor cortex, i.e., an increase in the strength of stride-related modulation and shortening of the discharge duration. Five modes of integration of simple and ladder locomotion-related information were recognized in the VL. We suggest that, in addition to contributing to the locomotion-related activity in the motor cortex during simple locomotion, the VL integrates and transmits signals needed for correct foot placement on a complex terrain to the motor cortex. PMID:21994259

  1. Patterns and age distribution of ground-water flow to streams

    USGS Publications Warehouse

    Modica, E.; Reilly, T.E.; Pollock, D.W.

    1997-01-01

    Simulations of ground-water flow in a generic aquifer system were made to characterize the topology of ground-water flow in the stream subsystem and to evaluate its relation to deeper ground-water flow. The flow models are patterned after hydraulic characteristics of aquifers of the Atlantic Coastal Plain and are based on numerical solutions to three-dimensional, steady-state, unconfined flow. The models were used to evaluate the effects of aquifer horizontal-to-vertical hydraulic conductivity ratios, aquifer thickness, and areal recharge rates on flow in the stream subsystem. A particle tracker was used to determine flow paths in a stream subsystem, to establish the relation between ground-water seepage to points along a simulated stream and its source area of flow, and to determine ground-water residence time in stream subsystems. In a geometrically simple aquifer system with accretion, the source area of flow to streams resembles an elongated ellipse that tapers in the downgradient direction. Increased recharge causes an expansion of the stream subsystem. The source area of flow to the stream expands predominantly toward the stream headwaters. Baseflow gain is also increased along the reach of the stream. A thin aquifer restricts ground-water flow and causes the source area of flow to expand near stream headwaters and also shifts the start-of-flow to the drainage basin divide. Increased aquifer anisotropy causes a lateral expansion of the source area of flow to streams. Ground-water seepage to the stream channel originates both from near- and far-recharge locations. The range in the lengths of flow paths that terminate at a point on a stream increase in the downstream direction. Consequently, the age distribution of ground water that seeps into the stream is skewed progressively older with distance downstream. Base flow ia an integration of ground water with varying age and potentially different water quality, depending on the source within the drainage basin. The quantitative results presented indicate that this integration can have a wide and complex residence time range and source distribution.

  2. On the effects of multimodal information integration in multitasking.

    PubMed

    Stock, Ann-Kathrin; Gohil, Krutika; Huster, René J; Beste, Christian

    2017-07-07

    There have recently been considerable advances in our understanding of the neuronal mechanisms underlying multitasking, but the role of multimodal integration for this faculty has remained rather unclear. We examined this issue by comparing different modality combinations in a multitasking (stop-change) paradigm. In-depth neurophysiological analyses of event-related potentials (ERPs) were conducted to complement the obtained behavioral data. Specifically, we applied signal decomposition using second order blind identification (SOBI) to the multi-subject ERP data and source localization. We found that both general multimodal information integration and modality-specific aspects (potentially related to task difficulty) modulate behavioral performance and associated neurophysiological correlates. Simultaneous multimodal input generally increased early attentional processing of visual stimuli (i.e. P1 and N1 amplitudes) as well as measures of cognitive effort and conflict (i.e. central P3 amplitudes). Yet, tactile-visual input caused larger impairments in multitasking than audio-visual input. General aspects of multimodal information integration modulated the activity in the premotor cortex (BA 6) as well as different visual association areas concerned with the integration of visual information with input from other modalities (BA 19, BA 21, BA 37). On top of this, differences in the specific combination of modalities also affected performance and measures of conflict/effort originating in prefrontal regions (BA 6).

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, Brian M.; Larson, Vincent E.

    Microphysical processes, such as the formation, growth, and evaporation of precipitation, interact with variability and covariances (e.g., fluxes) in moisture and heat content. For instance, evaporation of rain may produce cold pools, which in turn may trigger fresh convection and precipitation. These effects are usually omitted or else crudely parameterized at subgrid scales in weather and climate models.A more formal approach is pursued here, based on predictive, horizontally averaged equations for the variances, covariances, and fluxes of moisture and heat content. These higher-order moment equations contain microphysical source terms. The microphysics terms can be integrated analytically, given a suitably simplemore » warm-rain microphysics scheme and an approximate assumption about the multivariate distribution of cloud-related and precipitation-related variables. Performing the integrations provides exact expressions within an idealized context.A large-eddy simulation (LES) of a shallow precipitating cumulus case is performed here, and it indicates that the microphysical effects on (co)variances and fluxes can be large. In some budgets and altitude ranges, they are dominant terms. The analytic expressions for the integrals are implemented in a single-column, higher-order closure model. Interactive single-column simulations agree qualitatively with the LES. The analytic integrations form a parameterization of microphysical effects in their own right, and they also serve as benchmark solutions that can be compared to non-analytic integration methods.« less

  4. Organ S values and effective doses for family members exposed to adult patients following I-131 treatment: A Monte Carlo simulation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Eun Young; Lee, Choonsik; Mcguire, Lynn

    Purpose: To calculate organ S values (mGy/Bq-s) and effective doses per time-integrated activity (mSv/Bq-s) for pediatric and adult family members exposed to an adult male or female patient treated with I-131 using a series of hybrid computational phantoms coupled with a Monte Carlo radiation transport technique.Methods: A series of pediatric and adult hybrid computational phantoms were employed in the study. Three different exposure scenarios were considered: (1) standing face-to-face exposures between an adult patient and pediatric or adult family phantoms at five different separation distances; (2) an adult female patient holding her newborn child, and (3) a 1-yr-old child standingmore » on the lap of an adult female patient. For the adult patient model, two different thyroid-related diseases were considered: hyperthyroidism and differentiated thyroid cancer (DTC) with corresponding internal distributions of {sup 131}I. A general purpose Monte Carlo code, MCNPX v2.7, was used to perform the Monte Carlo radiation transport.Results: The S values show a strong dependency on age and organ location within the family phantoms at short distances. The S values and effective dose per time-integrated activity from the adult female patient phantom are relatively high at shorter distances and to younger family phantoms. At a distance of 1 m, effective doses per time-integrated activity are lower than those values based on the NRC (Nuclear Regulatory Commission) by a factor of 2 for both adult male and female patient phantoms. The S values to target organs from the hyperthyroid-patient source distribution strongly depend on the height of the exposed family phantom, so that their values rapidly decrease with decreasing height of the family phantom. Active marrow of the 10-yr-old phantom shows the highest S values among family phantoms for the DTC-patient source distribution. In the exposure scenario of mother and baby, S values and effective doses per time-integrated activity to the newborn and 1-yr-old phantoms for a hyperthyroid-patient source are higher than values for a DTC-patient source.Conclusions: The authors performed realistic assessments of {sup 131}I organ S values and effective dose per time-integrated activity from adult patients treated for hyperthyroidism and DTC to family members. In addition, the authors’ studies consider Monte Carlo simulated “mother and baby/child” exposure scenarios for the first time. Based on these results, the authors reconfirm the strong conservatism underlying the point source method recommended by the US NRC. The authors recommend that various factors such as the type of the patient's disease, the age of family members, and the distance/posture between the patient and family members must be carefully considered to provide realistic dose estimates for patient-to-family exposures.« less

  5. Challenges and trends in magnetic sensor integration with microfluidics for biomedical applications

    NASA Astrophysics Data System (ADS)

    Cardoso, S.; Leitao, D. C.; Dias, T. M.; Valadeiro, J.; Silva, M. D.; Chicharo, A.; Silverio, V.; Gaspar, J.; Freitas, P. P.

    2017-06-01

    Magnetoresistive (MR) sensors have been successfully applied in many technologies, in particular readout electronics and smart systems for multiple signal addressing and readout. When single sensors are used, the requirements relate to spatial resolution and localized field sources. The integration of MR sensors in adaptable media (e.g. flexible, stretchable substrates) offers the possibility to merge the magnetic detection with mechanical functionalities. In addition, the precision of a micrometric needle can benefit greatly from the integration of MR sensors with submicrometric resolution. In this paper, we demonstrate through several detailed examples how advanced MR sensors can be integrated with the systems described above, and also with microfluidic technologies. Here, the challenges of handling liquids over a chip combine with those for miniaturization of microelectronics for MR readout. However, when these are overcome, the result is an integrated system with added functionalities, capable of answering the demand in biomedicine and biochemistry for lab-on-a-chip devices.

  6. The Evolution of Arthropod Body Plans: Integrating Phylogeny, Fossils, and Development-An Introduction to the Symposium.

    PubMed

    Chipman, Ariel D; Erwin, Douglas H

    2017-09-01

    The last few years have seen a significant increase in the amount of data we have about the evolution of the arthropod body plan. This has come mainly from three separate sources: a new consensus and improved resolution of arthropod phylogeny, based largely on new phylogenomic analyses; a wealth of new early arthropod fossils from a number of Cambrian localities with excellent preservation, as well as a renewed analysis of some older fossils; and developmental data from a range of model and non-model pan-arthropod species that shed light on the developmental origins and homologies of key arthropod traits. However, there has been relatively little synthesis among these different data sources, and the three communities studying them have little overlap. The symposium "The Evolution of Arthropod Body Plans-Integrating Phylogeny, Fossils and Development" brought together leading researchers in these three disciplines and made a significant contribution to the emerging synthesis of arthropod evolution, which will help advance the field and will be useful for years to come. © The Author 2017. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  7. Using a Vertically Integrated Model to Determine the Effects of Seasonal Forcing on the Basal Topography of Ice Shelves

    NASA Astrophysics Data System (ADS)

    MacMackin, C. T.; Wells, A.

    2017-12-01

    While relatively small in mass, ice shelves play an important role in buttressing ice sheets, slowing their flow into the ocean. As such, an understanding of ice shelf stability is needed for predictions of future sea level rise. Networks of channels have been observed underneath Antarctic ice shelves and are thought to affect their stability. While the origins of channels running parallel to ice flow are thought to be well understood, transverse channels have also been observed and the mechanism for their formation is less clear. It has been suggested that seasonal variations in ice and ocean properties could be a source and we run nonlinear, vertically integrated 1-D simulations of a coupled ice shelf and plume to test this hypothesis. We also examine how these variations might alter the shape of internal radar reflectors within the ice, suggesting a new technique to model their distribution using a vertically integrated model of ice flow. We examine a range of sources for seasonal forcing which might lead to channel formation, finding that variability in subglacial discharge results in small variations of ice thickness. Additional mechanisms would be required to expand these into large transverse channels.

  8. Algorithms and physical parameters involved in the calculation of model stellar atmospheres

    NASA Astrophysics Data System (ADS)

    Merlo, D. C.

    This contribution summarizes the Doctoral Thesis presented at Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba for the degree of PhD in Astronomy. We analyze some algorithms and physical parameters involved in the calculation of model stellar atmospheres, such as atomic partition functions, functional relations connecting gaseous and electronic pressure, molecular formation, temperature distribution, chemical compositions, Gaunt factors, atomic cross-sections and scattering sources, as well as computational codes for calculating models. Special attention is paid to the integration of hydrostatic equation. We compare our results with those obtained by other authors, finding reasonable agreement. We make efforts on the implementation of methods that modify the originally adopted temperature distribution in the atmosphere, in order to obtain constant energy flux throughout. We find limitations and we correct numerical instabilities. We integrate the transfer equation solving directly the integral equation involving the source function. As a by-product, we calculate updated atomic partition functions of the light elements. Also, we discuss and enumerate carefully selected formulae for the monochromatic absorption and dispersion of some atomic and molecular species. Finally, we obtain a flexible code to calculate model stellar atmospheres.

  9. Combining multiple sources of data to inform conservation of Lesser Prairie-Chicken populations

    USGS Publications Warehouse

    Ross, Beth; Haukos, David A.; Hagen, Christian A.; Pitman, James

    2018-01-01

    Conservation of small populations is often based on limited data from spatially and temporally restricted studies, resulting in management actions based on an incomplete assessment of the population drivers. If fluctuations in abundance are related to changes in weather, proper management is especially important, because extreme weather events could disproportionately affect population abundance. Conservation assessments, especially for vulnerable populations, are aided by a knowledge of how extreme events influence population status and trends. Although important for conservation efforts, data may be limited for small or vulnerable populations. Integrated population models maximize information from various sources of data to yield population estimates that fully incorporate uncertainty from multiple data sources while allowing for the explicit incorporation of environmental covariates of interest. Our goal was to assess the relative influence of population drivers for the Lesser Prairie-Chicken (Tympanuchus pallidicinctus) in the core of its range, western and southern Kansas, USA. We used data from roadside lek count surveys, nest monitoring surveys, and survival data from telemetry monitoring combined with climate (Palmer drought severity index) data in an integrated population model. Our results indicate that variability in population growth rate was most influenced by variability in juvenile survival. The Palmer drought severity index had no measurable direct effects on adult survival or mean number of offspring per female; however, there were declines in population growth rate following severe drought. Because declines in population growth rate occurred at a broad spatial scale, declines in response to drought were likely due to decreases in chick and juvenile survival rather than emigration outside of the study area. Overall, our model highlights the importance of accounting for environmental and demographic sources of variability, and provides a thorough method for simultaneously evaluating population demography in response to long-term climate effects.

  10. A systematic examination of a random sampling strategy for source apportionment calculations.

    PubMed

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. A Review of Theoretical Frameworks for Supply Chain Integration

    NASA Astrophysics Data System (ADS)

    Thoo, AC; Tan, LC; Sulaiman, Z.; Zakuan, N.

    2017-06-01

    In a world of fierce competition and business driven by speed to market, good quality and low costs, this environment requires firms to have a source of competitive advantage that is inimitable and non-substitutable. For a supply chain integration (SCI) strategy to achieve sustainable competitive advantage it must be non-substitutable, inimitable, path-dependent and developed over time. Also, an integrated supply chain framework is needed to tie the whole network together in order to reduce perennial supply chain challenges such as functional silos, poor transparency of knowledge and information and the inadequate formation of appropriate customer and supplier relationships. Therefore, this paper aims to evaluate the competitive impact of a SCI strategy on firm performance using the theory of Resource-based View (RBV) and relational view.

  12. BIAS: Bioinformatics Integrated Application Software.

    PubMed

    Finak, G; Godin, N; Hallett, M; Pepin, F; Rajabi, Z; Srivastava, V; Tang, Z

    2005-04-15

    We introduce a development platform especially tailored to Bioinformatics research and software development. BIAS (Bioinformatics Integrated Application Software) provides the tools necessary for carrying out integrative Bioinformatics research requiring multiple datasets and analysis tools. It follows an object-relational strategy for providing persistent objects, allows third-party tools to be easily incorporated within the system and supports standards and data-exchange protocols common to Bioinformatics. BIAS is an OpenSource project and is freely available to all interested users at http://www.mcb.mcgill.ca/~bias/. This website also contains a paper containing a more detailed description of BIAS and a sample implementation of a Bayesian network approach for the simultaneous prediction of gene regulation events and of mRNA expression from combinations of gene regulation events. hallett@mcb.mcgill.ca.

  13. Integration services to enable regional shared electronic health records.

    PubMed

    Oliveira, Ilídio C; Cunha, João P S

    2011-01-01

    eHealth is expected to integrate a comprehensive set of patient data sources into a coherent continuum, but implementations vary and Portugal is still lacking on electronic patient data sharing. In this work, we present a clinical information hub to aggregate multi-institution patient data and bridge the information silos. This integration platform enables a coherent object model, services-oriented applications development and a trust framework. It has been instantiated in the Rede Telemática de Saúde (www.RTSaude.org) to support a regional Electronic Health Record approach, fed dynamically from production systems at eight partner institutions, providing access to more than 11,000,000 care episodes, relating to over 350,000 citizens. The network has obtained the necessary clearance from the Portuguese data protection agency.

  14. An integrated multi-source energy harvester based on vibration and magnetic field energy

    NASA Astrophysics Data System (ADS)

    Hu, Zhengwen; Qiu, Jing; Wang, Xian; Gao, Yuan; Liu, Xin; Chang, Qijie; Long, Yibing; He, Xingduo

    2018-05-01

    In this paper, an integrated multi-source energy harvester (IMSEH) employing a special shaped cantilever beam and a piezoelectric transducer to convert vibration and magnetic field energy into electrical energy is presented. The electric output performance of the proposed IMSEH has been investigated. Compared to a traditional multi-source energy harvester (MSEH) or single source energy harvester (SSEH), the proposed IMSEH can simultaneously harvest vibration and magnetic field energy with an integrated structure and the electric output is greatly improved. When other conditions keep identical, the IMSEH can obtain high voltage of 12.8V. Remarkably, the proposed IMSEHs have great potential for its application in wireless sensor network.

  15. Image-guided feedback for ophthalmic microsurgery using multimodal intraoperative swept-source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Li, Jianwei D.; Malone, Joseph D.; El-Haddad, Mohamed T.; Arquitola, Amber M.; Joos, Karen M.; Patel, Shriji N.; Tao, Yuankai K.

    2017-02-01

    Surgical interventions for ocular diseases involve manipulations of semi-transparent structures in the eye, but limited visualization of these tissue layers remains a critical barrier to developing novel surgical techniques and improving clinical outcomes. We addressed limitations in image-guided ophthalmic microsurgery by using microscope-integrated multimodal intraoperative swept-source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography (iSS-SESLO-OCT). We previously demonstrated in vivo human ophthalmic imaging using SS-SESLO-OCT, which enabled simultaneous acquisition of en face SESLO images with every OCT cross-section. Here, we integrated our new 400 kHz iSS-SESLO-OCT, which used a buffered Axsun 1060 nm swept-source, with a surgical microscope and TrueVision stereoscopic viewing system to provide image-based feedback. In vivo human imaging performance was demonstrated on a healthy volunteer, and simulated surgical maneuvers were performed in ex vivo porcine eyes. Denselysampled static volumes and volumes subsampled at 10 volumes-per-second were used to visualize tissue deformations and surgical dynamics during corneal sweeps, compressions, and dissections, and retinal sweeps, compressions, and elevations. En face SESLO images enabled orientation and co-registration with the widefield surgical microscope view while OCT imaging enabled depth-resolved visualization of surgical instrument positions relative to anatomic structures-of-interest. TrueVision heads-up display allowed for side-by-side viewing of the surgical field with SESLO and OCT previews for real-time feedback, and we demonstrated novel integrated segmentation overlays for augmented-reality surgical guidance. Integration of these complementary imaging modalities may benefit surgical outcomes by enabling real-time intraoperative visualization of surgical plans, instrument positions, tissue deformations, and image-based surrogate biomarkers correlated with completion of surgical goals.

  16. SKYDOSE: A code for gamma skyshine calculations using the integral line-beam method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shultis, J.K.; Faw, R.E.; Brockhoff, R.C.

    1994-07-01

    SKYDOS evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated by three simple geometries: (1) a source in a silo; (2) a source behind an infinitely long, vertical, black wall; and (3) a source in a rectangular building. In all three geometries, an optical overhead shield may be specified. The source energy must be between 0.02 and 100 MeV (10 MeV for sources with an overhead shield). This is a user`s manual. Other references give more detail on the integral line-beam method used by SKYDOSE.

  17. A Design of a Modular GPHS-Stirling Power System for a Lunar Habitation Module

    NASA Technical Reports Server (NTRS)

    Schmitz, Paul C.; Penswick, L. Barry; Shaltens, Richard K.

    2005-01-01

    Lunar habitation modules need electricity and potentially heat to operate. Because of the low amounts of radiation emitted by General Purpose Heat Source (GPHS) modules, power plants incorporating these as heat sources could be placed in close proximity to habitation modules. A design concept is discussed for a high efficiency power plant based on a GPHS assembly integrated with a Stirling convertor. This system could provide both electrical power and heat, if required, for a lunar habitation module. The conceptual GPHS/Stirling system is modular in nature and made up of a basic 5.5 KWe Stirling convertor/GPHS module assembly, convertor controller/PMAD electronics, waste heat radiators, and associated thermal insulation. For the specific lunar application under investigation eight modules are employed to deliver 40 KWe to the habitation module. This design looks at three levels of Stirling convertor technology and addresses the issues of integrating the Stirling convertors with the GPHS heat sources assembly using proven technology whenever possible. In addition, issues related to the high-temperature heat transport system, power management, convertor control, vibration isolation, and potential system packaging configurations to ensure safe operation during all phases of deployment will be discussed.

  18. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    PubMed Central

    2011-01-01

    Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854

  19. Integrating in silico models to enhance predictivity for developmental toxicity.

    PubMed

    Marzo, Marco; Kulkarni, Sunil; Manganaro, Alberto; Roncaglioni, Alessandra; Wu, Shengde; Barton-Maclaren, Tara S; Lester, Cathy; Benfenati, Emilio

    2016-08-31

    Application of in silico models to predict developmental toxicity has demonstrated limited success particularly when employed as a single source of information. It is acknowledged that modelling the complex outcomes related to this endpoint is a challenge; however, such models have been developed and reported in the literature. The current study explored the possibility of integrating the selected public domain models (CAESAR, SARpy and P&G model) with the selected commercial modelling suites (Multicase, Leadscope and Derek Nexus) to assess if there is an increase in overall predictive performance. The results varied according to the data sets used to assess performance which improved upon model integration relative to individual models. Moreover, because different models are based on different specific developmental toxicity effects, integration of these models increased the applicable chemical and biological spaces. It is suggested that this approach reduces uncertainty associated with in silico predictions by achieving a consensus among a battery of models. The use of tools to assess the applicability domain also improves the interpretation of the predictions. This has been verified in the case of the software VEGA, which makes freely available QSAR models with a measurement of the applicability domain. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Innovative financing for health: what is truly innovative?

    PubMed

    Atun, Rifat; Knaul, Felicia Marie; Akachi, Yoko; Frenk, Julio

    2012-12-08

    Development assistance for health has increased every year between 2000 and 2010, particularly for HIV/AIDS, tuberculosis, and malaria, to reach US$26·66 billion in 2010. The continued global economic crisis means that increased external financing from traditional donors is unlikely in the near term. Hence, new funding has to be sought from innovative financing sources to sustain the gains made in global health, to achieve the health Millennium Development Goals, and to address the emerging burden from non-communicable diseases. We use the value chain approach to conceptualise innovative financing. With this framework, we identify three integrated innovative financing mechanisms-GAVI, Global Fund, and UNITAID-that have reached a global scale. These three financing mechanisms have innovated along each step of the innovative finance value chain-namely resource mobilisation, pooling, channelling, resource allocation, and implementation-and integrated these steps to channel large amounts of funding rapidly to low-income and middle-income countries to address HIV/AIDS, malaria, tuberculosis, and vaccine-preventable diseases. However, resources mobilised from international innovative financing sources are relatively modest compared with donor assistance from traditional sources. Instead, the real innovation has been establishment of new organisational forms as integrated financing mechanisms that link elements of the financing value chain to more effectively and efficiently mobilise, pool, allocate, and channel financial resources to low-income and middle-income countries and to create incentives to improve implementation and performance of national programmes. These mechanisms provide platforms for health funding in the future, especially as efforts to grow innovative financing have faltered. The lessons learnt from these mechanisms can be used to develop and expand innovative financing from international sources to address health needs in low-income and middle-income countries. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. A fungal transcription factor essential for starch degradation affects integration of carbon and nitrogen metabolism

    DOE PAGES

    Xiong, Yi; Wu, Vincent W.; Lubbe, Andrea; ...

    2017-05-03

    In Neurospora crassa, the transcription factor COL-26 functions as a regulator of glucose signaling and metabolism. Its loss leads to resistance to carbon catabolite repression. Here, we report that COL-26 is necessary for the expression of amylolytic genes in N. crassa and is required for the utilization of maltose and starch. Additionally, the Δcol-26 mutant shows growth defects on preferred carbon sources, such as glucose, an effect that was alleviated if glutamine replaced ammonium as the primary nitrogen source. This rescue did not occur when maltose was used as a sole carbon source. Transcriptome and metabolic analyses of the Δcol-26more » mutant relative to its wild type parental strain revealed that amino acid and nitrogen metabolism, the TCA cycle and GABA shunt were adversely affected. Phylogenetic analysis showed a single col-26 homolog in Sordariales, Ophilostomatales, and the Magnaporthales, but an expanded number of col-26 homologs in other filamentous fungal species. Deletion of the closest homolog of col-26 in Trichoderma reesei, bglR, resulted in a mutant with similar preferred carbon source growth deficiency, and which was alleviated if glutamine was the sole nitrogen source, suggesting conservation of COL-26 and BglR function. Our finding provides novel insight into the role of COL-26 for utilization of starch and in integrating carbon and nitrogen metabolism for balanced metabolic activities for optimal carbon and nitrogen distribution.« less

  2. A fungal transcription factor essential for starch degradation affects integration of carbon and nitrogen metabolism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Yi; Wu, Vincent W.; Lubbe, Andrea

    In Neurospora crassa, the transcription factor COL-26 functions as a regulator of glucose signaling and metabolism. Its loss leads to resistance to carbon catabolite repression. Here, we report that COL-26 is necessary for the expression of amylolytic genes in N. crassa and is required for the utilization of maltose and starch. Additionally, the Δcol-26 mutant shows growth defects on preferred carbon sources, such as glucose, an effect that was alleviated if glutamine replaced ammonium as the primary nitrogen source. This rescue did not occur when maltose was used as a sole carbon source. Transcriptome and metabolic analyses of the Δcol-26more » mutant relative to its wild type parental strain revealed that amino acid and nitrogen metabolism, the TCA cycle and GABA shunt were adversely affected. Phylogenetic analysis showed a single col-26 homolog in Sordariales, Ophilostomatales, and the Magnaporthales, but an expanded number of col-26 homologs in other filamentous fungal species. Deletion of the closest homolog of col-26 in Trichoderma reesei, bglR, resulted in a mutant with similar preferred carbon source growth deficiency, and which was alleviated if glutamine was the sole nitrogen source, suggesting conservation of COL-26 and BglR function. Our finding provides novel insight into the role of COL-26 for utilization of starch and in integrating carbon and nitrogen metabolism for balanced metabolic activities for optimal carbon and nitrogen distribution.« less

  3. A fungal transcription factor essential for starch degradation affects integration of carbon and nitrogen metabolism

    PubMed Central

    Xiong, Yi; Qin, Lina; Kennedy, Megan; Bauer, Diane; Barry, Kerrie; Northen, Trent R.; Grigoriev, Igor V.

    2017-01-01

    In Neurospora crassa, the transcription factor COL-26 functions as a regulator of glucose signaling and metabolism. Its loss leads to resistance to carbon catabolite repression. Here, we report that COL-26 is necessary for the expression of amylolytic genes in N. crassa and is required for the utilization of maltose and starch. Additionally, the Δcol-26 mutant shows growth defects on preferred carbon sources, such as glucose, an effect that was alleviated if glutamine replaced ammonium as the primary nitrogen source. This rescue did not occur when maltose was used as a sole carbon source. Transcriptome and metabolic analyses of the Δcol-26 mutant relative to its wild type parental strain revealed that amino acid and nitrogen metabolism, the TCA cycle and GABA shunt were adversely affected. Phylogenetic analysis showed a single col-26 homolog in Sordariales, Ophilostomatales, and the Magnaporthales, but an expanded number of col-26 homologs in other filamentous fungal species. Deletion of the closest homolog of col-26 in Trichoderma reesei, bglR, resulted in a mutant with similar preferred carbon source growth deficiency, and which was alleviated if glutamine was the sole nitrogen source, suggesting conservation of COL-26 and BglR function. Our finding provides novel insight into the role of COL-26 for utilization of starch and in integrating carbon and nitrogen metabolism for balanced metabolic activities for optimal carbon and nitrogen distribution. PMID:28467421

  4. Comparing clinical automated, medical record, and hybrid data sources for diabetes quality measures.

    PubMed

    Kerr, Eve A; Smith, Dylan M; Hogan, Mary M; Krein, Sarah L; Pogach, Leonard; Hofer, Timothy P; Hayward, Rodney A

    2002-10-01

    Little is known about the relative reliability of medical record and clinical automated data, sources commonly used to assess diabetes quality of care. The agreement between diabetes quality measures constructed from clinical automated versus medical record data sources was compared, and the performance of hybrid measures derived from a combination of the two data sources was examined. Medical records were abstracted for 1,032 patients with diabetes who received care from 21 facilities in 4 Veterans Integrated Service Networks. Automated data were obtained from a central Veterans Health Administration diabetes registry containing information on laboratory tests and medication use. Success rates were higher for process measures derived from medical record data than from automated data, but no substantial differences among data sources were found for the intermediate outcome measures. Agreement for measures derived from the medical record compared with automated data was moderate for process measures but high for intermediate outcome measures. Hybrid measures yielded success rates similar to those of medical record-based measures but would have required about 50% fewer chart reviews. Agreement between medical record and automated data was generally high. Yet even in an integrated health care system with sophisticated information technology, automated data tended to underestimate the success rate in technical process measures for diabetes care and yielded different quartile performance rankings for facilities. Applying hybrid methodology yielded results consistent with the medical record but required less data to come from medical record reviews.

  5. Integrated Computational and Experimental Protocol for Understanding Rh(III) Speciation in Hydrochloric and Nitric Acid Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samuels, Alex C.; Boele, Cherilynn A.; Bennett, Kevin T.

    2014-12-01

    A combined experimental and theoretical approach has investigated the complex speciation of Rh(III) in hydrochloric and nitric acid media, as a function of acid concentration. This has relevance to the separation and isolation of Rh(III) from dissolved spent nuclear fuel, which is an emergent and attractive alternative source of platinum group metals, relative to traditional mining efforts.

  6. Inversion of Atmospheric Tracer Measurements, Localization of Sources

    NASA Astrophysics Data System (ADS)

    Issartel, J.-P.; Cabrit, B.; Hourdin, F.; Idelkadi, A.

    When abnormal concentrations of a pollutant are observed in the atmosphere, the question of its origin arises immediately. The radioactivity from Tchernobyl was de- tected in Sweden before the accident was announced. This situation emphasizes the psychological, political and medical stakes of a rapid identification of sources. In tech- nical terms, most industrial sources can be modeled as a fixed point at ground level with undetermined duration. The classical method of identification involves the cal- culation of a backtrajectory departing from the detector with an upstream integration of the wind field. We were first involved in such questions as we evaluated the ef- ficiency of the international monitoring network planned in the frame of the Com- prehensive Test Ban Treaty. We propose a new approach of backtracking based upon the use of retroplumes associated to available measurements. Firstly the retroplume is related to inverse transport processes, describing quantitatively how the air in a sam- ple originates from regions that are all the more extended and diffuse as we go back far in the past. Secondly it clarifies the sensibility of the measurement with respect to all potential sources. It is therefore calculated by adjoint equations including of course diffusive processes. Thirdly, the statistical interpretation, valid as far as sin- gle particles are concerned, should not be used to investigate the position and date of a macroscopic source. In that case, the retroplume rather induces a straightforward constraint between the intensity of the source and its position. When more than one measurements are available, including zero valued measurements, the source satisfies the same number of linear relations tightly related to the retroplumes. This system of linear relations can be handled through the simplex algorithm in order to make the above intensity-position correlation more restrictive. This method enables to manage in a quantitative manner the unavoidable ambiguity of atmospheric phenomena. When several measurements are available the ambiguity about the identification of a source is reduced significantly.

  7. Integrating multiple data sources in species distribution modeling: A framework for data fusion

    USGS Publications Warehouse

    Pacifici, Krishna; Reich, Brian J.; Miller, David A.W.; Gardner, Beth; Stauffer, Glenn E.; Singh, Susheela; McKerrow, Alexa; Collazo, Jaime A.

    2017-01-01

    The last decade has seen a dramatic increase in the use of species distribution models (SDMs) to characterize patterns of species’ occurrence and abundance. Efforts to parameterize SDMs often create a tension between the quality and quantity of data available to fit models. Estimation methods that integrate both standardized and non-standardized data types offer a potential solution to the tradeoff between data quality and quantity. Recently several authors have developed approaches for jointly modeling two sources of data (one of high quality and one of lesser quality). We extend their work by allowing for explicit spatial autocorrelation in occurrence and detection error using a Multivariate Conditional Autoregressive (MVCAR) model and develop three models that share information in a less direct manner resulting in more robust performance when the auxiliary data is of lesser quality. We describe these three new approaches (“Shared,” “Correlation,” “Covariates”) for combining data sources and show their use in a case study of the Brown-headed Nuthatch in the Southeastern U.S. and through simulations. All three of the approaches which used the second data source improved out-of-sample predictions relative to a single data source (“Single”). When information in the second data source is of high quality, the Shared model performs the best, but the Correlation and Covariates model also perform well. When the information quality in the second data source is of lesser quality, the Correlation and Covariates model performed better suggesting they are robust alternatives when little is known about auxiliary data collected opportunistically or through citizen scientists. Methods that allow for both data types to be used will maximize the useful information available for estimating species distributions.

  8. The global Filipino nurse: An integrative review of Filipino nurses' work experiences.

    PubMed

    Montayre, Jed; Montayre, Jasmine; Holroyd, Eleanor

    2018-05-01

    To understand the work-related experiences of Philippine-trained nurses working globally. The Philippines is a major source country of foreign-trained nurses located globally. However, there is paucity of research on professional factors and career related issues affecting foreign-trained nurses' work experiences. An integrative review through a comprehensive search of literature was undertaken from November 2015 and was repeated in August 2016. Seven articles satisfied the selection criteria. Filipino nurses experienced differences in the practice of nursing in terms of work process, roles and autonomy. Moreover, they encountered challenges such as work-related discrimination and technical difficulties within the organisation. A clear understanding of Filipino nurses' work experiences and the challenges they have encountered suggests identification of important constructs influencing effective translation of nursing practice across cultures and health systems, which then form the basis for support strategies. It is critical to recognize foreign-trained nurses' experience of work-related differences and challenges as these foster favorable conditions for the management team to plan and continually evaluate policies around recruitment, retention and support offered to these nurses. Furthermore, findings suggest internationalization of nursing framework and standards integrating a transcultural paradigm among staff members within a work organisation. © 2017 John Wiley & Sons Ltd.

  9. Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koulouri, Alexandra, E-mail: koulouri@uni-muenster.de; Department of Electrical and Electronic Engineering, Imperial College London, Exhibition Road, London SW7 2BT; Brookes, Mike

    In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In thismore » paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field. - Highlights: • Vector tomography is used to reconstruct electric fields generated by dipole sources. • Inverse solutions are based on longitudinal and transverse line integral measurements. • Transverse line integral measurements are used as a sparsity constraint. • Numerical procedure to approximate the line integrals is described in detail. • Patterns of the studied electric fields are correctly estimated.« less

  10. Dose rate calculations around 192Ir brachytherapy sources using a Sievert integration model

    NASA Astrophysics Data System (ADS)

    Karaiskos, P.; Angelopoulos, A.; Baras, P.; Rozaki-Mavrouli, H.; Sandilos, P.; Vlachos, L.; Sakelliou, L.

    2000-02-01

    The classical Sievert integral method is a valuable tool for dose rate calculations around brachytherapy sources, combining simplicity with reasonable computational times. However, its accuracy in predicting dose rate anisotropy around 192 Ir brachytherapy sources has been repeatedly put into question. In this work, we used a primary and scatter separation technique to improve an existing modification of the Sievert integral (Williamson's isotropic scatter model) that determines dose rate anisotropy around commercially available 192 Ir brachytherapy sources. The proposed Sievert formalism provides increased accuracy while maintaining the simplicity and computational time efficiency of the Sievert integral method. To describe transmission within the materials encountered, the formalism makes use of narrow beam attenuation coefficients which can be directly and easily calculated from the initially emitted 192 Ir spectrum. The other numerical parameters required for its implementation, once calculated with the aid of our home-made Monte Carlo simulation code, can be used for any 192 Ir source design. Calculations of dose rate and anisotropy functions with the proposed Sievert expression, around commonly used 192 Ir high dose rate sources and other 192 Ir elongated source designs, are in good agreement with corresponding accurate Monte Carlo results which have been reported by our group and other authors.

  11. The contribution of different information sources for adverse effects data.

    PubMed

    Golder, Su; Loke, Yoon K

    2012-04-01

    The aim of this study is to determine the relative value and contribution of searching different sources to identify adverse effects data. The process of updating a systematic review and meta-analysis of thiazolidinedione-related fractures in patients with type 2 diabetes mellitus was used as a case study. For each source searched, a record was made for each relevant reference included in the review noting whether it was retrieved with the search strategy used and whether it was available but not retrieved. The sensitivity, precision, and number needed to read from searching each source and from different combinations of sources were also calculated. There were 58 relevant references which presented sufficient numerical data to be included in a meta-analysis of fractures and bone mineral density. The highest number of relevant references were retrieved from Science Citation Index (SCI) (35), followed by BIOSIS Previews (27) and EMBASE (24). The precision of the searches varied from 0.88% (Scirus) to 41.67% (CENTRAL). With the search strategies used, the minimum combination of sources required to retrieve all the relevant references was; the GlaxoSmithKline (GSK) website, Science Citation Index (SCI), EMBASE, BIOSIS Previews, British Library Direct, Medscape DrugInfo, handsearching and reference checking, AHFS First, and Thomson Reuters Integrity or Conference Papers Index (CPI). In order to identify all the relevant references for this case study a number of different sources needed to be searched. The minimum combination of sources required to identify all the relevant references did not include MEDLINE.

  12. On the Discovery of Evolving Truth

    PubMed Central

    Li, Yaliang; Li, Qi; Gao, Jing; Su, Lu; Zhao, Bo; Fan, Wei; Han, Jiawei

    2015-01-01

    In the era of big data, information regarding the same objects can be collected from increasingly more sources. Unfortunately, there usually exist conflicts among the information coming from different sources. To tackle this challenge, truth discovery, i.e., to integrate multi-source noisy information by estimating the reliability of each source, has emerged as a hot topic. In many real world applications, however, the information may come sequentially, and as a consequence, the truth of objects as well as the reliability of sources may be dynamically evolving. Existing truth discovery methods, unfortunately, cannot handle such scenarios. To address this problem, we investigate the temporal relations among both object truths and source reliability, and propose an incremental truth discovery framework that can dynamically update object truths and source weights upon the arrival of new data. Theoretical analysis is provided to show that the proposed method is guaranteed to converge at a fast rate. The experiments on three real world applications and a set of synthetic data demonstrate the advantages of the proposed method over state-of-the-art truth discovery methods. PMID:26705502

  13. The New Landscape of Ethics and Integrity in Scholarly Publishing

    NASA Astrophysics Data System (ADS)

    Hanson, B.

    2016-12-01

    Scholarly peer-reviewed publications serve five major functions: They (i) have served as the primary, useful archive of scientific progress for hundreds of years; (ii) have been one principal way that scientists, and more recently departments and institutions, are evaluated; (iii) trigger and are the source of much communication about science to the public; (iv) have been primary revenue sources for scientific societies and companies; and (v) more recently play a critical and codified role in legal and regulatory decisions and advice to governments. Recent dynamics in science as well as in society, including the growth of online communication and new revenue sources, are influencing and altering particularly the first four core functions greatly. The changes in turn are posing important new challenges to the ethics and integrity of scholarly publishing and thus science in ways that are not widely or fully appreciated. For example, the expansion of electronic publishing has raised a number of new challenges for publishers with respect to their responsibility for curating scientific knowledge and even preserving the basic integrity of a manuscript. Many challenges are realted to new or expanded financial conflicts of interest related to the use of metrics such as the Journal Impact Factor, the expansion of alternate business models such as open access and advertising, and the fact that publishers are increasingly involved in framing communication around papers they are publishing. Solutions pose new responsibilities for scientists, publishers, and scientific societies, especially around transparency in their operations.

  14. How prior expectations shape multisensory perception.

    PubMed

    Gau, Remi; Noppeney, Uta

    2016-01-01

    The brain generates a representation of our environment by integrating signals from a common source, but segregating signals from different sources. This fMRI study investigated how the brain arbitrates between perceptual integration and segregation based on top-down congruency expectations and bottom-up stimulus-bound congruency cues. Participants were presented audiovisual movies of phonologically congruent, incongruent or McGurk syllables that can be integrated into an illusory percept (e.g. "ti" percept for visual «ki» with auditory /pi/). They reported the syllable they perceived. Critically, we manipulated participants' top-down congruency expectations by presenting McGurk stimuli embedded in blocks of congruent or incongruent syllables. Behaviorally, participants were more likely to fuse audiovisual signals into an illusory McGurk percept in congruent than incongruent contexts. At the neural level, the left inferior frontal sulcus (lIFS) showed increased activations for bottom-up incongruent relative to congruent inputs. Moreover, lIFS activations were increased for physically identical McGurk stimuli, when participants segregated the audiovisual signals and reported their auditory percept. Critically, this activation increase for perceptual segregation was amplified when participants expected audiovisually incongruent signals based on prior sensory experience. Collectively, our results demonstrate that the lIFS combines top-down prior (in)congruency expectations with bottom-up (in)congruency cues to arbitrate between multisensory integration and segregation. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  16. AN INTEGRATED APPROACH TO CHARACTERIZING BYPASSED OIL IN HETEROGENEOUS AND FRACTURED RESERVOIRS USING PARTITIONING TRACERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akhil Datta-Gupta

    2003-08-01

    We explore the use of efficient streamline-based simulation approaches for modeling partitioning interwell tracer tests in hydrocarbon reservoirs. Specifically, we utilize the unique features of streamline models to develop an efficient approach for interpretation and history matching of field tracer response. A critical aspect here is the underdetermined and highly ill-posed nature of the associated inverse problems. We have adopted an integrated approach whereby we combine data from multiple sources to minimize the uncertainty and non-uniqueness in the interpreted results. For partitioning interwell tracer tests, these are primarily the distribution of reservoir permeability and oil saturation distribution. A novel approachmore » to multiscale data integration using Markov Random Fields (MRF) has been developed to integrate static data sources from the reservoir such as core, well log and 3-D seismic data. We have also explored the use of a finite difference reservoir simulator, UTCHEM, for field-scale design and optimization of partitioning interwell tracer tests. The finite-difference model allows us to include detailed physics associated with reactive tracer transport, particularly those related with transverse and cross-streamline mechanisms. We have investigated the potential use of downhole tracer samplers and also the use of natural tracers for the design of partitioning tracer tests. Finally, the behavior of partitioning tracer tests in fractured reservoirs is investigated using a dual-porosity finite-difference model.« less

  17. Going Beyond the Facts: Young Children Extend Knowledge by Integrating Episodes

    PubMed Central

    Bauer, Patricia J.; Souci, Priscilla San

    2010-01-01

    The major question posed in this research was whether 4- and 6-year-old children productively extend their knowledge by integrating information acquired in separate episodes. The vehicle was a read-aloud activity during which children were presented with a novel fact in each of two passages. In Experiment 1, both age groups showed evidence of integration between the passages. For 6-year-olds, the evidence came in the form of responses to open-ended questions. Four-year-olds recognized the correct answers, but did not generate them in the open-ended question format. The 6-year-olds who generated the correct answers also were likely to recall both of the individual facts presented in the passages. In Experiment 2, we tested whether 4-year-olds’ integration performance would improve if their memory for the individual facts improved. Extra exposure to the individual facts resulted in higher levels of integration performance in both recall and recognition testing. The roles of memory and other potential sources of age-related differences in integration performance are discussed. PMID:20663513

  18. OpenFlyData: an exemplar data web integrating gene expression data on the fruit fly Drosophila melanogaster.

    PubMed

    Miles, Alistair; Zhao, Jun; Klyne, Graham; White-Cooper, Helen; Shotton, David

    2010-10-01

    Integrating heterogeneous data across distributed sources is a major requirement for in silico bioinformatics supporting translational research. For example, genome-scale data on patterns of gene expression in the fruit fly Drosophila melanogaster are widely used in functional genomic studies in many organisms to inform candidate gene selection and validate experimental results. However, current data integration solutions tend to be heavy weight, and require significant initial and ongoing investment of effort. Development of a common Web-based data integration infrastructure (a.k.a. data web), using Semantic Web standards, promises to alleviate these difficulties, but little is known about the feasibility, costs, risks or practical means of migrating to such an infrastructure. We describe the development of OpenFlyData, a proof-of-concept system integrating gene expression data on D. melanogaster, combining Semantic Web standards with light-weight approaches to Web programming based on Web 2.0 design patterns. To support researchers designing and validating functional genomic studies, OpenFlyData includes user-facing search applications providing intuitive access to and comparison of gene expression data from FlyAtlas, the BDGP in situ database, and FlyTED, using data from FlyBase to expand and disambiguate gene names. OpenFlyData's services are also openly accessible, and are available for reuse by other bioinformaticians and application developers. Semi-automated methods and tools were developed to support labour- and knowledge-intensive tasks involved in deploying SPARQL services. These include methods for generating ontologies and relational-to-RDF mappings for relational databases, which we illustrate using the FlyBase Chado database schema; and methods for mapping gene identifiers between databases. The advantages of using Semantic Web standards for biomedical data integration are discussed, as are open issues. In particular, although the performance of open source SPARQL implementations is sufficient to query gene expression data directly from user-facing applications such as Web-based data fusions (a.k.a. mashups), we found open SPARQL endpoints to be vulnerable to denial-of-service-type problems, which must be mitigated to ensure reliability of services based on this standard. These results are relevant to data integration activities in translational bioinformatics. The gene expression search applications and SPARQL endpoints developed for OpenFlyData are deployed at http://openflydata.org. FlyUI, a library of JavaScript widgets providing re-usable user-interface components for Drosophila gene expression data, is available at http://flyui.googlecode.com. Software and ontologies to support transformation of data from FlyBase, FlyAtlas, BDGP and FlyTED to RDF are available at http://openflydata.googlecode.com. SPARQLite, an implementation of the SPARQL protocol, is available at http://sparqlite.googlecode.com. All software is provided under the GPL version 3 open source license.

  19. A novel methodology for radiative transfer in a planetary atmosphere. I - The functions a exponent m and b exponent m of anisotropic scattering

    NASA Technical Reports Server (NTRS)

    Fymat, A. L.; Kalaba, R. E.

    1977-01-01

    The original problem of anisotropic scattering in an atmosphere illuminated by a unidirectional source is replaced by an analogous formulation where the incident light is omnidirectional. A radiative-transfer equation for the omnidirectional case is obtained in which the direction of illumination plays no role and the source-function analog, Sobolev's (1972) source function Phi exponent m, contains only a single integral term. For radiation incident on the top or the bottom of the atmosphere, this equation involves the functions b exponent m and h exponent m, respectively, with m corresponding to the order of the harmonic component of the scattered radiation field; these two functions are shown to be only one through some simple reciprocity relations. The transfer problem is then reformulated for the function a exponent m, in which case the source-function analog (Sobolev's function D exponent m) involves incident direction.

  20. Source apportion of atmospheric particulate matter: a joint Eulerian/Lagrangian approach.

    PubMed

    Riccio, A; Chianese, E; Agrillo, G; Esposito, C; Ferrara, L; Tirimberio, G

    2014-12-01

    PM2.5 samples were collected during an annual monitoring campaign (January 2012-January 2013) in the urban area of Naples, one of the major cities in Southern Italy. Samples were collected by means of a standard gravimetric sampler (Tecora Echo model) and characterized from a chemical point of view by ion chromatography. As a result, 143 samples together with their ionic composition have been collected. We extend traditional source apportionment techniques, usually based on multivariate factor analysis, interpreting the chemical analysis results within a Lagrangian framework. The Hybrid Single-Particle Lagrangian Integrated Trajectory Model (HYSPLIT) model was used, providing linkages to the source regions in the upwind areas. Results were analyzed in order to quantify the relative weight of different source types/areas. Model results suggested that PM concentrations are strongly affected not only by local emissions but also by transboundary emissions, especially from the Eastern and Northern European countries and African Saharan dust episodes.

  1. Rechargeable Li/Li(x)CoO(2) 100 Ah/600 Ah Battery With Integral Smart Charge Control

    DTIC Science & Technology

    1999-03-01

    Rechargeable Li/LixCo02100 Ah/600 Ah Battery with Integral Smart Charge Control By Charles J. Kelly ^ (Alliant Techsystems, Inc., Alliant Power...Rechargeable Li/LixCo02100 Ah/600 Ah Battery with Integral Smart Charge Control By Charles J. Kelly (Alliant Techsystems, Inc., Alliant Power Sources...AND SUBTITLE Rechargeable Li/LixCo02100 Ah/600 Ah Battery with Integral Smart Charge Control 6 AUTHOR(S) C. J. Kelly (Alliant Power Sources Co

  2. Heterogenous database integration in a physician workstation.

    PubMed

    Annevelink, J; Young, C Y; Tang, P C

    1991-01-01

    We discuss the integration of a variety of data and information sources in a Physician Workstation (PWS), focusing on the integration of data from DHCP, the Veteran Administration's Distributed Hospital Computer Program. We designed a logically centralized, object-oriented data-schema, used by end users and applications to explore the data accessible through an object-oriented database using a declarative query language. We emphasize the use of procedural abstraction to transparently integrate a variety of information sources into the data schema.

  3. Heterogenous database integration in a physician workstation.

    PubMed Central

    Annevelink, J.; Young, C. Y.; Tang, P. C.

    1991-01-01

    We discuss the integration of a variety of data and information sources in a Physician Workstation (PWS), focusing on the integration of data from DHCP, the Veteran Administration's Distributed Hospital Computer Program. We designed a logically centralized, object-oriented data-schema, used by end users and applications to explore the data accessible through an object-oriented database using a declarative query language. We emphasize the use of procedural abstraction to transparently integrate a variety of information sources into the data schema. PMID:1807624

  4. ESA's Integral solves thirty-year old gamma-ray mystery

    NASA Astrophysics Data System (ADS)

    Integral solves mystery hi-res Size hi-res: 60 kb Credits: Credit: ESA, F. Lebrun (CEA-Saclay). ESA's Integral solves thirty-year old gamma-ray mystery The central regions of our galaxy, the Milky Way, as seen by Integral in gamma rays. With its superior ability to see faint details, Integral correctly reveals the individual sources that comprised the foggy, gamma-ray background seen by previous observatories. The brightest 91 objects seen in this image were classified by Integral as individual sources, while the others appear too faint to be properly characterized at this stage. During the spring and autumn of 2003, Integral observed the central regions of our Galaxy, collecting some of the perpetual glow of diffuse low-energy gamma rays that bathe the entire Galaxy. These gamma rays were first discovered in the mid-1970s by high-flying balloon-borne experiments. Astronomers refer to them as the 'soft' Galactic gamma-ray background, with energies similar to those used in medical X-ray equipment. Initially, astronomers believed that the glow was caused by interactions involving the atoms of the gas that pervades the Galaxy. Whilst this theory could explain the diffuse nature of the emission, since the gas is ubiquitous, it failed to match the observed power of the gamma rays. The gamma rays produced by the proposed mechanisms would be much weaker than those observed. The mystery has remained unanswered for decades. Now Integral's superb gamma-ray telescope IBIS, built for ESA by an international consortium led by Principal Investigator Pietro Ubertini (IAS/CNR, Rome, Italy), has seen clearly that, instead of a fog produced by the interstellar medium, most of the gamma-rays are coming from individual celestial objects. In the view of previous, less sensitive instruments, these objects appeared to merge together. In a paper published today in "Nature", Francois Lebrun (CEA Saclay, Gif sur Yvette, France) and his collaborators report the discovery of 91 gamma-ray sources towards the direction of the Galactic centre. Lebrun's team includes Ubertini and seventeen other European scientists with long-standing experience in high-energy astrophysics. Much to the team's surprise, almost half of these sources do not fall in any class of known gamma-ray objects. They probably represent a new population of gamma-ray emitters. The first clues about a new class of gamma-ray objects came last October, when Integral discovered an intriguing gamma-ray source, known as IGRJ16318-4848. The data from Integral and ESA's other high-energy observatory XMM-Newton suggested that this object is a binary system, probably including a black hole or neutron star, embedded in a thick cocoon of cold gas and dust. When gas from the companion star is accelerated and swallowed by the black hole, energy is released at all wavelengths, mostly in the gamma rays. However, Lebrun is cautious to draw premature conclusions about the sources detected in the Galactic centre. Other interpretations are also possible that do not involve black holes. For instance, these objects could be the remains of exploded stars that are being energised by rapidly rotating celestial 'powerhouses', known as pulsars. Observations with another Integral instrument (SPI, the Spectrometer on Integral) could provide Lebrun and his team with more information on the nature of these sources. SPI measures the energy of incoming gamma rays with extraordinary accuracy and allows scientist to gain a better understanding of the physical mechanisms that generate them. However, regardless of the precise nature of these gamma-ray sources, Integral's observations have convincingly shown that the energy output from these new objects accounts for almost ninety per cent of the soft gamma-ray background coming from the centre of the Galaxy. This result raises the tantalising possibility that objects of this type hide everywhere in the Galaxy, not just in its centre. Again, Lebrun is cautious, saying, "It is tempting to think that we can simply extrapolate our results to the entire Galaxy. However, we have only looked towards its centre and that is a peculiar place compared to the rest." Next on Integral's list of things to do is to extend this work to the rest of the Galaxy. Christoph Winkler, ESA's Integral Project Scientist, says, "We now have to work on the whole disc region of the Galaxy. This will be a tough and long job for Integral. But at the end, the reward will be an exhaustive inventory of the most energetic celestial objects in the Galaxy." Note to editors The paper explaining these results will appear on the 18 March 2004 issue of "Nature". The author list includes F. Lebrun, R. Terrier, A. Bazzano, G. Belanger, A. Bird, L. Bouchet, A. Dean, M. Del Santo, A. Goldwurm, N. Lund, H. Morand, A. Parmar, J. Paul, J.-P. Roques, V. Schoenfelder, A. Strong, P. Ubertini, R. Walter and C. Winkler. For information about the related INTEGRAL and XMM-Newton discovery of IGRJ16318-4848, see: http://www.esa.int/esaSC/Pr_21_2003_s_en.html Integral The International Gamma Ray Astrophysics Laboratory (Integral) is the first space observatory that can simultaneously observe celestial objects in gamma rays, X-rays and visible light. Integral was launched on a Russian Proton rocket on 17 October 2002 into a highly elliptical orbit around Earth. Its principal targets include regions of the galaxy where chemical elements are being produced and compact objects, such as black holes. IBIS, Imager on Board the Integral Satellite - IBIS provides sharper gamma-ray images than any previous gamma-ray instrument. It can locate sources to a precision of 30 arcseconds, the equivalent of measuring the height of a person standing in a crowd, 1.3 kilometres away. The Principal Investigators that built the instrument are P. Ubertini (IAS/CNR, Rome, Italy), F. Lebrun (CEA Saclay, Gif sur Yvette, France), G. Di Cocco (ITESRE, Bologna, Italy). IBIS is equipped with the first un-cooled semiconductor gamma-ray camera, called ISGRI, which is responsible for its outstanding sensitivity. ISGRI was developed and built for ESA by CEA Saclay, France. SPI, Spectrometer on Integral - SPI measures the energy of incoming gamma rays with extraordinary accuracy. It is more sensitive to faint radiation than any previous gamma ray instrument and allows the precise nature of gamma ray sources to be determined. The Principal Investigators that developed SPI are J.-P. Roques, (CESR, Toulouse, France) and V. Schoenfelder (MPE, Garching, Germany). XMM-Newton XMM-Newton can detect more X-ray sources than any previous observatory and is helping to solve many cosmic mysteries of the violent Universe, from black holes to the formation of galaxies. It was launched on 10 December 1999, using an Ariane-5 rocket from French Guiana. Its orbit takes it almost a third of the way to the Moon, so that astronomers can enjoy long, uninterrupted views of celestial objects.

  5. Sediment source fingerprinting to quantify fine sediment sources in forested catchments, Chile.

    NASA Astrophysics Data System (ADS)

    Schuller, P.; Walling, D. E.; Iroume, A.; Castillo, A.; Quilodran, C.

    2012-04-01

    A study to improve the understanding of the primary sediment sources and transfer pathways in catchments disturbed following forest plantation harvesting is being undertaken in South-Central Chile. The study focuses on two sets of paired experimental catchments (treatment and control), located about 400 km apart, with similar soil type but contrasting mean annual rainfall: Nacimiento (1,200 mm year-1) and Los Ulmos (2,500 mm year-1). Sediment source fingerprinting techniques are being used to document the primary fine sediment sources. In each catchment, three potential sediment sources were defined: clearcut slopes (Z1), forest roads (Z2) and the stream channel (Z3). In each catchment, multiple representative composite samples of the different potential source materials were collected before harvest operations from the upper 1 cm layer in Z1, Z2, and from the channel bank and bed for Z3. A time-integrating trap sampler installed in the discharge monitoring station constructed at the outlet of each catchment has been used to collect samples of the suspended sediment and these have been supplemented by sediment collected from the weir pools. Total suspended sediment load is been quantified in the monitoring stations using discharge records and integrated water sampling. Caesium-137 (137Cs), excess lead-210 (210Pbex) and other sediment properties are being used as fingerprints. After air-drying, oven-drying at 40°C and disaggregation, both the source material samples and the sediment samples collected in the discharge monitoring stations were sieved through a 63-μm sieve and the <63-μm fractions were used for subsequent analyses. For radionuclide assay, the samples were sealed in Petri dishes and after 4 weeks the mass activity density (activity concentration) of 137Cs and 210Pbex was determined by gamma analysis, using an ORTEC extended range Ge detector of 53% relative efficiency. The 137Cs and 210Pbex activity and organic carbon (Corg) concentration associated with potential source materials and the target sediment show that the two radionuclides used in combination with the Corg property provide effective source fingerprints. Additional work using a mixing model taking account of particle size effects is required to establish the relative contributions of the three sources to the fine sediment loads of the study catchments. This research is supported by the Chilean Government through FONDECYT Project 1090574 and by the IAEA through CRP D1.20.11 (Contract CHI-15531 and Technical Contract 15478) and the RLA 05/051 Project.

  6. The GALAXIE all-optical FEL project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenzweig, J. B.; Arab, E.; Andonian, G.

    2012-12-21

    We describe a comprehensive project, funded under the DARPA AXiS program, to develop an all-optical table-top X-ray FEL based on dielectric acceleration and electromagnetic undulators, yielding a compact source of coherent X-rays for medical and related applications. The compactness of this source demands that high field (>GV/m) acceleration and undulation-inducing fields be employed, thus giving rise to the project's acronym: GV/m AcceLerator And X-ray Integrated Experiment (GALAXIE). There are numerous physics and technical hurdles to surmount in this ambitious scenario, and the integrated solutions include: a biharmonic photonic TW structure, 200 micron wavelength electromagnetic undulators, 5 {mu}m laser development, ultra-highmore » brightness magnetized/asymmetric emittance electron beam generation, and SASE FEL operation. We describe the overall design philosophy of the project, the innovative approaches to addressing the challenges presented by the design, and the significant progress towards realization of these approaches in the nine months since project initialization.« less

  7. ADEpedia: a scalable and standardized knowledge base of Adverse Drug Events using semantic web technology.

    PubMed

    Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G

    2011-01-01

    A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.

  8. Mining and Integration of Environmental Data

    NASA Astrophysics Data System (ADS)

    Tran, V.; Hluchy, L.; Habala, O.; Ciglan, M.

    2009-04-01

    The project ADMIRE (Advanced Data Mining and Integration Research for Europe) is a 7th FP EU ICT project aims to deliver a consistent and easy-to-use technology for extracting information and knowledge. The project is motivated by the difficulty of extracting meaningful information by data mining combinations of data from multiple heterogeneous and distributed resources. It will also provide an abstract view of data mining and integration, which will give users and developers the power to cope with complexity and heterogeneity of services, data and processes. The data sets describing phenomena from domains like business, society, and environment often contain spatial and temporal dimensions. Integration of spatio-temporal data from different sources is a challenging task due to those dimensions. Different spatio-temporal data sets contain data at different resolutions (e.g. size of the spatial grid) and frequencies. This heterogeneity is the principal challenge of geo-spatial and temporal data sets integration - the integrated data set should hold homogeneous data of the same resolution and frequency. Thus, to integrate heterogeneous spatio-temporal data from distinct source, transformation of one or more data sets is necessary. Following transformation operation are required: • transformation to common spatial and temporal representation - (e.g. transformation to common coordinate system), • spatial and/or temporal aggregation - data from detailed data source are aggregated to match the resolution of other resources involved in the integration process, • spatial and/or temporal record decomposition - records from source with lower resolution data are decomposed to match the granularity of the other data source. This operation decreases data quality (e.g. transformation of data from 50km grid to 10 km grid) - data from lower resolution data set in the integrated schema are imprecise, but it allows us to preserve higher resolution data. We can decompose the spatio-temporal data integration to following phases: • pre-integration data processing - different data set can be physically stored in different formats (e.g. relational databases, text files); it might be necessary to pre-process the data sets to be integrated, • identification of transformation operations necessary to integrate data in spatio-temporal dimensions, • identification of transformation operations to be performed on non-spatio-temporal attributes and • output data schema and set generation - given prepared data and the set of transformation, operations, the final integrated schema is produces. Spatio-temporal dimension brings its specifics also to the problem of mining spatio-temporal data sets. Spatio-temporal relationships exist among records in (s-t) data sets and those relationships should be considered in mining operation. This means that when analyzing a record in spatio-temporal data set, the records in its spatial and/or temporal proximity should be taken into account. In addition, the relationships discovered in spatio-temporal data can be different when mining the same data on different scales (e.g. mining the same data sets on 50 km grid with daily data vs. 10 km grid with hourly data). To be able to do effective data mining, we first needed to gather a sufficient amount of environmental data covering similar area and time span. For this purpose we have engaged in cooperation with several organizations working in the environmental domain in Slovakia, some of which are also our partners from previous research efforts. The organizations which volunteered some of their data are the Slovak Hydro-meteorological Institute (SHMU), the Slovak Water Enterprise (SVP), the Soil Science and Conservation Institute (VUPOP), and the Institute of Hydrology of the Slovak Academy of Sciences (UHSAV). We have prepared scenarios from general meteorology, as well as specialized in hydrology and soil protection.

  9. An all-silicon optical PC-to-PC link utilizing USB

    NASA Astrophysics Data System (ADS)

    Goosen, Marius E.; Alberts, Antonie C.; Venter, Petrus J.; du Plessis, Monuko; Rademeyer, Pieter

    2013-02-01

    An integrated silicon light source still remains the Holy Grail for integrated optical communication systems. Hot carrier luminescent light sources provide a way to create light in a standard CMOS process, potentially enabling cost effective optical communication between CMOS integrated circuits. In this paper we present a 1 Mb/s integrated silicon optical link for information transfer, targeting a real-world integrated solution by connecting two PCs via a USB port while transferring data optically between the devices. This realization represents the first optical communication product prototype utilizing a CMOS light emitter. The silicon light sources which are implemented in a standard 0.35 μm CMOS technology are electrically modulated and detected using a commercial silicon avalanche photodiode. Data rates exceeding 10 Mb/s using silicon light sources have previously been demonstrated using raw bit streams. In this work data is sent in two half duplex streams accompanied with the separate transmission of a clock. Such an optical communication system could find application in high noise environments where data fidelity, range and cost are a determining factor.

  10. Screening the Emission Sources of Volatile Organic Compounds (VOCs) in China Based on Multi-effect Evaluation

    NASA Astrophysics Data System (ADS)

    Niu, H., Jr.

    2015-12-01

    Volatile organic compounds (VOCs) in the atmosphere have adverse impacts via three main pathways: photochemical ozone formation, secondary organic aerosol production, and direct toxicity to humans. Few studies have integrated these effects to prioritize control measures for VOCs sources. In this study, we developed a multi-effect evaluation methodology based on updated emission inventories and source profiles, which was combined with ozone formation potential (OFP), secondary organic aerosol potential (SOAP), and VOC toxicity data to identify important emission sources and key species. We derived species-specific emission inventories for 152 sources. The OFPs, SOAPs, and toxicity of each source were determined, and the contribution and share of each source to each of these adverse effects was calculated. Weightings were given to the three adverse effects by expert scoring, and the integrated impact was determined. Using 2012 as the base year, solvent usage and industrial process were found to be the most important anthropogenic sources, accounting for 24.2 and 23.1% of the integrated environmental effect, respectively. This was followed by biomass burning, transportation, and fossil fuel combustion, all of which had a similar contribution ranging from 16.7 to 18.6%. The top five industrial sources, including plastic products, rubber products, chemical fiber products, the chemical industry, and oil refining, accounted for nearly 70.0% of industrial emissions. In China, emissions reductions are required for styrene, toluene, ethylene, benzene, and m/p-xylene. The 10 most abundant chemical species contributed 76.5% of the integrated impact. Beijing, Chongqing, Shanghai, Jiangsu, and Guangdong were the five leading provinces when considering the integrated effects. Besides, the chemical mass balance model (CMB) was used to verify the VOCs inventories of 47 cities in China, so as to optimize our evaluation results. We suggest that multi-effect evaluation is necessary to identify the need for abatement at the source type and substance levels.

  11. Multisource Data Integration in Remote Sensing

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Editor)

    1991-01-01

    Papers presented at the workshop on Multisource Data Integration in Remote Sensing are compiled. The full text of these papers is included. New instruments and new sensors are discussed that can provide us with a large variety of new views of the real world. This huge amount of data has to be combined and integrated in a (computer-) model of this world. Multiple sources may give complimentary views of the world - consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions may be caused by noise, errors during processing, or misinterpretations, and can be identified as such. As a consequence, integration results are very reliable and represent a valid source of information for any geographical information system.

  12. The nature of the embedded population in the Rho Ophiuchi dark cloud - Mid-infrared observations

    NASA Technical Reports Server (NTRS)

    Lada, C. J.; Wilking, B. A.

    1984-01-01

    In combination with previous IR and optical data, the present 10-20 micron observations of previously identified members of the embedded population of the Rho Ophiuchi dark cloud allow determinations to be made of the broadband energy distributions for 32 of the 44 sources. The majority of the sources are found to emit the bulk of their luminosity in the 1-20 micron range, and to be surrounded by dust shells. Because they are, in light of these characteristics, probably premain-sequence in nature, relatively accurate bolometric luminosities for these objects can be obtained through integration of their energy distributions. It is found that 44 percent of the sources are less luminous than the sun, and are among the lowest luminosity premain-sequence/protostellar objects observed to date.

  13. 3D Sound Techniques for Sound Source Elevation in a Loudspeaker Listening Environment

    NASA Astrophysics Data System (ADS)

    Kim, Yong Guk; Jo, Sungdong; Kim, Hong Kook; Jang, Sei-Jin; Lee, Seok-Pil

    In this paper, we propose several 3D sound techniques for sound source elevation in stereo loudspeaker listening environments. The proposed method integrates a head-related transfer function (HRTF) for sound positioning and early reflection for adding reverberant circumstance. In addition, spectral notch filtering and directional band boosting techniques are also included for increasing elevation perception capability. In order to evaluate the elevation performance of the proposed method, subjective listening tests are conducted using several kinds of sound sources such as white noise, sound effects, speech, and music samples. It is shown from the tests that the degrees of perceived elevation by the proposed method are around the 17º to 21º when the stereo loudspeakers are located on the horizontal plane.

  14. TIGRESS highly-segmented high-purity germanium clover detector

    NASA Astrophysics Data System (ADS)

    Scraggs, H. C.; Pearson, C. J.; Hackman, G.; Smith, M. B.; Austin, R. A. E.; Ball, G. C.; Boston, A. J.; Bricault, P.; Chakrawarthy, R. S.; Churchman, R.; Cowan, N.; Cronkhite, G.; Cunningham, E. S.; Drake, T. E.; Finlay, P.; Garrett, P. E.; Grinyer, G. F.; Hyland, B.; Jones, B.; Leslie, J. R.; Martin, J.-P.; Morris, D.; Morton, A. C.; Phillips, A. A.; Sarazin, F.; Schumaker, M. A.; Svensson, C. E.; Valiente-Dobón, J. J.; Waddington, J. C.; Watters, L. M.; Zimmerman, L.

    2005-05-01

    The TRIUMF-ISAC Gamma-Ray Escape-Suppressed Spectrometer (TIGRESS) will consist of twelve units of four high-purity germanium (HPGe) crystals in a common cryostat. The outer contacts of each crystal will be divided into four quadrants and two lateral segments for a total of eight outer contacts. The performance of a prototype HPGe four-crystal unit has been investigated. Integrated noise spectra for all contacts were measured. Energy resolutions, relative efficiencies for both individual crystals and for the entire unit, and peak-to-total ratios were measured with point-like sources. Position-dependent performance was measured by moving a collimated source across the face of the detector.

  15. Programmable LED-based integrating sphere light source for wide-field fluorescence microscopy.

    PubMed

    Rehman, Aziz Ul; Anwer, Ayad G; Goldys, Ewa M

    2017-12-01

    Wide-field fluorescence microscopy commonly uses a mercury lamp, which has limited spectral capabilities. We designed and built a programmable integrating sphere light (PISL) source which consists of nine LEDs, light-collecting optics, a commercially available integrating sphere and a baffle. The PISL source is tuneable in the range 365-490nm with a uniform spatial profile and a sufficient power at the objective to carry out spectral imaging. We retrofitted a standard fluorescence inverted microscope DM IRB (Leica) with a PISL source by mounting it together with a highly sensitive low- noise CMOS camera. The capabilities of the setup have been demonstrated by carrying out multispectral autofluorescence imaging of live BV2 cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Emissions & Measurements - Black Carbon | Science ...

    EPA Pesticide Factsheets

    Emissions and Measurement (EM) research activities performed within the National Risk Management Research Lab NRMRL) of EPA's Office of Research and Development (ORD) support measurement and laboratory analysis approaches to accurately characterize source emissions, and near source concentrations of air pollutants. They also support integrated Agency research programs (e.g., source to health outcomes) and the development of databases and inventories that assist Federal, state, and local air quality managers and industry implement and comply with air pollution standards. EM research underway in NRMRL supports the Agency's efforts to accurately characterize, analyze, measure and manage sources of air pollution. This pamphlet focuses on the EM research that NRMRL researchers conduct related to black carbon (BC). Black Carbon is a pollutant of concern to EPA due to its potential impact on human health and climate change. There are extensive uncertainties in emissions of BC from stationary and mobile sources. Emissions and Measurement (EM) research activities performed within the National Risk Management Research Lab NRMRL) of EPA's Office of Research and Development (ORD)

  17. Using Commercially available Tools for multi-faceted health assessment: Data Integration Lessons Learned

    PubMed Central

    Wilamowska, Katarzyna; Le, Thai; Demiris, George; Thompson, Hilaire

    2013-01-01

    Health monitoring data collected from multiple available intake devices provide a rich resource to support older adult health and wellness. Though large amounts of data can be collected, there is currently a lack of understanding on integration of these various data sources using commercially available products. This article describes an inexpensive approach to integrating data from multiple sources from a recently completed pilot project that assessed older adult wellness, and demonstrates challenges and benefits in pursuing data integration using commercially available products. The data in this project were sourced from a) electronically captured participant intake surveys, and existing commercial software output for b) vital signs and c) cognitive function. All the software used for data integration in this project was freeware and was chosen because of its ease of comprehension by novice database users. The methods and results of this approach provide a model for researchers with similar data integration needs to easily replicate this effort at a low cost. PMID:23728444

  18. Relative Throughput of the Near-IR Science Instruments for the James Webb Space Telescope as Measured During Ground Testing the Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    Malumuth, Eliot; Birkmann, Stephan; Kelly, Douglas M.; Kimble, Randy A.; Lindler, Don; Martel, Andre; Ohl, Raymond G.; Rieke, Marcia J.; Rowlands, Neil; Te Plate, Maurice

    2016-01-01

    Data were obtained for the purpose of measuring the relative throughput of the Near-IR Science Instruments (SIs) of the James Webb Space Telescope (JWST) as part of the second and third cryogenic-vacuum tests (CV2CV3) of the Integrated Science Instrument Module (ISIM) conducted at the Goddard Space Flight Center (GSFC) in 2014 and 20152016, at the beginning and end of the environmental test program, respectively. This Poster focuses on data obtained as part of the Initial Optical Baseline and as part of the Final Performance test -- two epochs that roughly bracket the CV3 test. The purpose of the test is to trend relative throughput to monitor for any potential changes from gross problems such as contamination or degradation of an optical element. Point source data were taken at a variety of wavelengths for NIRCam Module A and Module B, NIRSpec, NIRISS, Guider 1 and Guider 2 using the Laser Diode (LD) 1.06 micron, LD 1.55 micron, 2.1 micron LED and 3.5 micron LED, as well as for NIRCam Mod A and B and NIRISS using a tungsten source and the F277W, and F480M filters. Spectra were taken using the G140M, G235M, and G395M gratings for NIRSpec, the GRISMR grism for NIRCam Mod A and B and the GR150C grism for NIRISS. The results of these measurements are compared to what would be expected given the efficiency of each of the optical elements in each SI. Although these data were taken as a check against gross problems, they can also be used to provide the first relative throughput estimate for each SI through the various filters source wavelengths measured in their flight-like configurations.

  19. Building integrated business environments: analysing open-source ESB

    NASA Astrophysics Data System (ADS)

    Martínez-Carreras, M. A.; García Jimenez, F. J.; Gómez Skarmeta, A. F.

    2015-05-01

    Integration and interoperability are two concepts that have gained significant prominence in the business field, providing tools which enable enterprise application integration (EAI). In this sense, enterprise service bus (ESB) has played a crucial role as the underpinning technology for creating integrated environments in which companies may connect all their legacy-applications. However, the potential of these technologies remains unknown and some important features are not used to develop suitable business environments. The aim of this paper is to describe and detail the elements for building the next generation of integrated business environments (IBE) and to analyse the features of ESBs as the core of this infrastructure. For this purpose, we evaluate how well-known open-source ESB products fulfil these needs. Moreover, we introduce a scenario in which the collaborative system 'Alfresco' is integrated in the business infrastructure. Finally, we provide a comparison of the different open-source ESBs available for IBE requirements. According to this study, Fuse ESB provides the best results, considering features such as support for a wide variety of standards and specifications, documentation and implementation, security, advanced business trends, ease of integration and performance.

  20. Multiple-source spatial data fusion and integration research in the region unified planning management information system

    NASA Astrophysics Data System (ADS)

    Liu, Zhijun; Zhang, Liangpei; Liu, Zhenmin; Jiao, Hongbo; Chen, Liqun

    2008-12-01

    In order to manage the internal resources of Gulf of Tonkin and integrate multiple-source spatial data, the establishment of region unified plan management system is needed. The data fusion and the integrated research should be carried on because there are some difficulties in the course of the system's establishment. For example, kinds of planning and the project data format are different, and data criterion is not unified. Besides, the time state property is strong, and spatial reference is inconsistent, etc. In this article the ARCGIS ENGINE is introduced as the developing platform, key technologies are researched, such as multiple-source data transformation and fusion, remote sensing data and DEM fusion and integrated, plan and project data integration, and so on. Practice shows that the system improves the working efficiency of Guangxi Gulf of Tonkin Economic Zone Management Committee significantly and promotes planning construction work of the economic zone remarkably.

  1. Deep Kalman Filter: Simultaneous Multi-Sensor Integration and Modelling; A GNSS/IMU Case Study

    PubMed Central

    Hosseinyalamdary, Siavash

    2018-01-01

    Bayes filters, such as the Kalman and particle filters, have been used in sensor fusion to integrate two sources of information and obtain the best estimate of unknowns. The efficient integration of multiple sensors requires deep knowledge of their error sources. Some sensors, such as Inertial Measurement Unit (IMU), have complicated error sources. Therefore, IMU error modelling and the efficient integration of IMU and Global Navigation Satellite System (GNSS) observations has remained a challenge. In this paper, we developed deep Kalman filter to model and remove IMU errors and, consequently, improve the accuracy of IMU positioning. To achieve this, we added a modelling step to the prediction and update steps of the Kalman filter, so that the IMU error model is learned during integration. The results showed our deep Kalman filter outperformed the conventional Kalman filter and reached a higher level of accuracy. PMID:29695119

  2. Deep Kalman Filter: Simultaneous Multi-Sensor Integration and Modelling; A GNSS/IMU Case Study.

    PubMed

    Hosseinyalamdary, Siavash

    2018-04-24

    Bayes filters, such as the Kalman and particle filters, have been used in sensor fusion to integrate two sources of information and obtain the best estimate of unknowns. The efficient integration of multiple sensors requires deep knowledge of their error sources. Some sensors, such as Inertial Measurement Unit (IMU), have complicated error sources. Therefore, IMU error modelling and the efficient integration of IMU and Global Navigation Satellite System (GNSS) observations has remained a challenge. In this paper, we developed deep Kalman filter to model and remove IMU errors and, consequently, improve the accuracy of IMU positioning. To achieve this, we added a modelling step to the prediction and update steps of the Kalman filter, so that the IMU error model is learned during integration. The results showed our deep Kalman filter outperformed the conventional Kalman filter and reached a higher level of accuracy.

  3. Ambulatory position and orientation tracking fusing magnetic and inertial sensing.

    PubMed

    Roetenberg, Daniel; Slycke, Per J; Veltink, Peter H

    2007-05-01

    This paper presents the design and testing of a portable magnetic system combined with miniature inertial sensors for ambulatory 6 degrees of freedom (DOF) human motion tracking. The magnetic system consists of three orthogonal coils, the source, fixed to the body and 3-D magnetic sensors, fixed to remote body segments, which measure the fields generated by the source. Based on the measured signals, a processor calculates the relative positions and orientations between source and sensor. Magnetic actuation requires a substantial amount of energy which limits the update rate with a set of batteries. Moreover, the magnetic field can easily be disturbed by ferromagnetic materials or other sources. Inertial sensors can be sampled at high rates, require only little energy and do not suffer from magnetic interferences. However, accelerometers and gyroscopes can only measure changes in position and orientation and suffer from integration drift. By combing measurements from both systems in a complementary Kalman filter structure, an optimal solution for position and orientation estimates is obtained. The magnetic system provides 6 DOF measurements at a relatively low update rate while the inertial sensors track the changes position and orientation in between the magnetic updates. The implemented system is tested against a lab-bound camera tracking system for several functional body movements. The accuracy was about 5 mm for position and 3 degrees for orientation measurements. Errors were higher during movements with high velocities due to relative movement between source and sensor within one cycle of magnetic actuation.

  4. Summer precipitation prediction in the source region of the Yellow River using climate indices

    NASA Astrophysics Data System (ADS)

    Yuan, F.

    2016-12-01

    The source region of the Yellow River contributes about 35% of the total water yield in the Yellow River basin playing an important role in meeting downstream water resources requirements. The summer precipitation from June to September in the source region of the Yellow River accounts for about 70% of the annual total, and its decrease would cause further water shortage problems. Consequently, the objectives of this study are to improve the understanding of the linkages between the precipitation in the source region of the Yellow River and global teleconnection patterns, and to predict the summer precipitation based on revealed teleconnections. Spatial variability of precipitation was investigated based on three homogeneous sub-regions. Principal component analysis and singular value decomposition were used to find significant relations between the precipitation in the source region of the Yellow River and global teleconnection patterns using climate indices. A back-propagation neural network was developed to predict the summer precipitation using significantly correlated climate indices. It was found that precipitation in the study area is positively related to North Atlantic Oscillation, West Pacific Pattern and El Nino Southern Oscillation, and inversely related to Polar Eurasian pattern. Summer precipitation was overall well predicted using these significantly correlated climate indices, and the Pearson correlation coefficient between predicted and observed summer precipitation was in general larger than 0.6. The results are useful for integrated water resources management in the Yellow River basin.

  5. Measuring temporal summation in visual detection with a single-photon source.

    PubMed

    Holmes, Rebecca; Victora, Michelle; Wang, Ranxiao Frances; Kwiat, Paul G

    2017-11-01

    Temporal summation is an important feature of the visual system which combines visual signals that arrive at different times. Previous research estimated complete summation to last for 100ms for stimuli judged "just detectable." We measured the full range of temporal summation for much weaker stimuli using a new paradigm and a novel light source, developed in the field of quantum optics for generating small numbers of photons with precise timing characteristics and reduced variance in photon number. Dark-adapted participants judged whether a light was presented to the left or right of their fixation in each trial. In Experiment 1, stimuli contained a stream of photons delivered at a constant rate while the duration was systematically varied. Accuracy should increase with duration as long as the later photons can be integrated with the proceeding ones into a single signal. The temporal integration window was estimated as the point that performance no longer improved, and was found to be 650ms on average. In Experiment 2, the duration of the visual stimuli was kept short (100ms or <30ms) while the number of photons was varied to explore the efficiency of summation over the integration window compared to Experiment 1. There was some indication that temporal summation remains efficient over the integration window, although there is variation between individuals. The relatively long integration window measured in this study may be relevant to studies of the absolute visual threshold, i.e., tests of single-photon vision, where "single" photons should be separated by greater than the integration window to avoid summation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. A web-enabled system for integrated assessment of watershed development

    USGS Publications Warehouse

    Dymond, R.; Lohani, V.; Regmi, B.; Dietz, R.

    2004-01-01

    Researchers at Virginia Tech have put together the primary structure of a web enabled integrated modeling system that has potential to be a planning tool to help decision makers and stakeholders in making appropriate watershed management decisions. This paper describes the integrated system, including data sources, collection, analysis methods, system software and design, and issues of integrating the various component models. The integrated system has three modeling components, namely hydrology, economics, and fish health, and is accompanied by descriptive 'help files.' Since all three components have a related spatial aspect, GIS technology provides the integration platform. When completed, a user will access the integrated system over the web to choose pre-selected land development patterns to create a 'what if' scenario using an easy-to-follow interface. The hydrologic model simulates effects of the scenario on annual runoff volume, flood peaks of various return periods, and ground water recharge. The economics model evaluates tax revenue and fiscal costs as a result of a new land development scenario. The fish health model evaluates effects of new land uses in zones of influence to the health of fish populations in those areas. Copyright ASCE 2004.

  7. All Source Sensor Integration Using an Extended Kalman Filter

    DTIC Science & Technology

    2012-03-22

    Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix I. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 All...Positioning System . . . . . . . . . . . . . . . . . . 1 ASPN All Source Positioning Navigation . . . . . . . . . . . . . . 2 DARPA Defense Advanced...equations are developed for sensor preprocessed mea- 1 surements, and these navigation equations are not dependent upon the integrating filter. That is

  8. The impact of relative intensity noise on the signal in multiple reference optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Neuhaus, Kai; Subhash, Hrebesh; Alexandrov, Sergey; Dsouza, Roshan; Hogan, Josh; Wilson, Carol; Leahy, Martin; Slepneva, Svetlana; Huyet, Guillaume

    2016-03-01

    Multiple reference optical coherence tomography (MR-OCT) applies a unique low-cost solution to enhance the scanning depth of standard time domain OCT by inserting an partial mirror into the reference arm of the interferometric system. This novel approach achieves multiple reflections for different layers and depths of an sample with minimal effort of engineering and provides an excellent platform for low-cost OCT systems based on well understood production methods for micro-mechanical systems such as CD/DVD pick-up systems. The direct integration of a superluminescent light-emitting diode (SLED) is a preferable solution to reduce the form- factor of an MR-OCT system. Such direct integration exposes the light source to environmental conditions that can increase fluctuations in heat dissipation and vibrations and affect the noise characteristics of the output spectrum. This work describes the impact of relative intensity noise (RIN) on the quality of the interference signal of MR-OCT related to a variety of environmental conditions, such as temperature.

  9. Integrated spatial multiplexing of heralded single-photon sources

    PubMed Central

    Collins, M.J.; Xiong, C.; Rey, I.H.; Vo, T.D.; He, J.; Shahnia, S.; Reardon, C.; Krauss, T.F.; Steel, M.J.; Clark, A.S.; Eggleton, B.J.

    2013-01-01

    The non-deterministic nature of photon sources is a key limitation for single-photon quantum processors. Spatial multiplexing overcomes this by enhancing the heralded single-photon yield without enhancing the output noise. Here the intrinsic statistical limit of an individual source is surpassed by spatially multiplexing two monolithic silicon-based correlated photon pair sources in the telecommunications band, demonstrating a 62.4% increase in the heralded single-photon output without an increase in unwanted multipair generation. We further demonstrate the scalability of this scheme by multiplexing photons generated in two waveguides pumped via an integrated coupler with a 63.1% increase in the heralded photon rate. This demonstration paves the way for a scalable architecture for multiplexing many photon sources in a compact integrated platform and achieving efficient two-photon interference, required at the core of optical quantum computing and quantum communication protocols. PMID:24107840

  10. Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources.

    PubMed

    Gao, Xiang; Acar, Levent

    2016-07-04

    This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors' data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented.

  11. A Feature-Reinforcement-Based Approach for Supporting Poly-Lingual Category Integration

    NASA Astrophysics Data System (ADS)

    Wei, Chih-Ping; Chen, Chao-Chi; Cheng, Tsang-Hsiang; Yang, Christopher C.

    Document-category integration (or category integration for short) is fundamental to many e-commerce applications, including information integration along supply chains and information aggregation by intermediaries. Because of the trend of globalization, the requirement for category integration has been extended from monolingual to poly-lingual settings. Poly-lingual category integration (PLCI) aims to integrate two document catalogs, each of which consists of documents written in a mix of languages. Several category integration techniques have been proposed in the literature, but these techniques focus only on monolingual category integration rather than PLCI. In this study, we propose a feature-reinforcement-based PLCI (namely, FR-PLCI) technique that takes into account the master documents of all languages when integrating source documents (in the source catalog) written in a specific language into the master catalog. Using the monolingual category integration (MnCI) technique as a performance benchmark, our empirical evaluation results show that our proposed FR-PLCI technique achieves better integration accuracy than MnCI does in both English and Chinese category integration tasks.

  12. Systems and methods for supplemental weather information presentation on a display

    NASA Technical Reports Server (NTRS)

    Bunch, Brian (Inventor)

    2010-01-01

    An embodiment of the supplemental weather display system presents supplemental weather information on a display in a craft. An exemplary embodiment receives the supplemental weather information from a remote source, determines a location of the supplemental weather information relative to the craft, receives weather information from an on-board radar system, and integrates the supplemental weather information with the weather information received from the on-board radar system.

  13. Model fitting data from syllogistic reasoning experiments.

    PubMed

    Hattori, Masasi

    2016-12-01

    The data presented in this article are related to the research article entitled "Probabilistic representation in syllogistic reasoning: A theory to integrate mental models and heuristics" (M. Hattori, 2016) [1]. This article presents predicted data by three signature probabilistic models of syllogistic reasoning and model fitting results for each of a total of 12 experiments ( N =404) in the literature. Models are implemented in R, and their source code is also provided.

  14. Flood Control, Roseau River, Roseau and Kittson Counties, Minnesota. Final Environmental Impact Statement.

    DTIC Science & Technology

    1976-12-01

    Lemna minor , Spirodela polythiza, Wolffia columbiana) and high chlorophyll concentrations were observed. 2.315 Additional major sources of nitrogen...fethboe the hnnelfottmand heol ogic effects, although some flooding is experienced in the city of Roseau during spring flood events due to minor ...were then integrated into this "optimum" biological and archaeological plan. Relatively few conflicts developed. These included minor changes, which

  15. Shaping biological knowledge: applications in proteomics.

    PubMed

    Lisacek, F; Chichester, C; Gonnet, P; Jaillet, O; Kappus, S; Nikitin, F; Roland, P; Rossier, G; Truong, L; Appel, R

    2004-01-01

    The central dogma of molecular biology has provided a meaningful principle for data integration in the field of genomics. In this context, integration reflects the known transitions from a chromosome to a protein sequence: transcription, intron splicing, exon assembly and translation. There is no such clear principle for integrating proteomics data, since the laws governing protein folding and interactivity are not quite understood. In our effort to bring together independent pieces of information relative to proteins in a biologically meaningful way, we assess the bias of bioinformatics resources and consequent approximations in the framework of small-scale studies. We analyse proteomics data while following both a data-driven (focus on proteins smaller than 10 kDa) and a hypothesis-driven (focus on whole bacterial proteomes) approach. These applications are potentially the source of specialized complements to classical biological ontologies.

  16. Developing a Cyberinfrastructure for integrated assessments of environmental contaminants.

    PubMed

    Kaur, Taranjit; Singh, Jatinder; Goodale, Wing M; Kramar, David; Nelson, Peter

    2005-03-01

    The objective of this study was to design and implement prototype software for capturing field data and automating the process for reporting and analyzing the distribution of mercury. The four phase process used to design, develop, deploy and evaluate the prototype software is described. Two different development strategies were used: (1) design of a mobile data collection application intended to capture field data in a meaningful format and automate transfer into user databases, followed by (2) a re-engineering of the original software to develop an integrated database environment with improved methods for aggregating and sharing data. Results demonstrated that innovative use of commercially available hardware and software components can lead to the development of an end-to-end digital cyberinfrastructure that captures, records, stores, transmits, compiles and integrates multi-source data as it relates to mercury.

  17. A finite element formulation for supersonic flows around complex configurations

    NASA Technical Reports Server (NTRS)

    Morino, L.

    1974-01-01

    The problem of small perturbation potential supersonic flow around complex configurations is considered. This problem requires the solution of an integral equation relating the values of the potential on the surface of the body to the values of the normal derivative, which is known from the small perturbation boundary conditions. The surface of the body is divided into small (hyperboloidal quadrilateral) surface elements which are described in terms of the Cartesian components of the four corner points. The values of the potential (and its normal derivative) within each element are assumed to be constant and equal to its value at the centroid of the element. This yields a set of linear algebraic equations whose coefficients are given by source and doublet integrals over the surface elements. Closed form evaluations of the integrals are presented.

  18. Representation and Integration of Scientific Information

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The objective of this Joint Research Interchange with NASA-Ames was to investigate how the Tsimmis technology could be used to represent and integrate scientific information. The funding allowed us to work with researchers within NAS at the NASA Ames Research Center, to understand their information needs, and to work with them on integration strategies. Most organizations have a need to access and integrate information from multiple, disparate information sources that may include both structured as well as semi-structured information. At Stanford we have been working on an information integration project called Tsimmis, supported by DARPA. The main goal of the Tsimmis project is to allow a decision maker to find information of interest from such sources, fuse it, and process it (e.g., summarize it, visualize it, discover trends). Another important goal is the easy incorporation of new sources, as well the ability of deal with sources whose structure or services evolve. During the Interchange we had research meetings approximately every month or two. The participants from NASA included Michael Cox and Peter Vanderbilt. The Stanford PI, and various students and Stanford staff members also participated. NASA researchers also participated in some of our regular Tsimmis meetings. As planned, our meetings discussed problems and solutions to various information integration problems.

  19. A MoTe2-based light-emitting diode and photodetector for silicon photonic integrated circuits.

    PubMed

    Bie, Ya-Qing; Grosso, Gabriele; Heuck, Mikkel; Furchi, Marco M; Cao, Yuan; Zheng, Jiabao; Bunandar, Darius; Navarro-Moratalla, Efren; Zhou, Lin; Efetov, Dmitri K; Taniguchi, Takashi; Watanabe, Kenji; Kong, Jing; Englund, Dirk; Jarillo-Herrero, Pablo

    2017-12-01

    One of the current challenges in photonics is developing high-speed, power-efficient, chip-integrated optical communications devices to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, in part because of the promise that many components, such as waveguides, couplers, interferometers and modulators, could be directly integrated on silicon-based processors. However, light sources and photodetectors present ongoing challenges. Common approaches for light sources include one or few off-chip or wafer-bonded lasers based on III-V materials, but recent system architecture studies show advantages for the use of many directly modulated light sources positioned at the transmitter location. The most advanced photodetectors in the silicon photonic process are based on germanium, but this requires additional germanium growth, which increases the system cost. The emerging two-dimensional transition-metal dichalcogenides (TMDs) offer a path for optical interconnect components that can be integrated with silicon photonics and complementary metal-oxide-semiconductors (CMOS) processing by back-end-of-the-line steps. Here, we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe 2 , a TMD semiconductor with an infrared bandgap. This state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.

  20. A MoTe2-based light-emitting diode and photodetector for silicon photonic integrated circuits

    NASA Astrophysics Data System (ADS)

    Bie, Ya-Qing; Grosso, Gabriele; Heuck, Mikkel; Furchi, Marco M.; Cao, Yuan; Zheng, Jiabao; Bunandar, Darius; Navarro-Moratalla, Efren; Zhou, Lin; Efetov, Dmitri K.; Taniguchi, Takashi; Watanabe, Kenji; Kong, Jing; Englund, Dirk; Jarillo-Herrero, Pablo

    2017-12-01

    One of the current challenges in photonics is developing high-speed, power-efficient, chip-integrated optical communications devices to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, in part because of the promise that many components, such as waveguides, couplers, interferometers and modulators, could be directly integrated on silicon-based processors. However, light sources and photodetectors present ongoing challenges. Common approaches for light sources include one or few off-chip or wafer-bonded lasers based on III-V materials, but recent system architecture studies show advantages for the use of many directly modulated light sources positioned at the transmitter location. The most advanced photodetectors in the silicon photonic process are based on germanium, but this requires additional germanium growth, which increases the system cost. The emerging two-dimensional transition-metal dichalcogenides (TMDs) offer a path for optical interconnect components that can be integrated with silicon photonics and complementary metal-oxide-semiconductors (CMOS) processing by back-end-of-the-line steps. Here, we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe2, a TMD semiconductor with an infrared bandgap. This state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.

  1. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  2. Performance analysis of an integrated GPS/inertial attitude determination system. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Sullivan, Wendy I.

    1994-01-01

    The performance of an integrated GPS/inertial attitude determination system is investigated using a linear covariance analysis. The principles of GPS interferometry are reviewed, and the major error sources of both interferometers and gyroscopes are discussed and modeled. A new figure of merit, attitude dilution of precision (ADOP), is defined for two possible GPS attitude determination methods, namely single difference and double difference interferometry. Based on this figure of merit, a satellite selection scheme is proposed. The performance of the integrated GPS/inertial attitude determination system is determined using a linear covariance analysis. Based on this analysis, it is concluded that the baseline errors (i.e., knowledge of the GPS interferometer baseline relative to the vehicle coordinate system) are the limiting factor in system performance. By reducing baseline errors, it should be possible to use lower quality gyroscopes without significantly reducing performance. For the cases considered, single difference interferometry is only marginally better than double difference interferometry. Finally, the performance of the system is found to be relatively insensitive to the satellite selection technique.

  3. Method and apparatus for in-system redundant array repair on integrated circuits

    DOEpatents

    Bright, Arthur A [Croton-on-Hudson, NY; Crumley, Paul G [Yorktown Heights, NY; Dombrowa, Marc B [Bronx, NY; Douskey, Steven M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Oakland, Steven F [Colchester, VT; Ouellette, Michael R [Westford, VT; Strissel, Scott A [Byron, MN

    2008-07-29

    Disclosed is a method of repairing an integrated circuit of the type comprising of a multitude of memory arrays and a fuse box holding control data for controlling redundancy logic of the arrays. The method comprises the steps of providing the integrated circuit with a control data selector for passing the control data from the fuse box to the memory arrays; providing a source of alternate control data, external of the integrated circuit; and connecting the source of alternate control data to the control data selector. The method comprises the further step of, at a given time, passing the alternate control data from the source thereof, through the control data selector and to the memory arrays to control the redundancy logic of the memory arrays.

  4. Integrated front-end electronics in a detector compatible process: source-follower and charge-sensitive preamplifier configurations

    NASA Astrophysics Data System (ADS)

    Ratti, Lodovico; Manghisoni, Massimo; Re, Valerio; Speziali, Valeria

    2001-12-01

    This study is concerned with the simulation and design of low-noise front-end electronics monolithically integrated on the same high-resistivity substrate as multielectrode silicon detectors, in a process made available by the Istituto per la Ricerca Scientifica e Tecnologica (ITC-IRST) of Trento, Italy. The integrated front-end solutions described in this paper use N-channel JFETs as basic elements. The first one is based upon an all-NJFET charge preamplifier designed to match detector capacitances of a few picofarads and available in both a resistive and a non resistive feedback configuration. In the second solution, a single NJFET in the source-follower configuration is connected to the detector, while its source is wired to an external readout channel through an integrated capacitor.

  5. Method and apparatus for in-system redundant array repair on integrated circuits

    DOEpatents

    Bright, Arthur A [Croton-on-Hudson, NY; Crumley, Paul G [Yorktown Heights, NY; Dombrowa, Marc B [Bronx, NY; Douskey, Steven M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Oakland, Steven F [Colchester, VT; Ouellette, Michael R [Westford, VT; Strissel, Scott A [Byron, MN

    2008-07-08

    Disclosed is a method of repairing an integrated circuit of the type comprising of a multitude of memory arrays and a fuse box holding control data for controlling redundancy logic of the arrays. The method comprises the steps of providing the integrated circuit with a control data selector for passing the control data from the fuse box to the memory arrays; providing a source of alternate control data, external of the integrated circuit; and connecting the source of alternate control data to the control data selector. The method comprises the further step of, at a given time, passing the alternate control data from the source thereof, through the control data selector and to the memory arrays to control the redundancy logic of the memory arrays.

  6. Method and apparatus for in-system redundant array repair on integrated circuits

    DOEpatents

    Bright, Arthur A.; Crumley, Paul G.; Dombrowa, Marc B.; Douskey, Steven M.; Haring, Rudolf A.; Oakland, Steven F.; Ouellette, Michael R.; Strissel, Scott A.

    2007-12-18

    Disclosed is a method of repairing an integrated circuit of the type comprising of a multitude of memory arrays and a fuse box holding control data for controlling redundancy logic of the arrays. The method comprises the steps of providing the integrated circuit with a control data selector for passing the control data from the fuse box to the memory arrays; providing a source of alternate control data, external of the integrated circuit; and connecting the source of alternate control data to the control data selector. The method comprises the further step of, at a given time, passing the alternate control data from the source thereof, through the control data selector and to the memory arrays to control the redundancy logic of the memory arrays.

  7. Deficient multisensory integration in schizophrenia: an event-related potential study.

    PubMed

    Stekelenburg, Jeroen J; Maes, Jan Pieter; Van Gool, Arthur R; Sitskoorn, Margriet; Vroomen, Jean

    2013-07-01

    In many natural audiovisual events (e.g., the sight of a face articulating the syllable /ba/), the visual signal precedes the sound and thus allows observers to predict the onset and the content of the sound. In healthy adults, the N1 component of the event-related brain potential (ERP), reflecting neural activity associated with basic sound processing, is suppressed if a sound is accompanied by a video that reliably predicts sound onset. If the sound does not match the content of the video (e.g., hearing /ba/ while lipreading /fu/), the later occurring P2 component is affected. Here, we examined whether these visual information sources affect auditory processing in patients with schizophrenia. The electroencephalography (EEG) was recorded in 18 patients with schizophrenia and compared with that of 18 healthy volunteers. As stimuli we used video recordings of natural actions in which visual information preceded and predicted the onset of the sound that was either congruent or incongruent with the video. For the healthy control group, visual information reduced the auditory-evoked N1 if compared to a sound-only condition, and stimulus-congruency affected the P2. This reduction in N1 was absent in patients with schizophrenia, and the congruency effect on the P2 was diminished. Distributed source estimations revealed deficits in the network subserving audiovisual integration in patients with schizophrenia. The results show a deficit in multisensory processing in patients with schizophrenia and suggest that multisensory integration dysfunction may be an important and, to date, under-researched aspect of schizophrenia. Copyright © 2013. Published by Elsevier B.V.

  8. 76 FR 31507 - Domestic Licensing of Source Material-Amendments/Integrated Safety Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-01

    ... Licensing of Source Material--Amendments/Integrated Safety Analysis AGENCY: Nuclear Regulatory Commission... rule announced the availability of a draft regulatory analysis for public comment. This document... in Section XI, ``Regulatory Analysis.'' The correct ADAMS accession number is ML102380243. DATES: The...

  9. Better Assessment Science Integrating Point and Nonpoint Sources

    EPA Science Inventory

    Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is not a model per se, but is a multipurpose environmental decision support system for use by regional, state, and local agencies in performing watershed- and water-quality-based studies. BASI...

  10. Single Quantum Dot with Microlens and 3D-Printed Micro-objective as Integrated Bright Single-Photon Source

    PubMed Central

    2017-01-01

    Integrated single-photon sources with high photon-extraction efficiency are key building blocks for applications in the field of quantum communications. We report on a bright single-photon source realized by on-chip integration of a deterministic quantum dot microlens with a 3D-printed multilens micro-objective. The device concept benefits from a sophisticated combination of in situ 3D electron-beam lithography to realize the quantum dot microlens and 3D femtosecond direct laser writing for creation of the micro-objective. In this way, we obtain a high-quality quantum device with broadband photon-extraction efficiency of (40 ± 4)% and high suppression of multiphoton emission events with g(2)(τ = 0) < 0.02. Our results highlight the opportunities that arise from tailoring the optical properties of quantum emitters using integrated optics with high potential for the further development of plug-and-play fiber-coupled single-photon sources. PMID:28670600

  11. Spatial Data Integration Using Ontology-Based Approach

    NASA Astrophysics Data System (ADS)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  12. Source apportionments of ambient fine particulate matter in Israeli, Jordanian, and Palestinian cities.

    PubMed

    Heo, Jongbae; Wu, Bo; Abdeen, Ziad; Qasrawi, Radwan; Sarnat, Jeremy A; Sharf, Geula; Shpund, Kobby; Schauer, James J

    2017-06-01

    This manuscript evaluates spatial and temporal variations of source contributions to ambient fine particulate matter (PM 2.5 ) in Israeli, Jordanian, and Palestinian cities. Twenty-four hour integrated PM 2.5 samples were collected every six days over a 1-year period (January to December 2007) in four cities in Israel (West Jerusalem, Eilat, Tel Aviv, and Haifa), four cities in Jordan (Amman, Aqaba, Rahma, and Zarka), and three cities in Palestine (Nablus, East Jerusalem, and Hebron). The PM 2.5 samples were analyzed for major chemical components, including organic carbon and elemental carbon, ions, and metals, and the results were used in a positive matrix factorization (PMF) model to estimate source contributions to PM 2.5 mass. Nine sources, including secondary sulfate, secondary nitrate, mobile, industrial lead sources, dust, construction dust, biomass burning, fuel oil combustion and sea salt, were identified across the sampling sites. Secondary sulfate was the dominant source, contributing 35% of the total PM 2.5 mass, and it showed relatively homogeneous temporal trends of daily source contribution in the study area. Mobile sources were found to be the second greatest contributor to PM 2.5 mass in the large metropolitan cities, such as Tel Aviv, Hebron, and West and East Jerusalem. Other sources (i.e. industrial lead sources, construction dust, and fuel oil combustion) were closely related to local emissions within individual cities. This study demonstrates how international cooperation can facilitate air pollution studies that address regional air pollution issues and the incremental differences across cities in a common airshed. It also provides a model to study air pollution in regions with limited air quality monitoring capacity that have persistent and emerging air quality problems, such as Africa, South Asia and Central America. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. The Papillomavirus Episteme: a central resource for papillomavirus sequence data and analysis.

    PubMed

    Van Doorslaer, Koenraad; Tan, Qina; Xirasagar, Sandhya; Bandaru, Sandya; Gopalan, Vivek; Mohamoud, Yasmin; Huyen, Yentram; McBride, Alison A

    2013-01-01

    The goal of the Papillomavirus Episteme (PaVE) is to provide an integrated resource for the analysis of papillomavirus (PV) genome sequences and related information. The PaVE is a freely accessible, web-based tool (http://pave.niaid.nih.gov) created around a relational database, which enables storage, analysis and exchange of sequence information. From a design perspective, the PaVE adopts an Open Source software approach and stresses the integration and reuse of existing tools. Reference PV genome sequences have been extracted from publicly available databases and reannotated using a custom-created tool. To date, the PaVE contains 241 annotated PV genomes, 2245 genes and regions, 2004 protein sequences and 47 protein structures, which users can explore, analyze or download. The PaVE provides scientists with the data and tools needed to accelerate scientific progress for the study and treatment of diseases caused by PVs.

  14. Communicating Microbiology Concepts from Multiple Contexts through Poster Presentations.

    PubMed

    Gruss, Amy Borello

    2018-01-01

    Accredited environmental engineering degrees require graduates to be able to apply their scholarship to concepts of professional practice and design. This transferable skill of relating what you learn in one setting to another situation is vital for all professions, not just engineering. A course project involving designing and presenting a professional poster was implemented to enhance student mastery in Environmental Engineering Microbiology while also developing communication and transferable skills vital for all majors. Students were asked to read a contemporary non-fiction book relating to microbiology and expand upon the book's thesis by integrating course content, news articles, and peer-reviewed journal articles. They then were required to present this information in class using a professional poster. Students felt the project allowed them to synthesize and organize information, analyze ideas, and integrate ideas from various sources. These transferable skills are vital for students and professionals alike to be able to communicate advanced information and master a topic.

  15. Hierarchical semantic structures for medical NLP.

    PubMed

    Taira, Ricky K; Arnold, Corey W

    2013-01-01

    We present a framework for building a medical natural language processing (NLP) system capable of deep understanding of clinical text reports. The framework helps developers understand how various NLP-related efforts and knowledge sources can be integrated. The aspects considered include: 1) computational issues dealing with defining layers of intermediate semantic structures to reduce the dimensionality of the NLP problem; 2) algorithmic issues in which we survey the NLP literature and discuss state-of-the-art procedures used to map between various levels of the hierarchy; and 3) implementation issues to software developers with available resources. The objective of this poster is to educate readers to the various levels of semantic representation (e.g., word level concepts, ontological concepts, logical relations, logical frames, discourse structures, etc.). The poster presents an architecture for which diverse efforts and resources in medical NLP can be integrated in a principled way.

  16. Miniaturized radioisotope solid state power sources

    NASA Astrophysics Data System (ADS)

    Fleurial, J.-P.; Snyder, G. J.; Patel, J.; Herman, J. A.; Caillat, T.; Nesmith, B.; Kolawa, E. A.

    2000-01-01

    Electrical power requirements for the next generation of deep space missions cover a wide range from the kilowatt to the milliwatt. Several of these missions call for the development of compact, low weight, long life, rugged power sources capable of delivering a few milliwatts up to a couple of watts while operating in harsh environments. Advanced solid state thermoelectric microdevices combined with radioisotope heat sources and energy storage devices such as capacitors are ideally suited for these applications. By making use of macroscopic film technology, microgenrators operating across relatively small temperature differences can be conceptualized for a variety of high heat flux or low heat flux heat source configurations. Moreover, by shrinking the size of the thermoelements and increasing their number to several thousands in a single structure, these devices can generate high voltages even at low power outputs that are more compatible with electronic components. Because the miniaturization of state-of-the-art thermoelectric module technology based on Bi2Te3 alloys is limited due to mechanical and manufacturing constraints, we are developing novel microdevices using integrated-circuit type fabrication processes, electrochemical deposition techniques and high thermal conductivity substrate materials. One power source concept is based on several thermoelectric microgenerator modules that are tightly integrated with a 1.1W Radioisotope Heater Unit. Such a system could deliver up to 50mW of electrical power in a small lightweight package of approximately 50 to 60g and 30cm3. An even higher degree of miniaturization and high specific power values (mW/mm3) can be obtained when considering the potential use of radioisotope materials for an alpha-voltaic or a hybrid thermoelectric/alpha-voltaic power source. Some of the technical challenges associated with these concepts are discussed in this paper. .

  17. BiologicalNetworks 2.0 - an integrative view of genome biology data

    PubMed Central

    2010-01-01

    Background A significant problem in the study of mechanisms of an organism's development is the elucidation of interrelated factors which are making an impact on the different levels of the organism, such as genes, biological molecules, cells, and cell systems. Numerous sources of heterogeneous data which exist for these subsystems are still not integrated sufficiently enough to give researchers a straightforward opportunity to analyze them together in the same frame of study. Systematic application of data integration methods is also hampered by a multitude of such factors as the orthogonal nature of the integrated data and naming problems. Results Here we report on a new version of BiologicalNetworks, a research environment for the integral visualization and analysis of heterogeneous biological data. BiologicalNetworks can be queried for properties of thousands of different types of biological entities (genes/proteins, promoters, COGs, pathways, binding sites, and other) and their relations (interactions, co-expression, co-citations, and other). The system includes the build-pathways infrastructure for molecular interactions/relations and module discovery in high-throughput experiments. Also implemented in BiologicalNetworks are the Integrated Genome Viewer and Comparative Genomics Browser applications, which allow for the search and analysis of gene regulatory regions and their conservation in multiple species in conjunction with molecular pathways/networks, experimental data and functional annotations. Conclusions The new release of BiologicalNetworks together with its back-end database introduces extensive functionality for a more efficient integrated multi-level analysis of microarray, sequence, regulatory, and other data. BiologicalNetworks is freely available at http://www.biologicalnetworks.org. PMID:21190573

  18. Renewable Electricity Futures Study Executive Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mai, Trieu; Sandor, Debra; Wiser, Ryan

    2012-12-01

    The Renewable Electricity Futures Study (RE Futures) provides an analysis of the grid integration opportunities, challenges, and implications of high levels of renewable electricity generation for the U.S. electric system. The study is not a market or policy assessment. Rather, RE Futures examines renewable energy resources and many technical issues related to the operability of the U.S. electricity grid, and provides initial answers to important questions about the integration of high penetrations of renewable electricity technologies from a national perspective. RE Futures results indicate that a future U.S. electricity system that is largely powered by renewable sources is possible andmore » that further work is warranted to investigate this clean generation pathway.« less

  19. EPOS: Integrating seismological Research Infrastructures within Europe

    NASA Astrophysics Data System (ADS)

    Eck Van, Torild; Clinton, John; Haslinger, Florian; Michelini, Alberto

    2013-04-01

    Seismological data, products and models are currently produced in Europe within individual countries or research organizations, and with the contribution of coordinating organizations like ORFEUS and EMSC. In spite of these partly scattered resources, significant scientific results are obtained, excellent monitoring and information systems are operational and a huge amount of research quality data is being archived and disseminated. The seismological community, however, realizes that an effective European-scale integration of seismological and related geophysical data, products and models, combined with broad and easy access, is needed to facilitate future top level geoscience, for example, to appropriately harness the technological advancements enabling large scale and near-real time data processing. Here we present the technical concepts and developments within European seismology that will build the next generation of integrated services. Within the EPOS initiative and a number of related projects, where seismology infrastructure and IT developments are merging, in depth discussions are on-going on how to realize an effective integration. Concepts and visions addressing the obviously complex challenges resulting from the current highly distributed facilities and resources in Europe are emerging and are already partly being implemented. We will provide an overview of developments within key EU projects (NERA, VERCE, COOPEUS, EUDAT, REAKT, COMMIT, etc) and demonstrate how these are in coherence with EPOS and other on-going global initiatives. Within seismology current focus is on addressing IT related challenges to a) organize distributed data archives, develop metadata attributes for improved data searching, specifically including quality indicators, and define products from data and/or models, and b) define and create(on-line) monitoring, data access and processing tools. While developments to meet those challenges originate partly from within the community itself, it is important to harvest relevant ideas and tools from other scientific communities dealing with similar issues. We will present a short summary of those developments and how they fit within the proposed visions and concepts. These integration developments address a wide framework of seismological services that include: basic seismological data services (waveform data from velocity and acceleration sensors from land and underwater sites); seismological data products (source mechanism and process estimates, earthquake catalogues, structural and tomography model estimations); seismological models (synthetic waveforms, earth and earthquake source models, hazard models).Our aim is to build significantlyimproved seismological services and valuable products for multidisciplinary earth science research.

  20. Memory for performed and observed activities following traumatic brain injury

    PubMed Central

    Wright, Matthew J.; Wong, Andrew L.; Obermeit, Lisa C.; Woo, Ellen; Schmitter-Edgecombe, Maureen; Fuster, Joaquín M.

    2014-01-01

    Traumatic brain injury (TBI) is associated with deficits in memory for the content of completed activities. However, TBI groups have shown variable memory for the temporal order of activities. We sought to clarify the conditions under which temporal order memory for activities is intact following TBI. Additionally, we evaluated activity source memory and the relationship between activity memory and functional outcome in TBI participants. Thus, we completed a study of activity memory with 18 severe TBI survivors and 18 healthy age- and education-matched comparison participants. Both groups performed eight activities and observed eight activities that were fashioned after routine daily tasks. Incidental encoding conditions for activities were utilized. The activities were drawn from two counterbalanced lists, and both performance and observation were randomly determined and interspersed. After all of the activities were completed, content memory (recall and recognition), source memory (conditional source identification), and temporal order memory (correlation between order reconstruction and actual order) for the activities were assessed. Functional ability was assessed via the Community Integration Questionnaire (CIQ). In terms of content memory, TBI participants recalled and recognized fewer activities than comparison participants. Recognition of performed and observed activities was strongly associated with social integration on the CIQ. There were no between- or within-group differences in temporal order or source memory, although source memory performances were near ceiling. The findings were interpreted as suggesting that temporal order memory following TBI is intact under conditions of both purposeful activity completion and incidental encoding, and that activity memory is related to functional outcomes following TBI. PMID:24524393

  1. The effect of toxic carbon source on the reaction of activated sludge in the batch reactor.

    PubMed

    Wu, Changyong; Zhou, Yuexi; Zhang, Siyu; Xu, Min; Song, Jiamei

    2018-03-01

    The toxic carbon source can cause higher residual effluent dissolved organic carbon than easily biodegraded carbon source in activated sludge process. In this study, an integrated activated sludge model is developed as the tool to understand the mechanism of toxic carbon source (phenol) on the reaction, regarding the carbon flows during the aeration period in the batch reactor. To estimate the toxic function of phenol, the microbial cells death rate (k death ) is introduced into the model. The integrated model was calibrated and validated by the experimental data and it was found the model simulations matched the all experimental measurements. In the steady state, the toxicity of phenol can result in higher microbial cells death rate (0.1637 h -1 vs 0.0028 h -1 ) and decay rate coefficient of biomass (0.0115 h -1 vs 0.0107 h -1 ) than acetate. In addition, the utilization-associated products (UAP) and extracellular polymeric substances (EPS) formation coefficients of phenol are higher than that of acetate, indicating that more carbon flows into the extracellular components, such as soluble microbial products (SMP), when degrading toxic organics. In the non-steady state of feeding phenol, the yield coefficient for growth and maximum specific growth rate are very low in the first few days (1-10 d), while the decay rate coefficient of biomass and microbial cells death rate are relatively high. The model provides insights into the difference of the dynamic reaction with different carbon sources in the batch reactor. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Integrated Information Increases with Fitness in the Evolution of Animats

    PubMed Central

    Edlund, Jeffrey A.; Chaumont, Nicolas; Hintze, Arend; Koch, Christof; Tononi, Giulio; Adami, Christoph

    2011-01-01

    One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent (“animat”) evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its “fit” to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data. PMID:22028639

  3. Plan–Provider Integration, Premiums, and Quality in the Medicare Advantage Market

    PubMed Central

    Frakt, Austin B; Pizer, Steven D; Feldman, Roger

    2013-01-01

    Objective. To investigate how integration between Medicare Advantage plans and health care providers is related to plan premiums and quality ratings. Data Source. We used public data from the Centers for Medicare and Medicaid Services (CMS) and the Area Resource File and private data from one large insurer. Premiums and quality ratings are from 2009 CMS administrative files and some control variables are historical. Study Design. We estimated ordinary least-squares models for premiums and plan quality ratings, with state fixed effects and firm random effects. The key independent variable was an indicator of plan–provider integration. Data Collection. With the exception of Medigap premium data, all data were publicly available. We ascertained plan–provider integration through examination of plans’ websites and governance documents. Principal Findings. We found that integrated plan–providers charge higher premiums, controlling for quality. Such plans also have higher quality ratings. We found no evidence that integration is associated with more generous benefits. Conclusions. Current policy encourages plan–provider integration, although potential effects on health insurance products and markets are uncertain. Policy makers and regulators may want to closely monitor changes in premiums and quality after integration and consider whether quality improvement (if any) justifies premium increases (if they occur). PMID:23800017

  4. An Assessment of Behavioral Dynamic Information Processing Measures in Audiovisual Speech Perception

    PubMed Central

    Altieri, Nicholas; Townsend, James T.

    2011-01-01

    Research has shown that visual speech perception can assist accuracy in identification of spoken words. However, little is known about the dynamics of the processing mechanisms involved in audiovisual integration. In particular, architecture and capacity, measured using response time methodologies, have not been investigated. An issue related to architecture concerns whether the auditory and visual sources of the speech signal are integrated “early” or “late.” We propose that “early” integration most naturally corresponds to coactive processing whereas “late” integration corresponds to separate decisions parallel processing. We implemented the double factorial paradigm in two studies. First, we carried out a pilot study using a two-alternative forced-choice discrimination task to assess architecture, decision rule, and provide a preliminary assessment of capacity (integration efficiency). Next, Experiment 1 was designed to specifically assess audiovisual integration efficiency in an ecologically valid way by including lower auditory S/N ratios and a larger response set size. Results from the pilot study support a separate decisions parallel, late integration model. Results from both studies showed that capacity was severely limited for high auditory signal-to-noise ratios. However, Experiment 1 demonstrated that capacity improved as the auditory signal became more degraded. This evidence strongly suggests that integration efficiency is vitally affected by the S/N ratio. PMID:21980314

  5. MPHASYS: a mouse phenotype analysis system

    PubMed Central

    Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan

    2007-01-01

    Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167

  6. PopHR: a knowledge-based platform to support integration, analysis, and visualization of population health data.

    PubMed

    Shaban-Nejad, Arash; Lavigne, Maxime; Okhmatovskaia, Anya; Buckeridge, David L

    2017-01-01

    Population health decision makers must consider complex relationships between multiple concepts measured with differential accuracy from heterogeneous data sources. Population health information systems are currently limited in their ability to integrate data and present a coherent portrait of population health. Consequentially, these systems can provide only basic support for decision makers. The Population Health Record (PopHR) is a semantic web application that automates the integration and extraction of massive amounts of heterogeneous data from multiple distributed sources (e.g., administrative data, clinical records, and survey responses) to support the measurement and monitoring of population health and health system performance for a defined population. The design of the PopHR draws on the theories of the determinants of health and evidence-based public health to harmonize and explicitly link information about a population with evidence about the epidemiology and control of chronic diseases. Organizing information in this manner and linking it explicitly to evidence is expected to improve decision making related to the planning, implementation, and evaluation of population health and health system interventions. In this paper, we describe the PopHR platform and discuss the architecture, design, key modules, and its implementation and use. © 2016 New York Academy of Sciences.

  7. Development of flood routing simulation system of digital Qingjiang based on integrated spatial information technology

    NASA Astrophysics Data System (ADS)

    Yuan, Yanbin; Zhou, You; Zhu, Yaqiong; Yuan, Xiaohui; Sælthun, N. R.

    2007-11-01

    Based on digital technology, flood routing simulation system development is an important component of "digital catchment". Taking QingJiang catchment as a pilot case, in-depth analysis on informatization of Qingjiang catchment management being the basis, aiming at catchment data's multi-source, - dimension, -element, -subject, -layer and -class feature, the study brings the design thought and method of "subject-point-source database" (SPSD) to design system structure in order to realize the unified management of catchments data in great quantity. Using the thought of integrated spatial information technology for reference, integrating hierarchical structure development model of digital catchment is established. The model is general framework of the flood routing simulation system analysis, design and realization. In order to satisfy the demands of flood routing three-dimensional simulation system, the object-oriented spatial data model are designed. We can analyze space-time self-adapting relation between flood routing and catchments topography, express grid data of terrain by using non-directed graph, apply breadth first search arithmetic, set up search method for the purpose of dynamically searching stream channel on the basis of simulated three-dimensional terrain. The system prototype is therefore realized. Simulation results have demonstrated that the proposed approach is feasible and effective in the application.

  8. Geomorphic evidence of Quaternary tectonics within an underlap fault zone of southern Apennines, Italy

    NASA Astrophysics Data System (ADS)

    Giano, Salvatore Ivo; Pescatore, Eva; Agosta, Fabrizio; Prosser, Giacomo

    2018-02-01

    A composite seismic source, the Irpinia - Agri Valley Fault zone, located in the axial sector of the fold-and-thrust belt of southern Apennines, Italy, is investigated. This composite source is made up of a series of nearly parallel, NW-striking normal fault segments which caused many historical earthquakes. Two of these fault segments, known as the San Gregorio Magno and Pergola-Melandro, and the fault-related mountain fronts, form a wedge-shaped, right-stepping, underlap fault zone. This work is aimed at documenting tectonic geomorphology and geology of this underlap fault zone. The goal is to decipher the evidence of surface topographic interaction between two bounding fault segments and their related mountain fronts. In particular, computation of geomorphic indices such as mountain front sinuosity (Smf), water divide sinuosity (Swd), asymmetry factor (AF), drainage basin elongation (Bs), relief ratio (Rh), Hypsometry (HI), normalized steepness (Ksn), and concavity (θ) is integrated with geomorphological analysis, the geological mapping, and structural analysis in order to assess the recent activity of the fault scarp sets recognized within the underlap zone. Results are consistent with the NW-striking faults as those showing the most recent tectonic activity, as also suggested by presence of related slope deposits younger than 38 ka. The results of this work therefore show how the integration of a multidisciplinary approach that combines geomorphology, morphometry, and structural analyses may be key to solving tectonic geomorphology issues in a complex, fold-and-thrust belt configuration.

  9. Adaptable Information Models in the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Buddenberg, A.; Aulenbach, S.; Wolfe, R.; Goldstein, J.

    2014-12-01

    The US Global Change Research Program has sponsored the creation of the Global Change Information System () to provide a web based source of accessible, usable, and timely information about climate and global change for use by scientists, decision makers, and the public. The GCIS played multiple roles during the assembly and release of the Third National Climate Assessment. It provided human and programmable interfaces, relational and semantic representations of information, and discrete identifiers for various types of resources, which could then be manipulated by a distributed team with a wide range of specialties. The GCIS also served as a scalable backend for the web based version of the report. In this talk, we discuss the infrastructure decisions made during the design and deployment of the GCIS, as well as ongoing work to adapt to new types of information. Both a constrained relational database and an open ended triple store are used to ensure data integrity while maintaining fluidity. Using natural primary keys allows identifiers to propagate through both models. Changing identifiers are accomodated through fine grained auditing and explicit mappings to external lexicons. A practical RESTful API is used whose endpoints are also URIs in an ontology. Both the relational schema and the ontology are maleable, and stability is ensured through test driven development and continuous integration testing using modern open source techniques. Content is also validated through continuous testing techniques. A high degres of scalability is achieved through caching.

  10. Aspiring to Spectral Ignorance in Earth Observation

    NASA Astrophysics Data System (ADS)

    Oliver, S. A.

    2016-12-01

    Enabling robust, defensible and integrated decision making in the Era of Big Earth Data requires the fusion of data from multiple and diverse sensor platforms and networks. While the application of standardised global grid systems provides a common spatial analytics framework that facilitates the computationally efficient and statistically valid integration and analysis of these various data sources across multiple scales, there remains the challenge of sensor equivalency; particularly when combining data from different earth observation satellite sensors (e.g. combining Landsat and Sentinel-2 observations). To realise the vision of a sensor ignorant analytics platform for earth observation we require automation of spectral matching across the available sensors. Ultimately, the aim is to remove the requirement for the user to possess any sensor knowledge in order to undertake analysis. This paper introduces the concept of spectral equivalence and proposes a methodology through which equivalent bands may be sourced from a set of potential target sensors through application of equivalence metrics and thresholds. A number of parameters can be used to determine whether a pair of spectra are equivalent for the purposes of analysis. A baseline set of thresholds for these parameters and how to apply them systematically to enable relation of spectral bands amongst numerous different sensors is proposed. The base unit for comparison in this work is the relative spectral response. From this input, determination of a what may constitute equivalence can be related by a user, based on their own conceptualisation of equivalence.

  11. Determining the sources of suspended sediment in a Mediterranean groundwater-dominated river: the Na Borges basin (Mallorca, Spain).

    NASA Astrophysics Data System (ADS)

    Estrany, Joan; Martinez-Carreras, Nuria

    2013-04-01

    Tracers have been acknowledged as a useful tool to identify sediment sources, based upon a variety of techniques and chemical and physical sediment properties. Sediment fingerprinting supports the notion that changes in sedimentation rates are not just related to increased/reduced erosion and transport in the same areas, but also to the establishment of different pathways increasing sediment connectivity. The Na Borges is a Mediterranean lowland agricultural river basin (319 km2) where traditional soil and water conservation practices have been applied over millennia to provide effective protection of cultivated land. During the twentieth century, industrialisation and pressure from tourism activities have increased urbanised surfaces, which have impacts on the processes that control streamflow. Within this context, source material sampling was focused in Na Borges on obtaining representative samples from potential sediment sources (comprised topsoil; i.e., 0-2 cm) susceptible to mobilisation by water and subsequent routing to the river channel network, while those representing channel bank sources were collected from actively eroding channel margins and ditches. Samples of road dust and of solids from sewage treatment plants were also collected. During two hydrological years (2004-2006), representative suspended sediment samples for use in source fingerprinting studies were collected at four flow gauging stations and at eight secondary sampling points using time-integrating sampling samplers. Likewise, representative bed-channel sediment samples were obtained using the resuspension approach at eight sampling points in the main stem of the Na Borges River. These deposits represent the fine sediment temporarily stored in the bed-channel and were also used for tracing source contributions. A total of 102 individual time-integrated sediment samples, 40 bulk samples and 48 bed-sediment samples were collected. Upon return to the laboratory, source material samples were oven-dried at 40° C, disaggregated using a pestle and mortar, and dry sieved to

  12. Assessing and improving health in the workplace: an integration of subjective and objective measures with the STress Assessment and Research Toolkit (St.A.R.T.) method.

    PubMed

    Panari, Chiara; Guglielmi, Dina; Ricci, Aurora; Tabanelli, Maria Carla; Violante, Francesco Saverio

    2012-09-20

    The aim of this work was to introduce a new combined method of subjective and objective measures to assess psychosocial risk factors at work and improve workers' health and well-being. In the literature most of the research on work-related stress focuses on self-report measures and this work represents the first methodology capable of integrating different sources of data. An integrated method entitled St.A.R.T. (STress Assessment and Research Toolkit) was used in order to assess psychosocial risk factors and two health outcomes. In particular, a self-report questionnaire combined with an observational structured checklist was administered to 113 workers from an Italian retail company. The data showed a correlation between subjective data and the rating data of the observational checklist for the psychosocial risk factors related to work contexts such as customer relationship management and customer queue. Conversely, the factors related to work content (workload and boredom) measured with different methods (subjective vs. objective) showed a discrepancy. Furthermore, subjective measures of psychosocial risk factors were more predictive of workers' psychological health and exhaustion than rating data. The different objective measures played different roles, however, in terms of their influence on the two health outcomes considered. It is important to integrate self-related assessment of stressors with objective measures for a better understanding of workers' conditions in the workplace. The method presented could be considered a useful methodology for combining the two measures and differentiating the impact of different psychological risk factors related to work content and context on workers' health.

  13. Assessing and improving health in the workplace: an integration of subjective and objective measures with the STress Assessment and Research Toolkit (St.A.R.T.) method

    PubMed Central

    2012-01-01

    Background The aim of this work was to introduce a new combined method of subjective and objective measures to assess psychosocial risk factors at work and improve workers’ health and well-being. In the literature most of the research on work-related stress focuses on self-report measures and this work represents the first methodology capable of integrating different sources of data. Method An integrated method entitled St.A.R.T. (STress Assessment and Research Toolkit) was used in order to assess psychosocial risk factors and two health outcomes. In particular, a self-report questionnaire combined with an observational structured checklist was administered to 113 workers from an Italian retail company. Results The data showed a correlation between subjective data and the rating data of the observational checklist for the psychosocial risk factors related to work contexts such as customer relationship management and customer queue. Conversely, the factors related to work content (workload and boredom) measured with different methods (subjective vs. objective) showed a discrepancy. Furthermore, subjective measures of psychosocial risk factors were more predictive of workers’ psychological health and exhaustion than rating data. The different objective measures played different roles, however, in terms of their influence on the two health outcomes considered. Conclusions It is important to integrate self-related assessment of stressors with objective measures for a better understanding of workers’ conditions in the workplace. The method presented could be considered a useful methodology for combining the two measures and differentiating the impact of different psychological risk factors related to work content and context on workers’ health. PMID:22995286

  14. Querying clinical data in HL7 RIM based relational model with morph-RDB.

    PubMed

    Priyatna, Freddy; Alonso-Calvo, Raul; Paraiso-Medina, Sergio; Corcho, Oscar

    2017-10-05

    Semantic interoperability is essential when carrying out post-genomic clinical trials where several institutions collaborate, since researchers and developers need to have an integrated view and access to heterogeneous data sources. One possible approach to accommodate this need is to use RDB2RDF systems that provide RDF datasets as the unified view. These RDF datasets may be materialized and stored in a triple store, or transformed into RDF in real time, as virtual RDF data sources. Our previous efforts involved materialized RDF datasets, hence losing data freshness. In this paper we present a solution that uses an ontology based on the HL7 v3 Reference Information Model and a set of R2RML mappings that relate this ontology to an underlying relational database implementation, and where morph-RDB is used to expose a virtual, non-materialized SPARQL endpoint over the data. By applying a set of optimization techniques on the SPARQL-to-SQL query translation algorithm, we can now issue SPARQL queries to the underlying relational data with generally acceptable performance.

  15. Habitat Hydrology and Geomorphology Control the Distribution of Malaria Vector Larvae in Rural Africa

    PubMed Central

    Hardy, Andrew J.; Gamarra, Javier G. P.; Cross, Dónall E.; Macklin, Mark G.; Smith, Mark W.; Kihonda, Japhet; Killeen, Gerry F.; Ling’ala, George N.; Thomas, Chris J.

    2013-01-01

    Background Larval source management is a promising component of integrated malaria control and elimination. This requires development of a framework to target productive locations through process-based understanding of habitat hydrology and geomorphology. Methods We conducted the first catchment scale study of fine resolution spatial and temporal variation in Anopheles habitat and productivity in relation to rainfall, hydrology and geomorphology for a high malaria transmission area of Tanzania. Results Monthly aggregates of rainfall, river stage and water table were not significantly related to the abundance of vector larvae. However, these metrics showed strong explanatory power to predict mosquito larval abundances after stratification by water body type, with a clear seasonal trend for each, defined on the basis of its geomorphological setting and origin. Conclusion Hydrological and geomorphological processes governing the availability and productivity of Anopheles breeding habitat need to be understood at the local scale for which larval source management is implemented in order to effectively target larval source interventions. Mapping and monitoring these processes is a well-established practice providing a tractable way forward for developing important malaria management tools. PMID:24312606

  16. Habitat hydrology and geomorphology control the distribution of malaria vector larvae in rural Africa.

    PubMed

    Hardy, Andrew J; Gamarra, Javier G P; Cross, Dónall E; Macklin, Mark G; Smith, Mark W; Kihonda, Japhet; Killeen, Gerry F; Ling'ala, George N; Thomas, Chris J

    2013-01-01

    Larval source management is a promising component of integrated malaria control and elimination. This requires development of a framework to target productive locations through process-based understanding of habitat hydrology and geomorphology. We conducted the first catchment scale study of fine resolution spatial and temporal variation in Anopheles habitat and productivity in relation to rainfall, hydrology and geomorphology for a high malaria transmission area of Tanzania. Monthly aggregates of rainfall, river stage and water table were not significantly related to the abundance of vector larvae. However, these metrics showed strong explanatory power to predict mosquito larval abundances after stratification by water body type, with a clear seasonal trend for each, defined on the basis of its geomorphological setting and origin. Hydrological and geomorphological processes governing the availability and productivity of Anopheles breeding habitat need to be understood at the local scale for which larval source management is implemented in order to effectively target larval source interventions. Mapping and monitoring these processes is a well-established practice providing a tractable way forward for developing important malaria management tools.

  17. Exploiting IoT Technologies and Open Source Components for Smart Seismic Network Instrumentation

    NASA Astrophysics Data System (ADS)

    Germenis, N. G.; Koulamas, C. A.; Foundas, P. N.

    2017-12-01

    The data collection infrastructure of any seismic network poses a number of requirements and trade-offs related to accuracy, reliability, power autonomy and installation & operational costs. Having the right hardware design at the edge of this infrastructure, embedded software running inside the instruments is the heart of pre-processing and communication services implementation and their integration with the central storage and processing facilities of the seismic network. This work demonstrates the feasibility and benefits of exploiting software components from heterogeneous sources in order to realize a smart seismic data logger, achieving higher reliability, faster integration and less development and testing costs of critical functionality that is in turn responsible for the cost and power efficient operation of the device. The instrument's software builds on top of widely used open source components around the Linux kernel with real-time extensions, the core Debian Linux distribution, the earthworm and seiscomp tooling frameworks, as well as components from the Internet of Things (IoT) world, such as the CoAP and MQTT protocols for the signaling planes, besides the widely used de-facto standards of the application domain at the data plane, such as the SeedLink protocol. By using an innovative integration of features based on lower level GPL components of the seiscomp suite with higher level processing earthworm components, coupled with IoT protocol extensions to the latter, the instrument can implement smart functionality such as network controlled, event triggered data transmission in parallel with edge archiving and on demand, short term historical data retrieval.

  18. A fast algorithm for forward-modeling of gravitational fields in spherical coordinates with 3D Gauss-Legendre quadrature

    NASA Astrophysics Data System (ADS)

    Zhao, G.; Liu, J.; Chen, B.; Guo, R.; Chen, L.

    2017-12-01

    Forward modeling of gravitational fields at large-scale requires to consider the curvature of the Earth and to evaluate the Newton's volume integral in spherical coordinates. To acquire fast and accurate gravitational effects for subsurface structures, subsurface mass distribution is usually discretized into small spherical prisms (called tesseroids). The gravity fields of tesseroids are generally calculated numerically. One of the commonly used numerical methods is the 3D Gauss-Legendre quadrature (GLQ). However, the traditional GLQ integration suffers from low computational efficiency and relatively poor accuracy when the observation surface is close to the source region. We developed a fast and high accuracy 3D GLQ integration based on the equivalence of kernel matrix, adaptive discretization and parallelization using OpenMP. The equivalence of kernel matrix strategy increases efficiency and reduces memory consumption by calculating and storing the same matrix elements in each kernel matrix just one time. In this method, the adaptive discretization strategy is used to improve the accuracy. The numerical investigations show that the executing time of the proposed method is reduced by two orders of magnitude compared with the traditional method that without these optimized strategies. High accuracy results can also be guaranteed no matter how close the computation points to the source region. In addition, the algorithm dramatically reduces the memory requirement by N times compared with the traditional method, where N is the number of discretization of the source region in the longitudinal direction. It makes the large-scale gravity forward modeling and inversion with a fine discretization possible.

  19. Time encoded radiation imaging

    DOEpatents

    Marleau, Peter; Brubaker, Erik; Kiff, Scott

    2014-10-21

    The various technologies presented herein relate to detecting nuclear material at a large stand-off distance. An imaging system is presented which can detect nuclear material by utilizing time encoded imaging relating to maximum and minimum radiation particle counts rates. The imaging system is integrated with a data acquisition system that can utilize variations in photon pulse shape to discriminate between neutron and gamma-ray interactions. Modulation in the detected neutron count rates as a function of the angular orientation of the detector due to attenuation of neighboring detectors is utilized to reconstruct the neutron source distribution over 360 degrees around the imaging system. Neutrons (e.g., fast neutrons) and/or gamma-rays are incident upon scintillation material in the imager, the photons generated by the scintillation material are converted to electrical energy from which the respective neutrons/gamma rays can be determined and, accordingly, a direction to, and the location of, a radiation source identified.

  20. Measurement of Quantum Interference in a Silicon Ring Resonator Photon Source.

    PubMed

    Steidle, Jeffrey A; Fanto, Michael L; Preble, Stefan F; Tison, Christopher C; Howland, Gregory A; Wang, Zihao; Alsing, Paul M

    2017-04-04

    Silicon photonic chips have the potential to realize complex integrated quantum information processing circuits, including photon sources, qubit manipulation, and integrated single-photon detectors. Here, we present the key aspects of preparing and testing a silicon photonic quantum chip with an integrated photon source and two-photon interferometer. The most important aspect of an integrated quantum circuit is minimizing loss so that all of the generated photons are detected with the highest possible fidelity. Here, we describe how to perform low-loss edge coupling by using an ultra-high numerical aperture fiber to closely match the mode of the silicon waveguides. By using an optimized fusion splicing recipe, the UHNA fiber is seamlessly interfaced with a standard single-mode fiber. This low-loss coupling allows the measurement of high-fidelity photon production in an integrated silicon ring resonator and the subsequent two-photon interference of the produced photons in a closely integrated Mach-Zehnder interferometer. This paper describes the essential procedures for the preparation and characterization of high-performance and scalable silicon quantum photonic circuits.

  1. SIMULATION STUDY FOR GASEOUS FLUXES FROM AN AREA SOURCE USING COMPUTED TOMOGRAPHY AND OPTICAL REMOTE SENSING

    EPA Science Inventory

    The paper presents a new approach to quantifying emissions from fugitive gaseous air pollution sources. Computed tomography (CT) and path-integrated optical remote sensing (PI-ORS) concentration data are combined in a new field beam geometry. Path-integrated concentrations are ...

  2. FIELD EVALUATION OF A METHOD FOR ESTIMATING GASEOUS FLUXES FROM AREA SOURCES USING OPEN-PATH FTIR

    EPA Science Inventory


    The paper gives preliminary results from a field evaluation of a new approach for quantifying gaseous fugitive emissions of area air pollution sources. The approach combines path-integrated concentration data acquired with any path-integrated optical remote sensing (PI-ORS) ...

  3. FIELD EVALUATION OF A METHOD FOR ESTIMATING GASEOUS FLUXES FROM AREA SOURCES USING OPEN-PATH FOURIER TRANSFORM INFRARED

    EPA Science Inventory

    The paper describes preliminary results from a field experiment designed to evaluate a new approach to quantifying gaseous fugitive emissions from area air pollution sources. The new approach combines path-integrated concentration data acquired with any path-integrated optical re...

  4. ToxPi GUI: An interactive visualization tool for transparent integration of data from diverse sources of evidence

    EPA Science Inventory

    Motivation: Scientists and regulators are often faced with complex decisions, where use of scarce resources must be prioritized using collections of diverse information. The Toxicological Prioritization Index (ToxPi™) was developed to enable integration of multiple sources of evi...

  5. 76 FR 44865 - Domestic Licensing of Source Material-Amendments/Integrated Safety Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-27

    ... NUCLEAR REGULATORY COMMISSION 10 CFR Part 40 RIN 3150-AI50 [NRC-2009-0079 and NRC-2011-0080] Domestic Licensing of Source Material--Amendments/Integrated Safety Analysis AGENCY: Nuclear Regulatory Commission. ACTION: Extension of public comment period and public meeting. SUMMARY: The U.S. Nuclear...

  6. Global Solar Magnetology and Reference Points of the Solar Cycle

    NASA Astrophysics Data System (ADS)

    Obridko, V. N.; Shelting, B. D.

    2003-11-01

    The solar cycle can be described as a complex interaction of large-scale/global and local magnetic fields. In general, this approach agrees with the traditional dynamo scheme, although there are numerous discrepancies in the details. Integrated magnetic indices introduced earlier are studied over long time intervals, and the epochs of the main reference points of the solar cycles are refined. A hypothesis proposed earlier concerning global magnetometry and the natural scale of the cycles is verified. Variations of the heliospheric magnetic field are determined by both the integrated photospheric i(B r )ph and source surface i(B r )ss indices, however, their roles are different. Local fields contribute significantly to the photospheric index determining the total increase in the heliospheric magnetic field. The i(B r )ss index (especially the partial index ZO, which is related to the quasi-dipolar field) determines narrow extrema. These integrated indices supply us with a “passport” for reference points, making it possible to identify them precisely. A prominent dip in the integrated indices is clearly visible at the cycle maximum, resulting in the typical double-peak form (the Gnevyshev dip), with the succeeding maximum always being higher than the preceding maximum. At the source surface, this secondary maximum significantly exceeds the primary maximum. Using these index data, we can estimate the progression expected for the 23rd cycle and predict the dates of the ends of the 23rd and 24th cycles (the middle of 2007 and December 2018, respectively).

  7. Human umbilical cord mesenchymal stromal cells in a sandwich approach for osteochondral tissue engineering

    PubMed Central

    Wang, Limin; Zhao, Liang; Detamore, Michael S.

    2013-01-01

    Cell sources and tissue integration between cartilage and bone regions are critical to successful osteochondral regeneration. In this study, human umbilical cord mesenchymal stromal cells (hUCMSCs), derived from Wharton’s jelly, were introduced to the field of osteochondral tissue engineering and a new strategy for osteochondral integration was developed by sandwiching a layer of cells between chondrogenic and osteogenic constructs before suturing them together. Specifically, hUCMSCs were cultured in biodegradable poly-l-lactic acid scaffolds for 3 weeks in either chondrogenic or osteogenic medium to differentiate cells toward cartilage or bone lineages, respectively. A highly concentrated cell solution containing undifferentiated hUCMSCs was pasted onto the surface of the bone layer at week 3 and the two layers were then sutured together to form an osteochondral composite for another 3 week culture period. Chondrogenic and osteogenic differentiation was initiated during the first 3 weeks, as evidenced by the expression of type II collagen and runt-related transcription factor 2 genes, respectively, and continued with the increase of extracellular matrix during the last 3 weeks. Histological and immunohistochemical staining, such as for glycosaminoglycans, type I collagen and calcium, revealed better integration and transition of these matrices between two layers in the composite group containing sandwiched cells compared to other control composites. These results suggest that hUCMSCs may be a suitable cell source for osteochondral regeneration, and the strategy of sandwiching cells between two layers may facilitate scaffold and tissue integration. PMID:21953869

  8. The New York Brain Bank of Columbia University: practical highlights of 35 years of experience.

    PubMed

    Ramirez, Etty Paola Cortes; Keller, Christian Ernst; Vonsattel, Jean Paul

    2018-01-01

    The New York Brain Bank processes brains and organs of clinically well-characterized patients with age-related neurodegenerative diseases, and for comparison, from individuals without neurologic or psychiatric impairments. The donors, either patients or individuals, were evaluated at healthcare facilities of the Columbia University of New York. Each source brain yields four categories of samples: fresh frozen blocks and crushed parenchyma, and formalin-fixed wet blocks and histology sections. A source brain is thoroughly evaluated to determine qualitatively and quantitatively any changes it might harbor using conventional neuropathologic techniques. The clinical and pathologic diagnoses are integrated to determine the distributive diagnosis assigned to the samples obtained from a source brain. The gradual standardization of the protocol was developed in 1981 in response to the evolving requirements of basic investigations on neurodegeneration. The methods assimilate long-standing experience from multiple centers. The resulting and current protocol includes a constant central core applied to all brains with conditional flexibility around it. The New York Brain Bank is an integral part of the department of pathology, where the expertise, teaching duties, and hardware are shared. Since details of the protocols are available online, this chapter focuses on practical issues in professionalizing brain banking. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Physics-based investigation of negative ion behavior in a negative-ion-rich plasma using integrated diagnostics

    NASA Astrophysics Data System (ADS)

    Tsumori, K.; Takeiri, Y.; Ikeda, K.; Nakano, H.; Geng, S.; Kisaki, M.; Nagaoka, K.; Tokuzawa, T.; Wada, M.; Sasaki, K.; Nishiyama, S.; Goto, M.; Osakabe, M.

    2017-08-01

    Total power of 16 MW has been successfully delivered to the plasma confined in the Large Helical Device (LHD) from three Neutral Beam Injectors (NBIs) equipped with negative hydrogen (H-) ion sources. However, the detailed mechanisms from production through extraction of H- ions are still yet to be clarified and a similar size ion source on an independent acceleration test bench called Research and development Negative Ion Source (RNIS) serves as the facility to study physics related to H- production and transport for further improvement of NBI. The production of negative-ion-rich plasma and the H- ions behavior in the beam extraction region in RNIS is being investigated by employing an integrated diagnostic system. Flow patterns of electrons, positive ions and H- ions in the extraction region are described in a two-dimensional map. The measured flow patterns indicate the existence a stagnation region, where the H- flow changes the direction at a distance about 20 mm from the plasma grid. The pattern also suggested the H- flow originated from plasma grid (PG) surface that turned back toward extraction apertures. The turning region seems formed by a layer of combined magnetic field produced by the magnetic filter field and the Electron-Deflection Magnetic (EDM) field created by magnets installed in the extraction electrode.

  10. A conceptual framework for the sustainable management of wastewater in Harare, Zimbabwe.

    PubMed

    Nhapi, I; Gijzen, H J; Siebel, M A

    2003-01-01

    The aim of this study was to formulate an integrated wastewater management model for Harare, Zimbabwe, based on current thinking. This implies that wastewater is treated/disposed of as close to the source of generation as possible. Resource recovery and reuse in a local thriving urban agriculture are integrated into this model. Intervention strategies were considered for controlling water, nitrogen and phosphorus flows to the lake. In the formulation of strategies, Harare was divided into five major operational areas of high-, medium-, and low-density residential areas, and also commercial and industrial areas. Specific options were then considered to suit landuse, development constraints and socio-economic status for each area, within the overall criteria of limiting nutrient inflows into the downstream Lake Chivero. Flexible and differential solutions were developed in relation to built environment, population density, composition of users, ownership, future environmental demands, and technical, environmental, hygienic, social and organisational factors. Options considered include source control by the users (residents, industries, etc.), using various strategies like implementation of toilets with source separation, and natural methods of wastewater treatment. Other possible strategies are invoking better behaviour through fees and information, incentives for cleaner production, and user responsibility through education, legislative changes and stricter controls over industry.

  11. Initial Results from the Survey of Organizational Research Climates (SOuRCe) in the U.S. Department of Veterans Affairs Healthcare System.

    PubMed

    Martinson, Brian C; Nelson, David; Hagel-Campbell, Emily; Mohr, David; Charns, Martin P; Bangerter, Ann; Thrush, Carol R; Ghilardi, Joseph R; Bloomfield, Hanna; Owen, Richard; Wells, James A

    2016-01-01

    In service to its core mission of improving the health and well-being of veterans, Veterans Affairs (VA) leadership is committed to supporting research best practices in the VA. Recognizing that the behavior of researchers is influenced by the organizational climates in which they work, efforts to assess the integrity of research climates and share such information with research leadership in VA may be one way to support research best practices. The Survey of Organizational Research Climate (SOuRCe) is the first validated survey instrument specifically designed to assess the organizational climate of research integrity in academic research organizations. The current study reports on an initiative to use the SOuRCe in VA facilities to characterize the organizational research climates and pilot test the effectiveness of using SOuRCe data as a reporting and feedback intervention tool. We administered the SOuRCe using a cross-sectional, online survey, with mailed follow-up to non-responders, of research-engaged employees in the research services of a random selection of 42 VA facilities (e.g., Hospitals/Stations) believed to employ 20 or more research staff. We attained a 51% participation rate, yielding more than 5,200 usable surveys. We found a general consistency in organizational research climates across a variety of sub-groups in this random sample of research services in the VA. We also observed similar SOuRCe scale score means, relative rankings of these scales and their internal reliability, in this VA-based sample as we have previously documented in more traditional academic research settings. Results also showed more substantial variability in research climate scores within than between facilities in the VA research service as reflected in meaningful subgroup differences. These findings suggest that the SOuRCe is suitable as an instrument for assessing the research integrity climates in VA and that the tool has similar patterns of results that have been observed in more traditional academic research settings. The local and specific nature of organizational climates in VA research services, as reflected in variability across sub-groups within individual facilities, has important policy implications. Global, "one-size-fits-all" type initiatives are not likely to yield as much benefit as efforts targeted to specific organizational units or sub-groups and tailored to the specific strengths and weaknesses documented in those locations.

  12. Fabrication and assembly of a superconducting undulator for the advanced photon source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasse, Quentin; Fuerst, J. D.; Ivanyushenkov, Y.

    2014-01-29

    A prototype superconducting undulator magnet (SCU0) has been built at the Advanced Photon Source (APS) of Argonne National Laboratory (ANL) and has successfully completed both cryogenic performance and magnetic measurement test programs. The SCU0 closed loop, zero-boil-off cryogenic system incorporates high temperature superconducting (HTS) current leads, cryocoolers, a LHe reservoir supplying dual magnetic cores, and an integrated cooled beam chamber. This system presented numerous challenges in the design, fabrication, and assembly of the device. Aspects of this R and D relating to both the cryogenic and overall assembly of the device are presented here. The SCU0 magnet has been installedmore » in the APS storage ring.« less

  13. Highly parallel demagnetization field calculation using the fast multipole method on tetrahedral meshes with continuous sources

    NASA Astrophysics Data System (ADS)

    Palmesi, P.; Exl, L.; Bruckner, F.; Abert, C.; Suess, D.

    2017-11-01

    The long-range magnetic field is the most time-consuming part in micromagnetic simulations. Computational improvements can relieve problems related to this bottleneck. This work presents an efficient implementation of the Fast Multipole Method [FMM] for the magnetic scalar potential as used in micromagnetics. The novelty lies in extending FMM to linearly magnetized tetrahedral sources making it interesting also for other areas of computational physics. We treat the near field directly and in use (exact) numerical integration on the multipole expansion in the far field. This approach tackles important issues like the vectorial and continuous nature of the magnetic field. By using FMM the calculations scale linearly in time and memory.

  14. The faint X-ray sources in and out of omega Centauri: X-ray observations and optical identifications

    NASA Technical Reports Server (NTRS)

    Cool, Adrienne M.; Grindlay, Jonathan E.; Bailyn, Charles D.; Callanan, Paul J.; Hertz, Paul

    1995-01-01

    We present the results of an observation of the globular cluster omega Cen (NGC 5139) with the Einstein high-resolution imager (HRI). Of the five low-luminosity X-ray sources toward omega Cen which were first identified with the Einstein imaging proportional counter (IPC) (Hertz and Grindlay 1983a, b), two are detected in the Einstein HRI observation: IPC sources A and D. These detections provide source positions accurate to 3 sec-4 sec; the positions are confirmed in a ROSAT HRI observation reported here. Using CCD photometry and spectroscopy, we have identified both sources as foreground dwarf M stars with emission lines (dMe). The chance projection of two Mde stars within approximately 13 min of the center of omega Cen is not extraordinary, given the space density of these stellar coronal X-ray sources. We discuss the possible nature of the three as yet unidentified IPC sources toward omega Cen, and consider the constraints that the Einstein observations place on the total population of X-ray sources in this cluster. The integrated luminosity from faint X-ray sources in omega Cen appears to be low relative to both the old open cluster M67 and the post-core-collapse globular, NGC 6397.

  15. Using natural archives to track sources and long-term trends of pollution: an introduction

    USGS Publications Warehouse

    Jules Blais,; Rosen, Michael R.; John Smol,

    2015-01-01

    This book explores the myriad ways that environmental archives can be used to study the distribution and long-term trajectories of contaminants. The volume first focuses on reviews that examine the integrity of the historic record, including factors related to hydrology, post-depositional diffusion, and mixing processes. This is followed by a series of chapters dealing with the diverse archives available for long-term studies of environmental pollution.

  16. An integrated WRF/HYSPLIT modeling approach for the assessment of PM(2.5) source regions over the Mississippi Gulf Coast region.

    PubMed

    Yerramilli, Anjaneyulu; Dodla, Venkata Bhaskar Rao; Challa, Venkata Srinivas; Myles, Latoya; Pendergrass, William R; Vogel, Christoph A; Dasari, Hari Prasad; Tuluri, Francis; Baham, Julius M; Hughes, Robert L; Patrick, Chuck; Young, John H; Swanier, Shelton J; Hardy, Mark G

    2012-12-01

    Fine particulate matter (PM(2.5)) is majorly formed by precursor gases, such as sulfur dioxide (SO(2)) and nitrogen oxides (NO(x)), which are emitted largely from intense industrial operations and transportation activities. PM(2.5) has been shown to affect respiratory health in humans. Evaluation of source regions and assessment of emission source contributions in the Gulf Coast region of the USA will be useful for the development of PM(2.5) regulatory and mitigation strategies. In the present study, the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model driven by the Weather Research & Forecasting (WRF) model is used to identify the emission source locations and transportation trends. Meteorological observations as well as PM(2.5) sulfate and nitric acid concentrations were collected at two sites during the Mississippi Coastal Atmospheric Dispersion Study, a summer 2009 field experiment along the Mississippi Gulf Coast. Meteorological fields during the campaign were simulated using WRF with three nested domains of 36, 12, and 4 km horizontal resolutions and 43 vertical levels and validated with North American Mesoscale Analysis. The HYSPLIT model was integrated with meteorological fields derived from the WRF model to identify the source locations using backward trajectory analysis. The backward trajectories for a 24-h period were plotted at 1-h intervals starting from two observation locations to identify probable sources. The back trajectories distinctly indicated the sources to be in the direction between south and west, thus to have origin from local Mississippi, neighboring Louisiana state, and Gulf of Mexico. Out of the eight power plants located within the radius of 300 km of the two monitoring sites examined as sources, only Watson, Cajun, and Morrow power plants fall in the path of the derived back trajectories. Forward dispersions patterns computed using HYSPLIT were plotted from each of these source locations using the hourly mean emission concentrations as computed from past annual emission strength data to assess extent of their contribution. An assessment of the relative contributions from the eight sources reveal that only Cajun and Morrow power plants contribute to the observations at the Wiggins Airport to a certain extent while none of the eight power plants contribute to the observations at Harrison Central High School. As these observations represent a moderate event with daily average values of 5-8 μg m(-3) for sulfate and 1-3 μg m(-3) for HNO(3) with differences between the two spatially varied sites, the local sources may also be significant contributors for the observed values of PM(2.5).

  17. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    PubMed

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  18. MCNP-based computational model for the Leksell gamma knife.

    PubMed

    Trnka, Jiri; Novotny, Josef; Kluson, Jaroslav

    2007-01-01

    We have focused on the usage of MCNP code for calculation of Gamma Knife radiation field parameters with a homogenous polystyrene phantom. We have investigated several parameters of the Leksell Gamma Knife radiation field and compared the results with other studies based on EGS4 and PENELOPE code as well as the Leksell Gamma Knife treatment planning system Leksell GammaPlan (LGP). The current model describes all 201 radiation beams together and simulates all the sources in the same time. Within each beam, it considers the technical construction of the source, the source holder, collimator system, the spherical phantom, and surrounding material. We have calculated output factors for various sizes of scoring volumes, relative dose distributions along basic planes including linear dose profiles, integral doses in various volumes, and differential dose volume histograms. All the parameters have been calculated for each collimator size and for the isocentric configuration of the phantom. We have found the calculated output factors to be in agreement with other authors' works except the case of 4 mm collimator size, where averaging over the scoring volume and statistical uncertainties strongly influences the calculated results. In general, all the results are dependent on the choice of the scoring volume. The calculated linear dose profiles and relative dose distributions also match independent studies and the Leksell GammaPlan, but care must be taken about the fluctuations within the plateau, which can influence the normalization, and accuracy in determining the isocenter position, which is important for comparing different dose profiles. The calculated differential dose volume histograms and integral doses have been compared with data provided by the Leksell GammaPlan. The dose volume histograms are in good agreement as well as integral doses calculated in small calculation matrix volumes. However, deviations in integral doses up to 50% can be observed for large volumes such as for the total skull volume. The differences observed in treatment of scattered radiation between the MC method and the LGP may be important in this case. We have also studied the influence of differential direction sampling of primary photons and have found that, due to the anisotropic sampling, doses around the isocenter deviate from each other by up to 6%. With caution about the details of the calculation settings, it is possible to employ the MCNP Monte Carlo code for independent verification of the Leksell Gamma Knife radiation field properties.

  19. Integration of the Eventlndex with other ATLAS systems

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Cárdenas Zárate, S. E.; Gallas, E. J.; Prokoshin, F.

    2015-12-01

    The ATLAS EventIndex System, developed for use in LHC Run 2, is designed to index every processed event in ATLAS, replacing the TAG System used in Run 1. Its storage infrastructure, based on Hadoop open-source software framework, necessitates revamping how information in this system relates to other ATLAS systems. It will store more indexes since the fundamental mechanisms for retrieving these indexes will be better integrated into all stages of data processing, allowing more events from later stages of processing to be indexed than was possible with the previous system. Connections with other systems (conditions database, monitoring) are fundamentally critical to assess dataset completeness, identify data duplication, and check data integrity, and also enhance access to information in EventIndex by user and system interfaces. This paper gives an overview of the ATLAS systems involved, the relevant metadata, and describe the technologies we are deploying to complete these connections.

  20. A finite-element analysis for steady and oscillatory subsonic flow around complex configurations

    NASA Technical Reports Server (NTRS)

    Chen, L. T.; Suciu, E. O.; Morino, L.

    1974-01-01

    The problem of potential subsonic flow around complex configurations is considered. The solution is given of an integral equation relating the values of the potential on the surface of the body to the values of the normal derivative, which is known from the boundary conditions. The surface of the body is divided into small (hyperboloidal quadrilateral) surface elements, which are described in terms of the Cartesian components of the four corner points. The values of the potential (and its normal derivative) within each element is assumed to be constant and equal to its value at the centroid of the element. The coefficients of the equation are given by source and doublet integrals over the surface elements. Closed form evaluations of the integrals are presented. The results obtained with the above formulation are compared with existing analytical and experimental results.

  1. Real-time realizations of the Bayesian Infrasonic Source Localization Method

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Arrowsmith, S.; Hofstetter, A.; Nippress, A.

    2015-12-01

    The Bayesian Infrasonic Source Localization method (BISL), introduced by Mordak et al. (2010) and upgraded by Marcillo et al. (2014) is destined for the accurate estimation of the atmospheric event origin at local, regional and global scales by the seismic and infrasonic networks and arrays. The BISL is based on probabilistic models of the source-station infrasonic signal propagation time, picking time and azimuth estimate merged with a prior knowledge about celerity distribution. It requires at each hypothetical source location, integration of the product of the corresponding source-station likelihood functions multiplied by a prior probability density function of celerity over the multivariate parameter space. The present BISL realization is generally time-consuming procedure based on numerical integration. The computational scheme proposed simplifies the target function so that integrals are taken exactly and are represented via standard functions. This makes the procedure much faster and realizable in real-time without practical loss of accuracy. The procedure executed as PYTHON-FORTRAN code demonstrates high performance on a set of the model and real data.

  2. Integration and Optimization of Alternative Sources of Energy in a Remote Region

    NASA Astrophysics Data System (ADS)

    Berberi, Pellumb; Inodnorjani, Spiro; Aleti, Riza

    2010-01-01

    In a remote coastal region supply of energy from national grid is insufficient for a sustainable development. Integration and optimization of local alternative renewable energy sources is an optional solution of the problem. In this paper we have studied the energetic potential of local sources of renewable energy (water, solar, wind and biomass). A bottom-up energy system optimization model is proposed in order to support planning policies for promoting the use of renewable energy sources. A software, based on multiple factors and constrains analysis for optimization energy flow is proposed, which provides detailed information for exploitation each source of energy, power and heat generation, GHG emissions and end-use sectors. Economical analysis shows that with existing technologies both stand alone and regional facilities may be feasible. Improving specific legislation will foster investments from Central or Local Governments and also from individuals, private companies or small families. The study is carried on the frame work of a FP6 project "Integrated Renewable Energy System."

  3. On-Chip Waveguide Coupling of a Layered Semiconductor Single-Photon Source.

    PubMed

    Tonndorf, Philipp; Del Pozo-Zamudio, Osvaldo; Gruhler, Nico; Kern, Johannes; Schmidt, Robert; Dmitriev, Alexander I; Bakhtinov, Anatoly P; Tartakovskii, Alexander I; Pernice, Wolfram; Michaelis de Vasconcellos, Steffen; Bratschitsch, Rudolf

    2017-09-13

    Fully integrated quantum technology based on photons is in the focus of current research, because of its immense potential concerning performance and scalability. Ideally, the single-photon sources, the processing units, and the photon detectors are all combined on a single chip. Impressive progress has been made for on-chip quantum circuits and on-chip single-photon detection. In contrast, nonclassical light is commonly coupled onto the photonic chip from the outside, because presently only few integrated single-photon sources exist. Here, we present waveguide-coupled single-photon emitters in the layered semiconductor gallium selenide as promising on-chip sources. GaSe crystals with a thickness below 100 nm are placed on Si 3 N 4 rib or slot waveguides, resulting in a modified mode structure efficient for light coupling. Using optical excitation from within the Si 3 N 4 waveguide, we find nonclassicality of generated photons routed on the photonic chip. Thus, our work provides an easy-to-implement and robust light source for integrated quantum technology.

  4. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  5. Detecting misinformation and knowledge conflicts in relational data

    NASA Astrophysics Data System (ADS)

    Levchuk, Georgiy; Jackobsen, Matthew; Riordan, Brian

    2014-06-01

    Information fusion is required for many mission-critical intelligence analysis tasks. Using knowledge extracted from various sources, including entities, relations, and events, intelligence analysts respond to commander's information requests, integrate facts into summaries about current situations, augment existing knowledge with inferred information, make predictions about the future, and develop action plans. However, information fusion solutions often fail because of conflicting and redundant knowledge contained in multiple sources. Most knowledge conflicts in the past were due to translation errors and reporter bias, and thus could be managed. Current and future intelligence analysis, especially in denied areas, must deal with open source data processing, where there is much greater presence of intentional misinformation. In this paper, we describe a model for detecting conflicts in multi-source textual knowledge. Our model is based on constructing semantic graphs representing patterns of multi-source knowledge conflicts and anomalies, and detecting these conflicts by matching pattern graphs against the data graph constructed using soft co-reference between entities and events in multiple sources. The conflict detection process maintains the uncertainty throughout all phases, providing full traceability and enabling incremental updates of the detection results as new knowledge or modification to previously analyzed information are obtained. Detected conflicts are presented to analysts for further investigation. In the experimental study with SYNCOIN dataset, our algorithms achieved perfect conflict detection in ideal situation (no missing data) while producing 82% recall and 90% precision in realistic noise situation (15% of missing attributes).

  6. Progress in Open-World, Integrative, Collaborative Science Data Platforms (Invited)

    NASA Astrophysics Data System (ADS)

    Fox, P. A.

    2013-12-01

    As collaborative, or network science spreads into more Earth and space science fields, both the participants and their funders have expressed a very strong desire for highly functional data and information capabilities that are a) easy to use, b) integrated in a variety of ways, c) leverage prior investments and keep pace with rapid technical change, and d) are not expensive or time-consuming to build or maintain. In response, and based on our accumulated experience over the last decade and a maturing of several key technical approaches, we have adapted, extended, and integrated several open source applications and frameworks that handle major portions of functionality for these platforms. At minimum, these functions include: an object-type repository, collaboration tools, an ability to identify and manage all key entities in the platform, and an integrated portal to manage diverse content and applications, with varied access levels and privacy options. At a conceptual level, science networks (even small ones) deal with people, and many intellectual artifacts produced or consumed in research, organizational and/our outreach activities, as well as the relations among them. Increasingly these networks are modeled as knowledge networks, i.e. graphs with named and typed relations among the 'nodes'. Nodes can be people, organizations, datasets, events, presentations, publications, videos, meetings, reports, groups, and more. In this heterogeneous ecosystem, it is also important to use a set of common informatics approaches to co-design and co-evolve the needed science data platforms based on what real people want to use them for. In this contribution, we present our methods and results for information modeling, adapting, integrating and evolving a networked data science and information architecture based on several open source technologies (Drupal, VIVO, the Comprehensive Knowledge Archive Network; CKAN, and the Global Handle System; GHS). In particular we present both the instantiation of this data platform for the Deep Carbon Observatory, including key functional and non-functional attributes, how the smart mediation among the components is modeled and managed, and discuss its general applicability.

  7. Identification of Major Risk Sources for Surface Water Pollution by Risk Indexes (RI) in the Multi-Provincial Boundary Region of the Taihu Basin, China

    PubMed Central

    Yao, Hong; Li, Weixin; Qian, Xin

    2015-01-01

    Environmental safety in multi-district boundary regions has been one of the focuses in China and is mentioned many times in the Environmental Protection Act of 2014. Five types were categorized concerning the risk sources for surface water pollution in the multi-provincial boundary region of the Taihu basin: production enterprises, waste disposal sites, chemical storage sites, agricultural non-point sources and waterway transportations. Considering the hazard of risk sources, the purification property of environmental medium and the vulnerability of risk receptors, 52 specific attributes on the risk levels of each type of risk source were screened out. Continuous piecewise linear function model, expert consultation method and fuzzy integral model were used to calculate the integrated risk indexes (RI) to characterize the risk levels of pollution sources. In the studied area, 2716 pollution sources were characterized by RI values. There were 56 high-risk sources screened out as major risk sources, accounting for about 2% of the total. The numbers of sources with high-moderate, moderate, moderate-low and low pollution risk were 376, 1059, 101 and 1124, respectively, accounting for 14%, 38%, 5% and 41% of the total. The procedure proposed could be included in the integrated risk management systems of the multi-district boundary region of the Taihu basin. It could help decision makers to identify major risk sources in the risk prevention and reduction of surface water pollution. PMID:26308032

  8. Identification of Major Risk Sources for Surface Water Pollution by Risk Indexes (RI) in the Multi-Provincial Boundary Region of the Taihu Basin, China.

    PubMed

    Yao, Hong; Li, Weixin; Qian, Xin

    2015-08-21

    Environmental safety in multi-district boundary regions has been one of the focuses in China and is mentioned many times in the Environmental Protection Act of 2014. Five types were categorized concerning the risk sources for surface water pollution in the multi-provincial boundary region of the Taihu basin: production enterprises, waste disposal sites, chemical storage sites, agricultural non-point sources and waterway transportations. Considering the hazard of risk sources, the purification property of environmental medium and the vulnerability of risk receptors, 52 specific attributes on the risk levels of each type of risk source were screened out. Continuous piecewise linear function model, expert consultation method and fuzzy integral model were used to calculate the integrated risk indexes (RI) to characterize the risk levels of pollution sources. In the studied area, 2716 pollution sources were characterized by RI values. There were 56 high-risk sources screened out as major risk sources, accounting for about 2% of the total. The numbers of sources with high-moderate, moderate, moderate-low and low pollution risk were 376, 1059, 101 and 1124, respectively, accounting for 14%, 38%, 5% and 41% of the total. The procedure proposed could be included in the integrated risk management systems of the multi-district boundary region of the Taihu basin. It could help decision makers to identify major risk sources in the risk prevention and reduction of surface water pollution.

  9. Source-Based Tasks in Writing Independent and Integrated Essays

    ERIC Educational Resources Information Center

    Gholami, Javad; Alinasab, Mahsa

    2017-01-01

    Integrated writing tasks have gained considerable attention in ESL and EFL writing assessment and are frequently needed and used in academic settings and daily life. However, they are very rarely practiced and promoted in writing classes. This paper explored the effects of source-based writing practice on EFL learners' composing abilities and…

  10. A new high quality X-ray source for Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Walter, Philippe; Variola, Alessandro; Zomer, Fabian; Jaquet, Marie; Loulergue, Alexandre

    2009-09-01

    Compton based photon sources have generated much interest since the rapid advance in laser and accelerator technologies has allowed envisaging their utilisation for ultra-compact radiation sources. These should provide X-ray short pulses with a relatively high average flux. Moreover, the univocal dependence between the scattered photon energy and its angle gives the possibility of obtaining a quasi-monochromatic beam with a simple diaphragm system. For the most ambitious projects the expected performance takes into account a rate of 10-10 photons/s, with an angular divergence of few mrad, an X-ray energy cut-off of few tens of keV and a bandwidth ΔE/E˜1-10%. Even if the integrated rate cannot compete with synchrotron radiation sources, the cost and the compactness of these Compton based machines make them attractive for a wide spectrum of applications. We explore here the interest of these systems for Cultural Heritage preservation. To cite this article: P. Walter et al., C. R. Physique 10 (2009).

  11. ImTK: an open source multi-center information management toolkit

    NASA Astrophysics Data System (ADS)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  12. Biomedical informatics: development of a comprehensive data warehouse for clinical and genomic breast cancer research.

    PubMed

    Hu, Hai; Brzeski, Henry; Hutchins, Joe; Ramaraj, Mohan; Qu, Long; Xiong, Richard; Kalathil, Surendran; Kato, Rand; Tenkillaya, Santhosh; Carney, Jerry; Redd, Rosann; Arkalgudvenkata, Sheshkumar; Shahzad, Kashif; Scott, Richard; Cheng, Hui; Meadow, Stephen; McMichael, John; Sheu, Shwu-Lin; Rosendale, David; Kvecher, Leonid; Ahern, Stephen; Yang, Song; Zhang, Yonghong; Jordan, Rick; Somiari, Stella B; Hooke, Jeffrey; Shriver, Craig D; Somiari, Richard I; Liebman, Michael N

    2004-10-01

    The Windber Research Institute is an integrated high-throughput research center employing clinical, genomic and proteomic platforms to produce terabyte levels of data. We use biomedical informatics technologies to integrate all of these operations. This report includes information on a multi-year, multi-phase hybrid data warehouse project currently under development in the Institute. The purpose of the warehouse is to host the terabyte-level of internal experimentally generated data as well as data from public sources. We have previously reported on the phase I development, which integrated limited internal data sources and selected public databases. Currently, we are completing phase II development, which integrates our internal automated data sources and develops visualization tools to query across these data types. This paper summarizes our clinical and experimental operations, the data warehouse development, and the challenges we have faced. In phase III we plan to federate additional manual internal and public data sources and then to develop and adapt more data analysis and mining tools. We expect that the final implementation of the data warehouse will greatly facilitate biomedical informatics research.

  13. Psychosocial Determinants of Cancer-Related Information Seeking among Cancer Patients

    PubMed Central

    SMITH-McLALLEN, AARON; FISHBEIN, MARTIN; HORNIK, ROBERT C.

    2011-01-01

    This study explores the utility of using the Integrative Model of Behavioral Prediction as a framework for predicting cancer patients’ intentions to seek information about their cancer from sources other than a physician, and to examine the relation between patient’s baseline intentions to seek information and their actual seeking behavior at follow-up. Within one year of their diagnosis with colon, breast, or prostate cancer, 1641 patients responded to a mailed questionnaire assessing intentions to seek cancer-related information from a source other than their doctor, as well as their attitudes, perceived normative pressure, and perceived behavioral control with respect to this behavior. In addition, the survey assessed their cancer-related information seeking. One year later, 1049 of these patients responded to a follow-up survey assessing cancer-related information seeking during the previous year. Attitudes, perceived normative pressure, and perceived behavioral control were predictive of information seeking intentions, though attitudes emerged as the primary predictor. Intentions to seek information, perceived normative pressure regarding information seeking, baseline information seeking behavior, and being diagnosed with stage 4 cancer were predictive of actual information seeking behavior at follow-up. Practical implications are discussed. PMID:21207310

  14. Unregulated private wells in the Republic of Ireland: consumer awareness, source susceptibility and protective actions.

    PubMed

    Hynds, Paul D; Misstear, Bruce D; Gill, Laurence W

    2013-09-30

    While the safety of public drinking water supplies in the Republic of Ireland is governed and monitored at both local and national levels, there are currently no legislative tools in place relating to private supplies. It is therefore paramount that private well owners (and users) be aware of source specifications and potential contamination risks, to ensure adequate water quality. The objective of this study was to investigate the level of awareness among private well owners in the Republic of Ireland, relating to source characterisation and groundwater contamination issues. This was undertaken through interviews with 245 private well owners. Statistical analysis indicates that respondents' source type significantly influences owner awareness, particularly regarding well construction and design parameters. Water treatment, source maintenance and regular water quality testing are considered the three primary "protective actions" (or "stewardship activities") to consumption of contaminated groundwater and were reported as being absent in 64%, 72% and 40% of cases, respectively. Results indicate that the level of awareness exhibited by well users did not significantly affect the likelihood of their source being contaminated (source susceptibility); increased awareness on behalf of well users was associated with increased levels of protective action, particularly among borehole owners. Hence, lower levels of awareness may result in increased contraction of waterborne illnesses where contaminants have entered the well. Accordingly, focused educational strategies to increase awareness among private groundwater users are advocated in the short-term; the development and introdiction of formal legislation is recommended in the long-term, including an integrated programme of well inspections and risk assessments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Economic dispatch optimization for system integrating renewable energy sources

    NASA Astrophysics Data System (ADS)

    Jihane, Kartite; Mohamed, Cherkaoui

    2018-05-01

    Nowadays, the use of energy is growing especially in transportation and electricity industries. However this energy is based on conventional sources which pollute the environment. Multi-source system is seen as the best solution to sustainable development. This paper proposes the Economic Dispatch (ED) of hybrid renewable power system. The hybrid system is composed of ten thermal generators, photovoltaic (PV) generator and wind turbine generator. To show the importance of renewable energy sources (RES) in the energy mix we have ran the simulation for system integrated PV only and PV plus wind. The result shows that the system with renewable energy sources (RES) is more compromising than the system without RES in terms of fuel cost.

  16. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    PubMed

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  17. VISUALIZATION FROM INTRAOPERATIVE SWEPT-SOURCE MICROSCOPE-INTEGRATED OPTICAL COHERENCE TOMOGRAPHY IN VITRECTOMY FOR COMPLICATIONS OF PROLIFERATIVE DIABETIC RETINOPATHY.

    PubMed

    Gabr, Hesham; Chen, Xi; Zevallos-Carrasco, Oscar M; Viehland, Christian; Dandrige, Alexandria; Sarin, Neeru; Mahmoud, Tamer H; Vajzovic, Lejla; Izatt, Joseph A; Toth, Cynthia A

    2018-01-10

    To evaluate the use of live volumetric (4D) intraoperative swept-source microscope-integrated optical coherence tomography in vitrectomy for proliferative diabetic retinopathy complications. In this prospective study, we analyzed a subgroup of patients with proliferative diabetic retinopathy complications who required vitrectomy and who were imaged by the research swept-source microscope-integrated optical coherence tomography system. In near real time, images were displayed in stereo heads-up display facilitating intraoperative surgeon feedback. Postoperative review included scoring image quality, identifying different diabetic retinopathy-associated pathologies and reviewing the intraoperatively documented surgeon feedback. Twenty eyes were included. Indications for vitrectomy were tractional retinal detachment (16 eyes), combined tractional-rhegmatogenous retinal detachment (2 eyes), and vitreous hemorrhage (2 eyes). Useful, good-quality 2D (B-scans) and 4D images were obtained in 16/20 eyes (80%). In these eyes, multiple diabetic retinopathy complications could be imaged. Swept-source microscope-integrated optical coherence tomography provided surgical guidance, e.g., in identifying dissection planes under fibrovascular membranes, and in determining residual membranes and traction that would benefit from additional peeling. In 4/20 eyes (20%), acceptable images were captured, but they were not useful due to high tractional retinal detachment elevation which was challenging for imaging. Swept-source microscope-integrated optical coherence tomography can provide important guidance during surgery for proliferative diabetic retinopathy complications through intraoperative identification of different complications and facilitation of intraoperative decision making.

  18. Simulation of the alpha particle heating and the helium ash source in an International Thermonuclear Experimental Reactor-like tokamak with an internal transport barrier

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Lei, E-mail: lye@ipp.ac.cn; Guo, Wenfeng; Xiao, Xiaotao

    2014-12-15

    A guiding center orbit following code, which incorporates a set of non-singular coordinates for orbit integration, was developed and applied to investigate the alpha particle heating in an ITER-like tokamak with an internal transport barrier. It is found that a relatively large q (safety factor) value can significantly broaden the alpha heating profile in comparison with the local heating approximation; this broadening is due to the finite orbit width effects; when the orbit width is much smaller than the scale length of the alpha particle source profile, the heating profile agrees with the source profile, otherwise, the heating profile canmore » be significantly broadened. It is also found that the stagnation particles move to the magnetic axis during the slowing-down process, thus the effect of stagnation orbits is not beneficial to the helium ash removal. The source profile of helium ash is broadened in comparison with the alpha source profile, which is similar to the heating profile.« less

  19. ERP correlates of source memory: unitized source information increases familiarity-based retrieval.

    PubMed

    Diana, Rachel A; Van den Boom, Wijnand; Yonelinas, Andrew P; Ranganath, Charan

    2011-01-07

    Source memory tests typically require subjects to make decisions about the context in which an item was encoded and are thought to depend on recollection of details from the study episode. Although it is generally believed that familiarity does not contribute to source memory, recent behavioral studies have suggested that familiarity may also support source recognition when item and source information are integrated, or "unitized," during study (Diana, Yonelinas, and Ranganath, 2008). However, an alternative explanation of these behavioral findings is that unitization affects the manner in which recollection contributes to performance, rather than increasing familiarity-based source memory. To discriminate between these possibilities, we conducted an event-related potential (ERP) study testing the hypothesis that unitization increases the contribution of familiarity to source recognition. Participants studied associations between words and background colors using tasks that either encouraged or discouraged unitization. ERPs were recorded during a source memory test for background color. The results revealed two distinct neural correlates of source recognition: a frontally distributed positivity that was associated with familiarity-based source memory in the high-unitization condition only and a parietally distributed positivity that was associated with recollection-based source memory in both the high- and low-unitization conditions. The ERP and behavioral findings provide converging evidence for the idea that familiarity can contribute to source recognition, particularly when source information is encoded as an item detail. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. Illustrative national scale scenarios of environmental and human health impacts of Carbon Capture and Storage.

    PubMed

    Tzanidakis, Konstantinos; Oxley, Tim; Cockerill, Tim; ApSimon, Helen

    2013-06-01

    Integrated Assessment, and the development of strategies to reduce the impacts of air pollution, has tended to focus only upon the direct emissions from different sources, with the indirect emissions associated with the full life-cycle of a technology often overlooked. Carbon Capture and Storage (CCS) reflects a number of new technologies designed to reduce CO2 emissions, but which may have much broader environmental implications than greenhouse gas emissions. This paper considers a wider range of pollutants from a full life-cycle perspective, illustrating a methodology for assessing environmental impacts using source-apportioned effects based impact factors calculated by the national scale UK Integrated Assessment Model (UKIAM). Contrasting illustrative scenarios for the deployment of CCS towards 2050 are presented which compare the life-cycle effects of air pollutant emissions upon human health and ecosystems of business-as-usual, deployment of CCS and widespread uptake of IGCC for power generation. Together with estimation of the transboundary impacts we discuss the benefits of an effects based approach to such assessments in relation to emissions based techniques. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Experimental Verification of Bayesian Planet Detection Algorithms with a Shaped Pupil Coronagraph

    NASA Astrophysics Data System (ADS)

    Savransky, D.; Groff, T. D.; Kasdin, N. J.

    2010-10-01

    We evaluate the feasibility of applying Bayesian detection techniques to discovering exoplanets using high contrast laboratory data with simulated planetary signals. Background images are generated at the Princeton High Contrast Imaging Lab (HCIL), with a coronagraphic system utilizing a shaped pupil and two deformable mirrors (DMs) in series. Estimates of the electric field at the science camera are used to correct for quasi-static speckle and produce symmetric high contrast dark regions in the image plane. Planetary signals are added in software, or via a physical star-planet simulator which adds a second off-axis point source before the coronagraph with a beam recombiner, calibrated to a fixed contrast level relative to the source. We produce a variety of images, with varying integration times and simulated planetary brightness. We then apply automated detection algorithms such as matched filtering to attempt to extract the planetary signals. This allows us to evaluate the efficiency of these techniques in detecting planets in a high noise regime and eliminating false positives, as well as to test existing algorithms for calculating the required integration times for these techniques to be applicable.

  2. Toward a comprehensive, theoretical model of compassion fatigue: An integrative literature review.

    PubMed

    Coetzee, Siedine K; Laschinger, Heather K S

    2018-03-01

    This study was an integrative literature review in relation to compassion fatigue models, appraising these models, and developing a comprehensive theoretical model of compassion fatigue. A systematic search on PubMed, EbscoHost (Academic Search Premier, E-Journals, Medline, PsycINFO, Health Source Nursing/Academic Edition, CINAHL, MasterFILE Premier and Health Source Consumer Edition), gray literature, and manual searches of included reference lists was conducted in 2016. The studies (n = 11) were analyzed, and the strengths and limitations of the compassion fatigue models identified. We further built on these models through the application of the conservation of resources theory and the social neuroscience of empathy. The compassion fatigue model shows that it is not empathy that puts nurses at risk of developing compassion fatigue, but rather a lack of resources, inadequate positive feedback, and the nurse's response to personal distress. By acting on these three aspects, the risk of developing compassion fatigue can be addressed, which could improve the retention of a compassionate and committed nurse workforce. © 2017 John Wiley & Sons Australia, Ltd.

  3. Recent Advances and Field Trial Results Integrating Cosmic Ray Muon Tomography with Other Data Sources for Mineral Exploration

    NASA Astrophysics Data System (ADS)

    Schouten, D.

    2015-12-01

    CRM GeoTomography Technologies, Inc. is leading the way in applying muon tomography to discovery and definition of dense ore bodies for mineral exploration and resource estimation. We have successfully imaged volcanogenic massive sulfide (VMS) deposits at mines in North America using our suite of field-proven muon tracking detectors, and are at various stages of development for other applications. Recently we developed in-house inversion software that integrates data from assays, surface and borehole gravity, and underground muon flux measurements. We have found that the differing geophysical data sources provide complementary information and that dramatic improvements in inversion results are attained using various inversion performance metrics related to the excess tonnage of the mineral deposits, as well as their spatial extents and locations. This presentation will outline field tests of muon tomography performed by CRM Geotomography in some real world examples, and will demonstrate the effectiveness of joint muon tomography, assay and gravity inversion techniques in field tests (where data are available) and in simulations.

  4. Parallel evolution of Nitric Oxide signaling: Diversity of synthesis & memory pathways

    PubMed Central

    Moroz, Leonid L.; Kohn, Andrea B.

    2014-01-01

    The origin of NO signaling can be traceable back to the origin of life with the large scale of parallel evolution of NO synthases (NOSs). Inducible-like NOSs may be the most basal prototype of all NOSs and that neuronal-like NOS might have evolved several times from this prototype. Other enzymatic and non-enzymatic pathways for NO synthesis have been discovered using reduction of nitrites, an alternative source of NO. Diverse synthetic mechanisms can co-exist within the same cell providing a complex NO-oxygen microenvironment tightly coupled with cellular energetics. The dissection of multiple sources of NO formation is crucial in analysis of complex biological processes such as neuronal integration and learning mechanisms when NO can act as a volume transmitter within memory-forming circuits. In particular, the molecular analysis of learning mechanisms (most notably in insects and gastropod molluscs) opens conceptually different perspectives to understand the logic of recruiting evolutionarily conserved pathways for novel functions. Giant uniquely identified cells from Aplysia and related species precent unuque opportunities for integrative analysis of NO signaling at the single cell level. PMID:21622160

  5. An approach to source characterization of tremor signals associated with eruptions and lahars

    NASA Astrophysics Data System (ADS)

    Kumagai, Hiroyuki; Mothes, Patricia; Ruiz, Mario; Maeda, Yuta

    2015-11-01

    Tremor signals are observed in association with eruption activity and lahar descents. Reduced displacement ( D R) derived from tremor signals has been used to quantify tremor sources. However, tremor duration is not considered in D R, which makes it difficult to compare D R values estimated for different tremor episodes. We propose application of the amplitude source location (ASL) method to characterize the sources of tremor signals. We used this method to estimate the tremor source location and source amplitude from high-frequency (5-10 Hz) seismic amplitudes under the assumption of isotropic S-wave radiation. We considered the source amplitude to be the maximum value during tremor. We estimated the cumulative source amplitude ( I s) as the offset value of the time-integrated envelope of the vertical seismogram of tremor corrected for geometrical spreading and medium attenuation in the 5-10-Hz band. For eruption tremor signals, we also estimated the cumulative source pressure ( I p) from an infrasonic envelope waveform corrected for geometrical spreading. We studied these parameters of tremor signals associated with eruptions and lahars and explosion events at Tungurahua volcano, Ecuador. We identified two types of eruption tremor at Tungurahua: noise-like inharmonic waveforms and harmonic oscillatory signals. We found that I s increased linearly with increasing source amplitude for lahar tremor signals and explosion events, but I s increased exponentially with increasing source amplitude for inharmonic eruption tremor signals. The source characteristics of harmonic eruption tremor signals differed from those of inharmonic tremor signals. We found a linear relation between I s and I p for both explosion events and eruption tremor. Because I p may be proportional to the total mass involved during an eruption episode, this linear relation suggests that I s may be useful to quantify eruption size. The I s values we estimated for inharmonic eruption tremor were consistent with previous estimates of volumes of tephra fallout. The scaling relations among source parameters that we identified will contribute to our understanding of the dynamic processes associated with eruptions and lahars. This new approach is applicable in analyzing tremor sources in real time and may contribute to early assessment of the size of eruptions and lahars.

  6. HydroDesktop as a Community Designed and Developed Resource for Hydrologic Data Discovery and Analysis

    NASA Astrophysics Data System (ADS)

    Ames, D. P.

    2013-12-01

    As has been seen in other informatics fields, well-documented and appropriately licensed open source software tools have the potential to significantly increase both opportunities and motivation for inter-institutional science and technology collaboration. The CUAHSI HIS (and related HydroShare) projects have aimed to foster such activities in hydrology resulting in the development of many useful community software components including the HydroDesktop software application. HydroDesktop is an open source, GIS-based, scriptable software application for discovering data on the CUAHSI Hydrologic Information System and related resources. It includes a well-defined plugin architecture and interface to allow 3rd party developers to create extensions and add new functionality without requiring recompiling of the full source code. HydroDesktop is built in the C# programming language and uses the open source DotSpatial GIS engine for spatial data management. Capabilities include data search, discovery, download, visualization, and export. An extension that integrates the R programming language with HydroDesktop provides scripting and data automation capabilities and an OpenMI plugin provides the ability to link models. Current revision and updates to HydroDesktop include migration of core business logic to cross platform, scriptable Python code modules that can be executed in any operating system or linked into other software front-end applications.

  7. Issues in Humanoid Audition and Sound Source Localization by Active Audition

    NASA Astrophysics Data System (ADS)

    Nakadai, Kazuhiro; Okuno, Hiroshi G.; Kitano, Hiroaki

    In this paper, we present an active audition system which is implemented on the humanoid robot "SIG the humanoid". The audition system for highly intelligent humanoids localizes sound sources and recognizes auditory events in the auditory scene. Active audition reported in this paper enables SIG to track sources by integrating audition, vision, and motor movements. Given the multiple sound sources in the auditory scene, SIG actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. However, such an active head movement inevitably creates motor noises.The system adaptively cancels motor noises using motor control signals and the cover acoustics. The experimental result demonstrates that active audition by integration of audition, vision, and motor control attains sound source tracking in variety of conditions.onditions.

  8. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  9. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  10. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  11. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  12. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  13. Integrating open-source software applications to build molecular dynamics systems.

    PubMed

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  14. Indirect (source-free) integration method. I. Wave-forms from geodesic generic orbits of EMRIs

    NASA Astrophysics Data System (ADS)

    Ritter, Patxi; Aoudia, Sofiane; Spallicci, Alessandro D. A. M.; Cordier, Stéphane

    2016-12-01

    The Regge-Wheeler-Zerilli (RWZ) wave-equation describes Schwarzschild-Droste black hole perturbations. The source term contains a Dirac distribution and its derivative. We have previously designed a method of integration in time domain. It consists of a finite difference scheme where analytic expressions, dealing with the wave-function discontinuity through the jump conditions, replace the direct integration of the source and the potential. Herein, we successfully apply the same method to the geodesic generic orbits of EMRI (Extreme Mass Ratio Inspiral) sources, at second order. An EMRI is a Compact Star (CS) captured by a Super-Massive Black Hole (SMBH). These are considered the best probes for testing gravitation in strong regime. The gravitational wave-forms, the radiated energy and angular momentum at infinity are computed and extensively compared with other methods, for different orbits (circular, elliptic, parabolic, including zoom-whirl).

  15. Spherical-earth gravity and magnetic anomaly modeling by Gauss-Legendre quadrature integration

    NASA Technical Reports Server (NTRS)

    Von Frese, R. R. B.; Hinze, W. J.; Braile, L. W.; Luca, A. J.

    1981-01-01

    Gauss-Legendre quadrature integration is used to calculate the anomalous potential of gravity and magnetic fields and their spatial derivatives on a spherical earth. The procedure involves representation of the anomalous source as a distribution of equivalent point gravity poles or point magnetic dipoles. The distribution of equivalent point sources is determined directly from the volume limits of the anomalous body. The variable limits of integration for an arbitrarily shaped body are obtained from interpolations performed on a set of body points which approximate the body's surface envelope. The versatility of the method is shown by its ability to treat physical property variations within the source volume as well as variable magnetic fields over the source and observation surface. Examples are provided which illustrate the capabilities of the technique, including a preliminary modeling of potential field signatures for the Mississippi embayment crustal structure at 450 km.

  16. Integrated water assessment and modelling: A bibliometric analysis of trends in the water resource sector

    NASA Astrophysics Data System (ADS)

    Zare, Fateme; Elsawah, Sondoss; Iwanaga, Takuya; Jakeman, Anthony J.; Pierce, Suzanne A.

    2017-09-01

    There are substantial challenges facing humanity in the water and related sectors and purposeful integration of the disciplines, connected sectors and interest groups is now perceived as essential to address them. This article describes and uses bibliometric analysis techniques to provide quantitative insights into the general landscape of Integrated Water Resource Assessment and Modelling (IWAM) research over the last 45 years. Keywords, terms in titles, abstracts and the full texts are used to distinguish the 13,239 IWAM articles in journals and other non-grey literature. We identify the major journals publishing IWAM research, influential authors through citation counts, as well as the distribution and strength of source countries. Fruitfully, we find that the growth in numbers of such publications has continued to accelerate, and attention to both the biophysical and socioeconomic aspects has also been growing. On the other hand, our analysis strongly indicates that the former continue to dominate, partly by embracing integration with other biophysical sectors related to water - environment, groundwater, ecology, climate change and agriculture. In the social sciences the integration is occurring predominantly through economics, with the others, including law, policy and stakeholder participation, much diminished in comparison. We find there has been increasing attention to management and decision support systems, but a much weaker focus on uncertainty, a pervasive concern whose criticalities must be identified and managed for improving decision making. It would seem that interdisciplinary science still has a long way to go before crucial integration with the non-economic social sciences and uncertainty considerations are achieved more routinely.

  17. SIDD: A Semantically Integrated Database towards a Global View of Human Disease

    PubMed Central

    Cheng, Liang; Wang, Guohua; Li, Jie; Zhang, Tianjiao; Xu, Peigang; Wang, Yadong

    2013-01-01

    Background A number of databases have been developed to collect disease-related molecular, phenotypic and environmental features (DR-MPEs), such as genes, non-coding RNAs, genetic variations, drugs, phenotypes and environmental factors. However, each of current databases focused on only one or two DR-MPEs. There is an urgent demand to develop an integrated database, which can establish semantic associations among disease-related databases and link them to provide a global view of human disease at the biological level. This database, once developed, will facilitate researchers to query various DR-MPEs through disease, and investigate disease mechanisms from different types of data. Methodology To establish an integrated disease-associated database, disease vocabularies used in different databases are mapped to Disease Ontology (DO) through semantic match. 4,284 and 4,186 disease terms from Medical Subject Headings (MeSH) and Online Mendelian Inheritance in Man (OMIM) respectively are mapped to DO. Then, the relationships between DR-MPEs and diseases are extracted and merged from different source databases for reducing the data redundancy. Conclusions A semantically integrated disease-associated database (SIDD) is developed, which integrates 18 disease-associated databases, for researchers to browse multiple types of DR-MPEs in a view. A web interface allows easy navigation for querying information through browsing a disease ontology tree or searching a disease term. Furthermore, a network visualization tool using Cytoscape Web plugin has been implemented in SIDD. It enhances the SIDD usage when viewing the relationships between diseases and DR-MPEs. The current version of SIDD (Jul 2013) documents 4,465,131 entries relating to 139,365 DR-MPEs, and to 3,824 human diseases. The database can be freely accessed from: http://mlg.hit.edu.cn/SIDD. PMID:24146757

  18. Ground Penetrating Radar as a Contextual Sensor for Multi-Sensor Radiological Characterisation

    PubMed Central

    Ukaegbu, Ikechukwu K.; Gamage, Kelum A. A.

    2017-01-01

    Radioactive sources exist in environments or contexts that influence how they are detected and localised. For instance, the context of a moving source is different from a stationary source because of the effects of motion. The need to incorporate this contextual information in the radiation detection and localisation process has necessitated the integration of radiological and contextual sensors. The benefits of the successful integration of both types of sensors is well known and widely reported in fields such as medical imaging. However, the integration of both types of sensors has also led to innovative solutions to challenges in characterising radioactive sources in non-medical applications. This paper presents a review of such recent applications. It also identifies that these applications mostly use visual sensors as contextual sensors for characterising radiation sources. However, visual sensors cannot retrieve contextual information about radioactive wastes located in opaque environments encountered at nuclear sites, e.g., underground contamination. Consequently, this paper also examines ground-penetrating radar (GPR) as a contextual sensor for characterising this category of wastes and proposes several ways of integrating data from GPR and radiological sensors. Finally, it demonstrates combined GPR and radiation imaging for three-dimensional localisation of contamination in underground pipes using radiation transport and GPR simulations. PMID:28387706

  19. Modular Heat Exchanger With Integral Heat Pipe

    NASA Technical Reports Server (NTRS)

    Schreiber, Jeffrey G.

    1992-01-01

    Modular heat exchanger with integral heat pipe transports heat from source to Stirling engine. Alternative to heat exchangers depending on integrities of thousands of brazed joints, contains only 40 brazed tubes.

  20. Integration of Multiple Genomic and Phenotype Data to Infer Novel miRNA-Disease Associations

    PubMed Central

    Zhou, Meng; Cheng, Liang; Yang, Haixiu; Wang, Jing; Sun, Jie; Wang, Zhenzhen

    2016-01-01

    MicroRNAs (miRNAs) play an important role in the development and progression of human diseases. The identification of disease-associated miRNAs will be helpful for understanding the molecular mechanisms of diseases at the post-transcriptional level. Based on different types of genomic data sources, computational methods for miRNA-disease association prediction have been proposed. However, individual source of genomic data tends to be incomplete and noisy; therefore, the integration of various types of genomic data for inferring reliable miRNA-disease associations is urgently needed. In this study, we present a computational framework, CHNmiRD, for identifying miRNA-disease associations by integrating multiple genomic and phenotype data, including protein-protein interaction data, gene ontology data, experimentally verified miRNA-target relationships, disease phenotype information and known miRNA-disease connections. The performance of CHNmiRD was evaluated by experimentally verified miRNA-disease associations, which achieved an area under the ROC curve (AUC) of 0.834 for 5-fold cross-validation. In particular, CHNmiRD displayed excellent performance for diseases without any known related miRNAs. The results of case studies for three human diseases (glioblastoma, myocardial infarction and type 1 diabetes) showed that all of the top 10 ranked miRNAs having no known associations with these three diseases in existing miRNA-disease databases were directly or indirectly confirmed by our latest literature mining. All these results demonstrated the reliability and efficiency of CHNmiRD, and it is anticipated that CHNmiRD will serve as a powerful bioinformatics method for mining novel disease-related miRNAs and providing a new perspective into molecular mechanisms underlying human diseases at the post-transcriptional level. CHNmiRD is freely available at http://www.bio-bigdata.com/CHNmiRD. PMID:26849207

  1. Integration of Multiple Genomic and Phenotype Data to Infer Novel miRNA-Disease Associations.

    PubMed

    Shi, Hongbo; Zhang, Guangde; Zhou, Meng; Cheng, Liang; Yang, Haixiu; Wang, Jing; Sun, Jie; Wang, Zhenzhen

    2016-01-01

    MicroRNAs (miRNAs) play an important role in the development and progression of human diseases. The identification of disease-associated miRNAs will be helpful for understanding the molecular mechanisms of diseases at the post-transcriptional level. Based on different types of genomic data sources, computational methods for miRNA-disease association prediction have been proposed. However, individual source of genomic data tends to be incomplete and noisy; therefore, the integration of various types of genomic data for inferring reliable miRNA-disease associations is urgently needed. In this study, we present a computational framework, CHNmiRD, for identifying miRNA-disease associations by integrating multiple genomic and phenotype data, including protein-protein interaction data, gene ontology data, experimentally verified miRNA-target relationships, disease phenotype information and known miRNA-disease connections. The performance of CHNmiRD was evaluated by experimentally verified miRNA-disease associations, which achieved an area under the ROC curve (AUC) of 0.834 for 5-fold cross-validation. In particular, CHNmiRD displayed excellent performance for diseases without any known related miRNAs. The results of case studies for three human diseases (glioblastoma, myocardial infarction and type 1 diabetes) showed that all of the top 10 ranked miRNAs having no known associations with these three diseases in existing miRNA-disease databases were directly or indirectly confirmed by our latest literature mining. All these results demonstrated the reliability and efficiency of CHNmiRD, and it is anticipated that CHNmiRD will serve as a powerful bioinformatics method for mining novel disease-related miRNAs and providing a new perspective into molecular mechanisms underlying human diseases at the post-transcriptional level. CHNmiRD is freely available at http://www.bio-bigdata.com/CHNmiRD.

  2. Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains

    NASA Astrophysics Data System (ADS)

    Koulouri, Alexandra; Brookes, Mike; Rimpiläinen, Ville

    2017-01-01

    In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In this paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field.

  3. Leveraging Open Standard Interfaces in Providing Efficient Discovery, Retrieval, and Information of NASA-Sponsored Observations and Predictions

    NASA Astrophysics Data System (ADS)

    Cole, M.; Alameh, N.; Bambacus, M.

    2006-05-01

    The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.

  4. Sensory processes modulate differences in multi-component behavior and cognitive control between childhood and adulthood.

    PubMed

    Gohil, Krutika; Bluschke, Annet; Roessner, Veit; Stock, Ann-Kathrin; Beste, Christian

    2017-10-01

    Many everyday tasks require executive functions to achieve a certain goal. Quite often, this requires the integration of information derived from different sensory modalities. Children are less likely to integrate information from different modalities and, at the same time, also do not command fully developed executive functions, as compared to adults. Yet still, the role of developmental age-related effects on multisensory integration processes has not been examined within the context of multicomponent behavior until now (i.e., the concatenation of different executive subprocesses). This is problematic because differences in multisensory integration might actually explain a significant amount of the developmental effects that have traditionally been attributed to changes in executive functioning. In a system, neurophysiological approach combining electroencephaloram (EEG) recordings and source localization analyses, we therefore examined this question. The results show that differences in how children and adults accomplish multicomponent behavior do not solely depend on developmental differences in executive functioning. Instead, the observed developmental differences in response selection processes (reflected by the P3 ERP) were largely dependent on the complexity of integrating temporally separated stimuli from different modalities. This effect was related to activation differences in medial frontal and inferior parietal cortices. Primary perceptual gating or attentional selection processes (P1 and N1 ERPs) were not affected. The results show that differences in multisensory integration explain parts of transformations in cognitive processes between childhood and adulthood that have traditionally been attributed to changes in executive functioning, especially when these require the integration of multiple modalities during response selection. Hum Brain Mapp 38:4933-4945, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  5. Segregation and Integration of Auditory Streams when Listening to Multi-Part Music

    PubMed Central

    Ragert, Marie; Fairhurst, Merle T.; Keller, Peter E.

    2014-01-01

    In our daily lives, auditory stream segregation allows us to differentiate concurrent sound sources and to make sense of the scene we are experiencing. However, a combination of segregation and the concurrent integration of auditory streams is necessary in order to analyze the relationship between streams and thus perceive a coherent auditory scene. The present functional magnetic resonance imaging study investigates the relative role and neural underpinnings of these listening strategies in multi-part musical stimuli. We compare a real human performance of a piano duet and a synthetic stimulus of the same duet in a prioritized integrative attention paradigm that required the simultaneous segregation and integration of auditory streams. In so doing, we manipulate the degree to which the attended part of the duet led either structurally (attend melody vs. attend accompaniment) or temporally (asynchronies vs. no asynchronies between parts), and thus the relative contributions of integration and segregation used to make an assessment of the leader-follower relationship. We show that perceptually the relationship between parts is biased towards the conventional structural hierarchy in western music in which the melody generally dominates (leads) the accompaniment. Moreover, the assessment varies as a function of both cognitive load, as shown through difficulty ratings and the interaction of the temporal and the structural relationship factors. Neurally, we see that the temporal relationship between parts, as one important cue for stream segregation, revealed distinct neural activity in the planum temporale. By contrast, integration used when listening to both the temporally separated performance stimulus and the temporally fused synthetic stimulus resulted in activation of the intraparietal sulcus. These results support the hypothesis that the planum temporale and IPS are key structures underlying the mechanisms of segregation and integration of auditory streams, respectively. PMID:24475030

  6. Segregation and integration of auditory streams when listening to multi-part music.

    PubMed

    Ragert, Marie; Fairhurst, Merle T; Keller, Peter E

    2014-01-01

    In our daily lives, auditory stream segregation allows us to differentiate concurrent sound sources and to make sense of the scene we are experiencing. However, a combination of segregation and the concurrent integration of auditory streams is necessary in order to analyze the relationship between streams and thus perceive a coherent auditory scene. The present functional magnetic resonance imaging study investigates the relative role and neural underpinnings of these listening strategies in multi-part musical stimuli. We compare a real human performance of a piano duet and a synthetic stimulus of the same duet in a prioritized integrative attention paradigm that required the simultaneous segregation and integration of auditory streams. In so doing, we manipulate the degree to which the attended part of the duet led either structurally (attend melody vs. attend accompaniment) or temporally (asynchronies vs. no asynchronies between parts), and thus the relative contributions of integration and segregation used to make an assessment of the leader-follower relationship. We show that perceptually the relationship between parts is biased towards the conventional structural hierarchy in western music in which the melody generally dominates (leads) the accompaniment. Moreover, the assessment varies as a function of both cognitive load, as shown through difficulty ratings and the interaction of the temporal and the structural relationship factors. Neurally, we see that the temporal relationship between parts, as one important cue for stream segregation, revealed distinct neural activity in the planum temporale. By contrast, integration used when listening to both the temporally separated performance stimulus and the temporally fused synthetic stimulus resulted in activation of the intraparietal sulcus. These results support the hypothesis that the planum temporale and IPS are key structures underlying the mechanisms of segregation and integration of auditory streams, respectively.

  7. Dynamic data analysis of climate and recharge conditions over time in the Edwards Aquifer, Texas

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Collins, J.; Banner, J.

    2017-12-01

    Understanding the temporal patterns in datasets related to climate, recharge, and water resource conditions is important for informing water management and policy decisions. Data analysis and pipelines for evaluating these disparate sources of information are challenging to set up and rely on emerging informatics tools to complete. This project gathers data from both historical and recent sources for the Edwards Aquifer of central Texas. The Edwards faces a unique array of challenges, as it is composed of karst limestone, is susceptible to contaminants and climate change, and is expected to supply water for a rapidly growing population. Given these challenges, new approaches to integrating data will be particularly important. Case study data from the Edwards is used to evaluate aquifer and hydrologic system conditions over time as well as to discover patterns and possible relationships across the information sources. Prior research that evaluated trends in discharge and recharge of the aquifer is revisited by considering new data from 1992-2015, and the sustainability of the Edwards as a water resource within the more recent time period is addressed. Reusable and shareable analytical data pipelines are constructed using Jupyter Notebooks and Python libraries, and an interactive visualization is implemented with the information. In addition to the data sources that are utilized for the water balance analyses, the Global Surface Water Monitoring System from the University of Minnesota, a tool that integrates a wide number of satellite datasets with known surface water dynamics and machine learning, is used to evaluate water body persistence and change over time at regional scales. Preliminary results indicate that surface water body over the Edwards with differing aerial extents are declining, excepting some dam-controlled lakes in the region. Other existing tools and machine learning applications are also considered. Results are useful to the Texas Water Research Network and provide a reproducible geoinformatics approach to integrated data analysis for water resources at regional scales.

  8. The SCEC/UseIT Intern Program: Creating Open-Source Visualization Software Using Diverse Resources

    NASA Astrophysics Data System (ADS)

    Francoeur, H.; Callaghan, S.; Perry, S.; Jordan, T.

    2004-12-01

    The Southern California Earthquake Center undergraduate IT intern program (SCEC UseIT) conducts IT research to benefit collaborative earth science research. Through this program, interns have developed real-time, interactive, 3D visualization software using open-source tools. Dubbed LA3D, a distribution of this software is now in use by the seismic community. LA3D enables the user to interactively view Southern California datasets and models of importance to earthquake scientists, such as faults, earthquakes, fault blocks, digital elevation models, and seismic hazard maps. LA3D is now being extended to support visualizations anywhere on the planet. The new software, called SCEC-VIDEO (Virtual Interactive Display of Earth Objects), makes use of a modular, plugin-based software architecture which supports easy development and integration of new data sets. Currently SCEC-VIDEO is in beta testing, with a full open-source release slated for the future. Both LA3D and SCEC-VIDEO were developed using a wide variety of software technologies. These, which included relational databases, web services, software management technologies, and 3-D graphics in Java, were necessary to integrate the heterogeneous array of data sources which comprise our software. Currently the interns are working to integrate new technologies and larger data sets to increase software functionality and value. In addition, both LA3D and SCEC-VIDEO allow the user to script and create movies. Thus program interns with computer science backgrounds have been writing software while interns with other interests, such as cinema, geology, and education, have been making movies that have proved of great use in scientific talks, media interviews, and education. Thus, SCEC UseIT incorporates a wide variety of scientific and human resources to create products of value to the scientific and outreach communities. The program plans to continue with its interdisciplinary approach, increasing the relevance of the software and expanding its use in the scientific community.

  9. IGR J17329-2731: The birth of a symbiotic X-ray binary

    NASA Astrophysics Data System (ADS)

    Bozzo, E.; Bahramian, A.; Ferrigno, C.; Sanna, A.; Strader, J.; Lewis, F.; Russell, D. M.; di Salvo, T.; Burderi, L.; Riggio, A.; Papitto, A.; Gandhi, P.; Romano, P.

    2018-05-01

    We report on the results of the multiwavelength campaign carried out after the discovery of the INTEGRAL transient IGR J17329-2731. The optical data collected with the SOAR telescope allowed us to identify the donor star in this system as a late M giant at a distance of 2.7-1.2+3.4 kpc. The data collected quasi-simultaneously with XMM-Newton and NuSTAR showed the presence of a modulation with a period of 6680 ± 3 s in the X-ray light curves of the source. This unveils that the compact object hosted in this system is a slowly rotating neutron star. The broadband X-ray spectrum showed the presence of a strong absorption (≫1023 cm-2) and prominent emission lines at 6.4 keV, and 7.1 keV. These features are usually found in wind-fed systems, in which the emission lines result from the fluorescence of the X-rays from the accreting compact object on the surrounding stellar wind. The presence of a strong absorption line around 21 keV in the spectrum suggests a cyclotron origin, thus allowing us to estimate the neutron star magnetic field as 2.4 × 1012 G. All evidencethus suggests IGR J17329-2731 is a symbiotic X-ray binary. As no X-ray emission was ever observed from the location of IGR J17329-2731 by INTEGRAL (or other X-ray facilities) during the past 15 yr in orbit and considering that symbiotic X-ray binaries are known to be variable but persistent X-ray sources, we concluded that INTEGRAL caught the first detectable X-ray emission from IGR J17329-2731 when the source shined as a symbiotic X-ray binary. The Swift XRT monitoring performed up to 3 months after the discovery of the source, showed that it maintained a relatively stable X-ray flux and spectral properties.

  10. The role of the insula in intuitive expert bug detection in computer code: an fMRI study.

    PubMed

    Castelhano, Joao; Duarte, Isabel C; Ferreira, Carlos; Duraes, Joao; Madeira, Henrique; Castelo-Branco, Miguel

    2018-05-09

    Software programming is a complex and relatively recent human activity, involving the integration of mathematical, recursive thinking and language processing. The neural correlates of this recent human activity are still poorly understood. Error monitoring during this type of task, requiring the integration of language, logical symbol manipulation and other mathematical skills, is particularly challenging. We therefore aimed to investigate the neural correlates of decision-making during source code understanding and mental manipulation in professional participants with high expertise. The present fMRI study directly addressed error monitoring during source code comprehension, expert bug detection and decision-making. We used C code, which triggers the same sort of processing irrespective of the native language of the programmer. We discovered a distinct role for the insula in bug monitoring and detection and a novel connectivity pattern that goes beyond the expected activation pattern evoked by source code understanding in semantic language and mathematical processing regions. Importantly, insula activity levels were critically related to the quality of error detection, involving intuition, as signalled by reported initial bug suspicion, prior to final decision and bug detection. Activity in this salience network (SN) region evoked by bug suspicion was predictive of bug detection precision, suggesting that it encodes the quality of the behavioral evidence. Connectivity analysis provided evidence for top-down circuit "reutilization" stemming from anterior cingulate cortex (BA32), a core region in the SN that evolved for complex error monitoring such as required for this type of recent human activity. Cingulate (BA32) and anterolateral (BA10) frontal regions causally modulated decision processes in the insula, which in turn was related to activity of math processing regions in early parietal cortex. In other words, earlier brain regions used during evolution for other functions seem to be reutilized in a top-down manner for a new complex function, in an analogous manner as described for other cultural creations such as reading and literacy.

  11. Mapping Phenotypic Information in Heterogeneous Textual Sources to a Domain-Specific Terminological Resource

    PubMed Central

    Ananiadou, Sophia

    2016-01-01

    Biomedical literature articles and narrative content from Electronic Health Records (EHRs) both constitute rich sources of disease-phenotype information. Phenotype concepts may be mentioned in text in multiple ways, using phrases with a variety of structures. This variability stems partly from the different backgrounds of the authors, but also from the different writing styles typically used in each text type. Since EHR narrative reports and literature articles contain different but complementary types of valuable information, combining details from each text type can help to uncover new disease-phenotype associations. However, the alternative ways in which the same concept may be mentioned in each source constitutes a barrier to the automatic integration of information. Accordingly, identification of the unique concepts represented by phrases in text can help to bridge the gap between text types. We describe our development of a novel method, PhenoNorm, which integrates a number of different similarity measures to allow automatic linking of phenotype concept mentions to known concepts in the UMLS Metathesaurus, a biomedical terminological resource. PhenoNorm was developed using the PhenoCHF corpus—a collection of literature articles and narratives in EHRs, annotated for phenotypic information relating to congestive heart failure (CHF). We evaluate the performance of PhenoNorm in linking CHF-related phenotype mentions to Metathesaurus concepts, using a newly enriched version of PhenoCHF, in which each phenotype mention has an expert-verified link to a concept in the UMLS Metathesaurus. We show that PhenoNorm outperforms a number of alternative methods applied to the same task. Furthermore, we demonstrate PhenoNorm’s wider utility, by evaluating its ability to link mentions of various other types of medically-related information, occurring in texts covering wider subject areas, to concepts in different terminological resources. We show that PhenoNorm can maintain performance levels, and that its accuracy compares favourably to other methods applied to these tasks. PMID:27643689

  12. Sources of motivation, interpersonal conflict management styles, and leadership effectiveness: a structural model.

    PubMed

    Barbuto, John E; Xu, Ye

    2006-02-01

    126 leaders and 624 employees were sampled to test the relationship between sources of motivation and conflict management styles of leaders and how these variables influence effectiveness of leadership. Five sources of motivation measured by the Motivation Sources Inventory were tested-intrinsic process, instrumental, self-concept external, self-concept internal, and goal internalization. These sources of work motivation were associated with Rahim's modes of interpersonal conflict management-dominating, avoiding, obliging, complying, and integrating-and to perceived leadership effectiveness. A structural equation model tested leaders' conflict management styles and leadership effectiveness based upon different sources of work motivation. The model explained variance for obliging (65%), dominating (79%), avoiding (76%), and compromising (68%), but explained little variance for integrating (7%). The model explained only 28% of the variance in leader effectiveness.

  13. Selling Health to the Distracted: Consumer Responses to Source Credibility and Ad Appeal Type in a Direct-to-Consumer Advertisement.

    PubMed

    Lemanski, Jennifer L; Villegas, Jorge

    2015-01-01

    Since 1997, when the U.S. Food and Drug Administration first allowed prescription drug companies to release ads directly targeting the public, direct-to-consumer (DTC) advertising has become an integral part of the pharmaceutical industry marketing toolkit, reaching over $4 billion in 2005. In an experiment where cognitive load, a task that requires the investment of a subject's memory in an unrelated task; source credibility; and advertising appeal type (affective or cognitive) were manipulated, attitude toward the ad was measured for a print DTC meningitis vaccine ad. Main effect results for source credibility and advertising appeal type on attitude toward the ad were found, and interactions between manipulated variables were apparent when the individual difference variables related to a specific illness (vaccination history, living in a dorm, family members or friends who had suffered the illness) were taken into account.

  14. Contraindications for superficial heat and therapeutic ultrasound: do sources agree?

    PubMed

    Batavia, Mitchell

    2004-06-01

    To determine the amount of agreement among general rehabilitation sources for both superficial heating and therapeutic ultrasound contraindications. English-language textbook and peer-reviewed journal sources, from January 1992 to July 2002. Searches of computerized databases (HealthSTAR, CINAHL, MEDLINE, Embase) as well as Library of Congress Online Catalogs, Books in Print, and AcqWeb's Directory of Publishers and Venders. Sources were excluded if they (1) were published before 1992, (2) failed to address general rehabilitation audiences, or (3) were identified as a researcher's related publication with similar information on the topic. Type and number of contraindications, type of audience, year of publication, number of references, rationales, and alternative treatment strategies. Eighteen superficial heat and 20 ultrasound sources identified anywhere from 5 to 22 and 9 to 36 contraindications/precautions, respectively. Agreement among sources was generally high but ranged from 11% to 95%, with lower agreement noted for pregnancy, metal implants, edema, skin integrity, and cognitive/communicative concerns. Seventy-two percent of superficial heat sources and 25% of ultrasound sources failed to reference at least 1 contraindication claim. Agreement among contraindication sources was generally good for both superficial heat and therapeutic ultrasound. Sources varied with regard to the number of contraindications, references, and rationales cited. Greater reliance on objective data and standardized classification systems may serve to develop more uniform guidelines for superficial heat and therapeutic ultrasound.

  15. Ozone Depletion, Greenhouse Gases, and Climate Change. Proceedings of a Joint Symposium by the Board on Atmospheric Sciences and Climate and the Committee on Global Change, National Research Council (Washington, D.C., March 23, 1988).

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    The motivation for the organization of this symposium was the accumulation of evidence from many sources, both short- and long-term, that the global climate is in a state of change. Data which defy integrated explanation including temperature, ozone, methane, precipitation and other climate-related trends have presented troubling problems for…

  16. Ophthalmic laser system integrated with speckle variance optical coherence tomography for real-time temperature monitoring

    NASA Astrophysics Data System (ADS)

    Lee, Soohyun; Lee, Changho; Cheon, Gyeongwoo; Kim, Jongmin; Jo, Dongki; Lee, Jihoon; Kang, Jin U.

    2018-02-01

    A commercial ophthalmic laser system (R;GEN, Lutronic Corp) was integrated with a swept-source optical coherence tomography (OCT) imaging system for real-time tissue temperature monitoring. M-scan OCT images were acquired during laser-pulse radiation, and speckle variance OCT (svOCT) images were analyzed to deduce temporal signal variations related to tissue temperature change from laser-pulse radiation. A phantom study shows that svOCT magnitude increases abruptly after laser pulse radiation and recovered exponentially, and the peak intensity of svOCT image was linearly dependent on pulse laser energy until it saturates. A study using bovine iris also showed signal variation dependence on the laser pulse radiation, and the variation was more distinctive with higher energy level.

  17. Magnetic fringe field interference between the quadrupole and corrector magnets in the CSNS/RCS

    NASA Astrophysics Data System (ADS)

    Yang, Mei; Kang, Wen; Deng, Changdong; Sun, Xianjing; Li, Li; Wu, Xi; Gong, Lingling; Cheng, Da; Zhu, Yingshun; Chen, Fusan

    2017-03-01

    The Rapid Cycling Synchrotron (RCS) of the China Spallation Neutron Source (CSNS) employs large aperture quadrupole and corrector magnets with small aspect ratios and relatively short iron to iron separations; so the fringe field interference becomes serious which results in integral field strength reduction and extra field harmonics. We have performed 3D magnetic field simulations to investigate the magnetic field interference in the magnet assemblies and made some adjustments on the magnet arrangement. The Fourier analysis is used to quantify the integral gradient reduction and field harmonic changes of the quadrupole magnets. Some magnetic field measurements are undertaken to verify the simulation results. The simulation details and the major results are presented in this paper.

  18. The potential of the inventory of learning styles to study students' learning patterns in three types of medical curricula.

    PubMed

    Van der Veken, J; Valcke, M; Muijtjens, A; De Maeseneer, J; Derese, A

    2008-01-01

    Introducing innovative curricular designs can be evaluating by scrutinizing the learning patterns students use. Studying the potential of Vermunt's Inventory of Learning Styles (ILS) in detecting differences in student learning patterns in different medical curricula. Cross-sectional between-subjects comparison of ILS-scores in third-year medical students in a conventional, an integrated contextual and a PBL-curriculum using one-way post hoc ANOVA. Response rate was 85%: 197 conventional, 130 integrated contextual and 301 PBL students. The results show a differential impact from the three curricula. In relation to processing strategies, the students in the problem-based curriculum showed less rote learning and rehearsing, greater variety in sources of knowledge used and less ability to express study content in a personal manner than did the students in the conventional curriculum. The students of the integrated contextual curriculum showed more structuring of subject matter by integrating different aspects into a whole. In relation to regulation strategies, the students in the problem-based curriculum showed significantly more self-regulation of learning content and the students in the integrated contextual curriculum showed lower levels of regulation. As to learning orientations, the students in the problem-based curriculum showed less ambivalence and the students of the conventional curriculum were less vocationally oriented. The study provides empirical support for expected effects of traditional and innovative curricula which thus far were not well supported by empirical studies.

  19. Parameterizing microphysical effects on variances and covariances of moisture and heat content using a multivariate probability density function: a study with CLUBB (tag MVCS)

    DOE PAGES

    Griffin, Brian M.; Larson, Vincent E.

    2016-11-25

    Microphysical processes, such as the formation, growth, and evaporation of precipitation, interact with variability and covariances (e.g., fluxes) in moisture and heat content. For instance, evaporation of rain may produce cold pools, which in turn may trigger fresh convection and precipitation. These effects are usually omitted or else crudely parameterized at subgrid scales in weather and climate models.A more formal approach is pursued here, based on predictive, horizontally averaged equations for the variances, covariances, and fluxes of moisture and heat content. These higher-order moment equations contain microphysical source terms. The microphysics terms can be integrated analytically, given a suitably simplemore » warm-rain microphysics scheme and an approximate assumption about the multivariate distribution of cloud-related and precipitation-related variables. Performing the integrations provides exact expressions within an idealized context.A large-eddy simulation (LES) of a shallow precipitating cumulus case is performed here, and it indicates that the microphysical effects on (co)variances and fluxes can be large. In some budgets and altitude ranges, they are dominant terms. The analytic expressions for the integrals are implemented in a single-column, higher-order closure model. Interactive single-column simulations agree qualitatively with the LES. The analytic integrations form a parameterization of microphysical effects in their own right, and they also serve as benchmark solutions that can be compared to non-analytic integration methods.« less

  20. Integrated Access to Heliospheric and Magnetospheric Data

    NASA Astrophysics Data System (ADS)

    Merka, J.; Szabo, A.; Narock, T. W.

    2007-05-01

    Heliospheric and magnetospheric data are provided by a variety of diverse sources. For space physics scientists, knowing that such data sources exist and where they are located are only the first hurdles to overcome before they can utilize the data for research. As a solution, the NASA Heliophysics Division has established a group of virtual observatories (VOs) to provide the scientific community with integrated access to well documented data and related services. The VOs are organized by scientific discipline and yet their essential characteristic is cross-discipline data discovery and exchange. In this talk, we will demonstrate the architecture and features of two distributed data systems, the Virtual Heliospheric Observatory (VHO) and the Virtual Magnetospheric Observatory at NASA Goddard Space Flight Center (VMO/G). The VHO and VMO/G are designed to share most of the components to facilitate faster development and to ease communication between the two VxOs. Since different communities are served by the two observatories, slightly, and sometimes even significantly, different terms and expectations must be accommodated and correctly processed. In our approach the interfaces are tuned for a particular community while the standard SPASE data model is employed internally. Together with other VxOs, we are also developing a standard query language for metadata exchange among the VxOs, data providers, and VxO-related services. Specific examples will be given. http:vho.nasa.gov

Top