Sample records for combining multiple sources

  1. Ignition probability of polymer-bonded explosives accounting for multiple sources of material stochasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S.; Barua, A.; Zhou, M., E-mail: min.zhou@me.gatech.edu

    2014-05-07

    Accounting for the combined effect of multiple sources of stochasticity in material attributes, we develop an approach that computationally predicts the probability of ignition of polymer-bonded explosives (PBXs) under impact loading. The probabilistic nature of the specific ignition processes is assumed to arise from two sources of stochasticity. The first source involves random variations in material microstructural morphology; the second source involves random fluctuations in grain-binder interfacial bonding strength. The effect of the first source of stochasticity is analyzed with multiple sets of statistically similar microstructures and constant interfacial bonding strength. Subsequently, each of the microstructures in the multiple setsmore » is assigned multiple instantiations of randomly varying grain-binder interfacial strengths to analyze the effect of the second source of stochasticity. Critical hotspot size-temperature states reaching the threshold for ignition are calculated through finite element simulations that explicitly account for microstructure and bulk and interfacial dissipation to quantify the time to criticality (t{sub c}) of individual samples, allowing the probability distribution of the time to criticality that results from each source of stochastic variation for a material to be analyzed. Two probability superposition models are considered to combine the effects of the multiple sources of stochasticity. The first is a parallel and series combination model, and the second is a nested probability function model. Results show that the nested Weibull distribution provides an accurate description of the combined ignition probability. The approach developed here represents a general framework for analyzing the stochasticity in the material behavior that arises out of multiple types of uncertainty associated with the structure, design, synthesis and processing of materials.« less

  2. Developing a compact multiple laser diode combiner with a single fiber stub output for handheld IoT devices

    NASA Astrophysics Data System (ADS)

    Lee, Minseok; June, Seunghyeok; Kim, Sehwan

    2018-01-01

    Many biomedical applications require an efficient combination and localization of multiple discrete light sources ( e.g., fluorescence and absorbance imaging). We present a compact 6 channel combiner that couples the output of independent solid-state light sources into a single 400-μm-diameter fiber stub for handheld Internet of Things (IoT) devices. We demonstrate average coupling efficiencies > 80% for each of the 6 laser diodes installed into the prototype. The design supports the use of continuous wave and intensity-modulated laser diodes. This fiber-stub-type beam combiner could be used to construct custom multi-wavelength sources for tissue oximeters, microscopes and molecular imaging technologies. In order to validate its suitability, we applied the developed fiber-stub-type beam combiner to a multi-wavelength light source for a handheld IoT device and demonstrated its feasibility for smart healthcare through a tumor-mimicking silicon phantom.

  3. Single-channel mixed signal blind source separation algorithm based on multiple ICA processing

    NASA Astrophysics Data System (ADS)

    Cheng, Xiefeng; Li, Ji

    2017-01-01

    Take separating the fetal heart sound signal from the mixed signal that get from the electronic stethoscope as the research background, the paper puts forward a single-channel mixed signal blind source separation algorithm based on multiple ICA processing. Firstly, according to the empirical mode decomposition (EMD), the single-channel mixed signal get multiple orthogonal signal components which are processed by ICA. The multiple independent signal components are called independent sub component of the mixed signal. Then by combining with the multiple independent sub component into single-channel mixed signal, the single-channel signal is expanded to multipath signals, which turns the under-determined blind source separation problem into a well-posed blind source separation problem. Further, the estimate signal of source signal is get by doing the ICA processing. Finally, if the separation effect is not very ideal, combined with the last time's separation effect to the single-channel mixed signal, and keep doing the ICA processing for more times until the desired estimated signal of source signal is get. The simulation results show that the algorithm has good separation effect for the single-channel mixed physiological signals.

  4. Simultaneous Exposure to Multiple Air Pollutants Influences Alveolar Epithelial Cell Ion Transport

    EPA Science Inventory

    Purpose. Air pollution sources generally release multiple pollutants simultaneously and yet, research has historically focused on the source-to-health linkages of individual air pollutants. We recently showed that exposure of alveolar epithelial cells to a combination of particul...

  5. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  6. Combined mining: discovering informative knowledge in complex data.

    PubMed

    Cao, Longbing; Zhang, Huaifeng; Zhao, Yanchang; Luo, Dan; Zhang, Chengqi

    2011-06-01

    Enterprise data mining applications often involve complex data such as multiple large heterogeneous data sources, user preferences, and business impact. In such situations, a single method or one-step mining is often limited in discovering informative knowledge. It would also be very time and space consuming, if not impossible, to join relevant large data sources for mining patterns consisting of multiple aspects of information. It is crucial to develop effective approaches for mining patterns combining necessary information from multiple relevant business lines, catering for real business settings and decision-making actions rather than just providing a single line of patterns. The recent years have seen increasing efforts on mining more informative patterns, e.g., integrating frequent pattern mining with classifications to generate frequent pattern-based classifiers. Rather than presenting a specific algorithm, this paper builds on our existing works and proposes combined mining as a general approach to mining for informative patterns combining components from either multiple data sets or multiple features or by multiple methods on demand. We summarize general frameworks, paradigms, and basic processes for multifeature combined mining, multisource combined mining, and multimethod combined mining. Novel types of combined patterns, such as incremental cluster patterns, can result from such frameworks, which cannot be directly produced by the existing methods. A set of real-world case studies has been conducted to test the frameworks, with some of them briefed in this paper. They identify combined patterns for informing government debt prevention and improving government service objectives, which show the flexibility and instantiation capability of combined mining in discovering informative knowledge in complex data.

  7. Passive radio frequency peak power multiplier

    DOEpatents

    Farkas, Zoltan D.; Wilson, Perry B.

    1977-01-01

    Peak power multiplication of a radio frequency source by simultaneous charging of two high-Q resonant microwave cavities by applying the source output through a directional coupler to the cavities and then reversing the phase of the source power to the coupler, thereby permitting the power in the cavities to simultaneously discharge through the coupler to the load in combination with power from the source to apply a peak power to the load that is a multiplication of the source peak power.

  8. The Chandra Source Catalog : Automated Source Correlation

    NASA Astrophysics Data System (ADS)

    Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).

  9. Intelligent power management in a vehicular system with multiple power sources

    NASA Astrophysics Data System (ADS)

    Murphey, Yi L.; Chen, ZhiHang; Kiliaris, Leonidas; Masrur, M. Abul

    This paper presents an optimal online power management strategy applied to a vehicular power system that contains multiple power sources and deals with largely fluctuated load requests. The optimal online power management strategy is developed using machine learning and fuzzy logic. A machine learning algorithm has been developed to learn the knowledge about minimizing power loss in a Multiple Power Sources and Loads (M_PS&LD) system. The algorithm exploits the fact that different power sources used to deliver a load request have different power losses under different vehicle states. The machine learning algorithm is developed to train an intelligent power controller, an online fuzzy power controller, FPC_MPS, that has the capability of finding combinations of power sources that minimize power losses while satisfying a given set of system and component constraints during a drive cycle. The FPC_MPS was implemented in two simulated systems, a power system of four power sources, and a vehicle system of three power sources. Experimental results show that the proposed machine learning approach combined with fuzzy control is a promising technology for intelligent vehicle power management in a M_PS&LD power system.

  10. Site Features

    EPA Pesticide Factsheets

    This dataset consists of various site features from multiple Superfund sites in U.S. EPA Region 8. These data were acquired from multiple sources at different times and were combined into one region-wide layer.

  11. Quantifying seasonal shifts in nitrogen sources to Oregon estuaries using a transport model combined with stable isotopes

    EPA Science Inventory

    Identifying the sources of dissolved inorganic nitrogen (DIN) in estuaries is complicated by the multiple sources, temporal variability in inputs, and variations in transport. We used a hydrodynamic model to simulate the transport and uptake of three sources of DIN (oceanic, riv...

  12. 40 CFR 437.43 - Effluent limitations attainable by the application of the best conventional pollutant control...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... § 437.42(d). (e) Combined waste receipts from subparts B and C of this part: Limitations for BOD5, O&G... CENTRALIZED WASTE TREATMENT POINT SOURCE CATEGORY Multiple Wastestreams § 437.43 Effluent limitations... combines treated or untreated wastes from subparts A, B, or C of this part may be subject to Multiple...

  13. An evaluation of talker localization based on direction of arrival estimation and statistical sound source identification

    NASA Astrophysics Data System (ADS)

    Nishiura, Takanobu; Nakamura, Satoshi

    2002-11-01

    It is very important to capture distant-talking speech for a hands-free speech interface with high quality. A microphone array is an ideal candidate for this purpose. However, this approach requires localizing the target talker. Conventional talker localization algorithms in multiple sound source environments not only have difficulty localizing the multiple sound sources accurately, but also have difficulty localizing the target talker among known multiple sound source positions. To cope with these problems, we propose a new talker localization algorithm consisting of two algorithms. One is DOA (direction of arrival) estimation algorithm for multiple sound source localization based on CSP (cross-power spectrum phase) coefficient addition method. The other is statistical sound source identification algorithm based on GMM (Gaussian mixture model) for localizing the target talker position among localized multiple sound sources. In this paper, we particularly focus on the talker localization performance based on the combination of these two algorithms with a microphone array. We conducted evaluation experiments in real noisy reverberant environments. As a result, we confirmed that multiple sound signals can be identified accurately between ''speech'' or ''non-speech'' by the proposed algorithm. [Work supported by ATR, and MEXT of Japan.

  14. Auditing the multiply-related concepts within the UMLS

    PubMed Central

    Mougin, Fleur; Grabar, Natalia

    2014-01-01

    Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853

  15. NicoLase—An open-source diode laser combiner, fiber launch, and sequencing controller for fluorescence microscopy

    PubMed Central

    Walsh, James; Böcking, Till; Gaus, Katharina

    2017-01-01

    Modern fluorescence microscopy requires software-controlled illumination sources with high power across a wide range of wavelengths. Diode lasers meet the power requirements and combining multiple units into a single fiber launch expands their capability across the required spectral range. We present the NicoLase, an open-source diode laser combiner, fiber launch, and software sequence controller for fluorescence microscopy and super-resolution microscopy applications. Two configurations are described, giving four or six output wavelengths and one or two single-mode fiber outputs, with all CAD files, machinist drawings, and controller source code openly available. PMID:28301563

  16. Use of ultrasonic array method for positioning multiple partial discharge sources in transformer oil.

    PubMed

    Xie, Qing; Tao, Junhan; Wang, Yongqiang; Geng, Jianghai; Cheng, Shuyi; Lü, Fangcheng

    2014-08-01

    Fast and accurate positioning of partial discharge (PD) sources in transformer oil is very important for the safe, stable operation of power systems because it allows timely elimination of insulation faults. There is usually more than one PD source once an insulation fault occurs in the transformer oil. This study, which has both theoretical and practical significance, proposes a method of identifying multiple PD sources in the transformer oil. The method combines the two-sided correlation transformation algorithm in the broadband signal focusing and the modified Gerschgorin disk estimator. The method of classification of multiple signals is used to determine the directions of arrival of signals from multiple PD sources. The ultrasonic array positioning method is based on the multi-platform direction finding and the global optimization searching. Both the 4 × 4 square planar ultrasonic sensor array and the ultrasonic array detection platform are built to test the method of identifying and positioning multiple PD sources. The obtained results verify the validity and the engineering practicability of this method.

  17. Density estimation in tiger populations: combining information for strong inference

    USGS Publications Warehouse

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Delampady, Mohan; Nichols, James D.; Karanth, K. Ullas; Macdonald, David W.

    2012-01-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture–recapture data. The model, which combined information, provided the most precise estimate of density (8.5 ± 1.95 tigers/100 km2 [posterior mean ± SD]) relative to a model that utilized only one data source (photographic, 12.02 ± 3.02 tigers/100 km2 and fecal DNA, 6.65 ± 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  18. Density estimation in tiger populations: combining information for strong inference.

    PubMed

    Gopalaswamy, Arjun M; Royle, J Andrew; Delampady, Mohan; Nichols, James D; Karanth, K Ullas; Macdonald, David W

    2012-07-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture-recapture data. The model, which combined information, provided the most precise estimate of density (8.5 +/- 1.95 tigers/100 km2 [posterior mean +/- SD]) relative to a model that utilized only one data source (photographic, 12.02 +/- 3.02 tigers/100 km2 and fecal DNA, 6.65 +/- 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  19. Collaborative Research: Atmospheric Pressure Microplasma Chemistry-Photon Synergies Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graves, David

    Combining the effects of low temperature, atmospheric pressure microplasmas and microplasma photon sources shows greatly expanded range of applications of each of them. The plasma sources create active chemical species and these can be activated further by addition of photons and associated photochemistry. There are many ways to combine the effects of plasma chemistry and photochemistry, especially if there are multiple phases present. The project combines construction of appropriate test experimental systems, various spectroscopic diagnostics and mathematical modeling.

  20. A Survey of Insider Attack Detection Research

    DTIC Science & Technology

    2008-08-25

    modeling of statistical features , such as the frequency of events, the duration of events, the co-occurrence of multiple events combined through...forms of attack that have been reported [Error! Reference source not found.]. For example: • Unauthorized extraction , duplication, or exfiltration...network level. Schultz pointed out that not one approach will work but solutions need to be based on multiple sensors to be able to find any combination

  1. A Tracking Analyst for large 3D spatiotemporal data from multiple sources (case study: Tracking volcanic eruptions in the atmosphere)

    NASA Astrophysics Data System (ADS)

    Gad, Mohamed A.; Elshehaly, Mai H.; Gračanin, Denis; Elmongui, Hicham G.

    2018-02-01

    This research presents a novel Trajectory-based Tracking Analyst (TTA) that can track and link spatiotemporally variable data from multiple sources. The proposed technique uses trajectory information to determine the positions of time-enabled and spatially variable scatter data at any given time through a combination of along trajectory adjustment and spatial interpolation. The TTA is applied in this research to track large spatiotemporal data of volcanic eruptions (acquired using multi-sensors) in the unsteady flow field of the atmosphere. The TTA enables tracking injections into the atmospheric flow field, the reconstruction of the spatiotemporally variable data at any desired time, and the spatiotemporal join of attribute data from multiple sources. In addition, we were able to create a smooth animation of the volcanic ash plume at interactive rates. The initial results indicate that the TTA can be applied to a wide range of multiple-source data.

  2. Validation and calibration of structural models that combine information from multiple sources.

    PubMed

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  3. Multiple Sources of Prescription Payment and Risky Opioid Therapy Among Veterans.

    PubMed

    Becker, William C; Fenton, Brenda T; Brandt, Cynthia A; Doyle, Erin L; Francis, Joseph; Goulet, Joseph L; Moore, Brent A; Torrise, Virginia; Kerns, Robert D; Kreiner, Peter W

    2017-07-01

    Opioid overdose and other related harms are a major source of morbidity and mortality among US Veterans, in part due to high-risk opioid prescribing. We sought to determine whether having multiple sources of payment for opioids-as a marker for out-of-system access-is associated with risky opioid therapy among veterans. Cross-sectional study examining the association between multiple sources of payment and risky opioid therapy among all individuals with Veterans Health Administration (VHA) payment for opioid analgesic prescriptions in Kentucky during fiscal year 2014-2015. Source of payment categories: (1) VHA only source of payment (sole source); (2) sources of payment were VHA and at least 1 cash payment [VHA+cash payment(s)] whether or not there was a third source of payment; and (3) at least one other noncash source: Medicare, Medicaid, or private insurance [VHA+noncash source(s)]. Our outcomes were 2 risky opioid therapies: combination opioid/benzodiazepine therapy and high-dose opioid therapy, defined as morphine equivalent daily dose ≥90 mg. Of the 14,795 individuals in the analytic sample, there were 81.9% in the sole source category, 6.6% in the VHA+cash payment(s) category, and 11.5% in the VHA+noncash source(s) category. In logistic regression, controlling for age and sex, persons with multiple payment sources had significantly higher odds of each risky opioid therapy, with those in the VHA+cash having significantly higher odds than those in the VHA+noncash source(s) group. Prescribers should examine the prescription monitoring program as multiple payment sources increase the odds of risky opioid therapy.

  4. Text mining electronic hospital records to automatically classify admissions against disease: Measuring the impact of linking data sources.

    PubMed

    Kocbek, Simon; Cavedon, Lawrence; Martinez, David; Bain, Christopher; Manus, Chris Mac; Haffari, Gholamreza; Zukerman, Ingrid; Verspoor, Karin

    2016-12-01

    Text and data mining play an important role in obtaining insights from Health and Hospital Information Systems. This paper presents a text mining system for detecting admissions marked as positive for several diseases: Lung Cancer, Breast Cancer, Colon Cancer, Secondary Malignant Neoplasm of Respiratory and Digestive Organs, Multiple Myeloma and Malignant Plasma Cell Neoplasms, Pneumonia, and Pulmonary Embolism. We specifically examine the effect of linking multiple data sources on text classification performance. Support Vector Machine classifiers are built for eight data source combinations, and evaluated using the metrics of Precision, Recall and F-Score. Sub-sampling techniques are used to address unbalanced datasets of medical records. We use radiology reports as an initial data source and add other sources, such as pathology reports and patient and hospital admission data, in order to assess the research question regarding the impact of the value of multiple data sources. Statistical significance is measured using the Wilcoxon signed-rank test. A second set of experiments explores aspects of the system in greater depth, focusing on Lung Cancer. We explore the impact of feature selection; analyse the learning curve; examine the effect of restricting admissions to only those containing reports from all data sources; and examine the impact of reducing the sub-sampling. These experiments provide better understanding of how to best apply text classification in the context of imbalanced data of variable completeness. Radiology questions plus patient and hospital admission data contribute valuable information for detecting most of the diseases, significantly improving performance when added to radiology reports alone or to the combination of radiology and pathology reports. Overall, linking data sources significantly improved classification performance for all the diseases examined. However, there is no single approach that suits all scenarios; the choice of the most effective combination of data sources depends on the specific disease to be classified. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Objective consensus from decision trees.

    PubMed

    Putora, Paul Martin; Panje, Cedric M; Papachristofilou, Alexandros; Dal Pra, Alan; Hundsberger, Thomas; Plasswilm, Ludwig

    2014-12-05

    Consensus-based approaches provide an alternative to evidence-based decision making, especially in situations where high-level evidence is limited. Our aim was to demonstrate a novel source of information, objective consensus based on recommendations in decision tree format from multiple sources. Based on nine sample recommendations in decision tree format a representative analysis was performed. The most common (mode) recommendations for each eventuality (each permutation of parameters) were determined. The same procedure was applied to real clinical recommendations for primary radiotherapy for prostate cancer. Data was collected from 16 radiation oncology centres, converted into decision tree format and analyzed in order to determine the objective consensus. Based on information from multiple sources in decision tree format, treatment recommendations can be assessed for every parameter combination. An objective consensus can be determined by means of mode recommendations without compromise or confrontation among the parties. In the clinical example involving prostate cancer therapy, three parameters were used with two cut-off values each (Gleason score, PSA, T-stage) resulting in a total of 27 possible combinations per decision tree. Despite significant variations among the recommendations, a mode recommendation could be found for specific combinations of parameters. Recommendations represented as decision trees can serve as a basis for objective consensus among multiple parties.

  6. Auditing the multiply-related concepts within the UMLS.

    PubMed

    Mougin, Fleur; Grabar, Natalia

    2014-10-01

    This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. Eyetracking Reveals Multiple-Category Use in Induction

    ERIC Educational Resources Information Center

    Chen, Stephanie Y.; Ross, Brian H.; Murphy, Gregory L.

    2016-01-01

    Category information is used to predict properties of new category members. When categorization is uncertain, people often rely on only one, most likely category to make predictions. Yet studies of perception and action often conclude that people combine multiple sources of information near-optimally. We present a perception-action analog of…

  8. The receiver operational characteristic for binary classification with multiple indices and its application to the neuroimaging study of Alzheimer's disease.

    PubMed

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2013-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.

  9. The Receiver Operational Characteristic for Binary Classification with Multiple Indices and Its Application to the Neuroimaging Study of Alzheimer’s Disease

    PubMed Central

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2014-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553

  10. Electrophysiological correlates of cocktail-party listening.

    PubMed

    Lewald, Jörg; Getzmann, Stephan

    2015-10-01

    Detecting, localizing, and selectively attending to a particular sound source of interest in complex auditory scenes composed of multiple competing sources is a remarkable capacity of the human auditory system. The neural basis of this so-called "cocktail-party effect" has remained largely unknown. Here, we studied the cortical network engaged in solving the "cocktail-party" problem, using event-related potentials (ERPs) in combination with two tasks demanding horizontal localization of a naturalistic target sound presented either in silence or in the presence of multiple competing sound sources. Presentation of multiple sound sources, as compared to single sources, induced an increased P1 amplitude, a reduction in N1, and a strong N2 component, resulting in a pronounced negativity in the ERP difference waveform (N2d) around 260 ms after stimulus onset. About 100 ms later, the anterior contralateral N2 subcomponent (N2ac) occurred in the multiple-sources condition, as computed from the amplitude difference for targets in the left minus right hemispaces. Cortical source analyses of the ERP modulation, resulting from the contrast of multiple vs. single sources, generally revealed an initial enhancement of electrical activity in right temporo-parietal areas, including auditory cortex, by multiple sources (at P1) that is followed by a reduction, with the primary sources shifting from right inferior parietal lobule (at N1) to left dorso-frontal cortex (at N2d). Thus, cocktail-party listening, as compared to single-source localization, appears to be based on a complex chronology of successive electrical activities within a specific cortical network involved in spatial hearing in complex situations. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Under-reporting of pertussis in Ontario: A Canadian Immunization Research Network (CIRN) study using capture-recapture

    PubMed Central

    Crowcroft, Natasha S.; Johnson, Caitlin; Chen, Cynthia; Li, Ye; Marchand-Austin, Alex; Bolotin, Shelly; Schwartz, Kevin; Deeks, Shelley L.; Jamieson, Frances; Drews, Steven; Russell, Margaret L.; Svenson, Lawrence W.; Simmonds, Kimberley; Mahmud, Salaheddin M.; Kwong, Jeffrey C.

    2018-01-01

    Introduction Under-reporting of pertussis cases is a longstanding challenge. We estimated the true number of pertussis cases in Ontario using multiple data sources, and evaluated the completeness of each source. Methods We linked data from multiple sources for the period 2009 to 2015: public health reportable disease surveillance data, public health laboratory data, and health administrative data (hospitalizations, emergency department visits, and physician office visits). To estimate the total number of pertussis cases in Ontario, we used a three-source capture-recapture analysis stratified by age (infants, or aged one year and older) and adjusting for dependency between sources. We used the Bayesian Information Criterion to compare models. Results Using probable and confirmed reported cases, laboratory data, and combined hospitalizations/emergency department visits, the estimated total number of cases during the six-year period amongst infants was 924, compared with 545 unique observed cases from all sources. Using the same sources, the estimated total for those aged 1 year and older was 12,883, compared with 3,304 observed cases from all sources. Only 37% of infants and 11% for those aged 1 year and over admitted to hospital or seen in an emergency department for pertussis were reported to public health. Public health reporting sensitivity varied from 2% to 68% depending on age group and the combination of data sources included. Sensitivity of combined hospitalizations and emergency department visits varied from 37% to 49% and of laboratory data from 1% to 50%. Conclusions All data sources contribute cases and are complementary, suggesting that the incidence of pertussis is substantially higher than suggested by routine reports. The sensitivity of different data sources varies. Better case identification is required to improve pertussis control in Ontario. PMID:29718945

  12. Development of a Biomass Burning Emissions Inventory by Combining Satellite and Ground-based Information

    EPA Science Inventory

    A 2005 biomass burning (wildfire, prescribed, and agricultural) emission inventory has been developed for the contiguous United States using a newly developed simplified method of combining information from multiple sources for use in the US EPA’s national Emission Inventory (NEI...

  13. Multi-diversity combining and selection for relay-assisted mixed RF/FSO system

    NASA Astrophysics Data System (ADS)

    Chen, Li; Wang, Weidong

    2017-12-01

    We propose and analyze multi-diversity combining and selection to enhance the performance of relay-assisted mixed radio frequency/free-space optics (RF/FSO) system. We focus on a practical scenario for cellular network where a single-antenna source is communicating to a multi-apertures destination through a relay equipped with multiple receive antennas and multiple transmit apertures. The RF single input multiple output (SIMO) links employ either maximal-ratio combining (MRC) or receive antenna selection (RAS), and the FSO multiple input multiple output (MIMO) links adopt either repetition coding (RC) or transmit laser selection (TLS). The performance is evaluated via an outage probability analysis over Rayleigh fading RF links and Gamma-Gamma atmospheric turbulence FSO links with pointing errors where channel state information (CSI) assisted amplify-and-forward (AF) scheme is considered. Asymptotic closed-form expressions at high signal-to-noise ratio (SNR) are also derived. Coding gain and diversity order for different combining and selection schemes are further discussed. Numerical results are provided to verify and illustrate the analytical results.

  14. Considerations for Creating Multi-Language Personality Norms: A Three-Component Model of Error

    ERIC Educational Resources Information Center

    Meyer, Kevin D.; Foster, Jeff L.

    2008-01-01

    With the increasing globalization of human resources practices, a commensurate increase in demand has occurred for multi-language ("global") personality norms for use in selection and development efforts. The combination of data from multiple translations of a personality assessment into a single norm engenders error from multiple sources. This…

  15. Methane source identification in Boston, Massachusetts using isotopic and ethane measurements

    NASA Astrophysics Data System (ADS)

    Down, A.; Jackson, R. B.; Plata, D.; McKain, K.; Wofsy, S. C.; Rella, C.; Crosson, E.; Phillips, N. G.

    2012-12-01

    Methane has substantial greenhouse warming potential and is the principle component of natural gas. Fugitive natural gas emissions could be a significant source of methane to the atmosphere. However, the cumulative magnitude of natural gas leaks is not yet well constrained. We used a combination of point source measurements and ambient monitoring to characterize the methane sources in the Boston urban area. We developed distinct fingerprints for natural gas and multiple biogenic methane sources based on hydrocarbon concentration and isotopic composition. We combine these data with periodic measurements of atmospheric methane and ethane concentration to estimate the fractional contribution of natural gas and biogenic methane sources to the cumulative urban methane flux in Boston. These results are used to inform an inverse model of urban methane concentration and emissions.

  16. Combined adaptive multiple subtraction based on optimized event tracing and extended wiener filtering

    NASA Astrophysics Data System (ADS)

    Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo

    2017-06-01

    The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.

  17. Radiation pattern synthesis of planar antennas using the iterative sampling method

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Coffey, E. L.

    1975-01-01

    A synthesis method is presented for determining an excitation of an arbitrary (but fixed) planar source configuration. The desired radiation pattern is specified over all or part of the visible region. It may have multiple and/or shaped main beams with low sidelobes. The iterative sampling method is used to find an excitation of the source which yields a radiation pattern that approximates the desired pattern to within a specified tolerance. In this paper the method is used to calculate excitations for line sources, linear arrays (equally and unequally spaced), rectangular apertures, rectangular arrays (arbitrary spacing grid), and circular apertures. Examples using these sources to form patterns with shaped main beams, multiple main beams, shaped sidelobe levels, and combinations thereof are given.

  18. Application of fuzzy set and Dempster-Shafer theory to organic geochemistry interpretation

    NASA Technical Reports Server (NTRS)

    Kim, C. S.; Isaksen, G. H.

    1993-01-01

    An application of fuzzy sets and Dempster Shafter Theory (DST) in modeling the interpretational process of organic geochemistry data for predicting the level of maturities of oil and source rock samples is presented. This was accomplished by (1) representing linguistic imprecision and imprecision associated with experience by a fuzzy set theory, (2) capturing the probabilistic nature of imperfect evidences by a DST, and (3) combining multiple evidences by utilizing John Yen's generalized Dempster-Shafter Theory (GDST), which allows DST to deal with fuzzy information. The current prototype provides collective beliefs on the predicted levels of maturity by combining multiple evidences through GDST's rule of combination.

  19. A deterministic (non-stochastic) low frequency method for geoacoustic inversion.

    PubMed

    Tolstoy, A

    2010-06-01

    It is well known that multiple frequency sources are necessary for accurate geoacoustic inversion. This paper presents an inversion method which uses the low frequency (LF) spectrum only to estimate bottom properties even in the presence of expected errors in source location, phone depths, and ocean sound-speed profiles. Matched field processing (MFP) along a vertical array is used. The LF method first conducts an exhaustive search of the (five) parameter search space (sediment thickness, sound-speed at the top of the sediment layer, the sediment layer sound-speed gradient, the half-space sound-speed, and water depth) at 25 Hz and continues by retaining only the high MFP value parameter combinations. Next, frequency is slowly increased while again retaining only the high value combinations. At each stage of the process, only those parameter combinations which give high MFP values at all previous LF predictions are considered (an ever shrinking set). It is important to note that a complete search of each relevant parameter space seems to be necessary not only at multiple (sequential) frequencies but also at multiple ranges in order to eliminate sidelobes, i.e., false solutions. Even so, there are no mathematical guarantees that one final, unique "solution" will be found.

  20. Combining results of multiple search engines in proteomics.

    PubMed

    Shteynberg, David; Nesvizhskii, Alexey I; Moritz, Robert L; Deutsch, Eric W

    2013-09-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques.

  1. Combining Results of Multiple Search Engines in Proteomics*

    PubMed Central

    Shteynberg, David; Nesvizhskii, Alexey I.; Moritz, Robert L.; Deutsch, Eric W.

    2013-01-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques. PMID:23720762

  2. The influence of the interactions between anthropogenic activities and multiple ecological factors on land surface temperatures of urban forests

    NASA Astrophysics Data System (ADS)

    Ren, Y.

    2017-12-01

    Context Land surface temperatures (LSTs) spatio-temporal distribution pattern of urban forests are influenced by many ecological factors; the identification of interaction between these factors can improve simulations and predictions of spatial patterns of urban cold islands. This quantitative research requires an integrated method that combines multiple sources data with spatial statistical analysis. Objectives The purpose of this study was to clarify urban forest LST influence interaction between anthropogenic activities and multiple ecological factors using cluster analysis of hot and cold spots and Geogdetector model. We introduced the hypothesis that anthropogenic activity interacts with certain ecological factors, and their combination influences urban forests LST. We also assumed that spatio-temporal distributions of urban forest LST should be similar to those of ecological factors and can be represented quantitatively. Methods We used Jinjiang as a representative city in China as a case study. Population density was employed to represent anthropogenic activity. We built up a multi-source data (forest inventory, digital elevation models (DEM), population, and remote sensing imagery) on a unified urban scale to support urban forest LST influence interaction research. Through a combination of spatial statistical analysis results, multi-source spatial data, and Geogdetector model, the interaction mechanisms of urban forest LST were revealed. Results Although different ecological factors have different influences on forest LST, in two periods with different hot spots and cold spots, the patch area and dominant tree species were the main factors contributing to LST clustering in urban forests. The interaction between anthropogenic activity and multiple ecological factors increased LST in urban forest stands, linearly and nonlinearly. Strong interactions between elevation and dominant species were generally observed and were prevalent in either hot or cold spots areas in different years. Conclusions In conclusion, a combination of spatial statistics and GeogDetector models should be effective for quantitatively evaluating interactive relationships among ecological factors, anthropogenic activity and LST.

  3. Constraining Carbonaceous Aerosol Sources in a Receptor Model Using Combined 14C, Redox Species, Organic Tracers, and Elementary/Organic Carbon Measurements

    EPA Science Inventory

    Sources of carbonaceous PM2.5 were quantified in downtown Cleveland, OH and Chippewa Lake, OH located ~40 miles southwest of Cleveland during the Cleveland Multiple Air Pollutant Study (CMAPS). PM2.5 filter samples were collected daily during July-August 200...

  4. Combining Multiple Rupture Models in Real-Time for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Wu, S.; Beck, J. L.; Heaton, T. H.

    2015-12-01

    The ShakeAlert earthquake early warning system for the west coast of the United States is designed to combine information from multiple independent earthquake analysis algorithms in order to provide the public with robust predictions of shaking intensity at each user's location before they are affected by strong shaking. The current contributing analyses come from algorithms that determine the origin time, epicenter, and magnitude of an earthquake (On-site, ElarmS, and Virtual Seismologist). A second generation of algorithms will provide seismic line source information (FinDer), as well as geodetically-constrained slip models (BEFORES, GPSlip, G-larmS, G-FAST). These new algorithms will provide more information about the spatial extent of the earthquake rupture and thus improve the quality of the resulting shaking forecasts.Each of the contributing algorithms exploits different features of the observed seismic and geodetic data, and thus each algorithm may perform differently for different data availability and earthquake source characteristics. Thus the ShakeAlert system requires a central mediator, called the Central Decision Module (CDM). The CDM acts to combine disparate earthquake source information into one unified shaking forecast. Here we will present a new design for the CDM that uses a Bayesian framework to combine earthquake reports from multiple analysis algorithms and compares them to observed shaking information in order to both assess the relative plausibility of each earthquake report and to create an improved unified shaking forecast complete with appropriate uncertainties. We will describe how these probabilistic shaking forecasts can be used to provide each user with a personalized decision-making tool that can help decide whether or not to take a protective action (such as opening fire house doors or stopping trains) based on that user's distance to the earthquake, vulnerability to shaking, false alarm tolerance, and time required to act.

  5. Comparing and Combining Data across Multiple Sources via Integration of Paired-sample Data to Correct for Measurement Error

    PubMed Central

    Huang, Yunda; Huang, Ying; Moodie, Zoe; Li, Sue; Self, Steve

    2014-01-01

    Summary In biomedical research such as the development of vaccines for infectious diseases or cancer, measures from the same assay are often collected from multiple sources or laboratories. Measurement error that may vary between laboratories needs to be adjusted for when combining samples across laboratories. We incorporate such adjustment in comparing and combining independent samples from different labs via integration of external data, collected on paired samples from the same two laboratories. We propose: 1) normalization of individual level data from two laboratories to the same scale via the expectation of true measurements conditioning on the observed; 2) comparison of mean assay values between two independent samples in the Main study accounting for inter-source measurement error; and 3) sample size calculations of the paired-sample study so that hypothesis testing error rates are appropriately controlled in the Main study comparison. Because the goal is not to estimate the true underlying measurements but to combine data on the same scale, our proposed methods do not require that the true values for the errorprone measurements are known in the external data. Simulation results under a variety of scenarios demonstrate satisfactory finite sample performance of our proposed methods when measurement errors vary. We illustrate our methods using real ELISpot assay data generated by two HIV vaccine laboratories. PMID:22764070

  6. Dempster-Shafer theory applied to regulatory decision process for selecting safer alternatives to toxic chemicals in consumer products.

    PubMed

    Park, Sung Jin; Ogunseitan, Oladele A; Lejano, Raul P

    2014-01-01

    Regulatory agencies often face a dilemma when regulating chemicals in consumer products-namely, that of making decisions in the face of multiple, and sometimes conflicting, lines of evidence. We present an integrative approach for dealing with uncertainty and multiple pieces of evidence in toxics regulation. The integrative risk analytic framework is grounded in the Dempster-Shafer (D-S) theory that allows the analyst to combine multiple pieces of evidence and judgments from independent sources of information. We apply the integrative approach to the comparative risk assessment of bisphenol-A (BPA)-based polycarbonate and the functionally equivalent alternative, Eastman Tritan copolyester (ETC). Our results show that according to cumulative empirical evidence, the estimated probability of toxicity of BPA is 0.034, whereas the toxicity probability for ETC is 0.097. However, when we combine extant evidence with strength of confidence in the source (or expert judgment), we are guided by a richer interval measure, (Bel(t), Pl(t)). With the D-S derived measure, we arrive at various intervals for BPA, with the low-range estimate at (0.034, 0.250), and (0.097,0.688) for ETC. These new measures allow a reasonable basis for comparison and a justifiable procedure for decision making that takes advantage of multiple sources of evidence. Through the application of D-S theory to toxicity risk assessment, we show how a multiplicity of scientific evidence can be converted into a unified risk estimate, and how this information can be effectively used for comparative assessments to select potentially less toxic alternative chemicals. © 2013 SETAC.

  7. Experimenter's Laboratory for Visualized Interactive Science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Rodier, Daniel R.; Klemp, Marjorie K.

    1994-01-01

    ELVIS (Experimenter's Laboratory for Visualized Interactive Science) is an interactive visualization environment that enables scientists, students, and educators to visualize and analyze large, complex, and diverse sets of scientific data. It accomplishes this by presenting the data sets as 2-D, 3-D, color, stereo, and graphic images with movable and multiple light sources combined with displays of solid-surface, contours, wire-frame, and transparency. By simultaneously rendering diverse data sets acquired from multiple sources, formats, and resolutions and by interacting with the data through an intuitive, direct-manipulation interface, ELVIS provides an interactive and responsive environment for exploratory data analysis.

  8. A control system for a powered prosthesis using positional and myoelectric inputs from the shoulder complex.

    PubMed

    Losier, Y; Englehart, K; Hudgins, B

    2007-01-01

    The integration of multiple input sources within a control strategy for powered upper limb prostheses could provide smoother, more intuitive multi-joint reaching movements based on the user's intended motion. The work presented in this paper presents the results of using myoelectric signals (MES) of the shoulder area in combination with the position of the shoulder as input sources to multiple linear discriminant analysis classifiers. Such an approach may provide users with control signals capable of controlling three degrees of freedom (DOF). This work is another important step in the development of hybrid systems that will enable simultaneous control of multiple degrees of freedom used for reaching tasks in a prosthetic limb.

  9. Effect of Aggregation Operators on Network-Based Disease Gene Prioritization: A Case Study on Blood Disorders.

    PubMed

    Grewal, Nivit; Singh, Shailendra; Chand, Trilok

    2017-01-01

    Owing to the innate noise in the biological data sources, a single source or a single measure do not suffice for an effective disease gene prioritization. So, the integration of multiple data sources or aggregation of multiple measures is the need of the hour. The aggregation operators combine multiple related data values to a single value such that the combined value has the effect of all the individual values. In this paper, an attempt has been made for applying the fuzzy aggregation on the network-based disease gene prioritization and investigate its effect under noise conditions. This study has been conducted for a set of 15 blood disorders by fusing four different network measures, computed from the protein interaction network, using a selected set of aggregation operators and ranking the genes on the basis of the aggregated value. The aggregation operator-based rankings have been compared with the "Random walk with restart" gene prioritization method. The impact of noise has also been investigated by adding varying proportions of noise to the seed set. The results reveal that for all the selected blood disorders, the Mean of Maximal operator has relatively outperformed the other aggregation operators for noisy as well as non-noisy data.

  10. Multiple frequency optical mixer and demultiplexer and apparatus for remote sensing

    NASA Technical Reports Server (NTRS)

    Chen, Jeffrey R. (Inventor)

    2010-01-01

    A pulsed laser system includes a modulator module configured to provide pulsed electrical signals and a plurality of solid-state seed sources coupled to the modulator module and configured to operate, responsive to the pulsed electrical signals, in a pulse mode. Each of the plurality of solid-state seed sources is tuned to a different frequency channel separated from any adjacent frequency channel by a frequency offset. The pulsed laser system also includes a combiner that combines outputs from each of the solid state seed sources into a single optical path and an optical doubler and demultiplexer coupled to the single optical path and providing each doubled seed frequency on a separate output path.

  11. High current polarized electron source for future eRHIC

    NASA Astrophysics Data System (ADS)

    Wang, Erdong

    2018-05-01

    The high current and high bunch charge polarized electron source is essential for cost reduction of Linac-Ring (L-R) eRHIC. In the baseline design, electron beam from multiple guns (probably 4-8) will be combined using deflection plates or accumulate ring. Each gun aims to deliver electron beam with 10 mA average current and 5.3 nC bunch charge. With total 50 mA and 5.3 nC electron beam, this beam combining design could use for generating positron too. The gun has been designed, fabricated and expected to start commissioning by the mid of this year. In this paper, we will present the DC gun design parameters and beam combine schemes. Also, we will describe the details of gun design and the strategies to demonstrate high current high charge polarized electron beam from this source.

  12. Hospice Use among Urban Black and White U.S. Nursing Home Decedents in 2006

    ERIC Educational Resources Information Center

    Lepore, Michael J.; Miller, Susan C.; Gozalo, Pedro

    2011-01-01

    Purpose: Medicare hospice is a valuable source of quality care at the end of life, but its lower use by racial minority groups is of concern. This study identifies factors associated with hospice use among urban Black and White nursing home (NH) decedents in the United States. Design and Methods: Multiple data sources are combined and multilevel…

  13. A Versatile Integrated Ambient Ionization Source Platform.

    PubMed

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-30

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. Graphical abstract ᅟ.

  14. A Versatile Integrated Ambient Ionization Source Platform

    NASA Astrophysics Data System (ADS)

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-01

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.

  15. Feature extraction from multiple data sources using genetic programming

    NASA Astrophysics Data System (ADS)

    Szymanski, John J.; Brumby, Steven P.; Pope, Paul A.; Eads, Damian R.; Esch-Mosher, Diana M.; Galassi, Mark C.; Harvey, Neal R.; McCulloch, Hersey D.; Perkins, Simon J.; Porter, Reid B.; Theiler, James P.; Young, Aaron C.; Bloch, Jeffrey J.; David, Nancy A.

    2002-08-01

    Feature extraction from imagery is an important and long-standing problem in remote sensing. In this paper, we report on work using genetic programming to perform feature extraction simultaneously from multispectral and digital elevation model (DEM) data. We use the GENetic Imagery Exploitation (GENIE) software for this purpose, which produces image-processing software that inherently combines spatial and spectral processing. GENIE is particularly useful in exploratory studies of imagery, such as one often does in combining data from multiple sources. The user trains the software by painting the feature of interest with a simple graphical user interface. GENIE then uses genetic programming techniques to produce an image-processing pipeline. Here, we demonstrate evolution of image processing algorithms that extract a range of land cover features including towns, wildfire burnscars, and forest. We use imagery from the DOE/NNSA Multispectral Thermal Imager (MTI) spacecraft, fused with USGS 1:24000 scale DEM data.

  16. Two-Microphone Spatial Filtering Improves Speech Reception for Cochlear-Implant Users in Reverberant Conditions With Multiple Noise Sources

    PubMed Central

    2014-01-01

    This study evaluates a spatial-filtering algorithm as a method to improve speech reception for cochlear-implant (CI) users in reverberant environments with multiple noise sources. The algorithm was designed to filter sounds using phase differences between two microphones situated 1 cm apart in a behind-the-ear hearing-aid capsule. Speech reception thresholds (SRTs) were measured using a Coordinate Response Measure for six CI users in 27 listening conditions including each combination of reverberation level (T60 = 0, 270, and 540 ms), number of noise sources (1, 4, and 11), and signal-processing algorithm (omnidirectional response, dipole-directional response, and spatial-filtering algorithm). Noise sources were time-reversed speech segments randomly drawn from the Institute of Electrical and Electronics Engineers sentence recordings. Target speech and noise sources were processed using a room simulation method allowing precise control over reverberation times and sound-source locations. The spatial-filtering algorithm was found to provide improvements in SRTs on the order of 6.5 to 11.0 dB across listening conditions compared with the omnidirectional response. This result indicates that such phase-based spatial filtering can improve speech reception for CI users even in highly reverberant conditions with multiple noise sources. PMID:25330772

  17. Hypermedia (Multimedia).

    ERIC Educational Resources Information Center

    Byrom, Elizabeth

    1990-01-01

    Hypermedia allows students to follow associative links among elements of nonsequential information, by combining information from multiple sources into one microcomputer-controlled system. Hypermedia products help teachers create lessons integrating text, motion film, color graphics, speech, and music, by linking such electronic devices as…

  18. A statistical approach to combining multisource information in one-class classifiers

    DOE PAGES

    Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.; ...

    2017-06-08

    A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less

  19. A statistical approach to combining multisource information in one-class classifiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.

    A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less

  20. Multiple-energy Techniques in Industrial Computerized Tomography

    DOE R&D Accomplishments Database

    Schneberk, D.; Martz, H.; Azevedo, S.

    1990-08-01

    Considerable effort is being applied to develop multiple-energy industrial CT techniques for materials characterization. Multiple-energy CT can provide reliable estimates of effective Z (Z{sub eff}), weight fraction, and rigorous calculations of absolute density, all at the spatial resolution of the scanner. Currently, a wide variety of techniques exist for CT scanners, but each has certain problems and limitations. Ultimately, the best multi-energy CT technique would combine the qualities of accuracy, reliability, and wide range of application, and would require the smallest number of additional measurements. We have developed techniques for calculating material properties of industrial objects that differ somewhat from currently used methods. In this paper, we present our methods for calculating Z{sub eff}, weight fraction, and density. We begin with the simplest case -- methods for multiple-energy CT using isotopic sources -- and proceed to multiple-energy work with x-ray machine sources. The methods discussed here are illustrated on CT scans of PBX-9502 high explosives, a lexan-aluminum phantom, and a cylinder of glass beads used in a preliminary study to determine if CT can resolve three phases: air, water, and a high-Z oil. In the CT project at LLNL, we have constructed several CT scanners of varying scanning geometries using {gamma}- and x-ray sources. In our research, we employed two of these scanners: pencil-beam CAT for CT data using isotopic sources and video-CAT equipped with an IRT micro-focal x-ray machine source.

  1. Multiple Kernel Learning with Random Effects for Predicting Longitudinal Outcomes and Data Integration

    PubMed Central

    Chen, Tianle; Zeng, Donglin

    2015-01-01

    Summary Predicting disease risk and progression is one of the main goals in many clinical research studies. Cohort studies on the natural history and etiology of chronic diseases span years and data are collected at multiple visits. Although kernel-based statistical learning methods are proven to be powerful for a wide range of disease prediction problems, these methods are only well studied for independent data but not for longitudinal data. It is thus important to develop time-sensitive prediction rules that make use of the longitudinal nature of the data. In this paper, we develop a novel statistical learning method for longitudinal data by introducing subject-specific short-term and long-term latent effects through a designed kernel to account for within-subject correlation of longitudinal measurements. Since the presence of multiple sources of data is increasingly common, we embed our method in a multiple kernel learning framework and propose a regularized multiple kernel statistical learning with random effects to construct effective nonparametric prediction rules. Our method allows easy integration of various heterogeneous data sources and takes advantage of correlation among longitudinal measures to increase prediction power. We use different kernels for each data source taking advantage of the distinctive feature of each data modality, and then optimally combine data across modalities. We apply the developed methods to two large epidemiological studies, one on Huntington's disease and the other on Alzheimer's Disease (Alzheimer's Disease Neuroimaging Initiative, ADNI) where we explore a unique opportunity to combine imaging and genetic data to study prediction of mild cognitive impairment, and show a substantial gain in performance while accounting for the longitudinal aspect of the data. PMID:26177419

  2. Environmental monitoring of Galway Bay: fusing data from remote and in-situ sources

    NASA Astrophysics Data System (ADS)

    O'Connor, Edel; Hayes, Jer; Smeaton, Alan F.; O'Connor, Noel E.; Diamond, Dermot

    2009-09-01

    Changes in sea surface temperature can be used as an indicator of water quality. In-situ sensors are being used for continuous autonomous monitoring. However these sensors have limited spatial resolution as they are in effect single point sensors. Satellite remote sensing can be used to provide better spatial coverage at good temporal scales. However in-situ sensors have a richer temporal scale for a particular point of interest. Work carried out in Galway Bay has combined data from multiple satellite sources and in-situ sensors and investigated the benefits and drawbacks of using multiple sensing modalities for monitoring a marine location.

  3. Detection of anomalies in radio tomography of asteroids: Source count and forward errors

    NASA Astrophysics Data System (ADS)

    Pursiainen, S.; Kaasalainen, M.

    2014-09-01

    The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.

  4. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  5. Optimal bit allocation for hybrid scalable/multiple-description video transmission over wireless channels

    NASA Astrophysics Data System (ADS)

    Jubran, Mohammad K.; Bansal, Manu; Kondi, Lisimachos P.

    2006-01-01

    In this paper, we consider the problem of optimal bit allocation for wireless video transmission over fading channels. We use a newly developed hybrid scalable/multiple-description codec that combines the functionality of both scalable and multiple-description codecs. It produces a base layer and multiple-description enhancement layers. Any of the enhancement layers can be decoded (in a non-hierarchical manner) with the base layer to improve the reconstructed video quality. Two different channel coding schemes (Rate-Compatible Punctured Convolutional (RCPC)/Cyclic Redundancy Check (CRC) coding and, product code Reed Solomon (RS)+RCPC/CRC coding) are used for unequal error protection of the layered bitstream. Optimal allocation of the bitrate between source and channel coding is performed for discrete sets of source coding rates and channel coding rates. Experimental results are presented for a wide range of channel conditions. Also, comparisons with classical scalable coding show the effectiveness of using hybrid scalable/multiple-description coding for wireless transmission.

  6. Variable energy, high flux, ground-state atomic oxygen source

    NASA Technical Reports Server (NTRS)

    Chutjian, Ara (Inventor); Orient, Otto J. (Inventor)

    1987-01-01

    A variable energy, high flux atomic oxygen source is described which is comprised of a means for producing a high density beam of molecules which will emit O(-) ions when bombarded with electrons; a means of producing a high current stream of electrons at a low energy level passing through the high density beam of molecules to produce a combined stream of electrons and O(-) ions; means for accelerating the combined stream to a desired energy level; means for producing an intense magnetic field to confine the electrons and O(-) ions; means for directing a multiple pass laser beam through the combined stream to strip off the excess electrons from a plurality of the O(-) ions to produce ground-state O atoms within the combined stream; electrostatic deflection means for deflecting the path of the O(-) ions and the electrons in the combined stream; and, means for stopping the O(-) ions and the electrons and for allowing only the ground-state O atoms to continue as the source of the atoms of interest. The method and apparatus are also adaptable for producing other ground-state atoms and/or molecules.

  7. A hadoop-based method to predict potential effective drug combination.

    PubMed

    Sun, Yifan; Xiong, Yi; Xu, Qian; Wei, Dongqing

    2014-01-01

    Combination drugs that impact multiple targets simultaneously are promising candidates for combating complex diseases due to their improved efficacy and reduced side effects. However, exhaustive screening of all possible drug combinations is extremely time-consuming and impractical. Here, we present a novel Hadoop-based approach to predict drug combinations by taking advantage of the MapReduce programming model, which leads to an improvement of scalability of the prediction algorithm. By integrating the gene expression data of multiple drugs, we constructed data preprocessing and the support vector machines and naïve Bayesian classifiers on Hadoop for prediction of drug combinations. The experimental results suggest that our Hadoop-based model achieves much higher efficiency in the big data processing steps with satisfactory performance. We believed that our proposed approach can help accelerate the prediction of potential effective drugs with the increasing of the combination number at an exponential rate in future. The source code and datasets are available upon request.

  8. A Hadoop-Based Method to Predict Potential Effective Drug Combination

    PubMed Central

    Xiong, Yi; Xu, Qian; Wei, Dongqing

    2014-01-01

    Combination drugs that impact multiple targets simultaneously are promising candidates for combating complex diseases due to their improved efficacy and reduced side effects. However, exhaustive screening of all possible drug combinations is extremely time-consuming and impractical. Here, we present a novel Hadoop-based approach to predict drug combinations by taking advantage of the MapReduce programming model, which leads to an improvement of scalability of the prediction algorithm. By integrating the gene expression data of multiple drugs, we constructed data preprocessing and the support vector machines and naïve Bayesian classifiers on Hadoop for prediction of drug combinations. The experimental results suggest that our Hadoop-based model achieves much higher efficiency in the big data processing steps with satisfactory performance. We believed that our proposed approach can help accelerate the prediction of potential effective drugs with the increasing of the combination number at an exponential rate in future. The source code and datasets are available upon request. PMID:25147789

  9. An algorithm for separation of mixed sparse and Gaussian sources

    PubMed Central

    Akkalkotkar, Ameya

    2017-01-01

    Independent component analysis (ICA) is a ubiquitous method for decomposing complex signal mixtures into a small set of statistically independent source signals. However, in cases in which the signal mixture consists of both nongaussian and Gaussian sources, the Gaussian sources will not be recoverable by ICA and will pollute estimates of the nongaussian sources. Therefore, it is desirable to have methods for mixed ICA/PCA which can separate mixtures of Gaussian and nongaussian sources. For mixtures of purely Gaussian sources, principal component analysis (PCA) can provide a basis for the Gaussian subspace. We introduce a new method for mixed ICA/PCA which we call Mixed ICA/PCA via Reproducibility Stability (MIPReSt). Our method uses a repeated estimations technique to rank sources by reproducibility, combined with decomposition of multiple subsamplings of the original data matrix. These multiple decompositions allow us to assess component stability as the size of the data matrix changes, which can be used to determinine the dimension of the nongaussian subspace in a mixture. We demonstrate the utility of MIPReSt for signal mixtures consisting of simulated sources and real-word (speech) sources, as well as mixture of unknown composition. PMID:28414814

  10. An algorithm for separation of mixed sparse and Gaussian sources.

    PubMed

    Akkalkotkar, Ameya; Brown, Kevin Scott

    2017-01-01

    Independent component analysis (ICA) is a ubiquitous method for decomposing complex signal mixtures into a small set of statistically independent source signals. However, in cases in which the signal mixture consists of both nongaussian and Gaussian sources, the Gaussian sources will not be recoverable by ICA and will pollute estimates of the nongaussian sources. Therefore, it is desirable to have methods for mixed ICA/PCA which can separate mixtures of Gaussian and nongaussian sources. For mixtures of purely Gaussian sources, principal component analysis (PCA) can provide a basis for the Gaussian subspace. We introduce a new method for mixed ICA/PCA which we call Mixed ICA/PCA via Reproducibility Stability (MIPReSt). Our method uses a repeated estimations technique to rank sources by reproducibility, combined with decomposition of multiple subsamplings of the original data matrix. These multiple decompositions allow us to assess component stability as the size of the data matrix changes, which can be used to determinine the dimension of the nongaussian subspace in a mixture. We demonstrate the utility of MIPReSt for signal mixtures consisting of simulated sources and real-word (speech) sources, as well as mixture of unknown composition.

  11. Frequency multiplexed long range swept source optical coherence tomography

    PubMed Central

    Zurauskas, Mantas; Bradu, Adrian; Podoleanu, Adrian Gh.

    2013-01-01

    We present a novel swept source optical coherence tomography configuration, equipped with acousto-optic deflectors that can be used to simultaneously acquire multiple B-scans originating from different depths. The sensitivity range of the configuration is evaluated while acquiring five simultaneous B-scans. Then the configuration is employed to demonstrate long range B-scan imaging by combining two simultaneous B-scans from a mouse head sample. PMID:23760762

  12. Validation of a Sensor-Driven Modeling Paradigm for Multiple Source Reconstruction with FFT-07 Data

    DTIC Science & Technology

    2009-05-01

    operational warning and reporting (information) systems that combine automated data acquisition, analysis , source reconstruction, display and distribution of...report and to incorporate this operational ca- pability into the integrative multiscale urban modeling system implemented in the com- putational...Journal of Fluid Mechanics, 180, 529–556. [27] Flesch, T., Wilson, J. D., and Yee, E. (1995), Backward- time Lagrangian stochastic dispersion models

  13. Automatic plankton image classification combining multiple view features via multiple kernel learning.

    PubMed

    Zheng, Haiyong; Wang, Ruchen; Yu, Zhibin; Wang, Nan; Gu, Zhaorui; Zheng, Bing

    2017-12-28

    Plankton, including phytoplankton and zooplankton, are the main source of food for organisms in the ocean and form the base of marine food chain. As the fundamental components of marine ecosystems, plankton is very sensitive to environment changes, and the study of plankton abundance and distribution is crucial, in order to understand environment changes and protect marine ecosystems. This study was carried out to develop an extensive applicable plankton classification system with high accuracy for the increasing number of various imaging devices. Literature shows that most plankton image classification systems were limited to only one specific imaging device and a relatively narrow taxonomic scope. The real practical system for automatic plankton classification is even non-existent and this study is partly to fill this gap. Inspired by the analysis of literature and development of technology, we focused on the requirements of practical application and proposed an automatic system for plankton image classification combining multiple view features via multiple kernel learning (MKL). For one thing, in order to describe the biomorphic characteristics of plankton more completely and comprehensively, we combined general features with robust features, especially by adding features like Inner-Distance Shape Context for morphological representation. For another, we divided all the features into different types from multiple views and feed them to multiple classifiers instead of only one by combining different kernel matrices computed from different types of features optimally via multiple kernel learning. Moreover, we also applied feature selection method to choose the optimal feature subsets from redundant features for satisfying different datasets from different imaging devices. We implemented our proposed classification system on three different datasets across more than 20 categories from phytoplankton to zooplankton. The experimental results validated that our system outperforms state-of-the-art plankton image classification systems in terms of accuracy and robustness. This study demonstrated automatic plankton image classification system combining multiple view features using multiple kernel learning. The results indicated that multiple view features combined by NLMKL using three kernel functions (linear, polynomial and Gaussian kernel functions) can describe and use information of features better so that achieve a higher classification accuracy.

  14. The roles of inoculants' carbon source use in the biocontrol of potato scab disease.

    PubMed

    Sun, Pingping; Zhao, Xinbei; Shangguan, Nini; Chang, Dongwei; Ma, Qing

    2015-04-01

    Despite the application of multiple strains in the biocontrol of plant diseases, multistrain inoculation is still constrained by its inconsistency in the field. Nutrients, especially carbons, play an important role in the biocontrol processes. However, little work has been done on the systematic estimation of inoculants' carbon source use on biocontrol efficacies in vivo. In the present study, 7 nonpathogenic Streptomyces strains alone and in different combinations were inoculated as biocontrol agents against the potato scab disease, under field conditions and greenhouse treatments. The influence of the inoculants' carbon source use properties on biocontrol efficacies was investigated. The results showed that increasing the number of inoculated strains did not necessarily result in greater biocontrol efficacy in vivo. However, single strains with higher growth rates or multiple strains with less carbon source competition had positive effects on the biocontrol efficacies. These findings may shed light on optimizing the consistent biocontrol of plant disease with the consideration of inoculants' carbon source use properties.

  15. Lisbon 1755, a multiple-rupture earthquake

    NASA Astrophysics Data System (ADS)

    Fonseca, J. F. B. D.

    2017-12-01

    The Lisbon earthquake of 1755 poses a challenge to seismic hazard assessment. Reports pointing to MMI 8 or above at distances of the order of 500km led to magnitude estimates near M9 in classic studies. A refined analysis of the coeval sources lowered the estimates to 8.7 (Johnston, 1998) and 8.5 (Martinez-Solares, 2004). I posit that even these lower magnitude values reflect the combined effect of multiple ruptures. Attempts to identify a single source capable of explaining the damage reports with published ground motion models did not gather consensus and, compounding the challenge, the analysis of tsunami traveltimes has led to disparate source models, sometimes separated by a few hundred kilometers. From this viewpoint, the most credible source would combine a sub-set of the multiple active structures identifiable in SW Iberia. No individual moment magnitude needs to be above M8.1, thus rendering the search for candidate structures less challenging. The possible combinations of active structures should be ranked as a function of their explaining power, for macroseismic intensities and tsunami traveltimes taken together. I argue that the Lisbon 1755 earthquake is an example of a distinct class of intraplate earthquake previously unrecognized, of which the Indian Ocean earthquake of 2012 is the first instrumentally recorded example, showing space and time correlation over scales of the orders of a few hundred km and a few minutes. Other examples may exist in the historical record, such as the M8 1556 Shaanxi earthquake, with an unusually large damage footprint (MMI equal or above 6 in 10 provinces; 830000 fatalities). The ability to trigger seismicity globally, observed after the 2012 Indian Ocean earthquake, may be a characteristic of this type of event: occurrences in Massachussets (M5.9 Cape Ann earthquake on 18/11/1755), Morocco (M6.5 Fez earthquake on 27/11/1755) and Germany (M6.1 Duren earthquake, on 18/02/1756) had in all likelyhood a causal link to the Lisbon earthquake. This may reflect the very long period of surface waves generated by the combined sources as a result of the delays between ruptures. Recognition of this new class of large intraplate earthquakes may pave the way to a better understanding of the mechanisms driving intraplate deformation.

  16. FormScanner: Open-Source Solution for Grading Multiple-Choice Exams

    NASA Astrophysics Data System (ADS)

    Young, Chadwick; Lo, Glenn; Young, Kaisa; Borsetta, Alberto

    2016-01-01

    The multiple-choice exam remains a staple for many introductory physics courses. In the past, people have graded these by hand or even flaming needles. Today, one usually grades the exams with a form scanner that utilizes optical mark recognition (OMR). Several companies provide these scanners and particular forms, such as the eponymous "Scantron." OMR scanners combine hardware and software—a scanner and OMR program—to read and grade student-filled forms.

  17. Source-space ICA for MEG source imaging.

    PubMed

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  18. Combining Multiple Knowledge Sources for Speech Recognition

    DTIC Science & Technology

    1988-09-15

    Thus, the first is thle to clarify the pronunciationt ( TASSEAJ for the acronym TASA !). best adaptation sentence, the second sentence, whens addled...10 rapid adapltati,,n sen- tenrces, and 15 spell-i,, de phrases. 6101 resource rirailageo lei SPEAKER-DEPENDENT DATABASE sentences were randortily...combining the smoothed phoneme models with the de - system tested on a standard database using two well de . tailed context models. BYBLOS makes maximal use

  19. Reconfigurable optical interconnections via dynamic computer-generated holograms

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor); Zhou, Shaomin (Inventor)

    1994-01-01

    A system is proposed for optically providing one-to-many irregular interconnections, and strength-adjustable many-to-many irregular interconnections which may be provided with strengths (weights) w(sub ij) using multiple laser beams which address multiple holograms and means for combining the beams modified by the holograms to form multiple interconnections, such as a cross-bar switching network. The optical means for interconnection is based on entering a series of complex computer-generated holograms on an electrically addressed spatial light modulator for real-time reconfigurations, thus providing flexibility for interconnection networks for largescale practical use. By employing multiple sources and holograms, the number of interconnection patterns achieved is increased greatly.

  20. Valid Statistical Analysis for Logistic Regression with Multiple Sources

    NASA Astrophysics Data System (ADS)

    Fienberg, Stephen E.; Nardi, Yuval; Slavković, Aleksandra B.

    Considerable effort has gone into understanding issues of privacy protection of individual information in single databases, and various solutions have been proposed depending on the nature of the data, the ways in which the database will be used and the precise nature of the privacy protection being offered. Once data are merged across sources, however, the nature of the problem becomes far more complex and a number of privacy issues arise for the linked individual files that go well beyond those that are considered with regard to the data within individual sources. In the paper, we propose an approach that gives full statistical analysis on the combined database without actually combining it. We focus mainly on logistic regression, but the method and tools described may be applied essentially to other statistical models as well.

  1. A flat array large telescope concept for use on the moon, earth, and in space

    NASA Technical Reports Server (NTRS)

    Woodgate, Bruce E.

    1991-01-01

    An astronomical optical telescope concept is described which can provide very large collecting areas, of order 1000 sq m. This is an order of magnitude larger than the new generation of telescopes now being designed and built. Multiple gimballed flat mirrors direct the beams from a celestial source into a single telescope of the same aperture as each flat mirror. Multiple images of the same source are formed at the telescope focal plane. A beam combiner collects these images and superimposes them into a single image, onto a detector or spectrograph aperture. This telescope could be used on the earth, the moon, or in space.

  2. Computational Precision of Mental Inference as Critical Source of Human Choice Suboptimality.

    PubMed

    Drugowitsch, Jan; Wyart, Valentin; Devauchelle, Anne-Dominique; Koechlin, Etienne

    2016-12-21

    Making decisions in uncertain environments often requires combining multiple pieces of ambiguous information from external cues. In such conditions, human choices resemble optimal Bayesian inference, but typically show a large suboptimal variability whose origin remains poorly understood. In particular, this choice suboptimality might arise from imperfections in mental inference rather than in peripheral stages, such as sensory processing and response selection. Here, we dissociate these three sources of suboptimality in human choices based on combining multiple ambiguous cues. Using a novel quantitative approach for identifying the origin and structure of choice variability, we show that imperfections in inference alone cause a dominant fraction of suboptimal choices. Furthermore, two-thirds of this suboptimality appear to derive from the limited precision of neural computations implementing inference rather than from systematic deviations from Bayes-optimal inference. These findings set an upper bound on the accuracy and ultimate predictability of human choices in uncertain environments. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Flex fuel polygeneration: Integrating renewable natural gas

    NASA Astrophysics Data System (ADS)

    Kieffer, Matthew

    Flex Fuel Polygeneration (FFPG) is the use of multiple primary energy sources for the production of multiple energy carriers to achieve increased market opportunities. FFPG allows for adjustments in energy supply to meet market fluctuations and increase resiliency to contingencies such as weather disruptions, technological changes, and variations in supply of energy resources. In this study a FFPG plant is examined that uses a combination of the primary energy sources natural gas and renewable natural gas (RNG) derived from MSW and livestock manure and converts them into energy carriers of electricity and fuels through anaerobic digestion (AD), Fischer-Tropsch synthesis (FTS), and gas turbine cycles. Previous techno-economic analyses of conventional energy production plants are combined to obtain equipment and operating costs, and then the 20-year NPVs of the FFPG plant designs are evaluated by static and stochastic simulations. The effects of changing operating parameters are investigated, as well as the number of anaerobic digestion plants on the 20-year NPV of the FTS and FFPG systems.

  4. Location of high-frequency P wave microseismic noise in the Pacific Ocean using multiple small aperture arrays

    DOE PAGES

    Pyle, Moira L.; Koper, Keith D.; Euler, Garrett G.; ...

    2015-04-20

    We investigate source locations of P-wave microseisms within a narrow frequency band (0.67–1.33 Hz) that is significantly higher than the classic microseism band (~0.05–0.3 Hz). Employing a backprojection method, we analyze data recorded during January 2010 from five International Monitoring System arrays that border the Pacific Ocean. We develop a ranking scheme that allows us to combine beam power from multiple arrays to obtain robust locations of the microseisms. Some individual arrays exhibit a strong regional component, but results from the combination of all arrays show high-frequency P wave energy emanating from the North Pacific basin, in general agreement withmore » previous observations in the double-frequency (DF) microseism band (~0.1–0.3 Hz). This suggests that the North Pacific source of ambient P noise covers a broad range of frequencies and that the wave-wave interaction model is likely valid at shorter periods.« less

  5. High performance geospatial and climate data visualization using GeoJS

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.

  6. Locating non-volcanic tremor along the San Andreas Fault using a multiple array source imaging technique

    USGS Publications Warehouse

    Ryberg, T.; Haberland, C.H.; Fuis, G.S.; Ellsworth, W.L.; Shelly, D.R.

    2010-01-01

    Non-volcanic tremor (NVT) has been observed at several subduction zones and at the San Andreas Fault (SAF). Tremor locations are commonly derived by cross-correlating envelope-transformed seismic traces in combination with source-scanning techniques. Recently, they have also been located by using relative relocations with master events, that is low-frequency earthquakes that are part of the tremor; locations are derived by conventional traveltime-based methods. Here we present a method to locate the sources of NVT using an imaging approach for multiple array data. The performance of the method is checked with synthetic tests and the relocation of earthquakes. We also applied the method to tremor occurring near Cholame, California. A set of small-aperture arrays (i.e. an array consisting of arrays) installed around Cholame provided the data set for this study. We observed several tremor episodes and located tremor sources in the vicinity of SAF. During individual tremor episodes, we observed a systematic change of source location, indicating rapid migration of the tremor source along SAF. ?? 2010 The Authors Geophysical Journal International ?? 2010 RAS.

  7. The Chandra Source Catalog 2.0: Combining Data for Processing (or How I learned 17 different words for "group")

    NASA Astrophysics Data System (ADS)

    Hain, Roger; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula

    2018-01-01

    The Second Chandra Source Catalog (CSC2.0) combines data at multiple stages to improve detection efficiency, enhance source region identification, and match observations of the same celestial source taken with significantly different point spread functions on Chandra's detectors. The need to group data for different reasons at different times in processing results in a hierarchy of groups to which individual sources belong. Source data are initially identified as belonging to each Chandra observation ID and number (an "obsid"). Data from each obsid whose pointings are within sixty arcseconds of each other are reprojected to the same aspect reference coordinates and grouped into stacks. Detection is performed on all data in the same stack, and individual sources are identified. Finer source position and region data are determined by further processing sources whose photons may be commingled together, grouping such sources into bundles. Individual stacks which overlap to any extent are grouped into ensembles, and all stacks in the same ensemble are later processed together to identify master sources and determine their properties.We discuss the basis for the various methods of combining data for processing and precisely define how the groups are determined. We also investigate some of the issues related to grouping data and discuss what options exist and how groups have evolved from prior releases.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.

  8. Combinations of Multiple Neuroimaging Markers using Logistic Regression for Auxiliary Diagnosis of Alzheimer Disease and Mild Cognitive Impairment.

    PubMed

    Mao, Nini; Liu, Yunting; Chen, Kewei; Yao, Li; Wu, Xia

    2018-06-05

    Multiple neuroimaging modalities have been developed providing various aspects of information on the human brain. Used together and properly, these complementary multimodal neuroimaging data integrate multisource information which can facilitate a diagnosis and improve the diagnostic accuracy. In this study, 3 types of brain imaging data (sMRI, FDG-PET, and florbetapir-PET) were fused in the hope to improve diagnostic accuracy, and multivariate methods (logistic regression) were applied to these trimodal neuroimaging indices. Then, the receiver-operating characteristic (ROC) method was used to analyze the outcomes of the logistic classifier, with either each index, multiples from each modality, or all indices from all 3 modalities, to investigate their differential abilities to identify the disease. With increasing numbers of indices within each modality and across modalities, the accuracy of identifying Alzheimer disease (AD) increases to varying degrees. For example, the area under the ROC curve is above 0.98 when all the indices from the 3 imaging data types are combined. Using a combination of different indices, the results confirmed the initial hypothesis that different biomarkers were potentially complementary, and thus the conjoint analysis of multiple information from multiple sources would improve the capability to identify diseases such as AD and mild cognitive impairment. © 2018 S. Karger AG, Basel.

  9. A Simulated Environment Experiment on Annoyance Due to Combined Road Traffic and Industrial Noises.

    PubMed

    Marquis-Favre, Catherine; Morel, Julien

    2015-07-21

    Total annoyance due to combined noises is still difficult to predict adequately. This scientific gap is an obstacle for noise action planning, especially in urban areas where inhabitants are usually exposed to high noise levels from multiple sources. In this context, this work aims to highlight potential to enhance the prediction of total annoyance. The work is based on a simulated environment experiment where participants performed activities in a living room while exposed to combined road traffic and industrial noises. The first objective of the experiment presented in this paper was to gain further understanding of the effects on annoyance of some acoustical factors, non-acoustical factors and potential interactions between the combined noise sources. The second one was to assess total annoyance models constructed from the data collected during the experiment and tested using data gathered in situ. The results obtained in this work highlighted the superiority of perceptual models. In particular, perceptual models with an interaction term seemed to be the best predictors for the two combined noise sources under study, even with high differences in sound pressure level. Thus, these results reinforced the need to focus on perceptual models and to improve the prediction of partial annoyances.

  10. A data fusion approach for track monitoring from multiple in-service trains

    NASA Astrophysics Data System (ADS)

    Lederman, George; Chen, Siheng; Garrett, James H.; Kovačević, Jelena; Noh, Hae Young; Bielak, Jacobo

    2017-10-01

    We present a data fusion approach for enabling data-driven rail-infrastructure monitoring from multiple in-service trains. A number of researchers have proposed using vibration data collected from in-service trains as a low-cost method to monitor track geometry. The majority of this work has focused on developing novel features to extract information about the tracks from data produced by individual sensors on individual trains. We extend this work by presenting a technique to combine extracted features from multiple passes over the tracks from multiple sensors aboard multiple vehicles. There are a number of challenges in combining multiple data sources, like different relative position coordinates depending on the location of the sensor within the train. Furthermore, as the number of sensors increases, the likelihood that some will malfunction also increases. We use a two-step approach that first minimizes position offset errors through data alignment, then fuses the data with a novel adaptive Kalman filter that weights data according to its estimated reliability. We show the efficacy of this approach both through simulations and on a data-set collected from two instrumented trains operating over a one-year period. Combining data from numerous in-service trains allows for more continuous and more reliable data-driven monitoring than analyzing data from any one train alone; as the number of instrumented trains increases, the proposed fusion approach could facilitate track monitoring of entire rail-networks.

  11. Integrated model of multiple kernel learning and differential evolution for EUR/USD trading.

    PubMed

    Deng, Shangkun; Sakurai, Akito

    2014-01-01

    Currency trading is an important area for individual investors, government policy decisions, and organization investments. In this study, we propose a hybrid approach referred to as MKL-DE, which combines multiple kernel learning (MKL) with differential evolution (DE) for trading a currency pair. MKL is used to learn a model that predicts changes in the target currency pair, whereas DE is used to generate the buy and sell signals for the target currency pair based on the relative strength index (RSI), while it is also combined with MKL as a trading signal. The new hybrid implementation is applied to EUR/USD trading, which is the most traded foreign exchange (FX) currency pair. MKL is essential for utilizing information from multiple information sources and DE is essential for formulating a trading rule based on a mixture of discrete structures and continuous parameters. Initially, the prediction model optimized by MKL predicts the returns based on a technical indicator called the moving average convergence and divergence. Next, a combined trading signal is optimized by DE using the inputs from the prediction model and technical indicator RSI obtained from multiple timeframes. The experimental results showed that trading using the prediction learned by MKL yielded consistent profits.

  12. Fusion or confusion: knowledge or nonsense?

    NASA Astrophysics Data System (ADS)

    Rothman, Peter L.; Denton, Richard V.

    1991-08-01

    The terms 'data fusion,' 'sensor fusion,' multi-sensor integration,' and 'multi-source integration' have been used widely in the technical literature to refer to a variety of techniques, technologies, systems, and applications which employ and/or combine data derived from multiple information sources. Applications of data fusion range from real-time fusion of sensor information for the navigation of mobile robots to the off-line fusion of both human and technical strategic intelligence data. The Department of Defense Critical Technologies Plan lists data fusion in the highest priority group of critical technologies, but just what is data fusion? The DoD Critical Technologies Plan states that data fusion involves 'the acquisition, integration, filtering, correlation, and synthesis of useful data from diverse sources for the purposes of situation/environment assessment, planning, detecting, verifying, diagnosing problems, aiding tactical and strategic decisions, and improving system performance and utility.' More simply states, sensor fusion refers to the combination of data from multiple sources to provide enhanced information quality and availability over that which is available from any individual source alone. This paper presents a survey of the state-of-the- art in data fusion technologies, system components, and applications. A set of characteristics which can be utilized to classify data fusion systems is presented. Additionally, a unifying mathematical and conceptual framework within which to understand and organize fusion technologies is described. A discussion of often overlooked issues in the development of sensor fusion systems is also presented.

  13. Data registration for automated non-destructive inspection with multiple data sets

    NASA Astrophysics Data System (ADS)

    Tippetts, T.; Brierley, N.; Cawley, P.

    2013-01-01

    In many NDE applications, multiple sources of data are available covering the same region of a part under inspection. These overlapping data can come from intersecting scan patterns, sensors in an array configuration, or repeated inspections at different times. In many cases these data sets are analysed independently, with separate assessments for each channel or data file. It should be possible to improve the overall reliability of the inspection by combining multiple sources of information, simultaneously increasing the Probability of Detection (POD) and decreasing the Probability of False Alarm (PFA). Data registration, i.e. mapping the data to matching coordinates in space, is both an essential prerequisite and a challenging obstacle to this type of data fusion. This paper describes optimization techniques for matching and aligning features in NDE data. Examples from automated ultrasound inspection of aircraft engine discs illustrate the approach.

  14. Gas-phase broadband spectroscopy using active sources: progress, status, and applications

    PubMed Central

    Cossel, Kevin C.; Waxman, Eleanor M.; Finneran, Ian A.; Blake, Geoffrey A.; Ye, Jun; Newbury, Nathan R.

    2017-01-01

    Broadband spectroscopy is an invaluable tool for measuring multiple gas-phase species simultaneously. In this work we review basic techniques, implementations, and current applications for broadband spectroscopy. We discuss components of broad-band spectroscopy including light sources, absorption cells, and detection methods and then discuss specific combinations of these components in commonly-used techniques. We finish this review by discussing potential future advances in techniques and applications of broad-band spectroscopy. PMID:28630530

  15. Multifactorial antimicrobial wood protectants

    Treesearch

    Robert D. Coleman; Carol A. Clausen

    2008-01-01

    It is unlikely that a single antimicrobial compound, whether synthetic or natural, will provide the ‘magic bullet’ for eliminating multiple biological agents affecting wood products. Development of synergistic combinations of selected compounds, especially those derived from natural sources, is recognized as a promising approach to improved wood protection. Recent...

  16. Quantitation of mycotoxins using direct analysis in real time (DART)-mass spectrometry (MS)

    USDA-ARS?s Scientific Manuscript database

    Ambient ionization represents a new generation of mass spectrometry ion sources which is used for rapid ionization of small molecules under ambient conditions. The combination of ambient ionization and mass spectrometry allows analyzing multiple food samples with simple or no sample treatment, or in...

  17. COMBINING EVIDENCE ON AIR POLLUTION AND DAILY MORTALITY FROM 20 LARGEST U.S. CITIES: A HIERARCHICAL MODELING STRATEGY

    EPA Science Inventory

    Environmental science and management are fed by individual studies of pollution effects, often focused on single locations. Data are encountered data, typically from multiple sources and on different time and spatial scales. Statistical issues including publication bias and m...

  18. Spatiotemporal source tuning filter bank for multiclass EEG based brain computer interfaces.

    PubMed

    Acharya, Soumyadipta; Mollazadeh, Moshen; Murari, Kartikeya; Thakor, Nitish

    2006-01-01

    Non invasive brain-computer interfaces (BCI) allow people to communicate by modulating features of their electroencephalogram (EEG). Spatiotemporal filtering has a vital role in multi-class, EEG based BCI. In this study, we used a novel combination of principle component analysis, independent component analysis and dipole source localization to design a spatiotemporal multiple source tuning (SPAMSORT) filter bank, each channel of which was tuned to the activity of an underlying dipole source. Changes in the event-related spectral perturbation (ERSP) were measured and used to train a linear support vector machine to classify between four classes of motor imagery tasks (left hand, right hand, foot and tongue) for one subject. ERSP values were significantly (p<0.01) different across tasks and better (p<0.01) than conventional spatial filtering methods (large Laplacian and common average reference). Classification resulted in an average accuracy of 82.5%. This approach could lead to promising BCI applications such as control of a prosthesis with multiple degrees of freedom.

  19. Reconfigurable Optical Interconnections Via Dynamic Computer-Generated Holograms

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor); Zhou, Shao-Min (Inventor)

    1996-01-01

    A system is presented for optically providing one-to-many irregular interconnections, and strength-adjustable many-to-many irregular interconnections which may be provided with strengths (weights) w(sub ij) using multiple laser beams which address multiple holograms and means for combining the beams modified by the holograms to form multiple interconnections, such as a cross-bar switching network. The optical means for interconnection is based on entering a series of complex computer-generated holograms on an electrically addressed spatial light modulator for real-time reconfigurations, thus providing flexibility for interconnection networks for large-scale practical use. By employing multiple sources and holograms, the number of interconnection patterns achieved is increased greatly.

  20. Probing midrapidity source characteristics with charged particles and neutrons in the 35Cl+natTa reaction at 43 MeV/nucleon

    NASA Astrophysics Data System (ADS)

    Larochelle, Y.; St-Pierre, C.; Beaulieu, L.; Colonna, N.; Gingras, L.; Ball, G. C.; Bowman, D. R.; Colonna, M.; D'erasmo, G.; Fiore, E.; Fox, D.; Galindo-Uribarri, A.; Hagberg, E.; Horn, D.; Laforest, R.; Pantaleo, A.; Roy, R.; Tagliente, G.

    1999-02-01

    The characteristics of the midrapidity and target sources (apparent temperatures, velocities, and neutron multiplicities) extracted from the neutron energy spectra, have been measured for various quasiprojectile (QP) excitation energies, reconstructed from charged particles of well defined peripheral events in the 35Cl+natTa reaction at 43 MeV/nucleon. The reconstructed excitation energy of the QP is always smaller than the excitation energy calculated from its velocity, assuming pure dissipative binary collision. The latter observation combined with the neutron multiplicity at midrapidity and the apparent temperature suggests important preequilibrium and/or dynamical effects in the entrance channel. The midrapidity source moves at a velocity lower than the nucleon-nucleon center of mass velocity showing the importance of the attractive mean-field potential from the target even at 43 MeV/nucleon. The above picture is confirmed by comparison to Boltzman-Nordheim-Vlasov (BNV) simulations.

  1. Combined rule extraction and feature elimination in supervised classification.

    PubMed

    Liu, Sheng; Patel, Ronak Y; Daga, Pankaj R; Liu, Haining; Fu, Gang; Doerksen, Robert J; Chen, Yixin; Wilkins, Dawn E

    2012-09-01

    There are a vast number of biology related research problems involving a combination of multiple sources of data to achieve a better understanding of the underlying problems. It is important to select and interpret the most important information from these sources. Thus it will be beneficial to have a good algorithm to simultaneously extract rules and select features for better interpretation of the predictive model. We propose an efficient algorithm, Combined Rule Extraction and Feature Elimination (CRF), based on 1-norm regularized random forests. CRF simultaneously extracts a small number of rules generated by random forests and selects important features. We applied CRF to several drug activity prediction and microarray data sets. CRF is capable of producing performance comparable with state-of-the-art prediction algorithms using a small number of decision rules. Some of the decision rules are biologically significant.

  2. Audio-visual speech cue combination.

    PubMed

    Arnold, Derek H; Tear, Morgan; Schindel, Ryan; Roseboom, Warrick

    2010-04-16

    Different sources of sensory information can interact, often shaping what we think we have seen or heard. This can enhance the precision of perceptual decisions relative to those made on the basis of a single source of information. From a computational perspective, there are multiple reasons why this might happen, and each predicts a different degree of enhanced precision. Relatively slight improvements can arise when perceptual decisions are made on the basis of multiple independent sensory estimates, as opposed to just one. These improvements can arise as a consequence of probability summation. Greater improvements can occur if two initially independent estimates are summated to form a single integrated code, especially if the summation is weighted in accordance with the variance associated with each independent estimate. This form of combination is often described as a Bayesian maximum likelihood estimate. Still greater improvements are possible if the two sources of information are encoded via a common physiological process. Here we show that the provision of simultaneous audio and visual speech cues can result in substantial sensitivity improvements, relative to single sensory modality based decisions. The magnitude of the improvements is greater than can be predicted on the basis of either a Bayesian maximum likelihood estimate or a probability summation. Our data suggest that primary estimates of speech content are determined by a physiological process that takes input from both visual and auditory processing, resulting in greater sensitivity than would be possible if initially independent audio and visual estimates were formed and then subsequently combined.

  3. A method of classification for multisource data in remote sensing based on interval-valued probabilities

    NASA Technical Reports Server (NTRS)

    Kim, Hakil; Swain, Philip H.

    1990-01-01

    An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.

  4. Multiwavelength counterparts of the point sources in the Chandra Source Catalog

    NASA Astrophysics Data System (ADS)

    Reynolds, Michael; Civano, Francesca Maria; Fabbiano, Giuseppina; D'Abrusco, Raffaele

    2018-01-01

    The most recent release of the Chandra Source Catalog (CSC) version 2.0 comprises more than $\\sim$350,000 point sources, down to fluxes of $\\sim$10$^{-16}$ erg/cm$^2$/s, covering $\\sim$500 deg$^2$ of the sky, making it one of the best available X-ray catalogs to date. There are many reasons to have multiwavelength counterparts for sources, one such reason is that X-ray information alone is not enough to identify the sources and divide them between galactic and extragalactic origin, therefore multiwavelength data associated to each X-ray source is crucial for classification and scientific analysis of the sample. To perform this multiwavelength association, we are going to employ the recently released versatile tool NWAY (Salvato et al. 2017), based on a Bayesian algorithm for cross-matching multiple catalogs. NWAY allows the combination of multiple catalogs at the same time, provides a probability for the matches, even in case of non-detection due to different depth of the matching catalogs, and it can be used by including priors on the nature of the sources (e.g. colors, magnitudes, etc). In this poster, we are presenting the preliminary analysis using the CSC sources above the galactic plane matched to the WISE All-Sky catalog, SDSS, Pan-STARRS and GALEX.

  5. Sensor Fusion Techniques for Phased-Array Eddy Current and Phased-Array Ultrasound Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arrowood, Lloyd F.

    Sensor (or Data) fusion is the process of integrating multiple data sources to produce more consistent, accurate and comprehensive information than is provided by a single data source. Sensor fusion may also be used to combine multiple signals from a single modality to improve the performance of a particular inspection technique. Industrial nondestructive testing may utilize multiple sensors to acquire inspection data depending upon the object under inspection and the anticipated types of defects that can be identified. Sensor fusion can be performed at various levels of signal abstraction with each having its strengths and weaknesses. A multimodal data fusionmore » strategy first proposed by Heideklang and Shokouhi that combines spatially scattered detection locations to improve detection performance of surface-breaking and near-surface cracks in ferromagnetic metals is shown using a surface inspection example and is then extended for volumetric inspections. Utilizing data acquired from an Olympus Omniscan MX2 from both phased array eddy current and ultrasound probes on test phantoms, single and multilevel fusion techniques are employed to integrate signals from the two modalities. Preliminary results demonstrate how confidence in defect identification and interpretation benefit from sensor fusion techniques. Lastly, techniques for integrating data into radiographic and volumetric imagery from computed tomography are described and results are presented.« less

  6. Achieving high CRI from warm to super white

    NASA Astrophysics Data System (ADS)

    Bailey, Edward; Tormey, Ellen S.

    2007-09-01

    Light sources which produce a high color rendering index (CRI) have many applications in the lighting industry today. High color rendering accents the rich color which abounds in nature, interior design, theatrical costumes and props, clothing and fabric, jewelry, and machine vision applications. Multi-wavelength LED sources can pump phosphors at multiple stokes shift emission regimes and when combined with selected direct emission sources can allow for greater flexibility in the production of warm-white and cool white light of specialty interest. Unique solutions to R8 and R14 CRI >95 at 2850K, 4750K, 5250K, and 6750K presented.

  7. Limiting the public cost of stationary battery deployment by combining applications

    NASA Astrophysics Data System (ADS)

    Stephan, A.; Battke, B.; Beuse, M. D.; Clausdeinken, J. H.; Schmidt, T. S.

    2016-07-01

    Batteries could be central to low-carbon energy systems with high shares of intermittent renewable energy sources. However, the investment attractiveness of batteries is still perceived as low, eliciting calls for policy to support deployment. Here we show how the cost of battery deployment can potentially be minimized by introducing an aspect that has been largely overlooked in policy debates and underlying analyses: the fact that a single battery can serve multiple applications. Batteries thereby can not only tap into different value streams, but also combine different risk exposures. To address this gap, we develop a techno-economic model and apply it to the case of lithium-ion batteries serving multiple stationary applications in Germany. Our results show that batteries could be attractive for investors even now if non-market barriers impeding the combination of applications were removed. The current policy debate should therefore be refocused so as to encompass the removal of such barriers.

  8. Finite-Length Line Source Superposition Model (FLLSSM)

    NASA Astrophysics Data System (ADS)

    1980-03-01

    A linearized thermal conduction model was developed to economically determine media temperatures in geologic repositories for nuclear wastes. Individual canisters containing either high level waste or spent fuel assemblies were represented as finite length line sources in a continuous media. The combined effects of multiple canisters in a representative storage pattern were established at selected points of interest by superposition of the temperature rises calculated for each canister. The methodology is outlined and the computer code FLLSSM which performs required numerical integrations and superposition operations is described.

  9. Measurement-device-independent quantum key distribution with multiple crystal heralded source with post-selection

    NASA Astrophysics Data System (ADS)

    Chen, Dong; Shang-Hong, Zhao; MengYi, Deng

    2018-03-01

    The multiple crystal heralded source with post-selection (MHPS), originally introduced to improve the single-photon character of the heralded source, has specific applications for quantum information protocols. In this paper, by combining decoy-state measurement-device-independent quantum key distribution (MDI-QKD) with spontaneous parametric downconversion process, we present a modified MDI-QKD scheme with MHPS where two architectures are proposed corresponding to symmetric scheme and asymmetric scheme. The symmetric scheme, which linked by photon switches in a log-tree structure, is adopted to overcome the limitation of the current low efficiency of m-to-1 optical switches. The asymmetric scheme, which shows a chained structure, is used to cope with the scalability issue with increase in the number of crystals suffered in symmetric scheme. The numerical simulations show that our modified scheme has apparent advances both in transmission distance and key generation rate compared to the original MDI-QKD with weak coherent source and traditional heralded source with post-selection. Furthermore, the recent advances in integrated photonics suggest that if built into a single chip, the MHPS might be a practical alternative source in quantum key distribution tasks requiring single photons to work.

  10. ARTIP: Automated Radio Telescope Image Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  11. Source Term Estimation of Radioxenon Released from the Fukushima Dai-ichi Nuclear Reactors Using Measured Air Concentrations and Atmospheric Transport Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Biegalski, S.; Bowyer, Ted W.

    2014-01-01

    Systems designed to monitor airborne radionuclides released from underground nuclear explosions detected radioactive fallout from the Fukushima Daiichi nuclear accident in March 2011. Atmospheric transport modeling (ATM) of plumes of noble gases and particulates were performed soon after the accident to determine plausible detection locations of any radioactive releases to the atmosphere. We combine sampling data from multiple International Modeling System (IMS) locations in a new way to estimate the magnitude and time sequence of the releases. Dilution factors from the modeled plume at five different detection locations were combined with 57 atmospheric concentration measurements of 133-Xe taken from Marchmore » 18 to March 23 to estimate the source term. This approach estimates that 59% of the 1.24×1019 Bq of 133-Xe present in the reactors at the time of the earthquake was released to the atmosphere over a three day period. Source term estimates from combinations of detection sites have lower spread than estimates based on measurements at single detection sites. Sensitivity cases based on data from four or more detection locations bound the source term between 35% and 255% of available xenon inventory.« less

  12. Articulation Management for Intelligent Integration of Information

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    When combining data from distinct sources, there is a need to share meta-data and other knowledge about various source domains. Due to semantic inconsistencies and heterogeneity of representations, problems arise in combining multiple domains when the domains are merged. The knowledge that is irrelevant to the task of interoperation will be included, making the result unnecessarily complex. This heterogeneity problem can be eliminated by mediating the conflicts and managing the intersections of the domains. For interoperation and intelligent access to heterogeneous information, the focus is on the intersection of the knowledge, since intersection will define the required articulation rules. An algebra over domain has been proposed to use articulation rules to support disciplined manipulation of domain knowledge resources. The objective of a domain algebra is to provide the capability for interrogating many domain knowledge resources, which are largely semantically disjoint. The algebra supports formally the tasks of selecting, combining, extending, specializing, and modifying Components from a diverse set of domains. This paper presents a domain algebra and demonstrates the use of articulation rules to link declarative interfaces for Internet and enterprise applications. In particular, it discusses the articulation implementation as part of a production system capable of operating over the domain described by the IDL (interface description language) of objects registered in multiple CORBA servers.

  13. First-Degree Relatives of Young Children with Autism Spectrum Disorders: Some Gender Aspects

    ERIC Educational Resources Information Center

    Eriksson, Mats Anders; Westerlund, Joakim; Anderlid, Britt Marie; Gillberg, Christopher; Fernell, Elisabeth

    2012-01-01

    Prenatal risk factors, with special focus on gender distribution of neurodevelopmental and psychiatric conditions were analysed in first-degree relatives in a population-based group of young children with autism spectrum disorders (ASD). Multiple information sources were combined. This group was contrasted with the general population regarding…

  14. Astronomy with the Color Blind

    ERIC Educational Resources Information Center

    Smith, Donald A.; Melrose, Justyn

    2014-01-01

    The standard method to create dramatic color images in astrophotography is to record multiple black and white images, each with a different color filter in the optical path, and then tint each frame with a color appropriate to the corresponding filter. When combined, the resulting image conveys information about the sources of emission in the…

  15. Wildlife monitoring across multiple spatial scales using grid-based sampling

    Treesearch

    Kevin S. McKelvey; Samuel A. Cushman; Michael K. Schwartz; Leonard F. Ruggiero

    2009-01-01

    Recently, noninvasive genetic sampling has become the most effective way to reliably sample occurrence of many species. In addition, genetic data provide a rich data source enabling the monitoring of population status. The combination of genetically based animal data collected at known spatial coordinates with vegetation, topography, and other available covariates...

  16. Nocturnal air, road, and rail traffic noise and daytime cognitive performance and annoyance.

    PubMed

    Elmenhorst, Eva-Maria; Quehl, Julia; Müller, Uwe; Basner, Mathias

    2014-01-01

    Various studies indicate that at the same noise level and during the daytime, annoyance increases in the order of rail, road, and aircraft noise. The present study investigates if the same ranking can be found for annoyance to nocturnal exposure and next day cognitive performance. Annoyance ratings and performance change during combined noise exposure were also tested. In the laboratory 72 participants were exposed to air, road, or rail traffic noise and all combinations. The number of noise events and LAS,eq were kept constant. Each morning noise annoyance questionnaires and performance tasks were administered. Aircraft noise annoyance ranked first followed by railway and road noise. A possible explanation is the longer duration of aircraft noise events used in this study compared to road and railway noise events. In contrast to road and rail traffic, aircraft noise annoyance was higher after nights with combined exposure. Pooled noise exposure data showed small but significant impairments in reaction times (6 ms) compared to nights without noise. The noise sources did not have a differential impact on performance. Combined exposure to multiple traffic noise sources did not induce stronger impairments than a single noise source. This was reflected also in low workload ratings.

  17. Evaluating the Use of Existing Data Sources, Probabilistic Linkage, and Multiple Imputation to Build Population-based Injury Databases Across Phases of Trauma Care

    PubMed Central

    Newgard, Craig; Malveau, Susan; Staudenmayer, Kristan; Wang, N. Ewen; Hsia, Renee Y.; Mann, N. Clay; Holmes, James F.; Kuppermann, Nathan; Haukoos, Jason S.; Bulger, Eileen M.; Dai, Mengtao; Cook, Lawrence J.

    2012-01-01

    Objectives The objective was to evaluate the process of using existing data sources, probabilistic linkage, and multiple imputation to create large population-based injury databases matched to outcomes. Methods This was a retrospective cohort study of injured children and adults transported by 94 emergency medical systems (EMS) agencies to 122 hospitals in seven regions of the western United States over a 36-month period (2006 to 2008). All injured patients evaluated by EMS personnel within specific geographic catchment areas were included, regardless of field disposition or outcome. The authors performed probabilistic linkage of EMS records to four hospital and postdischarge data sources (emergency department [ED] data, patient discharge data, trauma registries, and vital statistics files) and then handled missing values using multiple imputation. The authors compare and evaluate matched records, match rates (proportion of matches among eligible patients), and injury outcomes within and across sites. Results There were 381,719 injured patients evaluated by EMS personnel in the seven regions. Among transported patients, match rates ranged from 14.9% to 87.5% and were directly affected by the availability of hospital data sources and proportion of missing values for key linkage variables. For vital statistics records (1-year mortality), estimated match rates ranged from 88.0% to 98.7%. Use of multiple imputation (compared to complete case analysis) reduced bias for injury outcomes, although sample size, percentage missing, type of variable, and combined-site versus single-site imputation models all affected the resulting estimates and variance. Conclusions This project demonstrates the feasibility and describes the process of constructing population-based injury databases across multiple phases of care using existing data sources and commonly available analytic methods. Attention to key linkage variables and decisions for handling missing values can be used to increase match rates between data sources, minimize bias, and preserve sampling design. PMID:22506952

  18. A Simulated Environment Experiment on Annoyance Due to Combined Road Traffic and Industrial Noises

    PubMed Central

    Marquis-Favre, Catherine; Morel, Julien

    2015-01-01

    Total annoyance due to combined noises is still difficult to predict adequately. This scientific gap is an obstacle for noise action planning, especially in urban areas where inhabitants are usually exposed to high noise levels from multiple sources. In this context, this work aims to highlight potential to enhance the prediction of total annoyance. The work is based on a simulated environment experiment where participants performed activities in a living room while exposed to combined road traffic and industrial noises. The first objective of the experiment presented in this paper was to gain further understanding of the effects on annoyance of some acoustical factors, non-acoustical factors and potential interactions between the combined noise sources. The second one was to assess total annoyance models constructed from the data collected during the experiment and tested using data gathered in situ. The results obtained in this work highlighted the superiority of perceptual models. In particular, perceptual models with an interaction term seemed to be the best predictors for the two combined noise sources under study, even with high differences in sound pressure level. Thus, these results reinforced the need to focus on perceptual models and to improve the prediction of partial annoyances. PMID:26197326

  19. Bias-field controlled phasing and power combination of gyromagnetic nonlinear transmission lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reale, D. V., E-mail: david.reale@ttu.edu; Bragg, J.-W. B.; Gonsalves, N. R.

    2014-05-15

    Gyromagnetic Nonlinear Transmission Lines (NLTLs) generate microwaves through the damped gyromagnetic precession of the magnetic moments in ferrimagnetic material, and are thus utilized as compact, solid-state, frequency agile, high power microwave (HPM) sources. The output frequency of a NLTL can be adjusted by control of the externally applied bias field and incident voltage pulse without physical alteration to the structure of the device. This property provides a frequency tuning capability not seen in many conventional e-beam based HPM sources. The NLTLs developed and tested are mesoband sources capable of generating MW power levels in the L, S, and C bandsmore » of the microwave spectrum. For an individual NLTL the output power at a given frequency is determined by several factors including the intrinsic properties of the ferrimagnetic material and the transmission line structure. Hence, if higher power levels are to be achieved, it is necessary to combine the outputs of multiple NLTLs. This can be accomplished in free space using antennas or in a transmission line via a power combiner. Using a bias-field controlled delay, a transient, high voltage, coaxial, three port, power combiner was designed and tested. Experimental results are compared with the results of a transient COMSOL simulation to evaluate combiner performance.« less

  20. Bias-field controlled phasing and power combination of gyromagnetic nonlinear transmission lines.

    PubMed

    Reale, D V; Bragg, J-W B; Gonsalves, N R; Johnson, J M; Neuber, A A; Dickens, J C; Mankowski, J J

    2014-05-01

    Gyromagnetic Nonlinear Transmission Lines (NLTLs) generate microwaves through the damped gyromagnetic precession of the magnetic moments in ferrimagnetic material, and are thus utilized as compact, solid-state, frequency agile, high power microwave (HPM) sources. The output frequency of a NLTL can be adjusted by control of the externally applied bias field and incident voltage pulse without physical alteration to the structure of the device. This property provides a frequency tuning capability not seen in many conventional e-beam based HPM sources. The NLTLs developed and tested are mesoband sources capable of generating MW power levels in the L, S, and C bands of the microwave spectrum. For an individual NLTL the output power at a given frequency is determined by several factors including the intrinsic properties of the ferrimagnetic material and the transmission line structure. Hence, if higher power levels are to be achieved, it is necessary to combine the outputs of multiple NLTLs. This can be accomplished in free space using antennas or in a transmission line via a power combiner. Using a bias-field controlled delay, a transient, high voltage, coaxial, three port, power combiner was designed and tested. Experimental results are compared with the results of a transient COMSOL simulation to evaluate combiner performance.

  1. EEG and MEG data analysis in SPM8.

    PubMed

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.

  2. EEG and MEG Data Analysis in SPM8

    PubMed Central

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221

  3. Multiplicity of High-z Submillimeter Galaxies from Cosmological Simulations

    NASA Astrophysics Data System (ADS)

    Ball, David; Narayanan, Desika; Hopkins, Philip F.; Turk, Matthew

    2015-01-01

    Sub-millimeter galaxies (or SMG's) are some of the most luminous galaxies in the universe, yet are nearly invisible in the optical. Theorists have long struggled to simulate SMG's and accurately match their spectral properties and abundance to observations. Recent high-resolution observations, however, suggest that what were previously thought to be single sub-millimeter sources on the sky, may break up into multiple components when viewed with sufficient resolving power. Here, we present a combination of high-resolution cosmological hydrodynamic zoom simulations of massive galaxies in formation with a new dust radiative transfer package in order to understand this multiplicity in simulated SMGs. We find that multiplicity is a natural element of SMG formation as numerous subhalos bombard the central during its peak growth phase

  4. Optical frequency switching scheme for a high-speed broadband THz measurement system based on the photomixing technique.

    PubMed

    Song, Hajun; Hwang, Sejin; Song, Jong-In

    2017-05-15

    This study presents an optical frequency switching scheme for a high-speed broadband terahertz (THz) measurement system based on the photomixing technique. The proposed system can achieve high-speed broadband THz measurements using narrow optical frequency scanning of a tunable laser source combined with a wavelength-switchable laser source. In addition, this scheme can provide a larger output power of an individual THz signal compared with that of a multi-mode THz signal generated by multiple CW laser sources. A swept-source THz tomography system implemented with a two-channel wavelength-switchable laser source achieves a reduced time for acquisition of a point spread function and a higher depth resolution in the same amount of measurement time compared with a system with a single optical source.

  5. Integrated Model of Multiple Kernel Learning and Differential Evolution for EUR/USD Trading

    PubMed Central

    Deng, Shangkun; Sakurai, Akito

    2014-01-01

    Currency trading is an important area for individual investors, government policy decisions, and organization investments. In this study, we propose a hybrid approach referred to as MKL-DE, which combines multiple kernel learning (MKL) with differential evolution (DE) for trading a currency pair. MKL is used to learn a model that predicts changes in the target currency pair, whereas DE is used to generate the buy and sell signals for the target currency pair based on the relative strength index (RSI), while it is also combined with MKL as a trading signal. The new hybrid implementation is applied to EUR/USD trading, which is the most traded foreign exchange (FX) currency pair. MKL is essential for utilizing information from multiple information sources and DE is essential for formulating a trading rule based on a mixture of discrete structures and continuous parameters. Initially, the prediction model optimized by MKL predicts the returns based on a technical indicator called the moving average convergence and divergence. Next, a combined trading signal is optimized by DE using the inputs from the prediction model and technical indicator RSI obtained from multiple timeframes. The experimental results showed that trading using the prediction learned by MKL yielded consistent profits. PMID:25097891

  6. Collective odor source estimation and search in time-variant airflow environments using mobile robots.

    PubMed

    Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming

    2011-01-01

    This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots' search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot's detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection-diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method.

  7. Collective Odor Source Estimation and Search in Time-Variant Airflow Environments Using Mobile Robots

    PubMed Central

    Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming

    2011-01-01

    This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots’ search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot’s detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection–diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method. PMID:22346650

  8. Site Area Boundaries

    EPA Pesticide Factsheets

    This dataset consists of site boundaries from multiple Superfund sites in U.S. EPA Region 8. These data were acquired from multiple sources at different times and were combined into one region-wide layer. Thus far the sources include:1. California Gulch (Irrigated Meadows) - ESAT Contractor.2. Manning Canyon - U.S. EPA Region 8; ESAT Contractor.3. Rapid City Small Arms Range - U.S. EPA Region 8; ESAT Contractor.4. Animas River/Cement Creek - U.S. EPA Region 8; ESAT Contractor.5. Monticello Mill Tailings (USDOE) - USDOE; ESAT Contractor.6. Pinon Canyon - USDOD.7. Rock Flats Industrial Park - U.S. EPA Region 8.8. Bountiful/Woods Cross - U.S. EPA Region 8.9. Lincoln Park - U.S. EPA Region 8.10. Marshall Landfill - U.S. EPA Region 8.11. U.S. Magnesium - Pacific Western Technologies Inc.

  9. Effects of correlated noise on the full-spectrum combining and complex-symbol combining arraying techniques

    NASA Technical Reports Server (NTRS)

    Vazirani, P.

    1995-01-01

    The process of combining telemetry signals received at multiple antennas, commonly referred to as arraying, can be used to improve communication link performance in the Deep Space Network (DSN). By coherently adding telemetry from multiple receiving sites, arraying produces an enhancement in signal-to-noise ratio (SNR) over that achievable with any single antenna in the array. A number of different techniques for arraying have been proposed and their performances analyzed in past literature. These analyses have compared different arraying schemes under the assumption that the signals contain additive white Gaussian noise (AWGN) and that the noise observed at distinct antennas is independent. In situations where an unwanted background body is visible to multiple antennas in the array, however, the assumption of independent noises is no longer applicable. A planet with significant radiation emissions in the frequency band of interest can be one such source of correlated noise. For example, during much of Galileo's tour of Jupiter, the planet will contribute significantly to the total system noise at various ground stations. This article analyzes the effects of correlated noise on two arraying schemes currently being considered for DSN applications: full-spectrum combining (FSC) and complex-symbol combining (CSC). A framework is presented for characterizing the correlated noise based on physical parameters, and the impact of the noise correlation on the array performance is assessed for each scheme.

  10. Marker-assisted combination of major genes for pathogen resistance in potato.

    PubMed

    Gebhardt, C; Bellin, D; Henselewski, H; Lehmann, W; Schwarzfischer, J; Valkonen, J P T

    2006-05-01

    Closely linked PCR-based markers facilitate the tracing and combining of resistance factors that have been introgressed previously into cultivated potato from different sources. Crosses were performed to combine the Ry ( adg ) gene for extreme resistance to Potato virus Y (PVY) with the Gro1 gene for resistance to the root cyst nematode Globodera rostochiensis and the Rx1 gene for extreme resistance to Potato virus X (PVX), or with resistance to potato wart (Synchytrium endobioticum). Marker-assisted selection (MAS) using four PCR-based diagnostic assays was applied to 110 F1 hybrids resulting from four 2x by 4x cross-combinations. Thirty tetraploid plants having the appropriate marker combinations were selected and tested for presence of the corresponding resistance traits. All plants tested showed the expected resistant phenotype. Unexpectedly, the plants segregated for additional resistance to pathotypes 1, 2 and 6 of S. endobioticum, which was subsequently shown to be inherited from the PVY resistant parents of the crosses. The selected plants can be used as sources of multiple resistance traits in pedigree breeding and are available from a potato germplasm bank.

  11. The combination of energy-dependent internal adaptation mechanisms and external factors enables Listeria monocytogenes to express a strong starvation survival response during multiple-nutrient starvation.

    PubMed

    Lungu, Bwalya; Saldivar, Joshua C; Story, Robert; Ricke, Steven C; Johnson, Michael G

    2010-05-01

    The goal of this study was to characterize the starvation survival response (SSR) of a wild-type Listeria monocytogenes 10403S and an isogenic DeltasigB mutant strain during multiple-nutrient starvation conditions over 28 days. This study examined the effects of inhibitors of protein synthesis, the proton motive force, substrate level phosphorylation, and oxidative phosphorylation on the SSR of L. monocytogenes 10403S and a DeltasigB mutant during multiple-nutrient starvation. The effects of starvation buffer changes on viability were also examined. During multiple-nutrient starvation, both strains expressed a strong SSR, suggesting that L. monocytogenes possesses SigB-independent mechanism(s) for survival during multiple-nutrient starvation. Neither strain was able to express an SSR following starvation buffer changes, indicating that the nutrients/factors present in the starvation buffer could be a source of energy for cell maintenance and survival. Neither the wild-type nor the DeltasigB mutant strain was able to elicit an SSR when exposed to the protein synthesis inhibitor chloramphenicol within the first 4 h of starvation. However, both strains expressed an SSR when exposed to chloramphenicol after 6 h or more of starvation, suggesting that the majority of proteins required to elicit an effective SSR in L. monocytogenes are likely produced somewhere between 4 and 6 h of starvation. The varying SSRs of both strains to the different metabolic inhibitors under aerobic or anaerobic conditions suggested that (1) energy derived from the proton motive force is important for an effective SSR, (2) L. monocytogenes utilizes an anaerobic electron transport during multiple-nutrient starvation conditions, and (3) the glycolytic pathway is an important energy source during multiple-nutrient starvation when oxygen is available, and less important under anaerobic conditions. Collectively, the data suggest that the combination of energy-dependent internal adaptation mechanisms of cells and external nutrients/factors enables L. monocytogenes to express a strong SSR.

  12. Efficient methods for joint estimation of multiple fundamental frequencies in music signals

    NASA Astrophysics Data System (ADS)

    Pertusa, Antonio; Iñesta, José M.

    2012-12-01

    This study presents efficient techniques for multiple fundamental frequency estimation in music signals. The proposed methodology can infer harmonic patterns from a mixture considering interactions with other sources and evaluate them in a joint estimation scheme. For this purpose, a set of fundamental frequency candidates are first selected at each frame, and several hypothetical combinations of them are generated. Combinations are independently evaluated, and the most likely is selected taking into account the intensity and spectral smoothness of its inferred patterns. The method is extended considering adjacent frames in order to smooth the detection in time, and a pitch tracking stage is finally performed to increase the temporal coherence. The proposed algorithms were evaluated in MIREX contests yielding state of the art results with a very low computational burden.

  13. Effectiveness of source documents for identifying fatal occupational injuries: a synthesis of studies.

    PubMed

    Stout, N; Bell, C

    1991-06-01

    The complete and accurate identification of fatal occupational injuries among the US work force is an important first step in developing work injury prevention efforts. Numerous sources of information, such as death certificates, Workers' Compensation files, Occupational Safety and Health Administration (OSHA) files, medical examiner records, state health and labor department reports, and various combinations of these, have been used to identify cases of work-related fatal injuries. Recent studies have questioned the effectiveness of these sources for identifying such cases. At least 10 studies have used multiple sources to define the universe of fatal work injuries within a state and to determine the capture rates, or proportion of the universe identified, by each source. Results of these studies, which are not all available in published literature, are summarized here in a format that allows researchers to readily compare the ascertainment capabilities of the sources. The overall average capture rates of sources were as follows: death certificates, 81%; medical examiner records, 61%; Workers' Compensation reports, 57%; and OSHA reports 32%. Variations by state and value added through the use of multiple sources are presented and discussed. This meta-analysis of 10 state-based studies summarizes the effectiveness of various source documents for capturing cases of fatal occupational injuries to help researchers make informed decisions when designing occupational injury surveillance systems.

  14. Effectiveness of source documents for identifying fatal occupational injuries: a synthesis of studies.

    PubMed Central

    Stout, N; Bell, C

    1991-01-01

    BACKGROUND: The complete and accurate identification of fatal occupational injuries among the US work force is an important first step in developing work injury prevention efforts. Numerous sources of information, such as death certificates, Workers' Compensation files, Occupational Safety and Health Administration (OSHA) files, medical examiner records, state health and labor department reports, and various combinations of these, have been used to identify cases of work-related fatal injuries. Recent studies have questioned the effectiveness of these sources for identifying such cases. METHODS: At least 10 studies have used multiple sources to define the universe of fatal work injuries within a state and to determine the capture rates, or proportion of the universe identified, by each source. Results of these studies, which are not all available in published literature, are summarized here in a format that allows researchers to readily compare the ascertainment capabilities of the sources. RESULTS: The overall average capture rates of sources were as follows: death certificates, 81%; medical examiner records, 61%; Workers' Compensation reports, 57%; and OSHA reports 32%. Variations by state and value added through the use of multiple sources are presented and discussed. CONCLUSIONS: This meta-analysis of 10 state-based studies summarizes the effectiveness of various source documents for capturing cases of fatal occupational injuries to help researchers make informed decisions when designing occupational injury surveillance systems. PMID:1827569

  15. Swallow segmentation with artificial neural networks and multi-sensor fusion.

    PubMed

    Lee, Joon; Steele, Catriona M; Chau, Tom

    2009-11-01

    Swallow segmentation is a critical precursory step to the analysis of swallowing signal characteristics. In an effort to automatically segment swallows, we investigated artificial neural networks (ANN) with information from cervical dual-axis accelerometry, submental MMG, and nasal airflow. Our objectives were (1) to investigate the relationship between segmentation performance and the number of signal sources and (2) to identify the signals or signal combinations most useful for swallow segmentation. Signals were acquired from 17 healthy adults in both discrete and continuous swallowing tasks using five stimuli. Training and test feature vectors were constructed with variances from single or multiple signals, estimated within 200 ms moving windows with 50% overlap. Corresponding binary target labels (swallow or non-swallow) were derived by manual segmentation. A separate 3-layer ANN was trained for each participant-signal combination, and all possible signal combinations were investigated. As more signal sources were included, segmentation performance improved in terms of sensitivity, specificity, accuracy, and adjusted accuracy. The combination of all four signal sources achieved the highest mean accuracy and adjusted accuracy of 88.5% and 89.6%, respectively. A-P accelerometry proved to be the most discriminatory source, while the inclusion of MMG or nasal airflow resulted in the least performance improvement. These findings suggest that an ANN, multi-sensor fusion approach to segmentation is worthy of further investigation in swallowing studies.

  16. Flood extent and water level estimation from SAR using data-model integration

    NASA Astrophysics Data System (ADS)

    Ajadi, O. A.; Meyer, F. J.

    2017-12-01

    Synthetic Aperture Radar (SAR) images have long been recognized as a valuable data source for flood mapping. Compared to other sources, SAR's weather and illumination independence and large area coverage at high spatial resolution supports reliable, frequent, and detailed observations of developing flood events. Accordingly, SAR has the potential to greatly aid in the near real-time monitoring of natural hazards, such as flood detection, if combined with automated image processing. This research works towards increasing the reliability and temporal sampling of SAR-derived flood hazard information by integrating information from multiple SAR sensors and SAR modalities (images and Interferometric SAR (InSAR) coherence) and by combining SAR-derived change detection information with hydrologic and hydraulic flood forecast models. First, the combination of multi-temporal SAR intensity images and coherence information for generating flood extent maps is introduced. The application of least-squares estimation integrates flood information from multiple SAR sensors, thus increasing the temporal sampling. SAR-based flood extent information will be combined with a Digital Elevation Model (DEM) to reduce false alarms and to estimate water depth and flood volume. The SAR-based flood extent map is assimilated into the Hydrologic Engineering Center River Analysis System (Hec-RAS) model to aid in hydraulic model calibration. The developed technology is improving the accuracy of flood information by exploiting information from data and models. It also provides enhanced flood information to decision-makers supporting the response to flood extent and improving emergency relief efforts.

  17. Developing CCUS system models to handle the complexity of multiple sources and sinks: An update on Tasks 5.3 and 5.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Richard Stephen

    2017-05-22

    This presentation is part of US-China Clean Coal project and describes the impact of power plant cycling, techno economic modeling of combined IGCC and CCS, integrated capacity generation decision making for power utilities, and a new decision support tool for integrated assessment of CCUS.

  18. Complex within complex: integrative taxonomy reveals hidden diversity in Cicadetta brevipennis (Hemiptera: Cicadidae) and unexpected relationships with a song divergent relative

    USDA-ARS?s Scientific Manuscript database

    Multiple sources of data in combination are essential for species delimitation and classification of difficult taxonomic groups. Here we investigate a cicada taxon with unusual cryptic diversity and we attempt to resolve seemingly contradictory data sets. Cicada songs act as species-specific premati...

  19. Mashing up Multiple Web Feeds Using Yahoo! Pipes

    ERIC Educational Resources Information Center

    Fagan, Jody Condit

    2007-01-01

    Pipes is an interactive data aggregator and manipulator that lets you mashup your favorite online data sources. Pipes could be used to "combine many feeds into one, then sort, filter and translate to create your ultimate custom feed. In this article, the author describes how to use Yahoo! Pipes. The author shares what she has learned in…

  20. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  1. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China.

    PubMed

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-12-02

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method.

  2. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China

    PubMed Central

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-01-01

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the “source-pathway-target” in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450

  3. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  4. True color scanning laser ophthalmoscopy and optical coherence tomography handheld probe

    PubMed Central

    LaRocca, Francesco; Nankivil, Derek; Farsiu, Sina; Izatt, Joseph A.

    2014-01-01

    Scanning laser ophthalmoscopes (SLOs) are able to achieve superior contrast and axial sectioning capability compared to fundus photography. However, SLOs typically use monochromatic illumination and are thus unable to extract color information of the retina. Previous color SLO imaging techniques utilized multiple lasers or narrow band sources for illumination, which allowed for multiple color but not “true color” imaging as done in fundus photography. We describe the first “true color” SLO, handheld color SLO, and combined color SLO integrated with a spectral domain optical coherence tomography (OCT) system. To achieve accurate color imaging, the SLO was calibrated with a color test target and utilized an achromatizing lens when imaging the retina to correct for the eye’s longitudinal chromatic aberration. Color SLO and OCT images from volunteers were then acquired simultaneously with a combined power under the ANSI limit. Images from this system were then compared with those from commercially available SLOs featuring multiple narrow-band color imaging. PMID:25401032

  5. Origins of extrinsic variability in eukaryotic gene expression

    NASA Astrophysics Data System (ADS)

    Volfson, Dmitri; Marciniak, Jennifer; Blake, William J.; Ostroff, Natalie; Tsimring, Lev S.; Hasty, Jeff

    2006-02-01

    Variable gene expression within a clonal population of cells has been implicated in a number of important processes including mutation and evolution, determination of cell fates and the development of genetic disease. Recent studies have demonstrated that a significant component of expression variability arises from extrinsic factors thought to influence multiple genes simultaneously, yet the biological origins of this extrinsic variability have received little attention. Here we combine computational modelling with fluorescence data generated from multiple promoter-gene inserts in Saccharomyces cerevisiae to identify two major sources of extrinsic variability. One unavoidable source arising from the coupling of gene expression with population dynamics leads to a ubiquitous lower limit for expression variability. A second source, which is modelled as originating from a common upstream transcription factor, exemplifies how regulatory networks can convert noise in upstream regulator expression into extrinsic noise at the output of a target gene. Our results highlight the importance of the interplay of gene regulatory networks with population heterogeneity for understanding the origins of cellular diversity.

  6. Origins of extrinsic variability in eukaryotic gene expression

    NASA Astrophysics Data System (ADS)

    Volfson, Dmitri; Marciniak, Jennifer; Blake, William J.; Ostroff, Natalie; Tsimring, Lev S.; Hasty, Jeff

    2006-03-01

    Variable gene expression within a clonal population of cells has been implicated in a number of important processes including mutation and evolution, determination of cell fates and the development of genetic disease. Recent studies have demonstrated that a significant component of expression variability arises from extrinsic factors thought to influence multiple genes in concert, yet the biological origins of this extrinsic variability have received little attention. Here we combine computational modeling with fluorescence data generated from multiple promoter-gene inserts in Saccharomyces cerevisiae to identify two major sources of extrinsic variability. One unavoidable source arising from the coupling of gene expression with population dynamics leads to a ubiquitous noise floor in expression variability. A second source which is modeled as originating from a common upstream transcription factor exemplifies how regulatory networks can convert noise in upstream regulator expression into extrinsic noise at the output of a target gene. Our results highlight the importance of the interplay of gene regulatory networks with population heterogeneity for understanding the origins of cellular diversity.

  7. Human annoyance and reactions to hotel room specific noises

    NASA Astrophysics Data System (ADS)

    Everhard, Ian L.

    2004-05-01

    A new formula is presented where multiple annoyance sources and transmission loss values of any partition are combined to produce a new single number rating of annoyance. The explanation of the formula is based on theoretical psychoacoustics and survey testing used to create variables used to weight the results. An imaginary hotel room is processed through the new formula and is rated based on theoretical survey results that would be taken by guests of the hotel. The new single number rating compares the multiple sources of annoyance to a single imaginary unbiased source where absolute level is the only factor in stimulating a linear rise in annoyance [Fidell et al., J. Acoust. Soc. Am. 66, 1427 (1979); D. M. Jones and D. E. Broadbent, ``Human performance and noise,'' in Handbook of Noise Control, 3rd ed., edited by C. M. Harris (ASA, New York, 1998), Chap. 24; J. P. Conroy and J. S. Roland, ``STC Field Testing and Results,'' in Sound and Vibration Magazine, Acoustical Publications, pp. 10-15 (July 2003)].

  8. A PC-based telemetry system for acquiring and reducing data from multiple PCM streams

    NASA Astrophysics Data System (ADS)

    Simms, D. A.; Butterfield, C. P.

    1991-07-01

    The Solar Energy Research Institute's (SERI) Wind Research Program is using Pulse Code Modulation (PCM) Telemetry Data-Acquisition Systems to study horizontal-axis wind turbines. Many PCM systems are combined for use in test installations that require accurate measurements from a variety of different locations. SERI has found them ideal for data-acquisition from multiple wind turbines and meteorological towers in wind parks. A major problem has been in providing the capability to quickly combine and examine incoming data from multiple PCM sources in the field. To solve this problem, SERI has developed a low-cost PC-based PCM Telemetry Data-Reduction System (PC-PCM System) to facilitate quick, in-the-field multiple-channel data analysis. The PC-PCM System consists of two basic components. First, PC-compatible hardware boards are used to decode and combine multiple PCM data streams. Up to four hardware boards can be installed in a single PC, which provides the capability to combine data from four PCM streams directly to PC disk or memory. Each stream can have up to 62 data channels. Second, a software package written for use under DOS was developed to simplify data-acquisition control and management. The software, called the Quick-Look Data Management Program, provides a quick, easy-to-use interface between the PC and multiple PCM data streams. The Quick-Look Data Management Program is a comprehensive menu-driven package used to organize, acquire, process, and display information from incoming PCM data streams. The paper describes both hardware and software aspects of the SERI PC-PCM system, concentrating on features that make it useful in an experiment test environment to quickly examine and verify incoming data from multiple PCM streams. Also discussed are problems and techniques associated with PC-based telemetry data-acquisition, processing, and real-time display.

  9. An optimal merging technique for high-resolution precipitation products: OPTIMAL MERGING OF PRECIPITATION METHOD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Roshan; Houser, Paul R.; Anantharaj, Valentine G.

    2011-04-01

    Precipitation products are currently available from various sources at higher spatial and temporal resolution than any time in the past. Each of the precipitation products has its strengths and weaknesses in availability, accuracy, resolution, retrieval techniques and quality control. By merging the precipitation data obtained from multiple sources, one can improve its information content by minimizing these issues. However, precipitation data merging poses challenges of scale-mismatch, and accurate error and bias assessment. In this paper we present Optimal Merging of Precipitation (OMP), a new method to merge precipitation data from multiple sources that are of different spatial and temporal resolutionsmore » and accuracies. This method is a combination of scale conversion and merging weight optimization, involving performance-tracing based on Bayesian statistics and trend-analysis, which yields merging weights for each precipitation data source. The weights are optimized at multiple scales to facilitate multiscale merging and better precipitation downscaling. Precipitation data used in the experiment include products from the 12-km resolution North American Land Data Assimilation (NLDAS) system, the 8-km resolution CMORPH and the 4-km resolution National Stage-IV QPE. The test cases demonstrate that the OMP method is capable of identifying a better data source and allocating a higher priority for them in the merging procedure, dynamically over the region and time period. This method is also effective in filtering out poor quality data introduced into the merging process.« less

  10. Searching Across the International Space Station Databases

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; McDermott, William J.; Smith, Ernest E.; Bell, David G.; Gurram, Mohana

    2007-01-01

    Data access in the enterprise generally requires us to combine data from different sources and different formats. It is advantageous thus to focus on the intersection of the knowledge across sources and domains; keeping irrelevant knowledge around only serves to make the integration more unwieldy and more complicated than necessary. A context search over multiple domain is proposed in this paper to use context sensitive queries to support disciplined manipulation of domain knowledge resources. The objective of a context search is to provide the capability for interrogating many domain knowledge resources, which are largely semantically disjoint. The search supports formally the tasks of selecting, combining, extending, specializing, and modifying components from a diverse set of domains. This paper demonstrates a new paradigm in composition of information for enterprise applications. In particular, it discusses an approach to achieving data integration across multiple sources, in a manner that does not require heavy investment in database and middleware maintenance. This lean approach to integration leads to cost-effectiveness and scalability of data integration with an underlying schemaless object-relational database management system. This highly scalable, information on demand system framework, called NX-Search, which is an implementation of an information system built on NETMARK. NETMARK is a flexible, high-throughput open database integration framework for managing, storing, and searching unstructured or semi-structured arbitrary XML and HTML used widely at the National Aeronautics Space Administration (NASA) and industry.

  11. Design of optical element combining Fresnel lens with microlens array for uniform light-emitting diode lighting.

    PubMed

    Wang, Guangzhen; Wang, Lili; Li, Fuli; Kong, Depeng

    2012-09-01

    One kind of optical element combining Fresnel lens with microlens array is designed simply for LED lighting based on geometrical optics and nonimaging optics. This design method imposes no restriction on the source intensity pattern. The designed element has compact construction and can produce multiple shapes of illumination distribution. Taking square lighting as an example, tolerance analysis is carried out to determine tolerance limits for applying the element in the assembly process. This element can produce on-axis lighting and off-axis lighting.

  12. Graphical function mapping as a new way to explore cause-and-effect chains

    USGS Publications Warehouse

    Evans, Mary Anne

    2016-01-01

    Graphical function mapping provides a simple method for improving communication within interdisciplinary research teams and between scientists and nonscientists. This article introduces graphical function mapping using two examples and discusses its usefulness. Function mapping projects the outcome of one function into another to show the combined effect. Using this mathematical property in a simpler, even cartoon-like, graphical way allows the rapid combination of multiple information sources (models, empirical data, expert judgment, and guesses) in an intuitive visual to promote further discussion, scenario development, and clear communication.

  13. Objectively combining AR5 instrumental period and paleoclimate climate sensitivity evidence

    NASA Astrophysics Data System (ADS)

    Lewis, Nicholas; Grünwald, Peter

    2018-03-01

    Combining instrumental period evidence regarding equilibrium climate sensitivity with largely independent paleoclimate proxy evidence should enable a more constrained sensitivity estimate to be obtained. Previous, subjective Bayesian approaches involved selection of a prior probability distribution reflecting the investigators' beliefs about climate sensitivity. Here a recently developed approach employing two different statistical methods—objective Bayesian and frequentist likelihood-ratio—is used to combine instrumental period and paleoclimate evidence based on data presented and assessments made in the IPCC Fifth Assessment Report. Probabilistic estimates from each source of evidence are represented by posterior probability density functions (PDFs) of physically-appropriate form that can be uniquely factored into a likelihood function and a noninformative prior distribution. The three-parameter form is shown accurately to fit a wide range of estimated climate sensitivity PDFs. The likelihood functions relating to the probabilistic estimates from the two sources are multiplicatively combined and a prior is derived that is noninformative for inference from the combined evidence. A posterior PDF that incorporates the evidence from both sources is produced using a single-step approach, which avoids the order-dependency that would arise if Bayesian updating were used. Results are compared with an alternative approach using the frequentist signed root likelihood ratio method. Results from these two methods are effectively identical, and provide a 5-95% range for climate sensitivity of 1.1-4.05 K (median 1.87 K).

  14. Integrating Multiple Data Sources for Combinatorial Marker Discovery: A Study in Tumorigenesis.

    PubMed

    Bandyopadhyay, Sanghamitra; Mallik, Saurav

    2018-01-01

    Identification of combinatorial markers from multiple data sources is a challenging task in bioinformatics. Here, we propose a novel computational framework for identifying significant combinatorial markers ( s) using both gene expression and methylation data. The gene expression and methylation data are integrated into a single continuous data as well as a (post-discretized) boolean data based on their intrinsic (i.e., inverse) relationship. A novel combined score of methylation and expression data (viz., ) is introduced which is computed on the integrated continuous data for identifying initial non-redundant set of genes. Thereafter, (maximal) frequent closed homogeneous genesets are identified using a well-known biclustering algorithm applied on the integrated boolean data of the determined non-redundant set of genes. A novel sample-based weighted support ( ) is then proposed that is consecutively calculated on the integrated boolean data of the determined non-redundant set of genes in order to identify the non-redundant significant genesets. The top few resulting genesets are identified as potential s. Since our proposed method generates a smaller number of significant non-redundant genesets than those by other popular methods, the method is much faster than the others. Application of the proposed technique on an expression and a methylation data for Uterine tumor or Prostate Carcinoma produces a set of significant combination of markers. We expect that such a combination of markers will produce lower false positives than individual markers.

  15. Enhancing the production of eicosapentaenoic acid (EPA) from Nannochloropsis oceanica CY2 using innovative photobioreactors with optimal light source arrangements.

    PubMed

    Chen, Chun-Yen; Chen, Yu-Chun; Huang, Hsiao-Chen; Ho, Shih-Hsin; Chang, Jo-Shu

    2015-09-01

    Binary combinations of LEDs with four different colors were used as light sources to identify the effects of multiple wavelengths on the production of eicosapentaenoic acid (EPA) by an isolated microalga Nannochloropsis oceanica CY2. Combining LED-Blue and LED-Red could give the highest EPA productivity of 13.24 mg L(-1) d(-1), which was further enhanced to 14.4 mg L(-1) d(-1) when using semi-batch operations at a 40% medium replacement ratio. A novel photobioreactor with additional immersed light sources improved light penetration efficiency and led to an 38% (0.170-0.235 g L(-1) d(-1)) increase in the microalgae biomass productivity and a 9% decrease in electricity consumption yield of EPA (10.15-9.33 kW-h (g EPA)(-1)) when compared with the control (i.e., without immersed light sources). Operating the immersed LEDs at a flashing-frequency of 9 Hz further lowered the energy consumption yield to 8.87 kW-h (g EPA)(-1). Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Estimation of gross land-use change and its uncertainty using a Bayesian data assimilation approach

    NASA Astrophysics Data System (ADS)

    Levy, Peter; van Oijen, Marcel; Buys, Gwen; Tomlinson, Sam

    2018-03-01

    We present a method for estimating land-use change using a Bayesian data assimilation approach. The approach provides a general framework for combining multiple disparate data sources with a simple model. This allows us to constrain estimates of gross land-use change with reliable national-scale census data, whilst retaining the detailed information available from several other sources. Eight different data sources, with three different data structures, were combined in our posterior estimate of land use and land-use change, and other data sources could easily be added in future. The tendency for observations to underestimate gross land-use change is accounted for by allowing for a skewed distribution in the likelihood function. The data structure produced has high temporal and spatial resolution, and is appropriate for dynamic process-based modelling. Uncertainty is propagated appropriately into the output, so we have a full posterior distribution of output and parameters. The data are available in the widely used netCDF file format from http://eidc.ceh.ac.uk/.

  17. The XMM-SERVS Survey: first results in the 5 deg^2 XMM-LSS region

    NASA Astrophysics Data System (ADS)

    Chen, Chien-Ting; Brandt, William; Luo, Bin; X-SERVS team

    2018-01-01

    We present an X-ray source catalog obtained with XMM-Newton in the XMM-LSS region as part of the X-SERVS survey (XMM-SERVS-LSS), which aims to expand the parameter space of current X-ray surveys with medium-deep X-ray observations in multiple large fields with superb multiwavelength coverage. Within the 5 deg$^2$ XMM-SERVS-LSS field, we combine the 1.3 Ms XMM observations allocated at XMM AO-15 with archival data, and identified 5218 X-ray sources of which 2400 are new sources. We reach $1.2\\times10^{-15}$ erg s$^{-1} cm$^{-1}$ for 50\\% of the area, which is comparable to the XMM-COSMOS survey but with 2.5 times more sources. We also present multiwavelength identifications, basic photometric properties, and spectroscopic redshifts obtained from the literature. These data, combined with the existing data from COSMOS, will enable a wide range of science on AGN evolution, including studying SMBH growth across the full range of cosmic environments and minimizing cosmic variance.

  18. Mixed Infections and their Control

    DTIC Science & Technology

    1983-04-29

    endocarditis , bactereF.,;a, and closed-space infections , such as brain or lung abscesses that cavnot be surgically drained. Combination therapy should not be...Systemic infection is a common complication of multiple injury despite the availability of potent and specific antibiotics. 2 Infections following...trauma are due to opportunistic pathogens that . originate from endogenous nr exogenous sources. These pathogens, often present as mixed infections

  19. The Light-Emitting Diode as a Light Detector

    ERIC Educational Resources Information Center

    Baird, William H.; Hack, W. Nathan; Tran, Kiet; Vira, Zeeshan; Pickett, Matthew

    2011-01-01

    A light-emitting diode (LED) and operational amplifier can be used as an affordable method to provide a digital output indicating detection of an intense light source such as a laser beam or high-output LED. When coupled with a microcontroller, the combination can be used as a multiple photogate and timer for under $50. A similar circuit is used…

  20. The Role of Gaze and Road Edge Information during High-Speed Locomotion

    ERIC Educational Resources Information Center

    Kountouriotis, Georgios K.; Floyd, Rosalind C.; Gardner, Peter H.; Merat, Natasha; Wilkie, Richard M.

    2012-01-01

    Robust control of skilled actions requires the flexible combination of multiple sources of information. Here we examined the role of gaze during high-speed locomotor steering and in particular the role of feedback from the visible road edges. Participants were required to maintain one of three lateral positions on the road when one or both edges…

  1. Control Coordination of Multiple Agents Through Decision Theoretic and Economic Methods

    DTIC Science & Technology

    2003-02-01

    instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information...investigated the design of test data for benchmarking such optimization algorithms. Our other research on combinatorial auctions included I...average combination rule. We exemplified these theoretical results with experiments on stock market data , demonstrating how ensembles of classifiers can

  2. White oak seed source performance across multiple sites in Indiana through age 16

    Treesearch

    Philip A. O' Connor; Mark V. Coggeshall

    2011-01-01

    In 1984, a series of combined provenance/progeny tests was established by the Indiana Department of Natural Resources, Division of Forestry. Three plantings were established using a maximum of 70 open-pollinated families representing 17 natural stands (15 Indiana, 1 Illinois, and 1 Missouri). Height data were collected periodically through age 16. Early analyses (ages...

  3. An Agent-Based Interface to Terrestrial Ecological Forecasting

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Nemani, Ramakrishna; Pang, Wan-Lin; Votava, Petr; Etzioni, Oren

    2004-01-01

    This paper describes a flexible agent-based ecological forecasting system that combines multiple distributed data sources and models to provide near-real-time answers to questions about the state of the Earth system We build on novel techniques in automated constraint-based planning and natural language interfaces to automatically generate data products based on descriptions of the desired data products.

  4. A Comparison of Methods to Screen Middle School Students for Reading and Math Difficulties

    ERIC Educational Resources Information Center

    Nelson, Peter M.; Van Norman, Ethan R.; Lackner, Stacey K.

    2016-01-01

    The current study explored multiple ways in which middle schools can use and integrate data sources to predict proficiency on future high-stakes state achievement tests. The diagnostic accuracy of (a) prior achievement data, (b) teacher rating scale scores, (c) a composite score combining state test scores and rating scale responses, and (d) two…

  5. Data integration of structured and unstructured sources for assigning clinical codes to patient stays

    PubMed Central

    Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim

    2016-01-01

    Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458

  6. Non-optically combined multispectral source for IR, visible, and laser testing

    NASA Astrophysics Data System (ADS)

    Laveigne, Joe; Rich, Brian; McHugh, Steve; Chua, Peter

    2010-04-01

    Electro Optical technology continues to advance, incorporating developments in infrared and laser technology into smaller, more tightly-integrated systems that can see and discriminate military targets at ever-increasing distances. New systems incorporate laser illumination and ranging with gated sensors that allow unparalleled vision at a distance. These new capabilities augment existing all-weather performance in the mid-wave infrared (MWIR) and long-wave infrared (LWIR), as well as low light level visible and near infrared (VNIR), giving the user multiple means of looking at targets of interest. There is a need in the test industry to generate imagery in the relevant spectral bands, and to provide temporal stimulus for testing range-gated systems. Santa Barbara Infrared (SBIR) has developed a new means of combining a uniform infrared source with uniform laser and visible sources for electro-optics (EO) testing. The source has been designed to allow laboratory testing of surveillance systems incorporating an infrared imager and a range-gated camera; and for field testing of emerging multi-spectral/fused sensor systems. A description of the source will be presented along with performance data relating to EO testing, including output in pertinent spectral bands, stability and resolution.

  7. Public Health Surveillance of Fatal Child Maltreatment: Analysis of 3 State Programs

    PubMed Central

    Schnitzer, Patricia G.; Covington, Theresa M.; Wirtz, Stephen J.; Verhoek-Oftedahl, Wendy; Palusci, Vincent J.

    2008-01-01

    Objectives. We sought to describe approaches to surveillance of fatal child maltreatment and to identify options for improving case ascertainment. Methods. Three states—California, Michigan, and Rhode Island—used multiple data sources for surveillance. Potential cases were identified, operational definitions were applied, and the number of maltreatment deaths was determined. Results. These programs identified 258 maltreatment deaths in California, 192 in Michigan, and 60 in Rhode Island. Corresponding maltreatment fatality rates ranged from 2.5 per 100000 population in Michigan to 8.8 in Rhode Island. Most deaths were identified by child death review teams in Rhode Island (98%), Uniform Crime Reports in California (56%), and child welfare agency data in Michigan (44%). Compared with the total number of cases identified, child welfare agency (the official source for maltreatment reports) and death certificate data underascertain child maltreatment deaths by 55% to 76% and 80% to 90%, respectively. In all 3 states, more than 90% of cases ascertained could be identified by combining 2 data sources. Conclusions. No single data source was adequate for thorough surveillance of fatal child maltreatment, but combining just 2 sources substantially increased case ascertainment. The child death review team process may be the most promising surveillance approach. PMID:17538060

  8. Public health surveillance of fatal child maltreatment: analysis of 3 state programs.

    PubMed

    Schnitzer, Patricia G; Covington, Theresa M; Wirtz, Stephen J; Verhoek-Oftedahl, Wendy; Palusci, Vincent J

    2008-02-01

    We sought to describe approaches to surveillance of fatal child maltreatment and to identify options for improving case ascertainment. Three states--California, Michigan, and Rhode Island--used multiple data sources for surveillance. Potential cases were identified, operational definitions were applied, and the number of maltreatment deaths was determined. These programs identified 258 maltreatment deaths in California, 192 in Michigan, and 60 in Rhode Island. Corresponding maltreatment fatality rates ranged from 2.5 per 100,000 population in Michigan to 8.8 in Rhode Island. Most deaths were identified by child death review teams in Rhode Island (98%), Uniform Crime Reports in California (56%), and child welfare agency data in Michigan (44%). Compared with the total number of cases identified, child welfare agency (the official source for maltreatment reports) and death certificate data underascertain child maltreatment deaths by 55% to 76% and 80% to 90%, respectively. In all 3 states, more than 90% of cases ascertained could be identified by combining 2 data sources. No single data source was adequate for thorough surveillance of fatal child maltreatment, but combining just 2 sources substantially increased case ascertainment. The child death review team process may be the most promising surveillance approach.

  9. Time-resolved multicolor two-photon excitation fluorescence microscopy of cells and tissues

    NASA Astrophysics Data System (ADS)

    Zheng, Wei

    2014-11-01

    Multilabeling which maps the distribution of different targets is an indispensable technique in many biochemical and biophysical studies. Two-photon excitation fluorescence (TPEF) microscopy of endogenous fluorophores combining with conventional fluorescence labeling techniques such as genetically encoded fluorescent protein (FP) and fluorescent dyes staining could be a powerful tool for imaging living cells. However, the challenge is that the excitation and emission wavelength of these endogenous fluorophores and fluorescent labels are very different. A multi-color ultrafast source is required for the excitation of multiple fluorescence molecules. In this study, we developed a two-photon imaging system with excitations from the pump femtosecond laser and the selected supercontinuum generated from a photonic crystal fiber (PCF). Multiple endogenous fluorophores, fluorescent proteins and fluorescent dyes were excited in their optimal wavelengths simultaneously. A time- and spectral-resolved detection system was used to record the TPEF signals. This detection technique separated the TPEF signals from multiple sources in time and wavelength domains. Cellular organelles such as nucleus, mitochondria, microtubule and endoplasmic reticulum, were clearly revealed in the TPEF images. The simultaneous imaging of multiple fluorophores of cells will greatly aid the study of sub-cellular compartments and protein localization.

  10. Transparent mediation-based access to multiple yeast data sources using an ontology driven interface.

    PubMed

    Briache, Abdelaali; Marrakchi, Kamar; Kerzazi, Amine; Navas-Delgado, Ismael; Rossi Hassani, Badr D; Lairini, Khalid; Aldana-Montes, José F

    2012-01-25

    Saccharomyces cerevisiae is recognized as a model system representing a simple eukaryote whose genome can be easily manipulated. Information solicited by scientists on its biological entities (Proteins, Genes, RNAs...) is scattered within several data sources like SGD, Yeastract, CYGD-MIPS, BioGrid, PhosphoGrid, etc. Because of the heterogeneity of these sources, querying them separately and then manually combining the returned results is a complex and time-consuming task for biologists most of whom are not bioinformatics expert. It also reduces and limits the use that can be made on the available data. To provide transparent and simultaneous access to yeast sources, we have developed YeastMed: an XML and mediator-based system. In this paper, we present our approach in developing this system which takes advantage of SB-KOM to perform the query transformation needed and a set of Data Services to reach the integrated data sources. The system is composed of a set of modules that depend heavily on XML and Semantic Web technologies. User queries are expressed in terms of a domain ontology through a simple form-based web interface. YeastMed is the first mediation-based system specific for integrating yeast data sources. It was conceived mainly to help biologists to find simultaneously relevant data from multiple data sources. It has a biologist-friendly interface easy to use. The system is available at http://www.khaos.uma.es/yeastmed/.

  11. Data Transfer for Multiple Sensor Networks Over a Broad Temperature Range

    NASA Technical Reports Server (NTRS)

    Krasowski, Michael

    2013-01-01

    At extreme temperatures, cryogenic and over 300 C, few electronic components are available to support intelligent data transfer over a common, linear combining medium. This innovation allows many sensors to operate on the same wire bus (or on the same airwaves or optical channel: any linearly combining medium), transmitting simultaneously, but individually recoverable at a node in a cooler part of the test area. This innovation has been demonstrated using room-temperature silicon microcircuits as proxy. The microcircuits have analog functionality comparable to componentry designed using silicon carbide. Given a common, linearly combining medium, multiple sending units may transmit information simultaneously. A listening node, using various techniques, can pick out the signal from a single sender, if it has unique qualities, e.g. a voice. The problem being solved is commonly referred to as the cocktail party problem. The human brain uses the cocktail party effect when it is able to recognize and follow a single conversation in a party full of talkers and other noise sources. High-temperature sensors have been used in silicon carbide electronic oscillator circuits. The frequency of the oscillator changes as a function of the changes in the sensed parameter, such as pressure. This change is analogous to changes in the pitch of a person s voice. The output of this oscillator and many others may be superimposed onto a single medium. This medium may be the power lines supplying current to the sensors, a third wire dedicated to data transmission, the airwaves through radio transmission, an optical medium, etc. However, with nothing to distinguish the identities of each source that is, the source separation this system is useless. Using digital electronic functions, unique codes or patterns are created and used to modulate the output of the sensor.

  12. High brightness fiber laser pump sources based on single emitters and multiple single emitters

    NASA Astrophysics Data System (ADS)

    Scheller, Torsten; Wagner, Lars; Wolf, Jürgen; Bonati, Guido; Dörfel, Falk; Gabler, Thomas

    2008-02-01

    Driven by the potential of the fiber laser market, the development of high brightness pump sources has been pushed during the last years. The main approaches to reach the targets of this market had been the direct coupling of single emitters (SE) on the one hand and the beam shaping of bars and stacks on the other hand, which often causes higher cost per watt. Meanwhile the power of single emitters with 100μm emitter size for direct coupling increased dramatically, which also pushed a new generation of wide stripe emitters or multi emitters (ME) of up to 1000μm emitter size respectively "minibars" with apertures of 3 to 5mm. The advantage of this emitter type compared to traditional bars is it's scalability to power levels of 40W to 60W combined with a small aperture which gives advantages when coupling into a fiber. We show concepts using this multiple single emitters for fiber coupled systems of 25W up to 40W out of a 100μm fiber NA 0.22 with a reasonable optical efficiency. Taking into account a further efficiency optimization and an increase in power of these devices in the near future, the EUR/W ratio pushed by the fiber laser manufacturer will further decrease. Results will be shown as well for higher power pump sources. Additional state of the art tapered fiber bundles for photonic crystal fibers are used to combine 7 (19) pump sources to output powers of 100W (370W) out of a 130μm (250μm) fiber NA 0.6 with nominal 20W per port. Improving those TFB's in the near future and utilizing 40W per pump leg, an output power of even 750W out of 250μm fiber NA 0.6 will be possible. Combined Counter- and Co-Propagated pumping of the fiber will then lead to the first 1kW fiber laser oscillator.

  13. Enhanced provenance interpretation using combined U-Pb and (U-Th)/He double dating of detrital zircon grains from lower Miocene strata, proximal Gulf of Mexico Basin, North America

    NASA Astrophysics Data System (ADS)

    Xu, Jie; Stockli, Daniel F.; Snedden, John W.

    2017-10-01

    Detrital zircon U-Pb analysis is an effective approach for investigating sediment provenance by relating crystallization age to potential crystalline source terranes. Studies of large passive margin basins, such as the Gulf of Mexico Basin, that have received sediment from multiple terranes with non-unique crystallization ages or sedimentary strata, benefit from additional constraints to better elucidate provenance interpretation. In this study, U-Pb and (U-Th)/He double dating analyses on single zircons from the lower Miocene sandstones in the northern Gulf of Mexico Basin reveal a detailed history of sediment source evolution. U-Pb age data indicate that most zircon originated from five major crystalline provinces, including the Western Cordillera Arc (<250 Ma), the Appalachian-Ouachita orogen (500-260 Ma), the Grenville (1300-950 Ma) orogen, the Mid-Continent Granite-Rhyolite (1500-1300 Ma), and the Yavapai-Mazatzal (1800-1600 Ma) terranes as well as sparse Pan-African (700-500 Ma) and Canadian Shield (>1800 Ma) terranes. Zircon (U-Th)/He ages record tectonic cooling and exhumation in the U.S. since the Mesoproterozoic related to the Grenville to Laramide Orogenies. The combined crystallization and cooling information from single zircon double dating can differentiate volcanic and plutonic zircons. Importantly, the U-Pb-He double dating approach allows for the differentiation between multiple possible crystallization-age sources on the basis of their subsequent tectonic evolution. In particular, for Grenville zircons that are present in all of lower Miocene samples, four distinct zircon U-Pb-He age combinations are recognizable that can be traced back to four different possible sources. The integrated U-Pb and (U-Th)/He data eliminate some ambiguities and improves the provenance interpretation for the lower Miocene strata in the northern Gulf of Mexico Basin and illustrate the applicability of this approach for other large-scale basins to reconstruct sediment provenance and dispersal patterns.

  14. Sequential combination of multi-source satellite observations for separation of surface deformation associated with serial seismic events

    NASA Astrophysics Data System (ADS)

    Chen, Qiang; Xu, Qian; Zhang, Yijun; Yang, Yinghui; Yong, Qi; Liu, Guoxiang; Liu, Xianwen

    2018-03-01

    Single satellite geodetic technique has weakness for mapping sequence of ground deformation associated with serial seismic events, like InSAR with long revisiting period readily leading to mixed complex deformation signals from multiple events. It challenges the observation capability of single satellite geodetic technique for accurate recognition of individual surface deformation and earthquake model. The rapidly increasing availability of various satellite observations provides good solution for overcoming the issue. In this study, we explore a sequential combination of multiple overlapping datasets from ALOS/PALSAR, ENVISAT/ASAR and GPS observations to separate surface deformation associated with the 2011 Mw 9.0 Tohoku-Oki major quake and two strong aftershocks including the Mw 6.6 Iwaki and Mw 5.8 Ibaraki events. We first estimate the fault slip model of major shock with ASAR interferometry and GPS displacements as constraints. Due to the used PALSAR interferogram spanning the period of all the events, we then remove the surface deformation of major shock through forward calculated prediction thus obtaining PALSAR InSAR deformation associated with the two strong aftershocks. The inversion for source parameters of Iwaki aftershock is conducted using the refined PALSAR deformation considering that the higher magnitude Iwaki quake has dominant deformation contribution than the Ibaraki event. After removal of deformation component of Iwaki event, we determine the fault slip distribution of Ibaraki shock using the remained PALSAR InSAR deformation. Finally, the complete source models for the serial seismic events are clearly identified from the sequential combination of multi-source satellite observations, which suggest that the major quake is a predominant mega-thrust rupture, whereas the two aftershocks are normal faulting motion. The estimated seismic moment magnitude for the Tohoku-Oki, Iwaki and Ibaraki evens are Mw 9.0, Mw 6.85 and Mw 6.11, respectively.

  15. Community Response to Multiple Sound Sources: Integrating Acoustic and Contextual Approaches in the Analysis

    PubMed Central

    Lercher, Peter; De Coensel, Bert; Dekonink, Luc; Botteldooren, Dick

    2017-01-01

    Sufficient data refer to the relevant prevalence of sound exposure by mixed traffic sources in many nations. Furthermore, consideration of the potential effects of combined sound exposure is required in legal procedures such as environmental health impact assessments. Nevertheless, current practice still uses single exposure response functions. It is silently assumed that those standard exposure-response curves accommodate also for mixed exposures—although some evidence from experimental and field studies casts doubt on this practice. The ALPNAP-study population (N = 1641) shows sufficient subgroups with combinations of rail-highway, highway-main road and rail-highway-main road sound exposure. In this paper we apply a few suggested approaches of the literature to investigate exposure-response curves and its major determinants in the case of exposure to multiple traffic sources. Highly/moderate annoyance and full scale mean annoyance served as outcome. The results show several limitations of the current approaches. Even facing the inherent methodological limitations (energy equivalent summation of sound, rating of overall annoyance) the consideration of main contextual factors jointly occurring with the sources (such as vibration, air pollution) or coping activities and judgments of the wider area soundscape increases the variance explanation from up to 8% (bivariate), up to 15% (base adjustments) up to 55% (full contextual model). The added predictors vary significantly, depending on the source combination. (e.g., significant vibration effects with main road/railway, not highway). Although no significant interactions were found, the observed additive effects are of public health importance. Especially in the case of a three source exposure situation the overall annoyance is already high at lower levels and the contribution of the acoustic indicators is small compared with the non-acoustic and contextual predictors. Noise mapping needs to go down to levels of 40 dBA,Lden to ensure the protection of quiet areas and prohibit the silent “filling up” of these areas with new sound sources. Eventually, to better predict the annoyance in the exposure range between 40 and 60 dBA and support the protection of quiet areas in city and rural areas in planning sound indicators need to be oriented at the noticeability of sound and consider other traffic related by-products (air quality, vibration, coping strain) in future studies and environmental impact assessments. PMID:28632198

  16. Impact of the Medicare interim payment system on length of use in home care among patients with Medicare-only payment source.

    PubMed

    Han, Beth; Remsburg, Robin E

    2005-01-01

    Using data from the 1996 and 2000 National Home and Hospice Care Surveys (N = 2,455), we examined length of use in home care among patients with Medicare-only payment source before and during the Medicare interim payment system (IPS). Logistic regression analyses revealed that patients were 2.9 times more likely to be discharged within 60 days during IPS than before IPS. The impact of Medicare IPS on length of use in home care among patients with Medicare only was stronger than what the existing literature indicates, which combines Medicare patients with multiple payment sources and patients with Medicare-only together.

  17. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  18. Development of Physics and Control of Multiple Forcing Mechanisms for the Alaska Tsunami Forecast Model

    NASA Astrophysics Data System (ADS)

    Bahng, B.; Whitmore, P.; Macpherson, K. A.; Knight, W. R.

    2016-12-01

    The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes or other mechanisms in either the Pacific Ocean, Atlantic Ocean or Gulf of Mexico. At the U.S. National Tsunami Warning Center (NTWC), the use of the model has been mainly for tsunami pre-computation due to earthquakes. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. The model has also been used for tsunami hindcasting due to submarine landslides and due to atmospheric pressure jumps, but in a very case-specific and somewhat limited manner. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves approach coastal waters. The shallow-water wave physics is readily applicable to all of the above tsunamis as well as to tides. Recently, the model has been expanded to include multiple forcing mechanisms in a systematic fashion, and to enhance the model physics for non-earthquake events.ATFM is now able to handle multiple source mechanisms, either individually or jointly, which include earthquake, submarine landslide, meteo-tsunami and tidal forcing. As for earthquakes, the source can be a single unit source or multiple, interacting source blocks. Horizontal slip contribution can be added to the sea-floor displacement. The model now includes submarine landslide physics, modeling the source either as a rigid slump, or as a viscous fluid. Additional shallow-water physics have been implemented for the viscous submarine landslides. With rigid slumping, any trajectory can be followed. As for meteo-tsunami, the forcing mechanism is capable of following any trajectory shape. Wind stress physics has also been implemented for the meteo-tsunami case, if required. As an example of multiple sources, a near-field model of the tsunami produced by a combination of earthquake and submarine landslide forcing which happened in Papua New Guinea on July 17, 1998 is provided.

  19. Photoneutron cross sections for 59Co : Systematic uncertainties of data from various experiments

    NASA Astrophysics Data System (ADS)

    Varlamov, V. V.; Davydov, A. I.; Ishkhanov, B. S.

    2017-09-01

    Data on partial photoneutron reaction cross sections (γ ,1n), (γ ,2n), and (γ ,3n) for 59Co obtained in two experiments carried out at Livermore (USA) were analyzed. The sources of radiation in both experiments were the monoenergetic photon beams from the annihilation in flight of relativistic positrons. The total yield was sorted by the neutron multiplicity, taking into account the difference in the neutron energy spectra for different multiplicity. The two quoted studies differ in the method of determining the neutron. Significant systematic disagreements between the results of the two experiments exist. They are considered to be caused by large systematic uncertainties in partial cross sections, since they do not satisfy physical criteria for reliability of the data. To obtain reliable cross sections of partial and total photoneutron reactions a new method combining experimental data and theoretical evaluation was used. It is based on the experimental neutron yield cross section which is rather independent of neutron multiplicity and the transitional neutron multiplicity functions of the combined photonucleon reaction model (CPNRM). The model transitional multiplicity functions were used for the decomposition of the neutron yield cross section into the contributions of partial reactions. The results of the new evaluation noticeably differ from the partial cross sections obtained in the two experimental studies are under discussion.

  20. HerMES: ALMA Imaging of Herschel-selected Dusty Star-forming Galaxies

    NASA Astrophysics Data System (ADS)

    Bussmann, R. S.; Riechers, D.; Fialkov, A.; Scudder, J.; Hayward, C. C.; Cowley, W. I.; Bock, J.; Calanog, J.; Chapman, S. C.; Cooray, A.; De Bernardis, F.; Farrah, D.; Fu, Hai; Gavazzi, R.; Hopwood, R.; Ivison, R. J.; Jarvis, M.; Lacey, C.; Loeb, A.; Oliver, S. J.; Pérez-Fournon, I.; Rigopoulou, D.; Roseboom, I. G.; Scott, Douglas; Smith, A. J.; Vieira, J. D.; Wang, L.; Wardlow, J.

    2015-10-01

    The Herschel Multi-tiered Extragalactic Survey (HerMES) has identified large numbers of dusty star-forming galaxies (DSFGs) over a wide range in redshift. A detailed understanding of these DSFGs is hampered by the limited spatial resolution of Herschel. We present 870 μm 0.″45 resolution imaging obtained with the Atacama Large Millimeter/submillimeter Array (ALMA) of a sample of 29 HerMES DSFGs that have far-infrared (FIR) flux densities that lie between the brightest of sources found by Herschel and fainter DSFGs found via ground-based surveys in the submillimeter region. The ALMA imaging reveals that these DSFGs comprise a total of 62 sources (down to the 5σ point-source sensitivity limit in our ALMA sample; σ ≈ 0.2 {mJy}). Optical or near-infrared imaging indicates that 36 of the ALMA sources experience a significant flux boost from gravitational lensing (μ \\gt 1.1), but only six are strongly lensed and show multiple images. We introduce and make use of uvmcmcfit, a general-purpose and publicly available Markov chain Monte Carlo visibility-plane analysis tool to analyze the source properties. Combined with our previous work on brighter Herschel sources, the lens models presented here tentatively favor intrinsic number counts for DSFGs with a break near 8 {mJy} at 880 μ {{m}} and a steep fall-off at higher flux densities. Nearly 70% of the Herschel sources break down into multiple ALMA counterparts, consistent with previous research indicating that the multiplicity rate is high in bright sources discovered in single-dish submillimeter or FIR surveys. The ALMA counterparts to our Herschel targets are located significantly closer to each other than ALMA counterparts to sources found in the LABOCA ECDFS Submillimeter Survey. Theoretical models underpredict the excess number of sources with small separations seen in our ALMA sample. The high multiplicity rate and small projected separations between sources seen in our sample argue in favor of interactions and mergers plausibly driving both the prodigious emission from the brightest DSFGs as well as the sharp downturn above {S}880=8 {mJy}. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.

  1. Evidence Combination From an Evolutionary Game Theory Perspective

    PubMed Central

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2017-01-01

    Dempster-Shafer evidence theory is a primary methodology for multi-source information fusion because it is good at dealing with uncertain information. This theory provides a Dempster’s rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multi-evidence system. Within the proposed ECR, we develop a Jaccard matrix game (JMG) to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution’s stability and convergence, have been mathematically proved as well. PMID:26285231

  2. The effect of spatial distribution on the annoyance caused by simultaneous sounds

    NASA Astrophysics Data System (ADS)

    Vos, Joos; Bronkhorst, Adelbert W.; Fedtke, Thomas

    2004-05-01

    A considerable part of the population is exposed to simultaneous and/or successive environmental sounds from different sources. In many cases, these sources are different with respect to their locations also. In a laboratory study, it was investigated whether the annoyance caused by the multiple sounds is affected by the spatial distribution of the sources. There were four independent variables: (1) sound category (stationary or moving), (2) sound type (stationary: lawn-mower, leaf-blower, and chain saw; moving: road traffic, railway, and motorbike), (3) spatial location (left, right, and combinations), and (4) A-weighted sound exposure level (ASEL of single sources equal to 50, 60, or 70 dB). In addition to the individual sounds in isolation, various combinations of two or three different sources within each sound category and sound level were presented for rating. The annoyance was mainly determined by sound level and sound source type. In most cases there were neither significant main effects of spatial distribution nor significant interaction effects between spatial distribution and the other variables. It was concluded that for rating the spatially distrib- uted sounds investigated, the noise dose can simply be determined by a summation of the levels for the left and right channels. [Work supported by CEU.

  3. MICA: Multiple interval-based curve alignment

    NASA Astrophysics Data System (ADS)

    Mann, Martin; Kahle, Hans-Peter; Beck, Matthias; Bender, Bela Johannes; Spiecker, Heinrich; Backofen, Rolf

    2018-01-01

    MICA enables the automatic synchronization of discrete data curves. To this end, characteristic points of the curves' shapes are identified. These landmarks are used within a heuristic curve registration approach to align profile pairs by mapping similar characteristics onto each other. In combination with a progressive alignment scheme, this enables the computation of multiple curve alignments. Multiple curve alignments are needed to derive meaningful representative consensus data of measured time or data series. MICA was already successfully applied to generate representative profiles of tree growth data based on intra-annual wood density profiles or cell formation data. The MICA package provides a command-line and graphical user interface. The R interface enables the direct embedding of multiple curve alignment computation into larger analyses pipelines. Source code, binaries and documentation are freely available at https://github.com/BackofenLab/MICA

  4. Contribution of Changing Sources and Sinks to the Growth Rate of Atmospheric Methane Concentrations for the Last Two Decades

    NASA Technical Reports Server (NTRS)

    Matthews, Elaine; Walter, B.; Bogner, J.; Sarma, D.; Portmey, G.; Travis, Larry (Technical Monitor)

    2001-01-01

    In situ measurements of atmospheric methane concentrations begun in the early 1980s show decadal trends, as well as large interannual variations, in growth rate. Recent research indicates that while wetlands can explain several of the large growth anomalies for individual years, the decadal trend may be the combined effect of increasing sinks, due to increases in tropospheric OH, and stabilizing sources. We discuss new 20-year histories of annual, global source strengths for all major methane sources, i.e., natural wetlands, rice cultivation, ruminant animals, landfills, fossil fuels, and biomass burning. We also present estimates of the temporal pattern of the sink required to reconcile these sources and atmospheric concentrations over this time period. Analysis of the individual emission sources, together with model-derived estimates of the OH sink strength, indicates that the growth rate of atmospheric methane observed over the last 20 years can only be explained by a combination of changes in source emissions and an increasing tropospheric sink. Direct validation of the global sources and the terrestrial sink is not straightforward, in part because some sources/sinks are relatively small and diffuse (e.g., landfills and soil consumption), as well as because the atmospheric record integrates multiple and substantial sources and tropospheric sinks in regions such as the tropics. We discuss ways to develop and test criteria for rejecting and/or accepting a suite of scenarios for the methane budget.

  5. VizieR Online Data Catalog: Mid-infrared study of RR Lyrae stars (Gavrilchenko+, 2014)

    NASA Astrophysics Data System (ADS)

    Gavrilchenko, T.; Klein, C. R.; Bloom, J. S.; Richards, J. W.

    2015-02-01

    The first goal was to find a large sample of WISE-observed RR Lyrae stars. A data base of previously identified RR Lyrae stars was created, combining information from General Catalogue of Variable Stars (GCVS), All Sky Automated Survey (ASAS), SIMBAD, VizieR, and individual papers. For many of the sources in this data base the only available data were the coordinates and RR Lyrae classification. When provided, information about the period, distance, subclass, and magnitude for several different wavebands was also stored. If a single source appeared in multiple surveys or papers, information from all relevant surveys was included, with markers indicating contradicting measurements between surveys. The resulting data base contains about 17000 sources, of which about 5000 sources have documented V-band periods. (3 data files).

  6. You are not always what we think you eat. Selective assimilation across multiple whole-stream isotopic tracer studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dodds, W. K.; Collins, S. M.; Hamilton, S. K.

    Analyses of 21 15N stable isotope tracer experiments, designed to examine food web dynamics in streams around the world, indicated that the isotopic composition of food resources assimilated by primary consumers (mostly invertebrates) poorly reflected the presumed food sources. Modeling indicated that consumers assimilated only 33–50% of the N available in sampled food sources such as decomposing leaves, epilithon, and fine particulate detritus over feeding periods of weeks or more. Thus, common methods of sampling food sources consumed by animals in streams do not sufficiently reflect the pool of N they assimilate. Lastly, Isotope tracer studies, combined with modeling andmore » food separation techniques, can improve estimation of N pools in food sources that are assimilated by consumers.« less

  7. A method for classification of multisource data using interval-valued probabilities and its application to HIRIS data

    NASA Technical Reports Server (NTRS)

    Kim, H.; Swain, P. H.

    1991-01-01

    A method of classifying multisource data in remote sensing is presented. The proposed method considers each data source as an information source providing a body of evidence, represents statistical evidence by interval-valued probabilities, and uses Dempster's rule to integrate information based on multiple data source. The method is applied to the problems of ground-cover classification of multispectral data combined with digital terrain data such as elevation, slope, and aspect. Then this method is applied to simulated 201-band High Resolution Imaging Spectrometer (HIRIS) data by dividing the dimensionally huge data source into smaller and more manageable pieces based on the global statistical correlation information. It produces higher classification accuracy than the Maximum Likelihood (ML) classification method when the Hughes phenomenon is apparent.

  8. You are not always what we think you eat. Selective assimilation across multiple whole-stream isotopic tracer studies

    DOE PAGES

    Dodds, W. K.; Collins, S. M.; Hamilton, S. K.; ...

    2014-10-01

    Analyses of 21 15N stable isotope tracer experiments, designed to examine food web dynamics in streams around the world, indicated that the isotopic composition of food resources assimilated by primary consumers (mostly invertebrates) poorly reflected the presumed food sources. Modeling indicated that consumers assimilated only 33–50% of the N available in sampled food sources such as decomposing leaves, epilithon, and fine particulate detritus over feeding periods of weeks or more. Thus, common methods of sampling food sources consumed by animals in streams do not sufficiently reflect the pool of N they assimilate. Lastly, Isotope tracer studies, combined with modeling andmore » food separation techniques, can improve estimation of N pools in food sources that are assimilated by consumers.« less

  9. Adapting Word Embeddings from Multiple Domains to Symptom Recognition from Psychiatric Notes

    PubMed Central

    Zhang, Yaoyun; Li, Hee-Jin; Wang, Jingqi; Cohen, Trevor; Roberts, Kirk; Xu, Hua

    2018-01-01

    Mental health is increasingly recognized an important topic in healthcare. Information concerning psychiatric symptoms is critical for the timely diagnosis of mental disorders, as well as for the personalization of interventions. However, the diversity and sparsity of psychiatric symptoms make it challenging for conventional natural language processing techniques to automatically extract such information from clinical text. To address this problem, this study takes the initiative to use and adapt word embeddings from four source domains – intensive care, biomedical literature, Wikipedia and Psychiatric Forum – to recognize symptoms in the target domain of psychiatry. We investigated four different approaches including 1) only using word embeddings of the source domain, 2) directly combining data of the source and target to generate word embeddings, 3) assigning different weights to word embeddings, and 4) retraining the word embedding model of the source domain using a corpus of the target domain. To the best of our knowledge, this is the first work of adapting multiple word embeddings of external domains to improve psychiatric symptom recognition in clinical text. Experimental results showed that the last two approaches outperformed the baseline methods, indicating the effectiveness of our new strategies to leverage embeddings from other domains. PMID:29888086

  10. Household trends in access to improved water sources and sanitation facilities in Vietnam and associated factors: findings from the Multiple Indicator Cluster Surveys, 2000-2011.

    PubMed

    Tuyet-Hanh, Tran Thi; Lee, Jong-Koo; Oh, Juhwan; Van Minh, Hoang; Ou Lee, Chul; Hoan, Le Thi; Nam, You-Seon; Long, Tran Khanh

    2016-01-01

    Despite progress made by the Millennium Development Goal (MDG) number 7.C, Vietnam still faces challenges with regard to the provision of access to safe drinking water and basic sanitation. This paper describes household trends in access to improved water sources and sanitation facilities separately, and analyses factors associated with access to improved water sources and sanitation facilities in combination. Secondary data from the Vietnam Multiple Indicator Cluster Survey in 2000, 2006, and 2011 were analyzed. Descriptive statistics and tests of significance describe trends over time in access to water and sanitation by location, demographic and socio-economic factors. Binary logistic regressions (2000, 2006, and 2011) describe associations between access to water and sanitation, and geographic, demographic, and socio-economic factors. There have been some outstanding developments in access to improved water sources and sanitation facilities from 2000 to 2011. In 2011, the proportion of households with access to improved water sources and sanitation facilities reached 90% and 77%, respectively, meeting the 2015 MDG targets for safe drinking water and basic sanitation set at 88% and 75%, respectively. However, despite these achievements, in 2011, only 74% of households overall had access to combined improved drinking water and sanitation facilities. There were also stark differences between regions. In 2011, only 47% of households had access to both improved water and sanitation facilities in the Mekong River Delta compared with 94% in the Red River Delta. In 2011, households in urban compared to rural areas were more than twice as likely (odds ratio [OR]: 2.2; 95% confidence interval [CI]: 1.9-2.5) to have access to improved water and sanitation facilities in combination, and households in the highest compared with the lowest wealth quintile were over 40 times more likely (OR: 42.3; 95% CI: 29.8-60.0). More efforts are required to increase household access to both improved water and sanitation facilities in the Mekong River Delta, South East and Central Highlands regions of Vietnam. There is also a need to address socio-economic factors associated with inadequate access to improved sanitation facilities.

  11. Opportunities and Challenges in Supply-Side Simulation: Physician-Based Models

    PubMed Central

    Gresenz, Carole Roan; Auerbach, David I; Duarte, Fabian

    2013-01-01

    Objective To provide a conceptual framework and to assess the availability of empirical data for supply-side microsimulation modeling in the context of health care. Data Sources Multiple secondary data sources, including the American Community Survey, Health Tracking Physician Survey, and SK&A physician database. Study Design We apply our conceptual framework to one entity in the health care market—physicians—and identify, assess, and compare data available for physician-based simulation models. Principal Findings Our conceptual framework describes three broad types of data required for supply-side microsimulation modeling. Our assessment of available data for modeling physician behavior suggests broad comparability across various sources on several dimensions and highlights the need for significant integration of data across multiple sources to provide a platform adequate for modeling. A growing literature provides potential estimates for use as behavioral parameters that could serve as the models' engines. Sources of data for simulation modeling that account for the complex organizational and financial relationships among physicians and other supply-side entities are limited. Conclusions A key challenge for supply-side microsimulation modeling is optimally combining available data to harness their collective power. Several possibilities also exist for novel data collection. These have the potential to serve as catalysts for the next generation of supply-side-focused simulation models to inform health policy. PMID:23347041

  12. On the source of cross-grain lineations in the central Pacific gravity field

    NASA Technical Reports Server (NTRS)

    Mcadoo, David C.; Sandwell, David T.

    1989-01-01

    The source of cross-grain lineations in marine gravity field observed in central Pacific was investigated by comparing multiple collinear gravity profiles from Geosat data with coincident bathymetry profiles, in the Fourier transform domain. Bathymetric data were collected by multibeam sonar systems operating from two research vessels, one in June-August 1985, the other in February and March 1987. The results of this analysis indicate that the lineations are superficial features that appear to result from a combination of subsurface and surface loads supported by a thin (2 km to 5 km) lithosphere.

  13. Safe, Multiphase Bounds Check Elimination in Java

    DTIC Science & Technology

    2010-01-28

    production of mobile code from source code, JIT compilation in the virtual ma- chine, and application code execution. The code producer uses...invariants, and inequality constraint analysis) to identify and prove redundancy of bounds checks. During class-loading and JIT compilation, the virtual...unoptimized code if the speculated invariants do not hold. The combined effect of the multiple phases is to shift the effort as- sociated with bounds

  14. Determining Object Orientation from a Single Image Using Multiple Information Sources.

    DTIC Science & Technology

    1984-06-01

    object surface. Location of the image ellipse is accomplished by exploiting knowledge about object boundaries and image intensity gradients . -. The...Using Intensity Gradient Information for Ellipse fitting ........ .51 4.3.7 Orientation From Ellipses .............................. 53 4.3.8 Application...object boundaries and image intensity gradients . The orientation information from each of these three methods is combined using a "plausibility" function

  15. Fast and Efficient Feature Engineering for Multi-Cohort Analysis of EHR Data.

    PubMed

    Ozery-Flato, Michal; Yanover, Chen; Gottlieb, Assaf; Weissbrod, Omer; Parush Shear-Yashuv, Naama; Goldschmidt, Yaara

    2017-01-01

    We present a framework for feature engineering, tailored for longitudinal structured data, such as electronic health records (EHRs). To fast-track feature engineering and extraction, the framework combines general-use plug-in extractors, a multi-cohort management mechanism, and modular memoization. Using this framework, we rapidly extracted thousands of features from diverse and large healthcare data sources in multiple projects.

  16. Identifying Greater Sage-Grouse source and sink habitats for conservation planning in an energy development landscape.

    PubMed

    Kirol, Christopher P; Beck, Jeffrey L; Huzurbazar, Snehalata V; Holloran, Matthew J; Miller, Scott N

    2015-06-01

    Conserving a declining species that is facing many threats, including overlap of its habitats with energy extraction activities, depends upon identifying and prioritizing the value of the habitats that remain. In addition, habitat quality is often compromised when source habitats are lost or fragmented due to anthropogenic development. Our objective was to build an ecological model to classify and map habitat quality in terms of source or sink dynamics for Greater Sage-Grouse (Centrocercus urophasianus) in the Atlantic Rim Project Area (ARPA), a developing coalbed natural gas field in south-central Wyoming, USA. We used occurrence and survival modeling to evaluate relationships between environmental and anthropogenic variables at multiple spatial scales and for all female summer life stages, including nesting, brood-rearing, and non-brooding females. For each life stage, we created resource selection functions (RSFs). We weighted the RSFs and combined them to form a female summer occurrence map. We modeled survival also as a function of spatial variables for nest, brood, and adult female summer survival. Our survival-models were mapped as survival probability functions individually and then combined with fixed vital rates in a fitness metric model that, when mapped, predicted habitat productivity (productivity map). Our results demonstrate a suite of environmental and anthropogenic variables at multiple scales that were predictive of occurrence and survival. We created a source-sink map by overlaying our female summer occurrence map and productivity map to predict habitats contributing to population surpluses (source habitats) or deficits (sink habitat) and low-occurrence habitats on the landscape. The source-sink map predicted that of the Sage-Grouse habitat within the ARPA, 30% was primary source, 29% was secondary source, 4% was primary sink, 6% was secondary sink, and 31% was low occurrence. Our results provide evidence that energy development and avoidance of energy infrastructure were probably reducing the amount of source habitat within the ARPA landscape. Our source-sink map provides managers with a means of prioritizing habitats for conservation planning based on source and sink dynamics. The spatial identification of high value (i.e., primary source) as well as suboptimal (i.e., primary sink) habitats allows for informed energy development to minimize effects on local wildlife populations.

  17. Methodological issues in current practice may lead to bias in the development of biomarker combinations for predicting acute kidney injury.

    PubMed

    Meisner, Allison; Kerr, Kathleen F; Thiessen-Philbrook, Heather; Coca, Steven G; Parikh, Chirag R

    2016-02-01

    Individual biomarkers of renal injury are only modestly predictive of acute kidney injury (AKI). Using multiple biomarkers has the potential to improve predictive capacity. In this systematic review, statistical methods of articles developing biomarker combinations to predict AKI were assessed. We identified and described three potential sources of bias (resubstitution bias, model selection bias, and bias due to center differences) that may compromise the development of biomarker combinations. Fifteen studies reported developing kidney injury biomarker combinations for the prediction of AKI after cardiac surgery (8 articles), in the intensive care unit (4 articles), or other settings (3 articles). All studies were susceptible to at least one source of bias and did not account for or acknowledge the bias. Inadequate reporting often hindered our assessment of the articles. We then evaluated, when possible (7 articles), the performance of published biomarker combinations in the TRIBE-AKI cardiac surgery cohort. Predictive performance was markedly attenuated in six out of seven cases. Thus, deficiencies in analysis and reporting are avoidable, and care should be taken to provide accurate estimates of risk prediction model performance. Hence, rigorous design, analysis, and reporting of biomarker combination studies are essential to realizing the promise of biomarkers in clinical practice.

  18. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    USGS Publications Warehouse

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  19. SynergyFinder: a web application for analyzing drug combination dose-response matrix data.

    PubMed

    Ianevski, Aleksandr; He, Liye; Aittokallio, Tero; Tang, Jing

    2017-08-01

    Rational design of drug combinations has become a promising strategy to tackle the drug sensitivity and resistance problem in cancer treatment. To systematically evaluate the pre-clinical significance of pairwise drug combinations, functional screening assays that probe combination effects in a dose-response matrix assay are commonly used. To facilitate the analysis of such drug combination experiments, we implemented a web application that uses key functions of R-package SynergyFinder, and provides not only the flexibility of using multiple synergy scoring models, but also a user-friendly interface for visualizing the drug combination landscapes in an interactive manner. The SynergyFinder web application is freely accessible at https://synergyfinder.fimm.fi ; The R-package and its source-code are freely available at http://bioconductor.org/packages/release/bioc/html/synergyfinder.html . jing.tang@helsinki.fi. © The Author(s) 2017. Published by Oxford University Press.

  20. Integrative data analysis in clinical psychology research.

    PubMed

    Hussong, Andrea M; Curran, Patrick J; Bauer, Daniel J

    2013-01-01

    Integrative data analysis (IDA), a novel framework for conducting the simultaneous analysis of raw data pooled from multiple studies, offers many advantages including economy (i.e., reuse of extant data), power (i.e., large combined sample sizes), the potential to address new questions not answerable by a single contributing study (e.g., combining longitudinal studies to cover a broader swath of the lifespan), and the opportunity to build a more cumulative science (i.e., examining the similarity of effects across studies and potential reasons for dissimilarities). There are also methodological challenges associated with IDA, including the need to account for sampling heterogeneity across studies, to develop commensurate measures across studies, and to account for multiple sources of study differences as they impact hypothesis testing. In this review, we outline potential solutions to these challenges and describe future avenues for developing IDA as a framework for studies in clinical psychology.

  1. Integrative Data Analysis in Clinical Psychology Research

    PubMed Central

    Hussong, Andrea M.; Curran, Patrick J.; Bauer, Daniel J.

    2013-01-01

    Integrative Data Analysis (IDA), a novel framework for conducting the simultaneous analysis of raw data pooled from multiple studies, offers many advantages including economy (i.e., reuse of extant data), power (i.e., large combined sample sizes), the potential to address new questions not answerable by a single contributing study (e.g., combining longitudinal studies to cover a broader swath of the lifespan), and the opportunity to build a more cumulative science (i.e., examining the similarity of effects across studies and potential reasons for dissimilarities). There are also methodological challenges associated with IDA, including the need to account for sampling heterogeneity across studies, to develop commensurate measures across studies, and to account for multiple sources of study differences as they impact hypothesis testing. In this review, we outline potential solutions to these challenges and describe future avenues for developing IDA as a framework for studies in clinical psychology. PMID:23394226

  2. Strong ground motion simulation of the 2016 Kumamoto earthquake of April 16 using multiple point sources

    NASA Astrophysics Data System (ADS)

    Nagasaka, Yosuke; Nozu, Atsushi

    2017-02-01

    The pseudo point-source model approximates the rupture process on faults with multiple point sources for simulating strong ground motions. A simulation with this point-source model is conducted by combining a simple source spectrum following the omega-square model with a path spectrum, an empirical site amplification factor, and phase characteristics. Realistic waveforms can be synthesized using the empirical site amplification factor and phase models even though the source model is simple. The Kumamoto earthquake occurred on April 16, 2016, with M JMA 7.3. Many strong motions were recorded at stations around the source region. Some records were considered to be affected by the rupture directivity effect. This earthquake was suitable for investigating the applicability of the pseudo point-source model, the current version of which does not consider the rupture directivity effect. Three subevents (point sources) were located on the fault plane, and the parameters of the simulation were determined. The simulated results were compared with the observed records at K-NET and KiK-net stations. It was found that the synthetic Fourier spectra and velocity waveforms generally explained the characteristics of the observed records, except for underestimation in the low frequency range. Troughs in the observed Fourier spectra were also well reproduced by placing multiple subevents near the hypocenter. The underestimation is presumably due to the following two reasons. The first is that the pseudo point-source model targets subevents that generate strong ground motions and does not consider the shallow large slip. The second reason is that the current version of the pseudo point-source model does not consider the rupture directivity effect. Consequently, strong pulses were not reproduced enough at stations northeast of Subevent 3 such as KMM004, where the effect of rupture directivity was significant, while the amplitude was well reproduced at most of the other stations. This result indicates the necessity for improving the pseudo point-source model, by introducing azimuth-dependent corner frequency for example, so that it can incorporate the effect of rupture directivity.[Figure not available: see fulltext.

  3. Viability of NLCD Products From IRS-P6, And From Landsat 7 Scan-gap Data

    NASA Technical Reports Server (NTRS)

    Coan, Michael

    2007-01-01

    Landcover test on Salt Lake test site illustrates potential issues with AWiFS/LISS-III for classification of certain land cover classes (evergreen, shrub/scrub, woody wetlands, emergent wetlands). Canopy and impervious graphs of product differences from source indicate slightly lower overall accuracies (shorter peaks, wider bases) for AWiFS/LISS-III, compared to L5/L7. Inspection of individual products from canopy and impervious estimate tests revealed issues with combining AWifs quadrants, and similar but less severe effects with combining multiple dates of L7 scan gap data.

  4. Open Source Platform Application to Groundwater Characterization and Monitoring

    NASA Astrophysics Data System (ADS)

    Ntarlagiannis, D.; Day-Lewis, F. D.; Falzone, S.; Lane, J. W., Jr.; Slater, L. D.; Robinson, J.; Hammett, S.

    2017-12-01

    Groundwater characterization and monitoring commonly rely on the use of multiple point sensors and human labor. Due to the number of sensors, labor, and other resources needed, establishing and maintaining an adequate groundwater monitoring network can be both labor intensive and expensive. To improve and optimize the monitoring network design, open source software and hardware components could potentially provide the platform to control robust and efficient sensors thereby reducing costs and labor. This work presents early attempts to create a groundwater monitoring system incorporating open-source software and hardware that will control the remote operation of multiple sensors along with data management and file transfer functions. The system is built around a Raspberry PI 3, that controls multiple sensors in order to perform on-demand, continuous or `smart decision' measurements while providing flexibility to incorporate additional sensors to meet the demands of different projects. The current objective of our technology is to monitor exchange of ionic tracers between mobile and immobile porosity using a combination of fluid and bulk electrical-conductivity measurements. To meet this objective, our configuration uses four sensors (pH, specific conductance, pressure, temperature) that can monitor the fluid electrical properties of interest and guide the bulk electrical measurement. This system highlights the potential of using open source software and hardware components for earth sciences applications. The versatility of the system makes it ideal for use in a large number of applications, and the low cost allows for high resolution (spatially and temporally) monitoring.

  5. Combining data from multiple sources using the CUAHSI Hydrologic Information System

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Ames, D. P.; Horsburgh, J. S.; Goodall, J. L.

    2012-12-01

    The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) has developed a Hydrologic Information System (HIS) to provide better access to data by enabling the publication, cataloging, discovery, retrieval, and analysis of hydrologic data using web services. The CUAHSI HIS is an Internet based system comprised of hydrologic databases and servers connected through web services as well as software for data publication, discovery and access. The HIS metadata catalog lists close to 100 web services registered to provide data through this system, ranging from large federal agency data sets to experimental watersheds managed by University investigators. The system's flexibility in storing and enabling public access to similarly formatted data and metadata has created a community data resource from governmental and academic data that might otherwise remain private or analyzed only in isolation. Comprehensive understanding of hydrology requires integration of this information from multiple sources. HydroDesktop is the client application developed as part of HIS to support data discovery and access through this system. HydroDesktop is founded on an open source GIS client and has a plug-in architecture that has enabled the integration of modeling and analysis capability with the functionality for data discovery and access. Model integration is possible through a plug-in built on the OpenMI standard and data visualization and analysis is supported by an R plug-in. This presentation will demonstrate HydroDesktop, showing how it provides an analysis environment within which data from multiple sources can be discovered, accessed and integrated.

  6. Targeted versus statistical approaches to selecting parameters for modelling sediment provenance

    NASA Astrophysics Data System (ADS)

    Laceby, J. Patrick

    2017-04-01

    One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.

  7. A method for separation of heavy metal sources in urban groundwater using multiple lines of evidence.

    PubMed

    Hepburn, Emily; Northway, Anne; Bekele, Dawit; Liu, Gang-Jun; Currell, Matthew

    2018-06-11

    Determining sources of heavy metals in soils, sediments and groundwater is important for understanding their fate and transport and mitigating human and environmental exposures. Artificially imported fill, natural sediments and groundwater from 240 ha of reclaimed land at Fishermans Bend in Australia, were analysed for heavy metals and other parameters to determine the relative contributions from different possible sources. Fishermans Bend is Australia's largest urban re-development project, however, complicated land-use history, geology, and multiple contamination sources pose challenges to successful re-development. We developed a method for heavy metal source separation in groundwater using statistical categorisation of the data, analysis of soil leaching values and fill/sediment XRF profiling. The method identified two major sources of heavy metals in groundwater: 1. Point sources from local or up-gradient groundwater contaminated by industrial activities and/or legacy landfills; and 2. contaminated fill, where leaching of Cu, Mn, Pb and Zn was observed. Across the precinct, metals were most commonly sourced from a combination of these sources; however, eight locations indicated at least one metal sourced solely from fill leaching, and 23 locations indicated at least one metal sourced solely from impacted groundwater. Concentrations of heavy metals in groundwater ranged from 0.0001 to 0.003 mg/L (Cd), 0.001-0.1 mg/L (Cr), 0.001-0.2 mg/L (Cu), 0.001-0.5 mg/L (Ni), 0.001-0.01 mg/L (Pb), and 0.005-1.2 mg/L (Zn). Our method can determine the likely contribution of different metal sources to groundwater, helping inform more detailed contamination assessments and precinct-wide management and remediation strategies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. The use of coded PCR primers enables high-throughput sequencing of multiple homolog amplification products by 454 parallel sequencing.

    PubMed

    Binladen, Jonas; Gilbert, M Thomas P; Bollback, Jonathan P; Panitz, Frank; Bendixen, Christian; Nielsen, Rasmus; Willerslev, Eske

    2007-02-14

    The invention of the Genome Sequence 20 DNA Sequencing System (454 parallel sequencing platform) has enabled the rapid and high-volume production of sequence data. Until now, however, individual emulsion PCR (emPCR) reactions and subsequent sequencing runs have been unable to combine template DNA from multiple individuals, as homologous sequences cannot be subsequently assigned to their original sources. We use conventional PCR with 5'-nucleotide tagged primers to generate homologous DNA amplification products from multiple specimens, followed by sequencing through the high-throughput Genome Sequence 20 DNA Sequencing System (GS20, Roche/454 Life Sciences). Each DNA sequence is subsequently traced back to its individual source through 5'tag-analysis. We demonstrate that this new approach enables the assignment of virtually all the generated DNA sequences to the correct source once sequencing anomalies are accounted for (miss-assignment rate<0.4%). Therefore, the method enables accurate sequencing and assignment of homologous DNA sequences from multiple sources in single high-throughput GS20 run. We observe a bias in the distribution of the differently tagged primers that is dependent on the 5' nucleotide of the tag. In particular, primers 5' labelled with a cytosine are heavily overrepresented among the final sequences, while those 5' labelled with a thymine are strongly underrepresented. A weaker bias also exists with regards to the distribution of the sequences as sorted by the second nucleotide of the dinucleotide tags. As the results are based on a single GS20 run, the general applicability of the approach requires confirmation. However, our experiments demonstrate that 5'primer tagging is a useful method in which the sequencing power of the GS20 can be applied to PCR-based assays of multiple homologous PCR products. The new approach will be of value to a broad range of research areas, such as those of comparative genomics, complete mitochondrial analyses, population genetics, and phylogenetics.

  9. Multi-point laser ignition device

    DOEpatents

    McIntyre, Dustin L.; Woodruff, Steven D.

    2017-01-17

    A multi-point laser device comprising a plurality of optical pumping sources. Each optical pumping source is configured to create pumping excitation energy along a corresponding optical path directed through a high-reflectivity mirror and into substantially different locations within the laser media thereby producing atomic optical emissions at substantially different locations within the laser media and directed along a corresponding optical path of the optical pumping source. An output coupler and one or more output lenses are configured to produce a plurality of lasing events at substantially different times, locations or a combination thereof from the multiple atomic optical emissions produced at substantially different locations within the laser media. The laser media is a single continuous media, preferably grown on a single substrate.

  10. Multisource Data Integration in Remote Sensing

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Editor)

    1991-01-01

    Papers presented at the workshop on Multisource Data Integration in Remote Sensing are compiled. The full text of these papers is included. New instruments and new sensors are discussed that can provide us with a large variety of new views of the real world. This huge amount of data has to be combined and integrated in a (computer-) model of this world. Multiple sources may give complimentary views of the world - consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions may be caused by noise, errors during processing, or misinterpretations, and can be identified as such. As a consequence, integration results are very reliable and represent a valid source of information for any geographical information system.

  11. Parameter estimation for slit-type scanning sensors

    NASA Technical Reports Server (NTRS)

    Fowler, J. W.; Rolfe, E. G.

    1981-01-01

    The Infrared Astronomical Satellite, scheduled for launch into a 900 km near-polar orbit in August 1982, will perform an infrared point source survey by scanning the sky with slit-type sensors. The description of position information is shown to require the use of a non-Gaussian random variable. Methods are described for deciding whether separate detections stem from a single common source, and a formulism is developed for the scan-to-scan problems of identifying multiple sightings of inertially fixed point sources for combining their individual measurements into a refined estimate. Several cases are given where the general theory yields results which are quite different from the corresponding Gaussian applications, showing that argument by Gaussian analogy would lead to error.

  12. Quantifying methane emission from fugitive sources by combining tracer release and downwind measurements - a sensitivity analysis based on multiple field surveys.

    PubMed

    Mønster, Jacob G; Samuelsson, Jerker; Kjeldsen, Peter; Rella, Chris W; Scheutz, Charlotte

    2014-08-01

    Using a dual species methane/acetylene instrument based on cavity ring down spectroscopy (CRDS), the dynamic plume tracer dispersion method for quantifying the emission rate of methane was successfully tested in four measurement campaigns: (1) controlled methane and trace gas release with different trace gas configurations, (2) landfill with unknown emission source locations, (3) landfill with closely located emission sources, and (4) comparing with an Fourier transform infrared spectroscopy (FTIR) instrument using multiple trace gasses for source separation. The new real-time, high precision instrument can measure methane plumes more than 1.2 km away from small sources (about 5 kg h(-1)) in urban areas with a measurement frequency allowing plume crossing at normal driving speed. The method can be used for quantification of total methane emissions from diffuse area sources down to 1 kg per hour and can be used to quantify individual sources with the right choice of wind direction and road distance. The placement of the trace gas is important for obtaining correct quantification and uncertainty of up to 36% can be incurred when the trace gas is not co-located with the methane source. Measurements made at greater distances are less sensitive to errors in trace gas placement and model calculations showed an uncertainty of less than 5% in both urban and open-country for placing the trace gas 100 m from the source, when measurements were done more than 3 km away. Using the ratio of the integrated plume concentrations of tracer gas and methane gives the most reliable results for measurements at various distances to the source, compared to the ratio of the highest concentration in the plume, the direct concentration ratio and using a Gaussian plume model. Under suitable weather and road conditions, the CRDS system can quantify the emission from different sources located close to each other using only one kind of trace gas due to the high time resolution, while the FTIR system can measure multiple trace gasses but with a lower time resolution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Combining multiple sources of data to inform conservation of Lesser Prairie-Chicken populations

    USGS Publications Warehouse

    Ross, Beth; Haukos, David A.; Hagen, Christian A.; Pitman, James

    2018-01-01

    Conservation of small populations is often based on limited data from spatially and temporally restricted studies, resulting in management actions based on an incomplete assessment of the population drivers. If fluctuations in abundance are related to changes in weather, proper management is especially important, because extreme weather events could disproportionately affect population abundance. Conservation assessments, especially for vulnerable populations, are aided by a knowledge of how extreme events influence population status and trends. Although important for conservation efforts, data may be limited for small or vulnerable populations. Integrated population models maximize information from various sources of data to yield population estimates that fully incorporate uncertainty from multiple data sources while allowing for the explicit incorporation of environmental covariates of interest. Our goal was to assess the relative influence of population drivers for the Lesser Prairie-Chicken (Tympanuchus pallidicinctus) in the core of its range, western and southern Kansas, USA. We used data from roadside lek count surveys, nest monitoring surveys, and survival data from telemetry monitoring combined with climate (Palmer drought severity index) data in an integrated population model. Our results indicate that variability in population growth rate was most influenced by variability in juvenile survival. The Palmer drought severity index had no measurable direct effects on adult survival or mean number of offspring per female; however, there were declines in population growth rate following severe drought. Because declines in population growth rate occurred at a broad spatial scale, declines in response to drought were likely due to decreases in chick and juvenile survival rather than emigration outside of the study area. Overall, our model highlights the importance of accounting for environmental and demographic sources of variability, and provides a thorough method for simultaneously evaluating population demography in response to long-term climate effects.

  14. A boundary element approach to optimization of active noise control sources on three-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cunefare, K. A.; Koopmann, G. H.

    1991-01-01

    This paper presents the theoretical development of an approach to active noise control (ANC) applicable to three-dimensional radiators. The active noise control technique, termed ANC Optimization Analysis, is based on minimizing the total radiated power by adding secondary acoustic sources on the primary noise source. ANC Optimization Analysis determines the optimum magnitude and phase at which to drive the secondary control sources in order to achieve the best possible reduction in the total radiated power from the noise source/control source combination. For example, ANC Optimization Analysis predicts a 20 dB reduction in the total power radiated from a sphere of radius at a dimensionless wavenumber ka of 0.125, for a single control source representing 2.5 percent of the total area of the sphere. ANC Optimization Analysis is based on a boundary element formulation of the Helmholtz Integral Equation, and thus, the optimization analysis applies to a single frequency, while multiple frequencies can be treated through repeated analyses.

  15. Microwave Power Combiners for Signals of Arbitrary Amplitude

    NASA Technical Reports Server (NTRS)

    Conroy, Bruce; Hoppe, Daniel

    2009-01-01

    Schemes for combining power from coherent microwave sources of arbitrary (unequal or equal) amplitude have been proposed. Most prior microwave-power-combining schemes are limited to sources of equal amplitude. The basic principle of the schemes now proposed is to use quasi-optical components to manipulate the polarizations and phases of two arbitrary-amplitude input signals in such a way as to combine them into one output signal having a specified, fixed polarization. To combine power from more than two sources, one could use multiple powercombining stages based on this principle, feeding the outputs of lower-power stages as inputs to higher-power stages. Quasi-optical components suitable for implementing these schemes include grids of parallel wires, vane polarizers, and a variety of waveguide structures. For the sake of brevity, the remainder of this article illustrates the basic principle by focusing on one scheme in which a wire grid and two vane polarizers would be used. Wire grids are the key quasi-optical elements in many prior equal-power combiners. In somewhat oversimplified terms, a wire grid reflects an incident beam having an electric field parallel to the wires and passes an incident beam having an electric field perpendicular to the wires. In a typical prior equal-power combining scheme, one provides for two properly phased, equal-amplitude signals having mutually perpendicular linear polarizations to impinge from two mutually perpendicular directions on a wire grid in a plane oriented at an angle of 45 with respect to both beam axes. The wires in the grid are oriented to pass one of the incident beams straight through onto the output path and to reflect the other incident beam onto the output path along with the first-mentioned beam.

  16. Disentangling formation of multiple-core holes in aminophenol molecules exposed to bright X-FEL radiation

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Kamińska, M.; Mucke, M.; Squibb, R. J.; Eland, J. H. D.; Piancastelli, M. N.; Frasinski, L. J.; Grilj, J.; Koch, M.; McFarland, B. K.; Sistrunk, E.; Gühr, M.; Coffee, R. N.; Bostedt, C.; Bozek, J. D.; Salén, P.; Meulen, P. v. d.; Linusson, P.; Thomas, R. D.; Larsson, M.; Foucar, L.; Ullrich, J.; Motomura, K.; Mondal, S.; Ueda, K.; Richter, R.; Prince, K. C.; Takahashi, O.; Osipov, T.; Fang, L.; Murphy, B. F.; Berrah, N.; Feifel, R.

    2015-12-01

    Competing multi-photon ionization processes, some leading to the formation of double core hole states, have been examined in 4-aminophenol. The experiments used the linac coherent light source (LCLS) x-ray free electron laser, in combination with a time-of-flight magnetic bottle electron spectrometer and the correlation analysis method of covariance mapping. The results imply that 4-aminophenol molecules exposed to the focused x-ray pulses of the LCLS sequentially absorb more than two x-ray photons, resulting in the formation of multiple core holes as well as in the sequential removal of photoelectrons and Auger electrons (so-called PAPA sequences).

  17. Disentangling formation of multiple-core holes in aminophenol molecules exposed to bright X-FEL radiation

    DOE PAGES

    Zhaunerchyk, V.; Kaminska, M.; Mucke, M.; ...

    2015-10-28

    Competing multi-photon ionization processes, some leading to the formation of double core hole states, have been examined in 4-aminophenol. The experiments used the linac coherent light source (LCLS) x-ray free electron laser, in combination with a time-of-flight magnetic bottle electron spectrometer and the correlation analysis method of covariance mapping. Furthermore, the results imply that 4-aminophenol molecules exposed to the focused x-ray pulses of the LCLS sequentially absorb more than two x-ray photons, resulting in the formation of multiple core holes as well as in the sequential removal of photoelectrons and Auger electrons (so-called PAPA sequences).

  18. Localizing the sources of two independent noises: Role of time varying amplitude differences

    PubMed Central

    Yost, William A.; Brown, Christopher A.

    2013-01-01

    Listeners localized the free-field sources of either one or two simultaneous and independently generated noise bursts. Listeners' localization performance was better when localizing one rather than two sound sources. With two sound sources, localization performance was better when the listener was provided prior information about the location of one of them. Listeners also localized two simultaneous noise bursts that had sinusoidal amplitude modulation (AM) applied, in which the modulation envelope was in-phase across the two source locations or was 180° out-of-phase. The AM was employed to investigate a hypothesis as to what process listeners might use to localize multiple sound sources. The results supported the hypothesis that localization of two sound sources might be based on temporal-spectral regions of the combined waveform in which the sound from one source was more intense than that from the other source. The interaural information extracted from such temporal-spectral regions might provide reliable estimates of the sound source location that produced the more intense sound in that temporal-spectral region. PMID:23556597

  19. Localizing the sources of two independent noises: role of time varying amplitude differences.

    PubMed

    Yost, William A; Brown, Christopher A

    2013-04-01

    Listeners localized the free-field sources of either one or two simultaneous and independently generated noise bursts. Listeners' localization performance was better when localizing one rather than two sound sources. With two sound sources, localization performance was better when the listener was provided prior information about the location of one of them. Listeners also localized two simultaneous noise bursts that had sinusoidal amplitude modulation (AM) applied, in which the modulation envelope was in-phase across the two source locations or was 180° out-of-phase. The AM was employed to investigate a hypothesis as to what process listeners might use to localize multiple sound sources. The results supported the hypothesis that localization of two sound sources might be based on temporal-spectral regions of the combined waveform in which the sound from one source was more intense than that from the other source. The interaural information extracted from such temporal-spectral regions might provide reliable estimates of the sound source location that produced the more intense sound in that temporal-spectral region.

  20. The extended Beer-Lambert theory for ray tracing modeling of LED chip-scaled packaging application with multiple luminescence materials

    NASA Astrophysics Data System (ADS)

    Yuan, Cadmus C. A.

    2015-12-01

    Optical ray tracing modeling applied Beer-Lambert method in the single luminescence material system to model the white light pattern from blue LED light source. This paper extends such algorithm to a mixed multiple luminescence material system by introducing the equivalent excitation and emission spectrum of individual luminescence materials. The quantum efficiency numbers of individual material and self-absorption of the multiple luminescence material system are considered as well. By this combination, researchers are able to model the luminescence characteristics of LED chip-scaled packaging (CSP), which provides simple process steps and the freedom of the luminescence material geometrical dimension. The method will be first validated by the experimental results. Afterward, a further parametric investigation has been then conducted.

  1. Modelling multiple sources of dissemination bias in meta-analysis.

    PubMed

    Bowden, Jack; Jackson, Dan; Thompson, Simon G

    2010-03-30

    Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.

  2. Force on Force Modeling with Formal Task Structures and Dynamic Geometry

    DTIC Science & Technology

    2017-03-24

    task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test

  3. A Robust, Scalable Framework for Conducting Climate Change Susceptibility Analyses

    DTIC Science & Technology

    2014-05-01

    for identifying areas of heightened risk from varying forms of climate forcings is needed. Based on global climate model projections, deviations from...framework provides an opportunity to easily combine multiple data sources — that are often freely available from many federal, state, and global ...Climate change and extreme weather events: implications for food production, plant diseases, and pests. Global Change and Human Health 2:90–104. ERDC/EL

  4. Optical design of a light-emitting diode lamp for a maritime lighthouse.

    PubMed

    Jafrancesco, D; Mercatelli, L; Sansoni, P; Fontani, D; Sani, E; Coraggia, S; Meucci, M; Francini, F

    2015-04-10

    Traffic signaling is an emerging field for light-emitting diode (LED) applications. This sustainable power-saving illumination technology can be used in maritime signaling thanks to the recently updated norms, where the possibility to utilize LED sources is explicitly cited, and to the availability of high-power white LEDs that, combined with suitable lenses, permit us to obtain well-collimated beams. This paper describes the optical design of a LED-based lamp that can replace a traditional lamp in an authentic marine lighthouse. This source recombines multiple separated LEDs realizing a quasi-punctual localized source. Advantages can be lower energy consumption, higher efficiency, longer life, fewer faults, slower aging, and minor maintenance costs. The proposed LED source allows us to keep and to utilize the old Fresnel lenses of the lighthouse, which very often have historical value.

  5. Combining peak- and chromatogram-based retention time alignment algorithms for multiple chromatography-mass spectrometry datasets.

    PubMed

    Hoffmann, Nils; Keck, Matthias; Neuweger, Heiko; Wilhelm, Mathias; Högy, Petra; Niehaus, Karsten; Stoye, Jens

    2012-08-27

    Modern analytical methods in biology and chemistry use separation techniques coupled to sensitive detectors, such as gas chromatography-mass spectrometry (GC-MS) and liquid chromatography-mass spectrometry (LC-MS). These hyphenated methods provide high-dimensional data. Comparing such data manually to find corresponding signals is a laborious task, as each experiment usually consists of thousands of individual scans, each containing hundreds or even thousands of distinct signals. In order to allow for successful identification of metabolites or proteins within such data, especially in the context of metabolomics and proteomics, an accurate alignment and matching of corresponding features between two or more experiments is required. Such a matching algorithm should capture fluctuations in the chromatographic system which lead to non-linear distortions on the time axis, as well as systematic changes in recorded intensities. Many different algorithms for the retention time alignment of GC-MS and LC-MS data have been proposed and published, but all of them focus either on aligning previously extracted peak features or on aligning and comparing the complete raw data containing all available features. In this paper we introduce two algorithms for retention time alignment of multiple GC-MS datasets: multiple alignment by bidirectional best hits peak assignment and cluster extension (BIPACE) and center-star multiple alignment by pairwise partitioned dynamic time warping (CeMAPP-DTW). We show how the similarity-based peak group matching method BIPACE may be used for multiple alignment calculation individually and how it can be used as a preprocessing step for the pairwise alignments performed by CeMAPP-DTW. We evaluate the algorithms individually and in combination on a previously published small GC-MS dataset studying the Leishmania parasite and on a larger GC-MS dataset studying grains of wheat (Triticum aestivum). We have shown that BIPACE achieves very high precision and recall and a very low number of false positive peak assignments on both evaluation datasets. CeMAPP-DTW finds a high number of true positives when executed on its own, but achieves even better results when BIPACE is used to constrain its search space. The source code of both algorithms is included in the OpenSource software framework Maltcms, which is available from http://maltcms.sf.net. The evaluation scripts of the present study are available from the same source.

  6. Combining peak- and chromatogram-based retention time alignment algorithms for multiple chromatography-mass spectrometry datasets

    PubMed Central

    2012-01-01

    Background Modern analytical methods in biology and chemistry use separation techniques coupled to sensitive detectors, such as gas chromatography-mass spectrometry (GC-MS) and liquid chromatography-mass spectrometry (LC-MS). These hyphenated methods provide high-dimensional data. Comparing such data manually to find corresponding signals is a laborious task, as each experiment usually consists of thousands of individual scans, each containing hundreds or even thousands of distinct signals. In order to allow for successful identification of metabolites or proteins within such data, especially in the context of metabolomics and proteomics, an accurate alignment and matching of corresponding features between two or more experiments is required. Such a matching algorithm should capture fluctuations in the chromatographic system which lead to non-linear distortions on the time axis, as well as systematic changes in recorded intensities. Many different algorithms for the retention time alignment of GC-MS and LC-MS data have been proposed and published, but all of them focus either on aligning previously extracted peak features or on aligning and comparing the complete raw data containing all available features. Results In this paper we introduce two algorithms for retention time alignment of multiple GC-MS datasets: multiple alignment by bidirectional best hits peak assignment and cluster extension (BIPACE) and center-star multiple alignment by pairwise partitioned dynamic time warping (CeMAPP-DTW). We show how the similarity-based peak group matching method BIPACE may be used for multiple alignment calculation individually and how it can be used as a preprocessing step for the pairwise alignments performed by CeMAPP-DTW. We evaluate the algorithms individually and in combination on a previously published small GC-MS dataset studying the Leishmania parasite and on a larger GC-MS dataset studying grains of wheat (Triticum aestivum). Conclusions We have shown that BIPACE achieves very high precision and recall and a very low number of false positive peak assignments on both evaluation datasets. CeMAPP-DTW finds a high number of true positives when executed on its own, but achieves even better results when BIPACE is used to constrain its search space. The source code of both algorithms is included in the OpenSource software framework Maltcms, which is available from http://maltcms.sf.net. The evaluation scripts of the present study are available from the same source. PMID:22920415

  7. MODEL-FREE MULTI-PROBE LENSING RECONSTRUCTION OF CLUSTER MASS PROFILES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Umetsu, Keiichi

    2013-05-20

    Lens magnification by galaxy clusters induces characteristic spatial variations in the number counts of background sources, amplifying their observed fluxes and expanding the area of sky, the net effect of which, known as magnification bias, depends on the intrinsic faint-end slope of the source luminosity function. The bias is strongly negative for red galaxies, dominated by the geometric area distortion, whereas it is mildly positive for blue galaxies, enhancing the blue counts toward the cluster center. We generalize the Bayesian approach of Umetsu et al. for reconstructing projected cluster mass profiles, by incorporating multiple populations of background sources for magnification-biasmore » measurements and combining them with complementary lens-distortion measurements, effectively breaking the mass-sheet degeneracy and improving the statistical precision of cluster mass measurements. The approach can be further extended to include strong-lensing projected mass estimates, thus allowing for non-parametric absolute mass determinations in both the weak and strong regimes. We apply this method to our recent CLASH lensing measurements of MACS J1206.2-0847, and demonstrate how combining multi-probe lensing constraints can improve the reconstruction of cluster mass profiles. This method will also be useful for a stacked lensing analysis, combining all lensing-related effects in the cluster regime, for a definitive determination of the averaged mass profile.« less

  8. A highly sensitive search strategy for clinical trials in Literatura Latino Americana e do Caribe em Ciências da Saúde (LILACS) was developed.

    PubMed

    Manríquez, Juan J

    2008-04-01

    Systematic reviews should include as many articles as possible. However, many systematic reviews use only databases with high English language content as sources of trials. Literatura Latino Americana e do Caribe em Ciências da Saúde (LILACS) is an underused source of trials, and there is not a validated strategy for searching clinical trials to be used in this database. The objective of this study was to develop a sensitive search strategy for clinical trials in LILACS. An analytical survey was performed. Several single and multiple-term search strategies were tested for their ability to retrieve clinical trials in LILACS. Sensitivity, specificity, and accuracy of each single and multiple-term strategy were calculated using the results of a hand-search of 44 Chilean journals as gold standard. After combining the most sensitive, specific, and accurate single and multiple-term search strategy, a strategy with a sensitivity of 97.75% (95% confidence interval [CI]=95.98-99.53) and a specificity of 61.85 (95% CI=61.19-62.51) was obtained. LILACS is a source of trials that could improve systematic reviews. A new highly sensitive search strategy for clinical trials in LILACS has been developed. It is hoped this search strategy will improve and increase the utilization of LILACS in future systematic reviews.

  9. An integrated network of Arabidopsis growth regulators and its use for gene prioritization.

    PubMed

    Sabaghian, Ehsan; Drebert, Zuzanna; Inzé, Dirk; Saeys, Yvan

    2015-12-01

    Elucidating the molecular mechanisms that govern plant growth has been an important topic in plant research, and current advances in large-scale data generation call for computational tools that efficiently combine these different data sources to generate novel hypotheses. In this work, we present a novel, integrated network that combines multiple large-scale data sources to characterize growth regulatory genes in Arabidopsis, one of the main plant model organisms. The contributions of this work are twofold: first, we characterized a set of carefully selected growth regulators with respect to their connectivity patterns in the integrated network, and, subsequently, we explored to which extent these connectivity patterns can be used to suggest new growth regulators. Using a large-scale comparative study, we designed new supervised machine learning methods to prioritize growth regulators. Our results show that these methods significantly improve current state-of-the-art prioritization techniques, and are able to suggest meaningful new growth regulators. In addition, the integrated network is made available to the scientific community, providing a rich data source that will be useful for many biological processes, not necessarily restricted to plant growth.

  10. A predictive assessment of genetic correlations between traits in chickens using markers.

    PubMed

    Momen, Mehdi; Mehrgardi, Ahmad Ayatollahi; Sheikhy, Ayoub; Esmailizadeh, Ali; Fozi, Masood Asadi; Kranis, Andreas; Valente, Bruno D; Rosa, Guilherme J M; Gianola, Daniel

    2017-02-01

    Genomic selection has been successfully implemented in plant and animal breeding programs to shorten generation intervals and accelerate genetic progress per unit of time. In practice, genomic selection can be used to improve several correlated traits simultaneously via multiple-trait prediction, which exploits correlations between traits. However, few studies have explored multiple-trait genomic selection. Our aim was to infer genetic correlations between three traits measured in broiler chickens by exploring kinship matrices based on a linear combination of measures of pedigree and marker-based relatedness. A predictive assessment was used to gauge genetic correlations. A multivariate genomic best linear unbiased prediction model was designed to combine information from pedigree and genome-wide markers in order to assess genetic correlations between three complex traits in chickens, i.e. body weight at 35 days of age (BW), ultrasound area of breast meat (BM) and hen-house egg production (HHP). A dataset with 1351 birds that were genotyped with the 600 K Affymetrix platform was used. A kinship kernel (K) was constructed as K = λ G + (1 - λ)A, where A is the numerator relationship matrix, measuring pedigree-based relatedness, and G is a genomic relationship matrix. The weight (λ) assigned to each source of information varied over the grid λ = (0, 0.2, 0.4, 0.6, 0.8, 1). Maximum likelihood estimates of heritability and genetic correlations were obtained at each λ, and the "optimum" λ was determined using cross-validation. Estimates of genetic correlations were affected by the weight placed on the source of information used to build K. For example, the genetic correlation between BW-HHP and BM-HHP changed markedly when λ varied from 0 (only A used for measuring relatedness) to 1 (only genomic information used). As λ increased, predictive correlations (correlation between observed phenotypes and predicted breeding values) increased and mean-squared predictive error decreased. However, the improvement in predictive ability was not monotonic, with an optimum found at some 0 < λ < 1, i.e., when both sources of information were used together. Our findings indicate that multiple-trait prediction may benefit from combining pedigree and marker information. Also, it appeared that expected correlated responses to selection computed from standard theory may differ from realized responses. The predictive assessment provided a metric for performance evaluation as well as a means for expressing uncertainty of outcomes of multiple-trait selection.

  11. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  12. A portable neutron spectroscope (NSPECT) for detection, imaging and identification of nuclear material

    NASA Astrophysics Data System (ADS)

    Ryan, James M.; Bancroft, Christopher; Bloser, Peter; Bravar, Ulisse; Fourguette, Dominique; Frost, Colin; Larocque, Liane; McConnell, Mark L.; Legere, Jason; Pavlich, Jane; Ritter, Greg; Wassick, Greg; Wood, Joshua; Woolf, Richard

    2010-08-01

    We have developed, fabricated and tested a prototype imaging neutron spectrometer designed for real-time neutron source location and identification. Real-time detection and identification is important for locating materials. These materials, specifically uranium and transuranics, emit neutrons via spontaneous or induced fission. Unlike other forms of radiation (e.g. gamma rays), penetrating neutron emission is very uncommon. The instrument detects these neutrons, constructs images of the emission pattern, and reports the neutron spectrum. The device will be useful for security and proliferation deterrence, as well as for nuclear waste characterization and monitoring. The instrument is optimized for imaging and spectroscopy in the 1-20 MeV range. The detection principle is based upon multiple elastic neutron-proton scatters in organic scintillator. Two detector panel layers are utilized. By measuring the recoil proton and scattered neutron locations and energies, the direction and energy spectrum of the incident neutrons can be determined and discrete and extended sources identified. Event reconstruction yields an image of the source and its location. The hardware is low power, low mass, and rugged. Its modular design allows the user to combine multiple units for increased sensitivity. We will report the results of laboratory testing of the instrument, including exposure to a calibrated Cf-252 source. Instrument parameters include energy and angular resolution, gamma rejection, minimum source identification distances and times, and projected effective area for a fully populated instrument.

  13. Combination of Complex-Based and Magnitude-Based Multiecho Water-Fat Separation for Accurate Quantification of Fat-Fraction

    PubMed Central

    Yu, Huanzhou; Shimakawa, Ann; Hines, Catherine D. G.; McKenzie, Charles A.; Hamilton, Gavin; Sirlin, Claude B.; Brittain, Jean H.; Reeder, Scott B.

    2011-01-01

    Multipoint water–fat separation techniques rely on different water–fat phase shifts generated at multiple echo times to decompose water and fat. Therefore, these methods require complex source images and allow unambiguous separation of water and fat signals. However, complex-based water–fat separation methods are sensitive to phase errors in the source images, which may lead to clinically important errors. An alternative approach to quantify fat is through “magnitude-based” methods that acquire multiecho magnitude images. Magnitude-based methods are insensitive to phase errors, but cannot estimate fat-fraction greater than 50%. In this work, we introduce a water–fat separation approach that combines the strengths of both complex and magnitude reconstruction algorithms. A magnitude-based reconstruction is applied after complex-based water–fat separation to removes the effect of phase errors. The results from the two reconstructions are then combined. We demonstrate that using this hybrid method, 0–100% fat-fraction can be estimated with improved accuracy at low fat-fractions. PMID:21695724

  14. Combining multiple earthquake models in real time for earthquake early warning

    USGS Publications Warehouse

    Minson, Sarah E.; Wu, Stephen; Beck, James L; Heaton, Thomas H.

    2017-01-01

    The ultimate goal of earthquake early warning (EEW) is to provide local shaking information to users before the strong shaking from an earthquake reaches their location. This is accomplished by operating one or more real‐time analyses that attempt to predict shaking intensity, often by estimating the earthquake’s location and magnitude and then predicting the ground motion from that point source. Other EEW algorithms use finite rupture models or may directly estimate ground motion without first solving for an earthquake source. EEW performance could be improved if the information from these diverse and independent prediction models could be combined into one unified, ground‐motion prediction. In this article, we set the forecast shaking at each location as the common ground to combine all these predictions and introduce a Bayesian approach to creating better ground‐motion predictions. We also describe how this methodology could be used to build a new generation of EEW systems that provide optimal decisions customized for each user based on the user’s individual false‐alarm tolerance and the time necessary for that user to react.

  15. Integrating data types to enhance shoreline change assessments

    NASA Astrophysics Data System (ADS)

    Long, J.; Henderson, R.; Plant, N. G.; Nelson, P. R.

    2016-12-01

    Shorelines represent the variable boundary between terrestrial and marine environments. Assessment of geographic and temporal variability in shoreline position and related variability in shoreline change rates are an important part of studies and applications related to impacts from sea-level rise and storms. The results from these assessments are used to quantify future ecosystem services and coastal resilience and guide selection of appropriate coastal restoration and protection designs. But existing assessments typically fail to incorporate all available shoreline observations because they are derived from multiple data types and have different or unknown biases and uncertainties. Shoreline-change research and assessments often focus on either the long-term trajectory using sparse data over multiple decades or shorter-term evolution using data collected more frequently but over a shorter period of time. The combination of data collected with significantly different temporal resolution is not often considered. Also, differences in the definition of the shoreline metric itself can occur, whether using a single or multiple data source(s), due to variation the signal being detected in the data (e.g. instantaneous land/water interface, swash zone, wrack line, or topographic contours). Previous studies have not explored whether more robust shoreline change assessments are possible if all available data are utilized and all uncertainties are considered. In this study, we test the hypothesis that incorporating all available shoreline data will lead to both improved historical assessments and enhance the predictive capability of shoreline-change forecasts. Using over 250 observations of shoreline position at Dauphin Island, Alabama over the last century, we compare shoreline-change rates derived from individual data sources (airborne lidar, satellite, aerial photographs) with an assessment using the combination of all available data. Biases or simple uncertainties in the shoreline metric from different data types and varying temporal/spatial resolution of the data are examined. As part of this test, we also demonstrate application of data assimilation techniques to predict shoreline position by accurately including the uncertainty in each type of data.

  16. Steering and positioning targets for HWIL IR testing at cryogenic conditions

    NASA Astrophysics Data System (ADS)

    Perkes, D. W.; Jensen, G. L.; Higham, D. L.; Lowry, H. S.; Simpson, W. R.

    2006-05-01

    In order to increase the fidelity of hardware-in-the-loop ground-truth testing, it is desirable to create a dynamic scene of multiple, independently controlled IR point sources. ATK-Mission Research has developed and supplied the steering mirror systems for the 7V and 10V Space Simulation Test Chambers at the Arnold Engineering Development Center (AEDC), Air Force Materiel Command (AFMC). A portion of the 10V system incorporates multiple target sources beam-combined at the focal point of a 20K cryogenic collimator. Each IR source consists of a precision blackbody with cryogenic aperture and filter wheels mounted on a cryogenic two-axis translation stage. This point source target scene is steered by a high-speed steering mirror to produce further complex motion. The scene changes dynamically in order to simulate an actual operational scene as viewed by the System Under Test (SUT) as it executes various dynamic look-direction changes during its flight to a target. Synchronization and real-time hardware-in-the-loop control is accomplished using reflective memory for each subsystem control and feedback loop. This paper focuses on the steering mirror system and the required tradeoffs of optical performance, precision, repeatability and high-speed motion as well as the complications of encoder feedback calibration and operation at 20K.

  17. Water sources and mixing in riparian wetlands revealed by tracers and geospatial analysis.

    PubMed

    Lessels, Jason S; Tetzlaff, Doerthe; Birkel, Christian; Dick, Jonathan; Soulsby, Chris

    2016-01-01

    Mixing of waters within riparian zones has been identified as an important influence on runoff generation and water quality. Improved understanding of the controls on the spatial and temporal variability of water sources and how they mix in riparian zones is therefore of both fundamental and applied interest. In this study, we have combined topographic indices derived from a high-resolution Digital Elevation Model (DEM) with repeated spatially high-resolution synoptic sampling of multiple tracers to investigate such dynamics of source water mixing. We use geostatistics to estimate concentrations of three different tracers (deuterium, alkalinity, and dissolved organic carbon) across an extended riparian zone in a headwater catchment in NE Scotland, to identify spatial and temporal influences on mixing of source waters. The various biogeochemical tracers and stable isotopes helped constrain the sources of runoff and their temporal dynamics. Results show that spatial variability in all three tracers was evident in all sampling campaigns, but more pronounced in warmer dryer periods. The extent of mixing areas within the riparian area reflected strong hydroclimatic controls and showed large degrees of expansion and contraction that was not strongly related to topographic indices. The integrated approach of using multiple tracers, geospatial statistics, and topographic analysis allowed us to classify three main riparian source areas and mixing zones. This study underlines the importance of the riparian zones for mixing soil water and groundwater and introduces a novel approach how this mixing can be quantified and the effect on the downstream chemistry be assessed.

  18. Development of Remote Sampling ESI Mass Spectrometry for the Rapid and Automatic Analysis of Multiple Samples

    PubMed Central

    Yamada, Yuki; Ninomiya, Satoshi; Hiraoka, Kenzo; Chen, Lee Chuin

    2016-01-01

    We report on combining a self-aspirated sampling probe and an ESI source using a single metal capillary which is electrically grounded and safe for use by the operator. To generate an electrospray, a negative H.V. is applied to the counter electrode of the ESI emitter to operate in positive ion mode. The sampling/ESI capillary is enclosed within another concentric capillary similar to the arrangement for a standard pneumatically assisted ESI source. The suction of the liquid sample is due to the Venturi effect created by the high-velocity gas flow near the ESI tip. In addition to serving as the mechanism for suction, the high-velocity gas flow also assists in the nebulization of charged droplets, thus producing a stable ion signal. Even though the potential of the ion source counter electrode is more negative than the mass spectrometer in the positive ion mode, the electric field effect is not significant if the ion source and the mass spectrometer are separated by a sufficient distance. Ion transmission is achieved by the viscous flow of the carrier gas. Using the present arrangement, the user can hold the ion source in a bare hand and the ion signal appears almost immediately when the sampling capillary is brought into contact with the liquid sample. The automated analysis of multiple samples can also be achieved by using motorized sample stage and an automated ion source holder. PMID:28616373

  19. Development of Remote Sampling ESI Mass Spectrometry for the Rapid and Automatic Analysis of Multiple Samples.

    PubMed

    Yamada, Yuki; Ninomiya, Satoshi; Hiraoka, Kenzo; Chen, Lee Chuin

    2016-01-01

    We report on combining a self-aspirated sampling probe and an ESI source using a single metal capillary which is electrically grounded and safe for use by the operator. To generate an electrospray, a negative H.V. is applied to the counter electrode of the ESI emitter to operate in positive ion mode. The sampling/ESI capillary is enclosed within another concentric capillary similar to the arrangement for a standard pneumatically assisted ESI source. The suction of the liquid sample is due to the Venturi effect created by the high-velocity gas flow near the ESI tip. In addition to serving as the mechanism for suction, the high-velocity gas flow also assists in the nebulization of charged droplets, thus producing a stable ion signal. Even though the potential of the ion source counter electrode is more negative than the mass spectrometer in the positive ion mode, the electric field effect is not significant if the ion source and the mass spectrometer are separated by a sufficient distance. Ion transmission is achieved by the viscous flow of the carrier gas. Using the present arrangement, the user can hold the ion source in a bare hand and the ion signal appears almost immediately when the sampling capillary is brought into contact with the liquid sample. The automated analysis of multiple samples can also be achieved by using motorized sample stage and an automated ion source holder.

  20. Assessing CO2 emissions from Canada's oil sands developments - an inversion approach combined with stable isotope data

    NASA Astrophysics Data System (ADS)

    Kim, M. G.; Lin, J. C.; Huang, L.; Edwards, T. W.; Jones, J. P.; Polavarapu, S.; Nassar, R.

    2012-12-01

    Reducing uncertainties in the projections of atmospheric CO2 concentration levels relies on increasing our scientific understanding of the exchange processes between atmosphere and land at regional scales, which is highly dependent on climate, ecosystem processes, and anthropogenic disturbances. In order for researchers to reduce the uncertainties, a combined framework that mutually addresses these independent variables to account for each process is invaluable. In this research, an example of top-down inversion modeling approach that is combined with stable isotope measurement data is presented. The potential for the proposed analysis framework is demonstrated using the Stochastic Time-Inverted Lagrangian Transport (STILT) model runs combined with high precision CO2 concentration data measured at a Canadian greenhouse gas monitoring site as well as multiple tracers: stable isotopes and combustion-related species. This framework yields a unique regional scale constraint that can be used to relate the measured changes of tracer concentrations to processes in their upwind source regions. The inversion approach both reproduces source areas in a spatially explicit way through sophisticated Lagrangian transport modeling and infers emission processes that leave imprints on atmospheric tracers. The understanding gained through the combined approach can also be used to verify reported emissions as part of regulatory regimes. The results indicate that changes in CO2 concentration is strongly influenced by regional sources, including significant fossil fuel emissions, and that the combined approach can be used to test reported emissions of the greenhouse gas from oil sands developments. Also, methods to further reduce uncertainties in the retrieved emissions by incorporating additional constraints including tracer-to-tracer correlations and satellite measurements are discussed briefly.

  1. Effect of multiple-source entry on price competition after patent expiration in the pharmaceutical industry.

    PubMed Central

    Suh, D C; Manning, W G; Schondelmeyer, S; Hadsall, R S

    2000-01-01

    OBJECTIVE: To analyze the effect of multiple-source drug entry on price competition after patent expiration in the pharmaceutical industry. DATA SOURCES: Originators and their multiple-source drugs selected from the 35 chemical entities whose patents expired from 1984 through 1987. Data were obtained from various primary and secondary sources for the patents' expiration dates, sales volume and units sold, and characteristics of drugs in the sample markets. STUDY DESIGN: The study was designed to determine significant factors using the study model developed under the assumption that the off-patented market is an imperfectly segmented market. PRINCIPAL FINDINGS: After patent expiration, the originators' prices continued to increase, while the price of multiple-source drugs decreased significantly over time. By the fourth year after patent expiration, originators' sales had decreased 12 percent in dollars and 30 percent in quantity. Multiple-source drugs increased their sales twofold in dollars and threefold in quantity, and possessed about one-fourth (in dollars) and half (in quantity) of the total market three years after entry. CONCLUSION: After patent expiration, multiple-source drugs compete largely with other multiple-source drugs in the price-sensitive sector, but indirectly with the originator in the price-insensitive sector. Originators have first-mover advantages, and therefore have a market that is less price sensitive after multiple-source drugs enter. On the other hand, multiple-source drugs target the price-sensitive sector, using their lower-priced drugs. This trend may indicate that the off-patented market is imperfectly segmented between the price-sensitive and insensitive sector. Consumers as a whole can gain from the entry of multiple-source drugs because the average price of the market continually declines after patent expiration. PMID:10857475

  2. Exploring uncertainty in the Earth Sciences - the potential field perspective

    NASA Astrophysics Data System (ADS)

    Saltus, R. W.; Blakely, R. J.

    2013-12-01

    Interpretation of gravity and magnetic anomalies is mathematically non-unique because multiple theoretical solutions are possible. The mathematical label of 'non-uniqueness' can lead to the erroneous impression that no single interpretation is better in a geologic sense than any other. The purpose of this talk is to present a practical perspective on the theoretical non-uniqueness of potential field interpretation in geology. There are multiple ways to approach and constrain potential field studies to produce significant, robust, and definitive results. For example, a smooth, bell-shaped gravity profile, in theory, could be caused by an infinite set of physical density bodies, ranging from a deep, compact, circular source to a shallow, smoothly varying, inverted bell-shaped source. In practice, however, we can use independent geologic or geophysical information to limit the range of possible source densities and rule out many of the theoretical solutions. We can further reduce the theoretical uncertainty by careful attention to subtle anomaly details. For example, short-wavelength anomalies are a well-known and theoretically established characteristic of shallow geologic sources. The 'non-uniqueness' of potential field studies is closely related to the more general topic of scientific uncertainty in the Earth sciences and beyond. Nearly all results in the Earth sciences are subject to significant uncertainty because problems are generally addressed with incomplete and imprecise data. The increasing need to combine results from multiple disciplines into integrated solutions in order to address complex global issues requires special attention to the appreciation and communication of uncertainty in geologic interpretation.

  3. Combining Multiple Knowledge Sources for Continuous Speech Recognition

    DTIC Science & Technology

    1989-08-01

    derived by estimating probabilities from a training set, or a linguistically -based model that uses syntactic and semantic information explicitly. The...into a hierarchical set of rules tha’ wouA. :over a much larger percentage of new sentences than the original sentence patteiis. We applied this tool...statistical grammars typically used by the use of linguistic knowledge. In particular, we group the different words in the vocabulary into classes, under the

  4. Adult exposure to ocean acidification is maladaptive for larvae of the Sydney rock oyster Saccostrea glomerata in the presence of multiple stressors.

    PubMed

    Parker, Laura M; O'Connor, Wayne A; Byrne, Maria; Coleman, Ross A; Virtue, Patti; Dove, Michael; Gibbs, Mitchell; Spohr, Lorraine; Scanes, Elliot; Ross, Pauline M

    2017-02-01

    Parental effects passed from adults to their offspring have been identified as a source of rapid acclimation that may allow marine populations to persist as our surface oceans continue to decrease in pH. Little is known, however, whether parental effects are beneficial for offspring in the presence of multiple stressors. We exposed adults of the oyster Saccostrea glomerata to elevated CO 2 and examined the impacts of elevated CO 2 (control = 392; 856 µatm) combined with elevated temperature (control = 24; 28°C), reduced salinity (control = 35; 25) and reduced food concentration (control = full; half diet) on their larvae. Adult exposure to elevated CO 2 had a positive impact on larvae reared at elevated CO 2 as a sole stressor, which were 8% larger and developed faster at elevated CO 2 compared with larvae from adults exposed to ambient CO 2 These larvae, however, had significantly reduced survival in all multistressor treatments. This was particularly evident for larvae reared at elevated CO 2 combined with elevated temperature or reduced food concentration, with no larvae surviving in some treatment combinations. Larvae from CO 2 -exposed adults had a higher standard metabolic rate. Our results provide evidence that parental exposure to ocean acidification may be maladaptive when larvae experience multiple stressors. © 2017 The Author(s).

  5. Adult exposure to ocean acidification is maladaptive for larvae of the Sydney rock oyster Saccostrea glomerata in the presence of multiple stressors

    PubMed Central

    O'Connor, Wayne A.; Byrne, Maria; Virtue, Patti; Dove, Michael; Gibbs, Mitchell; Spohr, Lorraine; Scanes, Elliot; Ross, Pauline M.

    2017-01-01

    Parental effects passed from adults to their offspring have been identified as a source of rapid acclimation that may allow marine populations to persist as our surface oceans continue to decrease in pH. Little is known, however, whether parental effects are beneficial for offspring in the presence of multiple stressors. We exposed adults of the oyster Saccostrea glomerata to elevated CO2 and examined the impacts of elevated CO2 (control = 392; 856 µatm) combined with elevated temperature (control = 24; 28°C), reduced salinity (control = 35; 25) and reduced food concentration (control = full; half diet) on their larvae. Adult exposure to elevated CO2 had a positive impact on larvae reared at elevated CO2 as a sole stressor, which were 8% larger and developed faster at elevated CO2 compared with larvae from adults exposed to ambient CO2. These larvae, however, had significantly reduced survival in all multistressor treatments. This was particularly evident for larvae reared at elevated CO2 combined with elevated temperature or reduced food concentration, with no larvae surviving in some treatment combinations. Larvae from CO2-exposed adults had a higher standard metabolic rate. Our results provide evidence that parental exposure to ocean acidification may be maladaptive when larvae experience multiple stressors. PMID:28202683

  6. Peculiarities of section topograms for the multiple diffraction of X rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohn, V. G., E-mail: kohnvict@yandex.ru; Smirnova, I. A.

    The distortion of interference fringes on the section topograms of single crystal due to the multiple diffraction of X rays has been investigated. The cases of the 220 and 400 reflections in a silicon crystal in the form of a plate with a surface oriented normally to the [001] direction are considered both theoretically and experimentally. The same section topogram exhibits five cases of multiple diffraction at small azimuthal angles for the 400 reflection and MoK{sub α} radiation, while the topogram for the 220 reflection demonstrates two cases of multiple diffraction. All these cases correspond to different combinations of reciprocalmore » lattice vectors. Exact theoretical calculations of section topograms for the aforementioned cases of multiple diffraction have been performed for the first time. The section topograms exhibit two different distortion regions. The distortions in the central region of the structure are fairly complex and depend strongly on the azimuthal angle. In the tails of the multiple diffraction region, there is a shift of two-beam interference fringes, which can be observed even with a laboratory X-ray source.« less

  7. [Sources analysis and contribution identification of polycyclic aromatic hydrocarbons in indoor and outdoor air of Hangzhou].

    PubMed

    Liu, Y; Zhu, L; Wang, J; Shen, X; Chen, X

    2001-11-01

    Twelve polycyclic aromatic hydrocarbons (PAHs) were measured in eight homes in Hangzhou during the summer and autumn in 1999. The sources of PAHs and the contributions of the sources to the total concentration of PAHs in the indoor air were identified by the combination of correlation analysis, factor analysis and multiple regression, and the equations between the concentrations of PAHs in indoor and outdoor air and factors were got. It was indicated that the factors of PAHs in the indoor air were domestic cuisine, the volatility of the mothball, cigarette smoke and heating, the waste gas from vehicles. In the smokers' home, cigarette smoke was the most important factor, and it contributed 25.8% of BaP to the indoor air of smokers' home.

  8. A broadband ASE light source-based full-duplex FTTX/ROF transport system.

    PubMed

    Chang, Ching-Hung; Lu, Hai-Han; Su, Heng-Sheng; Shih, Chien-Liang; Chen, Kai-Jen

    2009-11-23

    A full-duplex fiber-to-the-X (FTTX)/radio-over-fiber (ROF) transport system based on a broadband amplified spontaneous emission (ASE) light source is proposed and demonstrated for rural wide-spread villages. Combining the concepts of long-transmission transmission and ring topology, a long-haul single-mode fiber (SMF) trunk is sharing with multiple rural villages. Externally modulated baseband (BB) (1.25 Gbps) and radio-frequency (RF) (622 Mbps/10 GHz) signals are successfully transmitted simultaneously. Good bit error rate (BER) performance was achieved to demonstrate the practice of providing wire/wireless connections for long-haul wide-spread rural villages. Since our proposed system uses only a broadband ASE light source to achieve multi-wavelengths transmissions, it also reveals an outstanding one with simpler and more economic advantages.

  9. Resource assessment in Western Australia using a geographic information system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, A.

    1991-03-01

    Three study areas in Western Australia covering from 77,000 to 425,000 mi{sup 2} were examined for oil and gas potential using a geographic information system (GIS). A data base of source rock thickness, source richness, maturity, and expulsion efficiency was created for each interval. The GIS (Arc/Info) was used to create, manage, and analyze data for each interval in each study area. Source rock thickness and source richness data were added to the data base from digitized data. Maturity information was generated with Arc/Info by combining geochemical and depth to structure data. Expulsion efficiency data was created by a systemmore » level Arc/Info program. After the data base for each interval was built, the GIS was used to analyze the geologic data. The analysis consisted of converting each data layer into a lattice (grid) and using the lattice operation in Arc/Infor (addition, multiplication, division, and subtraction) to combine the data layers. Additional techniques for combining and selecting data were developed using Arc/Info system level programs. The procedure for performing the analyses was written as macros in Arc/Info's macro programming language (AML). The results of the analysis were estimates of oil and gas volumes for each interval. The resultant volumes were produced in tabular form for reports and cartographic form for presentation. The geographic information system provided several clear advantages over traditional methods of resource assessment including simplified management, updating, and editing of geologic data.« less

  10. Chemical complexity and source of the White River Ash, Alaska and Yukon

    USGS Publications Warehouse

    Preece, S.J.; McGimsey, Robert G.; Westgate, J.A.; Pearce, N.J.G.; Hartmann, W.K.; Perkins, W.T.

    2014-01-01

    The White River Ash, a prominent stratigraphic marker bed in Alaska (USA) and Yukon (Canada), consists of multiple compositional units belonging to two geochemical groups. The compositional units are characterized using multiple criteria, with combined glass and ilmenite compositions being the best discriminators. Two compositional units compose the northern group (WRA-Na and WRA-Nb), and two units are present in the eastern group (WRA-Ea and the younger, WRA-Eb). In the proximal area, the ca. 1900 yr B.P. (Lerbekmo et al., 1975) WRA-Na displays reverse zoning in the glass phase and systematic changes in ilmenite composition and estimated oxygen fugacity from the base to the top of the unit. The eruption probably tapped different magma batches or bodies within the magma reservoir with limited mixing or mingling between them. The 1147 cal yr B.P. (calibrated years, approximately equivalent to calendric years) (Clague et al., 1995) WRA-Ea eruption is only weakly zoned, but pumices with different glass compositions are present, along with gray and white intermingled glass in individual pumice clasts, indicating the presence of multiple magmatic bodies or layers. All White River Ash products are high-silica adakites and are sourced from the Mount Churchill magmatic system.

  11. Bayesian networks improve causal environmental ...

    EPA Pesticide Factsheets

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value

  12. Platelet lysate-based pro-angiogenic nanocoatings.

    PubMed

    Oliveira, Sara M; Pirraco, Rogério P; Marques, Alexandra P; Santo, Vítor E; Gomes, Manuela E; Reis, Rui L; Mano, João F

    2016-03-01

    Human platelet lysate (PL) is a cost-effective and human source of autologous multiple and potent pro-angiogenic factors, such as vascular endothelial growth factor A (VEGF A), fibroblast growth factor b (FGF b) and angiopoietin-1. Nanocoatings previously characterized were prepared by layer-by-layer assembling incorporating PL with marine-origin polysaccharides and were shown to activate human umbilical vein endothelial cells (HUVECs). Within 20 h of incubation, the more sulfated coatings induced the HUVECS to the form tube-like structures accompanied by an increased expression of angiogenic-associated genes, such as angiopoietin-1 and VEGF A. This may be a cost-effective approach to modify 2D/3D constructs to instruct angiogenic cells towards the formation of neo-vascularization, driven by multiple and synergistic stimulations from the PL combined with sulfated polysaccharides. The presence, or fast induction, of a stable and mature vasculature inside 3D constructs is crucial for new tissue formation and its viability. This has been one of the major tissue engineering challenges, limiting the dimensions of efficient tissue constructs. Many approaches based on cells, growth factors, 3D bioprinting and channel incorporation have been proposed. Herein, we explored a versatile technique, layer-by-layer assembling in combination with platelet lysate (PL), that is a cost-effective source of many potent pro-angiogenic proteins and growth factors. Results suggest that the combination of PL with sulfated polyelectrolytes might be used to introduce interfaces onto 2D/3D constructs with potential to induce the formation of cell-based tubular structures. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  13. An Informatics Approach to Evaluating Combined Chemical Exposures from Consumer Products: A Case Study of Asthma-Associated Chemicals and Potential Endocrine Disruptors.

    PubMed

    Gabb, Henry A; Blake, Catherine

    2016-08-01

    Simultaneous or sequential exposure to multiple environmental stressors can affect chemical toxicity. Cumulative risk assessments consider multiple stressors but it is impractical to test every chemical combination to which people are exposed. New methods are needed to prioritize chemical combinations based on their prevalence and possible health impacts. We introduce an informatics approach that uses publicly available data to identify chemicals that co-occur in consumer products, which account for a significant proportion of overall chemical load. Fifty-five asthma-associated and endocrine disrupting chemicals (target chemicals) were selected. A database of 38,975 distinct consumer products and 32,231 distinct ingredient names was created from online sources, and PubChem and the Unified Medical Language System were used to resolve synonymous ingredient names. Synonymous ingredient names are different names for the same chemical (e.g., vitamin E and tocopherol). Nearly one-third of the products (11,688 products, 30%) contained ≥ 1 target chemical and 5,229 products (13%) contained > 1. Of the 55 target chemicals, 31 (56%) appear in ≥ 1 product and 19 (35%) appear under more than one name. The most frequent three-way chemical combination (2-phenoxyethanol, methyl paraben, and ethyl paraben) appears in 1,059 products. Further work is needed to assess combined chemical exposures related to the use of multiple products. The informatics approach increased the number of products considered in a traditional analysis by two orders of magnitude, but missing/incomplete product labels can limit the effectiveness of this approach. Such an approach must resolve synonymy to ensure that chemicals of interest are not missed. Commonly occurring chemical combinations can be used to prioritize cumulative toxicology risk assessments. Gabb HA, Blake C. 2016. An informatics approach to evaluating combined chemical exposures from consumer products: a case study of asthma-associated chemicals and potential endocrine disruptors. Environ Health Perspect 124:1155-1165; http://dx.doi.org/10.1289/ehp.1510529.

  14. Parallel excitation-emission multiplexed fluorescence lifetime confocal microscopy for live cell imaging.

    PubMed

    Zhao, Ming; Li, Yu; Peng, Leilei

    2014-05-05

    We present a novel excitation-emission multiplexed fluorescence lifetime microscopy (FLIM) method that surpasses current FLIM techniques in multiplexing capability. The method employs Fourier multiplexing to simultaneously acquire confocal fluorescence lifetime images of multiple excitation wavelength and emission color combinations at 44,000 pixels/sec. The system is built with low-cost CW laser sources and standard PMTs with versatile spectral configuration, which can be implemented as an add-on to commercial confocal microscopes. The Fourier lifetime confocal method allows fast multiplexed FLIM imaging, which makes it possible to monitor multiple biological processes in live cells. The low cost and compatibility with commercial systems could also make multiplexed FLIM more accessible to biological research community.

  15. Hybrid-dual-fourier tomographic algorithm for a fast three-dimensionial optical image reconstruction in turbid media

    NASA Technical Reports Server (NTRS)

    Alfano, Robert R. (Inventor); Cai, Wei (Inventor)

    2007-01-01

    A reconstruction technique for reducing computation burden in the 3D image processes, wherein the reconstruction procedure comprises an inverse and a forward model. The inverse model uses a hybrid dual Fourier algorithm that combines a 2D Fourier inversion with a 1D matrix inversion to thereby provide high-speed inverse computations. The inverse algorithm uses a hybrid transfer to provide fast Fourier inversion for data of multiple sources and multiple detectors. The forward model is based on an analytical cumulant solution of a radiative transfer equation. The accurate analytical form of the solution to the radiative transfer equation provides an efficient formalism for fast computation of the forward model.

  16. Computational overlay metrology with adaptive data analytics

    NASA Astrophysics Data System (ADS)

    Schmitt-Weaver, Emil; Subramony, Venky; Ullah, Zakir; Matsunobu, Masazumi; Somasundaram, Ravin; Thomas, Joel; Zhang, Linmiao; Thul, Klaus; Bhattacharyya, Kaustuve; Goossens, Ronald; Lambregts, Cees; Tel, Wim; de Ruiter, Chris

    2017-03-01

    With photolithography as the fundamental patterning step in the modern nanofabrication process, every wafer within a semiconductor fab will pass through a lithographic apparatus multiple times. With more than 20,000 sensors producing more than 700GB of data per day across multiple subsystems, the combination of a light source and lithographic apparatus provide a massive amount of information for data analytics. This paper outlines how data analysis tools and techniques that extend insight into data that traditionally had been considered unmanageably large, known as adaptive analytics, can be used to show how data collected before the wafer is exposed can be used to detect small process dependent wafer-towafer changes in overlay.

  17. Virtual Plant Tissue: Building Blocks for Next-Generation Plant Growth Simulation

    PubMed Central

    De Vos, Dirk; Dzhurakhalov, Abdiravuf; Stijven, Sean; Klosiewicz, Przemyslaw; Beemster, Gerrit T. S.; Broeckhove, Jan

    2017-01-01

    Motivation: Computational modeling of plant developmental processes is becoming increasingly important. Cellular resolution plant tissue simulators have been developed, yet they are typically describing physiological processes in an isolated way, strongly delimited in space and time. Results: With plant systems biology moving toward an integrative perspective on development we have built the Virtual Plant Tissue (VPTissue) package to couple functional modules or models in the same framework and across different frameworks. Multiple levels of model integration and coordination enable combining existing and new models from different sources, with diverse options in terms of input/output. Besides the core simulator the toolset also comprises a tissue editor for manipulating tissue geometry and cell, wall, and node attributes in an interactive manner. A parameter exploration tool is available to study parameter dependence of simulation results by distributing calculations over multiple systems. Availability: Virtual Plant Tissue is available as open source (EUPL license) on Bitbucket (https://bitbucket.org/vptissue/vptissue). The project has a website https://vptissue.bitbucket.io. PMID:28523006

  18. Highly porous micro-roughened structures developed on aluminum surface using the jet of rotating arc discharges at atmospheric pressure

    NASA Astrophysics Data System (ADS)

    Asadollahi, Siavash; Farzaneh, Masoud; Stafford, Luc

    2018-02-01

    Aluminum 6061 samples were exposed to the jet of an atmospheric pressure rotating arc discharge operated in either nitrogen or air. After multiple passes of treatment with an air-based plasma jet at very short source-to-substrate distances, scanning electron microscopy combined with x-ray photoelectron spectroscopy revealed a highly porous micro-roughened alumina-based structure on the surface of aluminum. Based on optical emission spectroscopy and high-speed optical imaging of the jet interacting with aluminum samples, it was found that the process is mainly driven by the energy transfer from the plasma source to the surface through transient plasma-transferred arcs. The occurrence of multiple arc discharges over very short time scales can induce rapid phase transformations of aluminum with characteristics similar to the ones usually observed during laser ablation of materials with femto- to nanosecond laser pulses or during the formation of cathode spots on the surface of metals.

  19. Compilation and analysis of multiple groundwater-quality datasets for Idaho

    USGS Publications Warehouse

    Hundt, Stephen A.; Hopkins, Candice B.

    2018-05-09

    Groundwater is an important source of drinking and irrigation water throughout Idaho, and groundwater quality is monitored by various Federal, State, and local agencies. The historical, multi-agency records of groundwater quality include a valuable dataset that has yet to be compiled or analyzed on a statewide level. The purpose of this study is to combine groundwater-quality data from multiple sources into a single database, to summarize this dataset, and to perform bulk analyses to reveal spatial and temporal patterns of water quality throughout Idaho. Data were retrieved from the Water Quality Portal (https://www.waterqualitydata.us/), the Idaho Department of Environmental Quality, and the Idaho Department of Water Resources. Analyses included counting the number of times a sample location had concentrations above Maximum Contaminant Levels (MCL), performing trends tests, and calculating correlations between water-quality analytes. The water-quality database and the analysis results are available through USGS ScienceBase (https://doi.org/10.5066/F72V2FBG).

  20. Household trends in access to improved water sources and sanitation facilities in Vietnam and associated factors: findings from the Multiple Indicator Cluster Surveys, 2000–2011

    PubMed Central

    Tuyet-Hanh, Tran Thi; Lee, Jong-Koo; Oh, Juhwan; Van Minh, Hoang; Ou Lee, Chul; Hoan, Le Thi; Nam, You-Seon; Long, Tran Khanh

    2016-01-01

    Background Despite progress made by the Millennium Development Goal (MDG) number 7.C, Vietnam still faces challenges with regard to the provision of access to safe drinking water and basic sanitation. Objective This paper describes household trends in access to improved water sources and sanitation facilities separately, and analyses factors associated with access to improved water sources and sanitation facilities in combination. Design Secondary data from the Vietnam Multiple Indicator Cluster Survey in 2000, 2006, and 2011 were analyzed. Descriptive statistics and tests of significance describe trends over time in access to water and sanitation by location, demographic and socio-economic factors. Binary logistic regressions (2000, 2006, and 2011) describe associations between access to water and sanitation, and geographic, demographic, and socio-economic factors. Results There have been some outstanding developments in access to improved water sources and sanitation facilities from 2000 to 2011. In 2011, the proportion of households with access to improved water sources and sanitation facilities reached 90% and 77%, respectively, meeting the 2015 MDG targets for safe drinking water and basic sanitation set at 88% and 75%, respectively. However, despite these achievements, in 2011, only 74% of households overall had access to combined improved drinking water and sanitation facilities. There were also stark differences between regions. In 2011, only 47% of households had access to both improved water and sanitation facilities in the Mekong River Delta compared with 94% in the Red River Delta. In 2011, households in urban compared to rural areas were more than twice as likely (odds ratio [OR]: 2.2; 95% confidence interval [CI]: 1.9–2.5) to have access to improved water and sanitation facilities in combination, and households in the highest compared with the lowest wealth quintile were over 40 times more likely (OR: 42.3; 95% CI: 29.8–60.0). Conclusions More efforts are required to increase household access to both improved water and sanitation facilities in the Mekong River Delta, South East and Central Highlands regions of Vietnam. There is also a need to address socio-economic factors associated with inadequate access to improved sanitation facilities. PMID:26950563

  1. A Hybrid Algorithm for Period Analysis from Multiband Data with Sparse and Irregular Sampling for Arbitrary Light-curve Shapes

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Vivas, A. Katherina

    2017-12-01

    Ongoing and future surveys with repeat imaging in multiple bands are producing (or will produce) time-spaced measurements of brightness, resulting in the identification of large numbers of variable sources in the sky. A large fraction of these are periodic variables: compilations of these are of scientific interest for a variety of purposes. Unavoidably, the data sets from many such surveys not only have sparse sampling, but also have embedded frequencies in the observing cadence that beat against the natural periodicities of any object under investigation. Such limitations can make period determination ambiguous and uncertain. For multiband data sets with asynchronous measurements in multiple passbands, we wish to maximally use the information on periodicity in a manner that is agnostic of differences in the light-curve shapes across the different channels. Given large volumes of data, computational efficiency is also at a premium. This paper develops and presents a computationally economic method for determining periodicity that combines the results from two different classes of period-determination algorithms. The underlying principles are illustrated through examples. The effectiveness of this approach for combining asynchronously sampled measurements in multiple observables that share an underlying fundamental frequency is also demonstrated.

  2. Combining non selective gas sensors on a mobile robot for identification and mapping of multiple chemical compounds.

    PubMed

    Bennetts, Victor Hernandez; Schaffernicht, Erik; Pomareda, Victor; Lilienthal, Achim J; Marco, Santiago; Trincavelli, Marco

    2014-09-17

    In this paper, we address the task of gas distribution modeling in scenarios where multiple heterogeneous compounds are present. Gas distribution modeling is particularly useful in emission monitoring applications where spatial representations of the gaseous patches can be used to identify emission hot spots. In realistic environments, the presence of multiple chemicals is expected and therefore, gas discrimination has to be incorporated in the modeling process. The approach presented in this work addresses the task of gas distribution modeling by combining different non selective gas sensors. Gas discrimination is addressed with an open sampling system, composed by an array of metal oxide sensors and a probabilistic algorithm tailored to uncontrolled environments. For each of the identified compounds, the mapping algorithm generates a calibrated gas distribution model using the classification uncertainty and the concentration readings acquired with a photo ionization detector. The meta parameters of the proposed modeling algorithm are automatically learned from the data. The approach was validated with a gas sensitive robot patrolling outdoor and indoor scenarios, where two different chemicals were released simultaneously. The experimental results show that the generated multi compound maps can be used to accurately predict the location of emitting gas sources.

  3. Multiple Household Water Sources and Their Use in Remote Communities With Evidence From Pacific Island Countries

    NASA Astrophysics Data System (ADS)

    Elliott, Mark; MacDonald, Morgan C.; Chan, Terence; Kearton, Annika; Shields, Katherine F.; Bartram, Jamie K.; Hadwen, Wade L.

    2017-11-01

    Global water research and monitoring typically focus on the household's "main source of drinking-water." Use of multiple water sources to meet daily household needs has been noted in many developing countries but rarely quantified or reported in detail. We gathered self-reported data using a cross-sectional survey of 405 households in eight communities of the Republic of the Marshall Islands (RMI) and five Solomon Islands (SI) communities. Over 90% of households used multiple sources, with differences in sources and uses between wet and dry seasons. Most RMI households had large rainwater tanks and rationed stored rainwater for drinking throughout the dry season, whereas most SI households collected rainwater in small pots, precluding storage across seasons. Use of a source for cooking was strongly positively correlated with use for drinking, whereas use for cooking was negatively correlated or uncorrelated with nonconsumptive uses (e.g., bathing). Dry season water uses implied greater risk of water-borne disease, with fewer (frequently zero) handwashing sources reported and more unimproved sources consumed. Use of multiple sources is fundamental to household water management and feasible to monitor using electronic survey tools. We contend that recognizing multiple water sources can greatly improve understanding of household-level and community-level climate change resilience, that use of multiple sources confounds health impact studies of water interventions, and that incorporating multiple sources into water supply interventions can yield heretofore-unrealized benefits. We propose that failure to consider multiple sources undermines the design and effectiveness of global water monitoring, data interpretation, implementation, policy, and research.

  4. LISA Framework for Enhancing Gravitational Wave Signal Extraction Techniques

    NASA Technical Reports Server (NTRS)

    Thompson, David E.; Thirumalainambi, Rajkumar

    2006-01-01

    This paper describes the development of a Framework for benchmarking and comparing signal-extraction and noise-interference-removal methods that are applicable to interferometric Gravitational Wave detector systems. The primary use is towards comparing signal and noise extraction techniques at LISA frequencies from multiple (possibly confused) ,gravitational wave sources. The Framework includes extensive hybrid learning/classification algorithms, as well as post-processing regularization methods, and is based on a unique plug-and-play (component) architecture. Published methods for signal extraction and interference removal at LISA Frequencies are being encoded, as well as multiple source noise models, so that the stiffness of GW Sensitivity Space can be explored under each combination of methods. Furthermore, synthetic datasets and source models can be created and imported into the Framework, and specific degraded numerical experiments can be run to test the flexibility of the analysis methods. The Framework also supports use of full current LISA Testbeds, Synthetic data systems, and Simulators already in existence through plug-ins and wrappers, thus preserving those legacy codes and systems in tact. Because of the component-based architecture, all selected procedures can be registered or de-registered at run-time, and are completely reusable, reconfigurable, and modular.

  5. Multiple diagnosis in posttraumatic stress disorder. The role of war stressors.

    PubMed

    Green, B L; Lindy, J D; Grace, M C; Gleser, G C

    1989-06-01

    Prior studies have shown that posttraumatic stress disorder (PTSD) in Vietnam veterans is associated with various aspects of war stressors and that other diagnoses often co-occur with PTSD in this population. The present report examines the prediction of other diagnoses, in combination with PTSD, from a variety of war stressor experiences in a broad sample of veterans recruited from clinical and nonclinical sources. The results show that PTSD with panic disorder is better explained by war stressors than other diagnostic combinations and that high-risk assignments and exposure to grotesque deaths were more salient than other stressor experiences in accounting for different diagnostic combinations. Implications of the findings for PTSD's placement in the DSM-III-R and for psychological and pharmacological treatments were discussed.

  6. A physics based method for combining multiple anatomy models with application to medical simulation.

    PubMed

    Zhu, Yanong; Magee, Derek; Ratnalingam, Rishya; Kessel, David

    2009-01-01

    We present a physics based approach to the construction of anatomy models by combining components from different sources; different image modalities, protocols, and patients. Given an initial anatomy, a mass-spring model is generated which mimics the physical properties of the solid anatomy components. This helps maintain valid spatial relationships between the components, as well as the validity of their shapes. Combination can be either replacing/modifying an existing component, or inserting a new component. The external forces that deform the model components to fit the new shape are estimated from Gradient Vector Flow and Distance Transform maps. We demonstrate the applicability and validity of the described approach in the area of medical simulation, by showing the processes of non-rigid surface alignment, component replacement, and component insertion.

  7. Multiple site receptor modeling with a minimal spanning tree combined with a Kohonen neural network

    NASA Astrophysics Data System (ADS)

    Hopke, Philip K.

    1999-12-01

    A combination of two pattern recognition methods has been developed that allows the generation of geographical emission maps form multivariate environmental data. In such a projection into a visually interpretable subspace by a Kohonen Self-Organizing Feature Map, the topology of the higher dimensional variables space can be preserved, but parts of the information about the correct neighborhood among the sample vectors will be lost. This can partly be compensated for by an additional projection of Prim's Minimal Spanning Tree into the trained neural network. This new environmental receptor modeling technique has been adapted for multiple sampling sites. The behavior of the method has been studied using simulated data. Subsequently, the method has been applied to mapping data sets from the Southern California Air Quality Study. The projection of a 17 chemical variables measured at up to 8 sampling sites provided a 2D, visually interpretable, geometrically reasonable arrangement of air pollution source sin the South Coast Air Basin.

  8. System and method for determination of the reflection wavelength of multiple low-reflectivity bragg gratings in a sensing optical fiber

    NASA Technical Reports Server (NTRS)

    Moore, Jason P. (Inventor)

    2009-01-01

    A system and method for determining a reflection wavelength of multiple Bragg gratings in a sensing optical fiber comprise: (1) a source laser; (2) an optical detector configured to detect a reflected signal from the sensing optical fiber; (3) a plurality of frequency generators configured to generate a signal having a frequency corresponding to an interferometer frequency of a different one of the plurality of Bragg gratings; (4) a plurality of demodulation elements, each demodulation element configured to combine the signal produced by a different one of the plurality of frequency generators with the detected signal from the sensing optical fiber; (5) a plurality of peak detectors, each peak detector configured to detect a peak of the combined signal from a different one of the demodulation elements; and (6) a laser wavenumber detection element configured to determine a wavenumber of the laser when any of the peak detectors detects a peak.

  9. Regression Models for the Analysis of Longitudinal Gaussian Data from Multiple Sources

    PubMed Central

    O’Brien, Liam M.; Fitzmaurice, Garrett M.

    2006-01-01

    We present a regression model for the joint analysis of longitudinal multiple source Gaussian data. Longitudinal multiple source data arise when repeated measurements are taken from two or more sources, and each source provides a measure of the same underlying variable and on the same scale. This type of data generally produces a relatively large number of observations per subject; thus estimation of an unstructured covariance matrix often may not be possible. We consider two methods by which parsimonious models for the covariance can be obtained for longitudinal multiple source data. The methods are illustrated with an example of multiple informant data arising from a longitudinal interventional trial in psychiatry. PMID:15726666

  10. Bayesian module identification from multiple noisy networks.

    PubMed

    Zamani Dadaneh, Siamak; Qian, Xiaoning

    2016-12-01

    Module identification has been studied extensively in order to gain deeper understanding of complex systems, such as social networks as well as biological networks. Modules are often defined as groups of vertices in these networks that are topologically cohesive with similar interaction patterns with the rest of the vertices. Most of the existing module identification algorithms assume that the given networks are faithfully measured without errors. However, in many real-world applications, for example, when analyzing protein-protein interaction networks from high-throughput profiling techniques, there is significant noise with both false positive and missing links between vertices. In this paper, we propose a new model for more robust module identification by taking advantage of multiple observed networks with significant noise so that signals in multiple networks can be strengthened and help improve the solution quality by combining information from various sources. We adopt a hierarchical Bayesian model to integrate multiple noisy snapshots that capture the underlying modular structure of the networks under study. By introducing a latent root assignment matrix and its relations to instantaneous module assignments in all the observed networks to capture the underlying modular structure and combine information across multiple networks, an efficient variational Bayes algorithm can be derived to accurately and robustly identify the underlying modules from multiple noisy networks. Experiments on synthetic and protein-protein interaction data sets show that our proposed model enhances both the accuracy and resolution in detecting cohesive modules, and it is less vulnerable to noise in the observed data. In addition, it shows higher power in predicting missing edges compared to individual-network methods.

  11. Source Identification of Human Biological Materials and Its Prospect in Forensic Science.

    PubMed

    Zou, K N; Gui, C; Gao, Y; Yang, F; Zhou, H G

    2016-06-01

    Source identification of human biological materials in crime scene plays an important role in reconstructing the crime process. Searching specific genetic markers to identify the source of different human biological materials is the emphasis and difficulty of the research work of legal medical experts in recent years. This paper reviews the genetic markers which are used for identifying the source of human biological materials and studied widely, such as DNA methylation, mRNA, microRNA, microflora and protein, etc. By comparing the principles and methods of source identification of human biological materials using different kinds of genetic markers, different source of human biological material owns suitable marker types and can be identified by detecting single genetic marker or combined multiple genetic markers. Though there is no uniform standard and method for identifying the source of human biological materials in forensic laboratories at present, the research and development of a series of mature and reliable methods for distinguishing different human biological materials play the role as forensic evidence which will be the future development direction. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  12. Blind source separation and localization using microphone arrays

    NASA Astrophysics Data System (ADS)

    Sun, Longji

    The blind source separation and localization problem for audio signals is studied using microphone arrays. Pure delay mixtures of source signals typically encountered in outdoor environments are considered. Our proposed approach utilizes the subspace methods, including multiple signal classification (MUSIC) and estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithms, to estimate the directions of arrival (DOAs) of the sources from the collected mixtures. Since audio signals are generally considered broadband, the DOA estimates at frequencies with the large sum of squared amplitude values are combined to obtain the final DOA estimates. Using the estimated DOAs, the corresponding mixing and demixing matrices are computed, and the source signals are recovered using the inverse short time Fourier transform. Subspace methods take advantage of the spatial covariance matrix of the collected mixtures to achieve robustness to noise. While the subspace methods have been studied for localizing radio frequency signals, audio signals have their special properties. For instance, they are nonstationary, naturally broadband and analog. All of these make the separation and localization for the audio signals more challenging. Moreover, our algorithm is essentially equivalent to the beamforming technique, which suppresses the signals in unwanted directions and only recovers the signals in the estimated DOAs. Several crucial issues related to our algorithm and their solutions have been discussed, including source number estimation, spatial aliasing, artifact filtering, different ways of mixture generation, and source coordinate estimation using multiple arrays. Additionally, comprehensive simulations and experiments have been conducted to examine various aspects of the algorithm. Unlike the existing blind source separation and localization methods, which are generally time consuming, our algorithm needs signal mixtures of only a short duration and therefore supports real-time implementation.

  13. Landsat-8 Operational Land Imager On-Orbit Radiometric Calibration

    NASA Technical Reports Server (NTRS)

    Markham, Brian L.; Barsi, Julia A.

    2017-01-01

    The Operational Land Imager (OLI), the VIS/NIR/SWIR sensor on the Landsat-8 has been successfully acquiring Earth Imagery for more than four years. The OLI incorporates two on-board radiometric calibration systems, one diffuser based and one lamp based, each with multiple sources. For each system one source is treated as primary and used frequently and the other source(s) are used less frequently to assist in tracking any degradation in the primary sources. In addition, via a spacecraft maneuver, the OLI instrument views the moon once a lunar cycle (approx. 29 days). The integrated lunar irradiances from these acquisitions are compared to the output of a lunar irradiance model. The results from all these techniques, combined with cross calibrations with other sensors and ground based vicarious measurements are used to monitor the OLI's stability and correct for any changes observed. To date, the various techniques have other detected significant changes in the shortest wavelength OLI band centered at 443 nm and these are currently being adjusted in the operational processing.

  14. Hybrid Modeling Approach to Estimate Exposures of Hazardous Air Pollutants (HAPs) for the National Air Toxics Assessment (NATA).

    PubMed

    Scheffe, Richard D; Strum, Madeleine; Phillips, Sharon B; Thurman, James; Eyth, Alison; Fudge, Steve; Morris, Mark; Palma, Ted; Cook, Richard

    2016-11-15

    A hybrid air quality model has been developed and applied to estimate annual concentrations of 40 hazardous air pollutants (HAPs) across the continental United States (CONUS) to support the 2011 calendar year National Air Toxics Assessment (NATA). By combining a chemical transport model (CTM) with a Gaussian dispersion model, both reactive and nonreactive HAPs are accommodated across local to regional spatial scales, through a multiplicative technique designed to improve mass conservation relative to previous additive methods. The broad scope of multiple pollutants capturing regional to local spatial scale patterns across a vast spatial domain is precedent setting within the air toxics community. The hybrid design exhibits improved performance relative to the stand alone CTM and dispersion model. However, model performance varies widely across pollutant categories and quantifiably definitive performance assessments are hampered by a limited observation base and challenged by the multiple physical and chemical attributes of HAPs. Formaldehyde and acetaldehyde are the dominant HAP concentration and cancer risk drivers, characterized by strong regional signals associated with naturally emitted carbonyl precursors enhanced in urban transport corridors with strong mobile source sector emissions. The multiple pollutant emission characteristics of combustion dominated source sectors creates largely similar concentration patterns across the majority of HAPs. However, reactive carbonyls exhibit significantly less spatial variability relative to nonreactive HAPs across the CONUS.

  15. Geographical Heterogeneity of Multiple Sclerosis Prevalence in France.

    PubMed

    Pivot, Diane; Debouverie, Marc; Grzebyk, Michel; Brassat, David; Clanet, Michel; Clavelou, Pierre; Confavreux, Christian; Edan, Gilles; Leray, Emmanuelle; Moreau, Thibault; Vukusic, Sandra; Hédelin, Guy; Guillemin, Francis

    2016-01-01

    Geographical variation in the prevalence of multiple sclerosis (MS) is controversial. Heterogeneity is important to acknowledge to adapt the provision of care within the healthcare system. We aimed to investigate differences in prevalence of MS in departments in the French territory. We estimated MS prevalence on October 31, 2004 in 21 administrative departments in France (22% of the metropolitan departments) by using multiple data sources: the main French health insurance systems, neurologist networks devoted to MS and the Technical Information Agency of Hospitalization. We used a spatial Bayesian approach based on estimating the number of MS cases from 2005 and 2008 capture-recapture studies to analyze differences in prevalence. The age- and sex-standardized prevalence of MS per 100,000 inhabitants ranged from 68.1 (95% credible interval 54.6, 84.4) in Hautes-Pyrénées (southwest France) to 296.5 (258.8, 338.9) in Moselle (northeast France). The greatest prevalence was in the northeast departments, and the other departments showed great variability. By combining multiple data sources into a spatial Bayesian model, we found heterogeneity in MS prevalence among the 21 departments of France, some with higher prevalence than anticipated from previous publications. No clear explanation related to health insurance coverage and hospital facilities can be advanced. Population migration, socioeconomic status of the population studied and environmental effects are suspected.

  16. Combining Evidence of Preferential Gene-Tissue Relationships from Multiple Sources

    PubMed Central

    Guo, Jing; Hammar, Mårten; Öberg, Lisa; Padmanabhuni, Shanmukha S.; Bjäreland, Marcus; Dalevi, Daniel

    2013-01-01

    An important challenge in drug discovery and disease prognosis is to predict genes that are preferentially expressed in one or a few tissues, i.e. showing a considerably higher expression in one tissue(s) compared to the others. Although several data sources and methods have been published explicitly for this purpose, they often disagree and it is not evident how to retrieve these genes and how to distinguish true biological findings from those that are due to choice-of-method and/or experimental settings. In this work we have developed a computational approach that combines results from multiple methods and datasets with the aim to eliminate method/study-specific biases and to improve the predictability of preferentially expressed human genes. A rule-based score is used to merge and assign support to the results. Five sets of genes with known tissue specificity were used for parameter pruning and cross-validation. In total we identify 3434 tissue-specific genes. We compare the genes of highest scores with the public databases: PaGenBase (microarray), TiGER (EST) and HPA (protein expression data). The results have 85% overlap to PaGenBase, 71% to TiGER and only 28% to HPA. 99% of our predictions have support from at least one of these databases. Our approach also performs better than any of the databases on identifying drug targets and biomarkers with known tissue-specificity. PMID:23950964

  17. Multi-source Geospatial Data Analysis with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  18. Effects of extrusion temperature and dwell time on aflatoxin levels in cottonseed.

    PubMed

    Buser, Michael D; Abbas, Hamed K

    2002-04-24

    Cottonseed is an economical source of protein and is commonly used in balancing livestock rations; however, its use is typically limited by protein, fat, gossypol, and aflatoxin contents. Whole cottonseed was extruded to determine if the temperature and dwell time (multiple stages of processing) associated with the process affected aflatoxin levels. The extrusion temperature study showed that aflatoxin levels were reduced by an additional 33% when the cottonseed was extruded at 160 degrees C as compared to 104 degrees C. Furthermore, the multiple-pass extrusion study indicated that aflatoxin levels were reduced by an additional 55% when the cottonseed was extruded four times as compared to one time. To estimate the aflatoxin reductions due to extrusion temperature and dwell time, the least mean fits obtained for the individual studies were combined. Total estimated reductions of 55% (three stages of processing at 104 degrees C), 50% (two stages of processing at 132 degrees C), and 47% (one stage of processing at 160 degrees C) were obtained from the combined equations. If the extreme conditions (four stages of processing at 160 degrees C) of the evaluation studies are applied to the combined temperature and processing equation, the resulting aflatoxin reduction would be 76%.

  19. The application of chemical and isotopic tracers to characterize aerosol sources and processing in marine air

    NASA Astrophysics Data System (ADS)

    Turekian, Vaughan Charles

    2000-12-01

    Aerosol production, transport, chemical and physical evolution and deposition impact the environment by influencing radiation budgets, altering the composition of the atmosphere, and delivering nutrients to marine and terrestrial ecosystems. The objective of this research was to combine high-resolution chemical measurements with stable isotopic analysis in order to characterize the sources and processing of carbon, nitrogen and sulfur bearing compounds, associated with sized aerosols on Bermuda, during spring. Chemical tracers combined with forward and backward trajectories demonstrated the transport of biomass burning products from North America to Bermuda. The size distributions of NH4+ from 1998 differed from those during spring, 1997, a year without the large-scale burning. These results suggest that transport of biomass burning products altered the pH of the aerosols. Marine and continentally derived carbon was associated with all aerosol size fractions. Supermicron radius sea- salt aerosol was enriched in marine derived carbon by 2 orders of magnitude compared to bulk surface seawater. Enrichments of oxalate relative to methanesulfonic acid (MSA) in supermicron radius aerosol suggested in situ formation of oxalate within the sea-salt solution, or direct injection from the organic rich surface microlayer. Compound specific isotope analysis of oxalic acid, indicated a marine source for all aerosol size fractions, indicating formation from in the gas phase for the submicron radius aerosol. Stable sulfur isotopes indicated that the biogenic non- sea-salt (nss) SO42-/MSA ratio varied with aerosol size indicating that MSA may not be a conservative tracer of biogenic nss SO4 2- in bulk aerosol sampling. The calculated biogenic nss SO 42-/MSA based on stable isotopes and sized aerosol sampling, was 3 times lower than previous estimates for Bermuda. Stable nitrogen isotope values for submicron and supermicron aerosol where significantly different, consistent with their different chemical compositions. Results suggested that HNO3 incorporation into supermicron aerosol was essentially unidirectional whereas submicron aerosol was both a source and a sink for NH3(g). Variable aerosol liquid water content over the relatively longer atmospheric lifetimes of submicron aerosol may lead to multiple NH3 phase changes. This study was the first to combine sized aerosol sampling, high-resolution chemical analysis and multiple stable isotopes to characterize both the sources and the processing of aerosols in marine air. The results of this study, therefore, provide crucial information for source apportionment of environmentally important atmospheric species in continentally impacted, marine air.

  20. Age differences in coping and locus of control: a study of managerial stress in Hong Kong.

    PubMed

    Siu, O; Cooper, C L; Spector, P E; Donald, I

    2001-12-01

    The present study involved data collection from 3 samples of Hong Kong managers to examine mechanisms by which age would relate to work well-being. A total of 634 managers was drawn by random sampling and purposive sampling methods. The results showed that age was positively related to well-being (job satisfaction and mental well-being). Furthermore, older managers reported fewer sources of stress, better coping, and a more internal locus of control. Multiple regression analyses suggested that the relations of age with 2 well-being indicators can be attributed to various combinations of coping, work locus of control, sources of stress, managerial level, and organizational tenure.

  1. Synchrotron X-ray micro-tomography at the Advanced Light Source: Developments in high-temperature in-situ mechanical testing

    NASA Astrophysics Data System (ADS)

    Barnard, Harold S.; MacDowell, A. A.; Parkinson, D. Y.; Mandal, P.; Czabaj, M.; Gao, Y.; Maillet, E.; Blank, B.; Larson, N. M.; Ritchie, R. O.; Gludovatz, B.; Acevedo, C.; Liu, D.

    2017-06-01

    At the Advanced Light Source (ALS), Beamline 8.3.2 performs hard X-ray micro-tomography under conditions of high temperature, pressure, mechanical loading, and other realistic conditions using environmental test cells. With scan times of 10s-100s of seconds, the microstructural evolution of materials can be directly observed over multiple time steps spanning prescribed changes in the sample environment. This capability enables in-situ quasi-static mechanical testing of materials. We present an overview of our in-situ mechanical testing capabilities and recent hardware developments that enable flexural testing at high temperature and in combination with acoustic emission analysis.

  2. Algae biofuels: versatility for the future of bioenergy.

    PubMed

    Jones, Carla S; Mayfield, Stephen P

    2012-06-01

    The world continues to increase its energy use, brought about by an expanding population and a desire for a greater standard of living. This energy use coupled with the realization of the impact of carbon dioxide on the climate, has led us to reanalyze the potential of plant-based biofuels. Of the potential sources of biofuels the most efficient producers of biomass are the photosynthetic microalgae and cyanobacteria. These versatile organisms can be used for the production of bioethanol, biodiesel, biohydrogen, and biogas. In fact, one of the most economic methods for algal biofuels production may be the combined biorefinery approach where multiple biofuels are produced from one biomass source. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D.J.; Warner, J.A.; LeBarron, N.

    Processes that use energetic ions for large substrates require that the time-averaged erosion effects from the ion flux be uniform across the surface. A numerical model has been developed to determine this flux and its effects on surface etching of a silica/photoresist combination. The geometry of the source and substrate is very similar to a typical deposition geometry with single or planetary substrate rotation. The model was used to tune an inert ion-etching process that used single or multiple Kaufman sources to less than 3% uniformity over a 30-cm aperture after etching 8 {micro}m of material. The same model canmore » be used to predict uniformity for ion-assisted deposition (IAD).« less

  4. Coherent Terahertz Radiation from Multiple Electron Beams Excitation within a Plasmonic Crystal-like structure.

    PubMed

    Zhang, Yaxin; Zhou, Yucong; Gang, Yin; Jiang, Guili; Yang, Ziqiang

    2017-01-23

    Coherent terahertz radiation from multiple electron beams excitation within a plasmonic crystal-like structure (a three-dimensional holes array) which is composed of multiple stacked layers with 3 × 3 subwavelength holes array has been proposed in this paper. It has been found that in the structure the electromagnetic fields in each hole can be coupled with one another to construct a composite mode with strong field intensity. Therefore, the multiple electron beams injection can excite and efficiently interact with such mode. Meanwhile, the coupling among the electron beams is taken place during the interaction so that a very strong coherent terahertz radiation with high electron conversion efficiency can be generated. Furthermore, due to the coupling, the starting current density of this mechanism is much lower than that of traditional electron beam-driven terahertz sources. This multi-beam radiation system may provide a favorable way to combine photonics structure with electronics excitation to generate middle, high power terahertz radiation.

  5. Coherent Terahertz Radiation from Multiple Electron Beams Excitation within a Plasmonic Crystal-like structure

    PubMed Central

    Zhang, Yaxin; Zhou, Yucong; Gang, Yin; Jiang, Guili; Yang, Ziqiang

    2017-01-01

    Coherent terahertz radiation from multiple electron beams excitation within a plasmonic crystal-like structure (a three-dimensional holes array) which is composed of multiple stacked layers with 3 × 3 subwavelength holes array has been proposed in this paper. It has been found that in the structure the electromagnetic fields in each hole can be coupled with one another to construct a composite mode with strong field intensity. Therefore, the multiple electron beams injection can excite and efficiently interact with such mode. Meanwhile, the coupling among the electron beams is taken place during the interaction so that a very strong coherent terahertz radiation with high electron conversion efficiency can be generated. Furthermore, due to the coupling, the starting current density of this mechanism is much lower than that of traditional electron beam-driven terahertz sources. This multi-beam radiation system may provide a favorable way to combine photonics structure with electronics excitation to generate middle, high power terahertz radiation. PMID:28112234

  6. Fusion of Remote Sensing Methods, UAV Photogrammetry and LiDAR Scanning products for monitoring fluvial dynamics

    NASA Astrophysics Data System (ADS)

    Lendzioch, Theodora; Langhammer, Jakub; Hartvich, Filip

    2015-04-01

    Fusion of remote sensing data is a common and rapidly developing discipline, which combines data from multiple sources with different spatial and spectral resolution, from satellite sensors, aircraft and ground platforms. Fusion data contains more detailed information than each of the source and enhances the interpretation performance and accuracy of the source data and produces a high-quality visualisation of the final data. Especially, in fluvial geomorphology it is essential to get valuable images in sub-meter resolution to obtain high quality 2D and 3D information for a detailed identification, extraction and description of channel features of different river regimes and to perform a rapid mapping of changes in river topography. In order to design, test and evaluate a new approach for detection of river morphology, we combine different research techniques from remote sensing products to drone-based photogrammetry and LiDAR products (aerial LiDAR Scanner and TLS). Topographic information (e.g. changes in river channel morphology, surface roughness, evaluation of floodplain inundation, mapping gravel bars and slope characteristics) will be extracted either from one single layer or from combined layers in accordance to detect fluvial topographic changes before and after flood events. Besides statistical approaches for predictive geomorphological mapping and the determination of errors and uncertainties of the data, we will also provide 3D modelling of small fluvial features.

  7. Source apportionment of PM2.5 at the Lin'an regional background site in China with three receptor models

    NASA Astrophysics Data System (ADS)

    Deng, Junjun; Zhang, Yanru; Qiu, Yuqing; Zhang, Hongliang; Du, Wenjiao; Xu, Lingling; Hong, Youwei; Chen, Yanting; Chen, Jinsheng

    2018-04-01

    Source apportionment of fine particulate matter (PM2.5) were conducted at the Lin'an Regional Atmospheric Background Station (LA) in the Yangtze River Delta (YRD) region in China from July 2014 to April 2015 with three receptor models including principal component analysis combining multiple linear regression (PCA-MLR), UNMIX and Positive Matrix Factorization (PMF). The model performance, source identification and source contribution of the three models were analyzed and inter-compared. Source apportionment of PM2.5 was also conducted with the receptor models. Good correlations between the reconstructed and measured concentrations of PM2.5 and its major chemical species were obtained for all models. PMF resolved almost all masses of PM2.5, while PCA-MLR and UNMIX explained about 80%. Five, four and seven sources were identified by PCA-MLR, UNMIX and PMF, respectively. Combustion, secondary source, marine source, dust and industrial activities were identified by all the three receptor models. Combustion source and secondary source were the major sources, and totally contributed over 60% to PM2.5. The PMF model had a better performance on separating the different combustion sources. These findings improve the understanding of PM2.5 sources in background region.

  8. Application of Molecular Typing Results in Source Attribution Models: The Case of Multiple Locus Variable Number Tandem Repeat Analysis (MLVA) of Salmonella Isolates Obtained from Integrated Surveillance in Denmark.

    PubMed

    de Knegt, Leonardo V; Pires, Sara M; Löfström, Charlotta; Sørensen, Gitte; Pedersen, Karl; Torpdahl, Mia; Nielsen, Eva M; Hald, Tine

    2016-03-01

    Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark. The purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques. © 2015 Society for Risk Analysis.

  9. Filling Terrorism Gaps: VEOs, Evaluating Databases, and Applying Risk Terrain Modeling to Terrorism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagan, Ross F.

    2016-08-29

    This paper aims to address three issues: the lack of literature differentiating terrorism and violent extremist organizations (VEOs), terrorism incident databases, and the applicability of Risk Terrain Modeling (RTM) to terrorism. Current open source literature and publicly available government sources do not differentiate between terrorism and VEOs; furthermore, they fail to define them. Addressing the lack of a comprehensive comparison of existing terrorism data sources, a matrix comparing a dozen terrorism databases is constructed, providing insight toward the array of data available. RTM, a method for spatial risk analysis at a micro level, has some applicability to terrorism research, particularlymore » for studies looking at risk indicators of terrorism. Leveraging attack data from multiple databases, combined with RTM, offers one avenue for closing existing research gaps in terrorism literature.« less

  10. BioContainers: an open-source and community-driven framework for software standardization.

    PubMed

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  11. BioContainers: an open-source and community-driven framework for software standardization

    PubMed Central

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  12. Interactive visualization and analysis of multimodal datasets for surgical applications.

    PubMed

    Kirmizibayrak, Can; Yim, Yeny; Wakid, Mike; Hahn, James

    2012-12-01

    Surgeons use information from multiple sources when making surgical decisions. These include volumetric datasets (such as CT, PET, MRI, and their variants), 2D datasets (such as endoscopic videos), and vector-valued datasets (such as computer simulations). Presenting all the information to the user in an effective manner is a challenging problem. In this paper, we present a visualization approach that displays the information from various sources in a single coherent view. The system allows the user to explore and manipulate volumetric datasets, display analysis of dataset values in local regions, combine 2D and 3D imaging modalities and display results of vector-based computer simulations. Several interaction methods are discussed: in addition to traditional interfaces including mouse and trackers, gesture-based natural interaction methods are shown to control these visualizations with real-time performance. An example of a medical application (medialization laryngoplasty) is presented to demonstrate how the combination of different modalities can be used in a surgical setting with our approach.

  13. Source term estimation of radioxenon released from the Fukushima Dai-ichi nuclear reactors using measured air concentrations and atmospheric transport modeling.

    PubMed

    Eslinger, P W; Biegalski, S R; Bowyer, T W; Cooper, M W; Haas, D A; Hayes, J C; Hoffman, I; Korpach, E; Yi, J; Miley, H S; Rishel, J P; Ungar, K; White, B; Woods, V T

    2014-01-01

    Systems designed to monitor airborne radionuclides released from underground nuclear explosions detected radioactive fallout across the northern hemisphere resulting from the Fukushima Dai-ichi Nuclear Power Plant accident in March 2011. Sampling data from multiple International Modeling System locations are combined with atmospheric transport modeling to estimate the magnitude and time sequence of releases of (133)Xe. Modeled dilution factors at five different detection locations were combined with 57 atmospheric concentration measurements of (133)Xe taken from March 18 to March 23 to estimate the source term. This analysis suggests that 92% of the 1.24 × 10(19) Bq of (133)Xe present in the three operating reactors at the time of the earthquake was released to the atmosphere over a 3 d period. An uncertainty analysis bounds the release estimates to 54-129% of available (133)Xe inventory. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Multiple Solutions of Real-time Tsunami Forecasting Using Short-term Inundation Forecasting for Tsunamis Tool

    NASA Astrophysics Data System (ADS)

    Gica, E.

    2016-12-01

    The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.

  15. Efficient Assignment of Multiple E-MBMS Sessions towards LTE

    NASA Astrophysics Data System (ADS)

    Alexiou, Antonios; Bouras, Christos; Kokkinos, Vasileios

    One of the major prerequisites for Long Term Evolution (LTE) networks is the mass provision of multimedia services to mobile users. To this end, Evolved - Multimedia Broadcast/Multicast Service (E-MBMS) is envisaged to play an instrumental role during LTE standardization process and ensure LTE’s proliferation in mobile market. E-MBMS targets at the economic delivery, in terms of power and spectral efficiency, of multimedia data from a single source entity to multiple destinations. This paper proposes a novel mechanism for efficient radio bearer selection during E-MBMS transmissions in LTE networks. The proposed mechanism is based on the concept of transport channels combination in any cell of the network. Most significantly, the mechanism manages to efficiently deliver multiple E-MBMS sessions. The performance of the proposed mechanism is evaluated and compared with several radio bearer selection mechanisms in order to highlight the enhancements that it provides.

  16. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations.

    PubMed

    Dwivedi, Bhakti; Kowalski, Jeanne

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.

  17. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations

    PubMed Central

    Dwivedi, Bhakti

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010

  18. Design of Xen Hybrid Multiple Police Model

    NASA Astrophysics Data System (ADS)

    Sun, Lei; Lin, Renhao; Zhu, Xianwei

    2017-10-01

    Virtualization Technology has attracted more and more attention. As a popular open-source virtualization tools, XEN is used more and more frequently. Xsm, XEN security model, has also been widespread concern. The safety status classification has not been established in the XSM, and it uses the virtual machine as a managed object to make Dom0 a unique administrative domain that does not meet the minimum privilege. According to these questions, we design a Hybrid multiple police model named SV_HMPMD that organically integrates multiple single security policy models include DTE,RBAC,BLP. It can fullfill the requirement of confidentiality and integrity for security model and use different particle size to different domain. In order to improve BLP’s practicability, the model introduce multi-level security labels. In order to divide the privilege in detail, we combine DTE with RBAC. In order to oversize privilege, we limit the privilege of domain0.

  19. How to retrieve additional information from the multiplicity distributions

    NASA Astrophysics Data System (ADS)

    Wilk, Grzegorz; Włodarczyk, Zbigniew

    2017-01-01

    Multiplicity distributions (MDs) P(N) measured in multiparticle production processes are most frequently described by the negative binomial distribution (NBD). However, with increasing collision energy some systematic discrepancies have become more and more apparent. They are usually attributed to the possible multi-source structure of the production process and described using a multi-NBD form of the MD. We investigate the possibility of keeping a single NBD but with its parameters depending on the multiplicity N. This is done by modifying the widely known clan model of particle production leading to the NBD form of P(N). This is then confronted with the approach based on the so-called cascade-stochastic formalism which is based on different types of recurrence relations defining P(N). We demonstrate that a combination of both approaches allows the retrieval of additional valuable information from the MDs, namely the oscillatory behavior of the counting statistics apparently visible in the high energy data.

  20. Steady-State Ion Beam Modeling with MICHELLE

    NASA Astrophysics Data System (ADS)

    Petillo, John

    2003-10-01

    There is a need to efficiently model ion beam physics for ion implantation, chemical vapor deposition, and ion thrusters. Common to all is the need for three-dimensional (3D) simulation of volumetric ion sources, ion acceleration, and optics, with the ability to model charge exchange of the ion beam with a background neutral gas. The two pieces of physics stand out as significant are the modeling of the volumetric source and charge exchange. In the MICHELLE code, the method for modeling the plasma sheath in ion sources assumes that the electron distribution function is a Maxwellian function of electrostatic potential over electron temperature. Charge exchange is the process by which a neutral background gas with a "fast" charged particle streaming through exchanges its electron with the charged particle. An efficient method for capturing this is essential, and the model presented is based on semi-empirical collision cross section functions. This appears to be the first steady-state 3D algorithm of its type to contain multiple generations of charge exchange, work with multiple species and multiple charge state beam/source particles simultaneously, take into account the self-consistent space charge effects, and track the subsequent fast neutral particles. The solution used by MICHELLE is to combine finite element analysis with particle-in-cell (PIC) methods. The basic physics model is based on the equilibrium steady-state application of the electrostatic particle-in-cell (PIC) approximation employing a conformal computational mesh. The foundation stems from the same basic model introduced in codes such as EGUN. Here, Poisson's equation is used to self-consistently include the effects of space charge on the fields, and the relativistic Lorentz equation is used to integrate the particle trajectories through those fields. The presentation will consider the complexity of modeling ion thrusters.

  1. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  2. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  3. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  4. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  5. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  6. Legacy of contaminant N sources to the NO3- signature in rivers: a combined isotopic (δ15N-NO3-, δ18O-NO3-, δ11B) and microbiological investigation

    NASA Astrophysics Data System (ADS)

    Briand, Cyrielle; Sebilo, Mathieu; Louvat, Pascale; Chesnot, Thierry; Vaury, Véronique; Schneider, Maude; Plagnes, Valérie

    2017-02-01

    Nitrate content of surface waters results from complex mixing of multiple sources, whose signatures can be modified through N reactions occurring within the different compartments of the whole catchment. Despite this complexity, the determination of nitrate origin is the first and crucial step for water resource preservation. Here, for the first time, we combined at the catchment scale stable isotopic tracers (δ15N and δ18O of nitrate and δ11B) and fecal indicators to trace nitrate sources and pathways to the stream. We tested this approach on two rivers in an agricultural region of SW France. Boron isotopic ratios evidenced inflow from anthropogenic waters, microbiological markers revealed organic contaminations from both human and animal wastes. Nitrate δ15N and δ18O traced inputs from the surface leaching during high flow events and from the subsurface drainage in base flow regime. They also showed that denitrification occurred within the soils before reaching the rivers. Furthermore, this study highlighted the determinant role of the soil compartment in nitrate formation and recycling with important spatial heterogeneity and temporal variability.

  7. Tracking iron in multiple sclerosis: a combined imaging and histopathological study at 7 Tesla

    PubMed Central

    Hametner, Simon; Yao, Bing; van Gelderen, Peter; Merkle, Hellmut; Cantor, Fredric K.; Lassmann, Hans; Duyn, Jeff H.

    2011-01-01

    Previous authors have shown that the transverse relaxivity R2* and frequency shifts that characterize gradient echo signal decay in magnetic resonance imaging are closely associated with the distribution of iron and myelin in the brain's white matter. In multiple sclerosis, iron accumulation in brain tissue may reflect a multiplicity of pathological processes. Hence, iron may have the unique potential to serve as an in vivo magnetic resonance imaging tracer of disease pathology. To investigate the ability of iron in tracking multiple sclerosis-induced pathology by magnetic resonance imaging, we performed qualitative histopathological analysis of white matter lesions and normal-appearing white matter regions with variable appearance on gradient echo magnetic resonance imaging at 7 Tesla. The samples used for this study derive from two patients with multiple sclerosis and one non-multiple sclerosis donor. Magnetic resonance images were acquired using a whole body 7 Tesla magnetic resonance imaging scanner equipped with a 24-channel receive-only array designed for tissue imaging. A 3D multi-gradient echo sequence was obtained and quantitative R2* and phase maps were reconstructed. Immunohistochemical stainings for myelin and oligodendrocytes, microglia and macrophages, ferritin and ferritin light polypeptide were performed on 3- to 5-µm thick paraffin sections. Iron was detected with Perl's staining and 3,3′-diaminobenzidine-tetrahydrochloride enhanced Turnbull blue staining. In multiple sclerosis tissue, iron presence invariably matched with an increase in R2*. Conversely, R2* increase was not always associated with the presence of iron on histochemical staining. We interpret this finding as the effect of embedding, sectioning and staining procedures. These processes likely affected the histopathological analysis results but not the magnetic resonance imaging that was obtained before tissue manipulations. Several cellular sources of iron were identified. These sources included oligodendrocytes in normal-appearing white matter and activated macrophages/microglia at the edges of white matter lesions. Additionally, in white matter lesions, iron precipitation in aggregates typical of microbleeds was shown by the Perl's staining. Our combined imaging and pathological study shows that multi-gradient echo magnetic resonance imaging is a sensitive technique for the identification of iron in the brain tissue of patients with multiple sclerosis. However, magnetic resonance imaging-identified iron does not necessarily reflect pathology and may also be seen in apparently normal tissue. Iron identification by multi-gradient echo magnetic resonance imaging in diseased tissues can shed light on the pathological processes when coupled with topographical information and patient disease history. PMID:22171355

  8. VERCE: a productive e-Infrastructure and e-Science environment for data-intensive seismology research

    NASA Astrophysics Data System (ADS)

    Vilotte, J. P.; Atkinson, M.; Spinuso, A.; Rietbrock, A.; Michelini, A.; Igel, H.; Frank, A.; Carpené, M.; Schwichtenberg, H.; Casarotti, E.; Filgueira, R.; Garth, T.; Germünd, A.; Klampanos, I.; Krause, A.; Krischer, L.; Leong, S. H.; Magnoni, F.; Matser, J.; Moguilny, G.

    2015-12-01

    Seismology addresses both fundamental problems in understanding the Earth's internal wave sources and structures and augmented societal applications, like earthquake and tsunami hazard assessment and risk mitigation; and puts a premium on open-data accessible by the Federated Digital Seismological Networks. The VERCE project, "Virtual Earthquake and seismology Research Community e-science environment in Europe", has initiated a virtual research environment to support complex orchestrated workflows combining state-of-art wave simulation codes and data analysis tools on distributed computing and data infrastructures (DCIs) along with multiple sources of observational data and new capabilities to combine simulation results with observational data. The VERCE Science Gateway provides a view of all the available resources, supporting collaboration with shared data and methods, with data access controls. The mapping to DCIs handles identity management, authority controls, transformations between representations and controls, and access to resources. The framework for computational science that provides simulation codes, like SPECFEM3D, democratizes their use by getting data from multiple sources, managing Earth models and meshes, distilling them as input data, and capturing results with meta-data. The dispel4py data-intensive framework allows for developing data-analysis applications using Python and the ObsPy library, which can be executed on different DCIs. A set of tools allows coupling with seismology and external data services. Provenance driven tools validate results and show relationships between data to facilitate method improvement. Lessons learned from VERCE training lead us to conclude that solid-Earth scientists could make significant progress by using VERCE e-science environment. VERCE has already contributed to the European Plate Observation System (EPOS), and is part of the EPOS implementation phase. Its cross-disciplinary capabilities are being extended for the EPOS implantation phase.

  9. Psychosocial Intervention for Young Children With Chronic Tics

    ClinicalTrials.gov

    2018-06-18

    Tourette's Syndrome; Tourette's Disorder; Tourette's Disease; Tourette Disorder; Tourette Disease; Tic Disorder, Combined Vocal and Multiple Motor; Multiple Motor and Vocal Tic Disorder, Combined; Gilles de La Tourette's Disease; Gilles de la Tourette Syndrome; Gilles De La Tourette's Syndrome; Combined Vocal and Multiple Motor Tic Disorder; Combined Multiple Motor and Vocal Tic Disorder; Chronic Motor and Vocal Tic Disorder

  10. MetaTracker: integration and abstraction of 3D motion tracking data from multiple hardware systems

    NASA Astrophysics Data System (ADS)

    Kopecky, Ken; Winer, Eliot

    2014-06-01

    Motion tracking has long been one of the primary challenges in mixed reality (MR), augmented reality (AR), and virtual reality (VR). Military and defense training can provide particularly difficult challenges for motion tracking, such as in the case of Military Operations in Urban Terrain (MOUT) and other dismounted, close quarters simulations. These simulations can take place across multiple rooms, with many fast-moving objects that need to be tracked with a high degree of accuracy and low latency. Many tracking technologies exist, such as optical, inertial, ultrasonic, and magnetic. Some tracking systems even combine these technologies to complement each other. However, there are no systems that provide a high-resolution, flexible, wide-area solution that is resistant to occlusion. While frameworks exist that simplify the use of tracking systems and other input devices, none allow data from multiple tracking systems to be combined, as if from a single system. In this paper, we introduce a method for compensating for the weaknesses of individual tracking systems by combining data from multiple sources and presenting it as a single tracking system. Individual tracked objects are identified by name, and their data is provided to simulation applications through a server program. This allows tracked objects to transition seamlessly from the area of one tracking system to another. Furthermore, it abstracts away the individual drivers, APIs, and data formats for each system, providing a simplified API that can be used to receive data from any of the available tracking systems. Finally, when single-piece tracking systems are used, those systems can themselves be tracked, allowing for real-time adjustment of the trackable area. This allows simulation operators to leverage limited resources in more effective ways, improving the quality of training.

  11. Passive synthetic aperture radar imaging of ground moving targets

    NASA Astrophysics Data System (ADS)

    Wacks, Steven; Yazici, Birsen

    2012-05-01

    In this paper we present a method for imaging ground moving targets using passive synthetic aperture radar. A passive radar imaging system uses small, mobile receivers that do not radiate any energy. For these reasons, passive imaging systems result in signicant cost, manufacturing, and stealth advantages. The received signals are obtained by multiple airborne receivers collecting scattered waves due to illuminating sources of opportunity such as commercial television, radio, and cell phone towers. We describe a novel forward model and a corresponding ltered-backprojection type image reconstruction method combined with entropy optimization. Our method determines the location and velocity of multiple targets moving at dierent velocities. Furthermore, it can accommodate arbitrary imaging geometries. we present numerical simulations to verify the imaging method.

  12. In Situ Electrochemical Synthesis of Oriented and Defect-Free AEL Molecular-Sieve Films Using Ionic Liquids.

    PubMed

    Yu, Tongwen; Chu, Wenling; Cai, Rui; Liu, Yanchun; Yang, Weishen

    2015-10-26

    Simply preparing oriented and defect-free molecular-sieve films have been a long-standing challenge both in academia and industry. Most of the early works focus on the careful and multiple controls of the seeds layer or synthesis conditions. Herein, we report a one-step in situ electrochemical ionothermal method that combines a controllable electric field with ionic liquids. We demonstrate that an in-plane oriented and defect-free AEL (one molecular-sieve framework type) molecular-sieve film was obtained using an Al electrode as the Al source. The excellent corrosion-resistant performance of the film makes this technology promising in multiple applications, such as anti-corrosion coatings. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Parallel excitation-emission multiplexed fluorescence lifetime confocal microscopy for live cell imaging

    PubMed Central

    Zhao, Ming; Li, Yu; Peng, Leilei

    2014-01-01

    We present a novel excitation-emission multiplexed fluorescence lifetime microscopy (FLIM) method that surpasses current FLIM techniques in multiplexing capability. The method employs Fourier multiplexing to simultaneously acquire confocal fluorescence lifetime images of multiple excitation wavelength and emission color combinations at 44,000 pixels/sec. The system is built with low-cost CW laser sources and standard PMTs with versatile spectral configuration, which can be implemented as an add-on to commercial confocal microscopes. The Fourier lifetime confocal method allows fast multiplexed FLIM imaging, which makes it possible to monitor multiple biological processes in live cells. The low cost and compatibility with commercial systems could also make multiplexed FLIM more accessible to biological research community. PMID:24921725

  14. Developing a system for blind acoustic source localization and separation

    NASA Astrophysics Data System (ADS)

    Kulkarni, Raghavendra

    This dissertation presents innovate methodologies for locating, extracting, and separating multiple incoherent sound sources in three-dimensional (3D) space; and applications of the time reversal (TR) algorithm to pinpoint the hyper active neural activities inside the brain auditory structure that are correlated to the tinnitus pathology. Specifically, an acoustic modeling based method is developed for locating arbitrary and incoherent sound sources in 3D space in real time by using a minimal number of microphones, and the Point Source Separation (PSS) method is developed for extracting target signals from directly measured mixed signals. Combining these two approaches leads to a novel technology known as Blind Sources Localization and Separation (BSLS) that enables one to locate multiple incoherent sound signals in 3D space and separate original individual sources simultaneously, based on the directly measured mixed signals. These technologies have been validated through numerical simulations and experiments conducted in various non-ideal environments where there are non-negligible, unspecified sound reflections and reverberation as well as interferences from random background noise. Another innovation presented in this dissertation is concerned with applications of the TR algorithm to pinpoint the exact locations of hyper-active neurons in the brain auditory structure that are directly correlated to the tinnitus perception. Benchmark tests conducted on normal rats have confirmed the localization results provided by the TR algorithm. Results demonstrate that the spatial resolution of this source localization can be as high as the micrometer level. This high precision localization may lead to a paradigm shift in tinnitus diagnosis, which may in turn produce a more cost-effective treatment for tinnitus than any of the existing ones.

  15. Intelligence, Dataveillance, and Information Privacy

    NASA Astrophysics Data System (ADS)

    Mace, Robyn R.

    The extent and scope of intelligence activities are expanding in response to technological and economic transformations of the past decades. Intelligence efforts involving aggregated data from multiple public and private sources combined with past abuses of domestic intelligence functions have generated significant concerns among privacy advocates and citizens about the protection of individual civil liberties and information privacy from corporate and governmental misuse. In the information age, effective regulation and oversight are key components in the legitimacy and success of government domestic intelligence activities.

  16. Capnography and chest wall impedance algorithms for ventilation detection during cardiopulmonary resuscitation

    PubMed Central

    Edelson, Dana P.; Eilevstjønn, Joar; Weidman, Elizabeth K.; Retzer, Elizabeth; Vanden Hoek, Terry L.; Abella, Benjamin S.

    2009-01-01

    Objective Hyperventilation is both common and detrimental during cardiopulmonary resuscitation (CPR). Chest wall impedance algorithms have been developed to detect ventilations during CPR. However, impedance signals are challenged by noise artifact from multiple sources, including chest compressions. Capnography has been proposed as an alternate method to measure ventilations. We sought to assess and compare the adequacy of these two approaches. Methods Continuous chest wall impedance and capnography were recorded during consecutive in-hospital cardiac arrests. Algorithms utilizing each of these data sources were compared to a manually determined “gold standard” reference ventilation rate. In addition, a combination algorithm, which utilized the highest of the impedance or capnography values in any given minute, was similarly evaluated. Results Data were collected from 37 cardiac arrests, yielding 438 min of data with continuous chest compressions and concurrent recording of impedance and capnography. The manually calculated mean ventilation rate was 13.3±4.3/min. In comparison, the defibrillator’s impedance-based algorithm yielded an average rate of 11.3±4.4/min (p=0.0001) while the capnography rate was 11.7±3.7/min (p=0.0009). There was no significant difference in sensitivity and positive predictive value between the two methods. The combination algorithm rate was 12.4±3.5/min (p=0.02), which yielded the highest fraction of minutes with respiratory rates within 2/min of the reference. The impedance signal was uninterpretable 19.5% of the time, compared with 9.7% for capnography. However, the signals were only simultaneously non-interpretable 0.8% of the time. Conclusions Both the impedance and capnography-based algorithms underestimated the ventilation rate. Reliable ventilation rate determination may require a novel combination of multiple algorithms during resuscitation. PMID:20036047

  17. SEARCHES FOR HIGH-ENERGY NEUTRINO EMISSION IN THE GALAXY WITH THE COMBINED ICECUBE-AMANDA DETECTOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbasi, R.; Ahlers, M.; Andeen, K.

    2013-01-20

    We report on searches for neutrino sources at energies above 200 GeV in the Northern sky of the Galactic plane, using the data collected by the South Pole neutrino telescope, IceCube, and AMANDA. The Galactic region considered in this work includes the local arm toward the Cygnus region and our closest approach to the Perseus Arm. The searches are based on the data collected between 2007 and 2009. During this time AMANDA was an integrated part of IceCube, which was still under construction and operated with 22 strings (2007-2008) and 40 strings (2008-2009) of optical modules deployed in the ice.more » By combining the advantages of the larger IceCube detector with the lower energy threshold of the more compact AMANDA detector, we obtain an improved sensitivity at energies below {approx}10 TeV with respect to previous searches. The analyses presented here are a scan for point sources within the Galactic plane, a search optimized for multiple and extended sources in the Cygnus region, which might be below the sensitivity of the point source scan, and studies of seven pre-selected neutrino source candidates. For one of them, Cygnus X-3, a time-dependent search for neutrino emission in coincidence with observed radio and X-ray flares has been performed. No evidence of a signal is found, and upper limits are reported for each of the searches. We investigate neutrino spectra proportional to E {sup -2} and E {sup -3} in order to cover the entire range of possible neutrino spectra. The steeply falling E {sup -3} neutrino spectrum can also be used to approximate neutrino energy spectra with energy cutoffs below 50 TeV since these result in a similar energy distribution of events in the detector. For the region of the Galactic plane visible in the Northern sky, the 90% confidence level muon neutrino flux upper limits are in the range E {sup 3} dN/dE {approx} 5.4-19.5 Multiplication-Sign 10{sup -11} TeV{sup 2} cm{sup -2} s{sup -1} for point-like neutrino sources in the energy region [180.0 GeV-20.5 TeV]. These represent the most stringent upper limits for soft-spectra neutrino sources within the Galaxy reported to date.« less

  18. Negative effects of item repetition on source memory.

    PubMed

    Kim, Kyungmi; Yi, Do-Joon; Raye, Carol L; Johnson, Marcia K

    2012-08-01

    In the present study, we explored how item repetition affects source memory for new item-feature associations (picture-location or picture-color). We presented line drawings varying numbers of times in Phase 1. In Phase 2, each drawing was presented once with a critical new feature. In Phase 3, we tested memory for the new source feature of each item from Phase 2. Experiments 1 and 2 demonstrated and replicated the negative effects of item repetition on incidental source memory. Prior item repetition also had a negative effect on source memory when different source dimensions were used in Phases 1 and 2 (Experiment 3) and when participants were explicitly instructed to learn source information in Phase 2 (Experiments 4 and 5). Importantly, when the order between Phases 1 and 2 was reversed, such that item repetition occurred after the encoding of critical item-source combinations, item repetition no longer affected source memory (Experiment 6). Overall, our findings did not support predictions based on item predifferentiation, within-dimension source interference, or general interference from multiple traces of an item. Rather, the findings were consistent with the idea that prior item repetition reduces attention to subsequent presentations of the item, decreasing the likelihood that critical item-source associations will be encoded.

  19. Spatio-Temporal Data Model for Integrating Evolving Nation-Level Datasets

    NASA Astrophysics Data System (ADS)

    Sorokine, A.; Stewart, R. N.

    2017-10-01

    Ability to easily combine the data from diverse sources in a single analytical workflow is one of the greatest promises of the Big Data technologies. However, such integration is often challenging as datasets originate from different vendors, governments, and research communities that results in multiple incompatibilities including data representations, formats, and semantics. Semantics differences are hardest to handle: different communities often use different attribute definitions and associate the records with different sets of evolving geographic entities. Analysis of global socioeconomic variables across multiple datasets over prolonged time is often complicated by the difference in how boundaries and histories of countries or other geographic entities are represented. Here we propose an event-based data model for depicting and tracking histories of evolving geographic units (countries, provinces, etc.) and their representations in disparate data. The model addresses the semantic challenge of preserving identity of geographic entities over time by defining criteria for the entity existence, a set of events that may affect its existence, and rules for mapping between different representations (datasets). Proposed model is used for maintaining an evolving compound database of global socioeconomic and environmental data harvested from multiple sources. Practical implementation of our model is demonstrated using PostgreSQL object-relational database with the use of temporal, geospatial, and NoSQL database extensions.

  20. (Sub)millimetre interferometric imaging of a sample of COSMOS/AzTEC submillimetre galaxies. I. Multiwavelength identifications and redshift distribution

    NASA Astrophysics Data System (ADS)

    Miettinen, O.; Smolčić, V.; Novak, M.; Aravena, M.; Karim, A.; Masters, D.; Riechers, D. A.; Bussmann, R. S.; McCracken, H. J.; Ilbert, O.; Bertoldi, F.; Capak, P.; Feruglio, C.; Halliday, C.; Kartaltepe, J. S.; Navarrete, F.; Salvato, M.; Sanders, D.; Schinnerer, E.; Sheth, K.

    2015-05-01

    We used the Plateau de Bure Interferometer (PdBI) to map a sample of 15 submillimetre galaxies (SMGs) in the COSMOS field at the wavelength of 1.3 mm. The target SMGs were originally discovered in the James Clerk Maxwell Telescope (JCMT)/AzTEC 1.1 mm continuum survey at S/N1.1 mm = 4-4.5. This paper presents, for the first time, interferometric millimetre-wavelength observations of these sources. The angular resolution of our observations, 1''&dotbelow;8, allowed us to accurately determine the positions of the target SMGs. Using a detection threshold of S/N1.3 mm> 4.5 regardless of multiwavelength counterpart association, and 4

  1. Controlled Release Strategies for Bone, Cartilage, and Osteochondral Engineering—Part II: Challenges on the Evolution from Single to Multiple Bioactive Factor Delivery

    PubMed Central

    Santo, Vítor E.; Mano, João F.; Reis, Rui L.

    2013-01-01

    The development of controlled release systems for the regeneration of bone, cartilage, and osteochondral interface is one of the hot topics in the field of tissue engineering and regenerative medicine. However, the majority of the developed systems consider only the release of a single growth factor, which is a limiting step for the success of the therapy. More recent studies have been focused on the design and tailoring of appropriate combinations of bioactive factors to match the desired goals regarding tissue regeneration. In fact, considering the complexity of extracellular matrix and the diversity of growth factors and cytokines involved in each biological response, it is expected that an appropriate combination of bioactive factors could lead to more successful outcomes in tissue regeneration. In this review, the evolution on the development of dual and multiple bioactive factor release systems for bone, cartilage, and osteochondral interface is overviewed, specifically the relevance of parameters such as dosage and spatiotemporal distribution of bioactive factors. A comprehensive collection of studies focused on the delivery of bioactive factors is also presented while highlighting the increasing impact of platelet-rich plasma as an autologous source of multiple growth factors. PMID:23249320

  2. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  3. Combinatorial Fusion Analysis for Meta Search Information Retrieval

    NASA Astrophysics Data System (ADS)

    Hsu, D. Frank; Taksa, Isak

    Leading commercial search engines are built as single event systems. In response to a particular search query, the search engine returns a single list of ranked search results. To find more relevant results the user must frequently try several other search engines. A meta search engine was developed to enhance the process of multi-engine querying. The meta search engine queries several engines at the same time and fuses individual engine results into a single search results list. The fusion of multiple search results has been shown (mostly experimentally) to be highly effective. However, the question of why and how the fusion should be done still remains largely unanswered. In this chapter, we utilize the combinatorial fusion analysis proposed by Hsu et al. to analyze combination and fusion of multiple sources of information. A rank/score function is used in the design and analysis of our framework. The framework provides a better understanding of the fusion phenomenon in information retrieval. For example, to improve the performance of the combined multiple scoring systems, it is necessary that each of the individual scoring systems has relatively high performance and the individual scoring systems are diverse. Additionally, we illustrate various applications of the framework using two examples from the information retrieval domain.

  4. Tracking Vessels to Illegal Pollutant Discharges Using Multisource Vessel Information

    NASA Astrophysics Data System (ADS)

    Busler, J.; Wehn, H.; Woodhouse, L.

    2015-04-01

    Illegal discharge of bilge waters is a significant source of oil and other environmental pollutants in Canadian and international waters. Imaging satellites are commonly used to monitor large areas to detect oily discharges from vessels, off-shore platforms and other sources. While remotely sensed imagery provides a snap-shot picture useful for detecting a spill or the presence of vessels in the vicinity, it is difficult to directly associate a vessel to an observed spill unless the vessel is observed while the discharge is occurring. The situation then becomes more challenging with increased vessel traffic as multiple vessels may be associated with a spill event. By combining multiple sources of vessel location data, such as Automated Information Systems (AIS), Long Range Identification and Tracking (LRIT) and SAR-based ship detection, with spill detections and drift models we have created a system that associates detected spill events with vessels in the area using a probabilistic model that intersects vessel tracks and spill drift trajectories in both time and space. Working with the Canadian Space Agency and the Canadian Ice Service's Integrated Satellite Tracking of Pollution (ISTOP) program, we use spills observed in Canadian waters to demonstrate the investigative value of augmenting spill detections with temporally sequenced vessel and spill tracking information.

  5. Diversity-based acoustic communication with a glider in deep water.

    PubMed

    Song, H C; Howe, Bruce M; Brown, Michael G; Andrew, Rex K

    2014-03-01

    The primary use of underwater gliders is to collect oceanographic data within the water column and periodically relay the data at the surface via a satellite connection. In summer 2006, a Seaglider equipped with an acoustic recording system received transmissions from a broadband acoustic source centered at 75 Hz deployed on the bottom off Kauai, Hawaii, while moving away from the source at ranges up to ∼200 km in deep water and diving up to 1000-m depth. The transmitted signal was an m-sequence that can be treated as a binary-phase shift-keying communication signal. In this letter multiple receptions are exploited (i.e., diversity combining) to demonstrate the feasibility of using the glider as a mobile communication gateway.

  6. Integration of Schemas on the Pre-Design Level Using the KCPM-Approach

    NASA Astrophysics Data System (ADS)

    Vöhringer, Jürgen; Mayr, Heinrich C.

    Integration is a central research and operational issue in information system design and development. It can be conducted on the system, schema, and view or data level. On the system level, integration deals with the progressive linking and testing of system components to merge their functional and technical characteristics and behavior into a comprehensive, interoperable system. Schema integration comprises the comparison and merging of two or more schemas, usually conceptual database schemas. The integration of data deals with merging the contents of multiple sources of related data. View integration is similar to schema integration, however focuses on views and queries on these instead of schemas. All these types of integration have in common, that two or more sources are merged and previously compared, in order to identify matches and mismatches as well as conflicts and inconsistencies. The sources may stem from heterogeneous companies, organizational units or projects. Integration enables the reuse and combined use of source components.

  7. Chemometric techniques in distribution, characterisation and source apportionment of polycyclic aromatic hydrocarbons (PAHS) in aquaculture sediments in Malaysia.

    PubMed

    Retnam, Ananthy; Zakaria, Mohamad Pauzi; Juahir, Hafizan; Aris, Ahmad Zaharin; Zali, Munirah Abdul; Kasim, Mohd Fadhil

    2013-04-15

    This study investigated polycyclic aromatic hydrocarbons (PAHs) pollution in surface sediments within aquaculture areas in Peninsular Malaysia using chemometric techniques, forensics and univariate methods. The samples were analysed using soxhlet extraction, silica gel column clean-up and gas chromatography mass spectrometry. The total PAH concentrations ranged from 20 to 1841 ng/g with a mean of 363 ng/g dw. The application of chemometric techniques enabled clustering and discrimination of the aquaculture sediments into four groups according to the contamination levels. A combination of chemometric and molecular indices was used to identify the sources of PAHs, which could be attributed to vehicle emissions, oil combustion and biomass combustion. Source apportionment using absolute principle component scores-multiple linear regression showed that the main sources of PAHs are vehicle emissions 54%, oil 37% and biomass combustion 9%. Land-based pollution from vehicle emissions is the predominant contributor of PAHs in the aquaculture sediments of Peninsular Malaysia. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Tomographic gamma ray apparatus and method

    DOEpatents

    Anger, Hal O.

    1976-09-07

    This invention provides a radiation detecting apparatus for imaging the distribution of radioactive substances in a three-dimensional subject such as a medical patient. Radiating substances introduced into the subject are viewed by a radiation image detector that provides an image of the distribution of radiating sources within its field of view. By viewing the area of interest from two or more positions, as by scanning the detector over the area, the radiating sources seen by the detector have relative positions that are a function of their depth in the subject. The images seen by the detector are transformed into first output signals which are combined in a readout device with second output signals that indicate the position of the detector relative to the subject. The readout device adjusts the signals and provides multiple radiation distribution readouts of the subject, each readout comprising a sharply resolved picture that shows the distribution and intensity of radiating sources lying in a selected plane in the subject, while sources lying on other planes are blurred in that particular readout.

  9. Who do we think we are? Analysing the content and form of identity work in the English National Health Service.

    PubMed

    McDermott, Imelda; Checkland, Kath; Harrison, Stephen; Snow, Stephanie; Coleman, Anna

    2013-01-01

    The language used by National Health Service (NHS) "commissioning" managers when discussing their roles and responsibilities can be seen as a manifestation of "identity work", defined as a process of identifying. This paper aims to offer a novel approach to analysing "identity work" by triangulation of multiple analytical methods, combining analysis of the content of text with analysis of its form. Fairclough's discourse analytic methodology is used as a framework. Following Fairclough, the authors use analytical methods associated with Halliday's systemic functional linguistics. While analysis of the content of interviews provides some information about NHS Commissioners' perceptions of their roles and responsibilities, analysis of the form of discourse that they use provides a more detailed and nuanced view. Overall, the authors found that commissioning managers have a higher level of certainty about what commissioning is not rather than what commissioning is; GP managers have a high level of certainty of their identity as a GP rather than as a manager; and both GP managers and non-GP managers oscillate between multiple identities depending on the different situations they are in. This paper offers a novel approach to triangulation, based not on the usual comparison of multiple data sources, but rather based on the application of multiple analytical methods to a single source of data. This paper also shows the latent uncertainty about the nature of commissioning enterprise in the English NHS.

  10. Matched field localization based on CS-MUSIC algorithm

    NASA Astrophysics Data System (ADS)

    Guo, Shuangle; Tang, Ruichun; Peng, Linhui; Ji, Xiaopeng

    2016-04-01

    The problem caused by shortness or excessiveness of snapshots and by coherent sources in underwater acoustic positioning is considered. A matched field localization algorithm based on CS-MUSIC (Compressive Sensing Multiple Signal Classification) is proposed based on the sparse mathematical model of the underwater positioning. The signal matrix is calculated through the SVD (Singular Value Decomposition) of the observation matrix. The observation matrix in the sparse mathematical model is replaced by the signal matrix, and a new concise sparse mathematical model is obtained, which means not only the scale of the localization problem but also the noise level is reduced; then the new sparse mathematical model is solved by the CS-MUSIC algorithm which is a combination of CS (Compressive Sensing) method and MUSIC (Multiple Signal Classification) method. The algorithm proposed in this paper can overcome effectively the difficulties caused by correlated sources and shortness of snapshots, and it can also reduce the time complexity and noise level of the localization problem by using the SVD of the observation matrix when the number of snapshots is large, which will be proved in this paper.

  11. Implant for in-vivo parameter monitoring, processing and transmitting

    DOEpatents

    Ericson, Milton N [Knoxville, TN; McKnight, Timothy E [Greenback, TN; Smith, Stephen F [London, TN; Hylton, James O [Clinton, TN

    2009-11-24

    The present invention relates to a completely implantable intracranial pressure monitor, which can couple to existing fluid shunting systems as well as other internal monitoring probes. The implant sensor produces an analog data signal which is then converted electronically to a digital pulse by generation of a spreading code signal and then transmitted to a location outside the patient by a radio-frequency transmitter to an external receiver. The implanted device can receive power from an internal source as well as an inductive external source. Remote control of the implant is also provided by a control receiver which passes commands from an external source to the implant system logic. Alarm parameters can be programmed into the device which are capable of producing an audible or visual alarm signal. The utility of the monitor can be greatly expanded by using multiple pressure sensors simultaneously or by combining sensors of various physiological types.

  12. Implantable device for in-vivo intracranial and cerebrospinal fluid pressure monitoring

    DOEpatents

    Ericson, Milton N.; McKnight, Timothy E.; Smith, Stephen F.; Hylton, James O.

    2003-01-01

    The present invention relates to a completely implantable intracranial pressure monitor, which can couple to existing fluid shunting systems as well as other internal monitoring probes. The implant sensor produces an analog data signal which is then converted electronically to a digital pulse by generation of a spreading code signal and then transmitted to a location outside the patient by a radio-frequency transmitter to an external receiver. The implanted device can receive power from an internal source as well as an inductive external source. Remote control of the implant is also provided by a control receiver which passes commands from an external source to the implant system logic. Alarm parameters can be programmed into the device which are capable of producing an audible or visual alarm signal. The utility of the monitor can be greatly expanded by using multiple pressure sensors simultaneously or by combining sensors of various physiological types.

  13. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    PubMed

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  14. Sol-gel precursors and products thereof

    DOEpatents

    Warren, Scott C.; DiSalvo, Jr., Francis J.; Weisner, Ulrich B.

    2017-02-14

    The present invention provides a generalizable single-source sol-gel precursor capable of introducing a wide range of functionalities to metal oxides such as silica. The sol-gel precursor facilitates a one-molecule, one-step approach to the synthesis of metal-silica hybrids with combinations of biological, catalytic, magnetic, and optical functionalities. The single-source precursor also provides a flexible route for simultaneously incorporating functional species of many different types. The ligands employed for functionalizing the metal oxides are derived from a library of amino acids, hydroxy acids, or peptides and a silicon alkoxide, allowing many biological functionalities to be built into silica hybrids. The ligands can coordinate with a wide range of metals via a carboxylic acid, thereby allowing direct incorporation of inorganic functionalities from across the periodic table. Using the single-source precursor a wide range of functionalized nanostructures such as monolith structures, mesostructures, multiple metal gradient mesostructures and Stober-type nanoparticles can be synthesized. ##STR00001##

  15. Eutrophication assessment and management methodology of multiple pollution sources of a landscape lake in North China.

    PubMed

    Chen, Yanxi; Niu, Zhiguang; Zhang, Hongwei

    2013-06-01

    Landscape lakes in the city suffer high eutrophication risk because of their special characters and functions in the water circulation system. Using a landscape lake HMLA located in Tianjin City, North China, with a mixture of point source (PS) pollution and non-point source (NPS) pollution, we explored the methodology of Fluent and AQUATOX to simulate and predict the state of HMLA, and trophic index was used to assess the eutrophication state. Then, we use water compensation optimization and three scenarios to determine the optimal management methodology. Three scenarios include ecological restoration scenario, best management practices (BMPs) scenario, and a scenario combining both. Our results suggest that the maintenance of a healthy ecosystem with ecoremediation is necessary and the BMPs have a far-reaching effect on water reusing and NPS pollution control. This study has implications for eutrophication control and management under development for urbanization in China.

  16. Meteorological and air pollution modeling for an urban airport

    NASA Technical Reports Server (NTRS)

    Swan, P. R.; Lee, I. Y.

    1980-01-01

    Results are presented of numerical experiments modeling meteorology, multiple pollutant sources, and nonlinear photochemical reactions for the case of an airport in a large urban area with complex terrain. A planetary boundary-layer model which predicts the mixing depth and generates wind, moisture, and temperature fields was used; it utilizes only surface and synoptic boundary conditions as input data. A version of the Hecht-Seinfeld-Dodge chemical kinetics model is integrated with a new, rapid numerical technique; both the San Francisco Bay Area Air Quality Management District source inventory and the San Jose Airport aircraft inventory are utilized. The air quality model results are presented in contour plots; the combined results illustrate that the highly nonlinear interactions which are present require that the chemistry and meteorology be considered simultaneously to make a valid assessment of the effects of individual sources on regional air quality.

  17. Standardized shrinking LORETA-FOCUSS (SSLOFO): a new algorithm for spatio-temporal EEG source reconstruction.

    PubMed

    Liu, Hesheng; Schimpf, Paul H; Dong, Guoya; Gao, Xiaorong; Yang, Fusheng; Gao, Shangkai

    2005-10-01

    This paper presents a new algorithm called Standardized Shrinking LORETA-FOCUSS (SSLOFO) for solving the electroencephalogram (EEG) inverse problem. Multiple techniques are combined in a single procedure to robustly reconstruct the underlying source distribution with high spatial resolution. This algorithm uses a recursive process which takes the smooth estimate of sLORETA as initialization and then employs the re-weighted minimum norm introduced by FOCUSS. An important technique called standardization is involved in the recursive process to enhance the localization ability. The algorithm is further improved by automatically adjusting the source space according to the estimate of the previous step, and by the inclusion of temporal information. Simulation studies are carried out on both spherical and realistic head models. The algorithm achieves very good localization ability on noise-free data. It is capable of recovering complex source configurations with arbitrary shapes and can produce high quality images of extended source distributions. We also characterized the performance with noisy data in a realistic head model. An important feature of this algorithm is that the temporal waveforms are clearly reconstructed, even for closely spaced sources. This provides a convenient way to estimate neural dynamics directly from the cortical sources.

  18. Comparison of two trajectory based models for locating particle sources for two rural New York sites

    NASA Astrophysics Data System (ADS)

    Zhou, Liming; Hopke, Philip K.; Liu, Wei

    Two back trajectory-based statistical models, simplified quantitative transport bias analysis and residence-time weighted concentrations (RTWC) have been compared for their capabilities of identifying likely locations of source emissions contributing to observed particle concentrations at Potsdam and Stockton, New York. Quantitative transport bias analysis (QTBA) attempts to take into account the distribution of concentrations around the directions of the back trajectories. In full QTBA approach, deposition processes (wet and dry) are also considered. Simplified QTBA omits the consideration of deposition. It is best used with multiple site data. Similarly the RTWC approach uses concentrations measured at different sites along with the back trajectories to distribute the concentration contributions across the spatial domain of the trajectories. In this study, these models are used in combination with the source contribution values obtained by the previous positive matrix factorization analysis of particle composition data from Potsdam and Stockton. The six common sources for the two sites, sulfate, soil, zinc smelter, nitrate, wood smoke and copper smelter were analyzed. The results of the two methods are consistent and locate large and clearly defined sources well. RTWC approach can find more minor sources but may also give unrealistic estimations of the source locations.

  19. Groundwater vulnerability assessment for organic compounds: fuzzy multi-criteria approach for Mexico city.

    PubMed

    Mazari-Hiriart, Marisa; Cruz-Bello, Gustavo; Bojórquez-Tapia, Luis A; Juárez-Marusich, Lourdes; Alcantar-López, Georgina; Marín, Luis E; Soto-Galera, Ernesto

    2006-03-01

    This study was based on a groundwater vulnerability assessment approach implemented for the Mexico City Metropolitan Area (MCMA). The approach is based on a fuzzy multi-criteria procedure integrated in a geographic information system. The approach combined the potential contaminant sources with the permeability of geological materials. Initially, contaminant sources were ranked by experts through the Analytic Hierarchy Process. An aggregated contaminant sources map layer was obtained through the simple additive weighting method, using a scalar multiplication of criteria weights and binary maps showing the location of each source. A permeability map layer was obtained through the reclassification of a geology map using the respective hydraulic conductivity values, followed by a linear normalization of these values against a compatible scale. A fuzzy logic procedure was then applied to transform and combine the two map layers, resulting in a groundwater vulnerability map layer of five classes: very low, low, moderate, high, and very high. Results provided a more coherent assessment of the policy-making priorities considered when discussing the vulnerability of groundwater to organic compounds. The very high and high vulnerability areas covered a relatively small area (71 km(2) or 1.5% of the total study area), allowing the identification of the more critical locations. The advantage of a fuzzy logic procedure is that it enables the best possible use to be made of the information available regarding groundwater vulnerability in the MCMA.

  20. Multi-Source Autonomous Response for Targeting and Monitoring of Volcanic Activity

    NASA Technical Reports Server (NTRS)

    Davies, Ashley G.; Doubleday, Joshua R.; Tran, Daniel Q.

    2014-01-01

    The study of volcanoes is important for both purely scientific and human survival reasons. From a scientific standpoint, volcanic gas and ash emissions contribute significantly to the terrestrial atmosphere. Ash depositions and lava flows can also greatly affect local environments. From a human survival standpoint, many people live within the reach of active volcanoes, and therefore can be endangered by both atmospheric (ash, debris) toxicity and lava flow. There are many potential information sources that can be used to determine how to best monitor volcanic activity worldwide. These are of varying temporal frequency, spatial regard, method of access, and reliability. The problem is how to incorporate all of these inputs in a general framework to assign/task/reconfigure assets to monitor events in a timely fashion. In situ sensing can provide a valuable range of complementary information such as seismographic, discharge, acoustic, and other data. However, many volcanoes are not instrumented with in situ sensors, and those that have sensor networks are restricted to a relatively small numbers of point sensors. Consequently, ideal volcanic study synergistically combines space and in situ measurements. This work demonstrates an effort to integrate spaceborne sensing from MODIS (Terra and Aqua), ALI (EO-1), Worldview-2, and in situ sensing in an automated scheme to improve global volcano monitoring. Specifically, it is a "sensor web" concept in which a number of volcano monitoring systems are linked together to monitor volcanic activity more accurately, and this activity measurement automatically tasks space assets to acquire further satellite imagery of ongoing volcanic activity. A general framework was developed for evidence combination that accounts for multiple information sources in a scientist-directed fashion to weigh inputs and allocate observations based on the confidence of an events occurrence, rarity of the event at that location, and other scientists' inputs. The software framework uses multiple source languages and is a general framework for combining inputs and incrementally submitting observation requests/reconfigurations, accounting for prior requests. The autonomous aspect of operations is unique, especially in the context of the wide range of inputs that includes manually inputted electronic reports (such as the Air Force Weather Advisories), automated satellite-based detection methods (such as MODVOLC and GOESVOLC), and in situ sensor networks.

  1. On the possibility of the multiple inductively coupled plasma and helicon plasma sources for large-area processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jin-Won; Lee, Yun-Seong, E-mail: leeeeys@kaist.ac.kr; Chang, Hong-Young

    2014-08-15

    In this study, we attempted to determine the possibility of multiple inductively coupled plasma (ICP) and helicon plasma sources for large-area processes. Experiments were performed with the one and two coils to measure plasma and electrical parameters, and a circuit simulation was performed to measure the current at each coil in the 2-coil experiment. Based on the result, we could determine the possibility of multiple ICP sources due to a direct change of impedance due to current and saturation of impedance due to the skin-depth effect. However, a helicon plasma source is difficult to adapt to the multiple sources duemore » to the consistent change of real impedance due to mode transition and the low uniformity of the B-field confinement. As a result, it is expected that ICP can be adapted to multiple sources for large-area processes.« less

  2. An Informatics Approach to Evaluating Combined Chemical Exposures from Consumer Products: A Case Study of Asthma-Associated Chemicals and Potential Endocrine Disruptors

    PubMed Central

    Gabb, Henry A.; Blake, Catherine

    2016-01-01

    Background: Simultaneous or sequential exposure to multiple environmental stressors can affect chemical toxicity. Cumulative risk assessments consider multiple stressors but it is impractical to test every chemical combination to which people are exposed. New methods are needed to prioritize chemical combinations based on their prevalence and possible health impacts. Objectives: We introduce an informatics approach that uses publicly available data to identify chemicals that co-occur in consumer products, which account for a significant proportion of overall chemical load. Methods: Fifty-five asthma-associated and endocrine disrupting chemicals (target chemicals) were selected. A database of 38,975 distinct consumer products and 32,231 distinct ingredient names was created from online sources, and PubChem and the Unified Medical Language System were used to resolve synonymous ingredient names. Synonymous ingredient names are different names for the same chemical (e.g., vitamin E and tocopherol). Results: Nearly one-third of the products (11,688 products, 30%) contained ≥ 1 target chemical and 5,229 products (13%) contained > 1. Of the 55 target chemicals, 31 (56%) appear in ≥ 1 product and 19 (35%) appear under more than one name. The most frequent three-way chemical combination (2-phenoxyethanol, methyl paraben, and ethyl paraben) appears in 1,059 products. Further work is needed to assess combined chemical exposures related to the use of multiple products. Conclusions: The informatics approach increased the number of products considered in a traditional analysis by two orders of magnitude, but missing/incomplete product labels can limit the effectiveness of this approach. Such an approach must resolve synonymy to ensure that chemicals of interest are not missed. Commonly occurring chemical combinations can be used to prioritize cumulative toxicology risk assessments. Citation: Gabb HA, Blake C. 2016. An informatics approach to evaluating combined chemical exposures from consumer products: a case study of asthma-associated chemicals and potential endocrine disruptors. Environ Health Perspect 124:1155–1165; http://dx.doi.org/10.1289/ehp.1510529 PMID:26955064

  3. Optimized Biasing of Pump Laser Diodes in a Highly Reliable Metrology Source for Long-Duration Space Missions

    NASA Technical Reports Server (NTRS)

    Poberezhskiy, Ilya Y; Chang, Daniel H.; Erlig, Herman

    2011-01-01

    Optical metrology system reliability during a prolonged space mission is often limited by the reliability of pump laser diodes. We developed a metrology laser pump module architecture that meets NASA SIM Lite instrument optical power and reliability requirements by combining the outputs of multiple single-mode pump diodes in a low-loss, high port count fiber coupler. We describe Monte-Carlo simulations used to calculate the reliability of the laser pump module and introduce a combined laser farm aging parameter that serves as a load-sharing optimization metric. Employing these tools, we select pump module architecture, operating conditions, biasing approach and perform parameter sensitivity studies to investigate the robustness of the obtained solution.

  4. Aspiring to Spectral Ignorance in Earth Observation

    NASA Astrophysics Data System (ADS)

    Oliver, S. A.

    2016-12-01

    Enabling robust, defensible and integrated decision making in the Era of Big Earth Data requires the fusion of data from multiple and diverse sensor platforms and networks. While the application of standardised global grid systems provides a common spatial analytics framework that facilitates the computationally efficient and statistically valid integration and analysis of these various data sources across multiple scales, there remains the challenge of sensor equivalency; particularly when combining data from different earth observation satellite sensors (e.g. combining Landsat and Sentinel-2 observations). To realise the vision of a sensor ignorant analytics platform for earth observation we require automation of spectral matching across the available sensors. Ultimately, the aim is to remove the requirement for the user to possess any sensor knowledge in order to undertake analysis. This paper introduces the concept of spectral equivalence and proposes a methodology through which equivalent bands may be sourced from a set of potential target sensors through application of equivalence metrics and thresholds. A number of parameters can be used to determine whether a pair of spectra are equivalent for the purposes of analysis. A baseline set of thresholds for these parameters and how to apply them systematically to enable relation of spectral bands amongst numerous different sensors is proposed. The base unit for comparison in this work is the relative spectral response. From this input, determination of a what may constitute equivalence can be related by a user, based on their own conceptualisation of equivalence.

  5. Evaluation of Electroencephalography Source Localization Algorithms with Multiple Cortical Sources.

    PubMed

    Bradley, Allison; Yao, Jun; Dewald, Jules; Richter, Claus-Peter

    2016-01-01

    Source localization algorithms often show multiple active cortical areas as the source of electroencephalography (EEG). Yet, there is little data quantifying the accuracy of these results. In this paper, the performance of current source density source localization algorithms for the detection of multiple cortical sources of EEG data has been characterized. EEG data were generated by simulating multiple cortical sources (2-4) with the same strength or two sources with relative strength ratios of 1:1 to 4:1, and adding noise. These data were used to reconstruct the cortical sources using current source density (CSD) algorithms: sLORETA, MNLS, and LORETA using a p-norm with p equal to 1, 1.5 and 2. Precision (percentage of the reconstructed activity corresponding to simulated activity) and Recall (percentage of the simulated sources reconstructed) of each of the CSD algorithms were calculated. While sLORETA has the best performance when only one source is present, when two or more sources are present LORETA with p equal to 1.5 performs better. When the relative strength of one of the sources is decreased, all algorithms have more difficulty reconstructing that source. However, LORETA 1.5 continues to outperform other algorithms. If only the strongest source is of interest sLORETA is recommended, while LORETA with p equal to 1.5 is recommended if two or more of the cortical sources are of interest. These results provide guidance for choosing a CSD algorithm to locate multiple cortical sources of EEG and for interpreting the results of these algorithms.

  6. Evaluation of Electroencephalography Source Localization Algorithms with Multiple Cortical Sources

    PubMed Central

    Bradley, Allison; Yao, Jun; Dewald, Jules; Richter, Claus-Peter

    2016-01-01

    Background Source localization algorithms often show multiple active cortical areas as the source of electroencephalography (EEG). Yet, there is little data quantifying the accuracy of these results. In this paper, the performance of current source density source localization algorithms for the detection of multiple cortical sources of EEG data has been characterized. Methods EEG data were generated by simulating multiple cortical sources (2–4) with the same strength or two sources with relative strength ratios of 1:1 to 4:1, and adding noise. These data were used to reconstruct the cortical sources using current source density (CSD) algorithms: sLORETA, MNLS, and LORETA using a p-norm with p equal to 1, 1.5 and 2. Precision (percentage of the reconstructed activity corresponding to simulated activity) and Recall (percentage of the simulated sources reconstructed) of each of the CSD algorithms were calculated. Results While sLORETA has the best performance when only one source is present, when two or more sources are present LORETA with p equal to 1.5 performs better. When the relative strength of one of the sources is decreased, all algorithms have more difficulty reconstructing that source. However, LORETA 1.5 continues to outperform other algorithms. If only the strongest source is of interest sLORETA is recommended, while LORETA with p equal to 1.5 is recommended if two or more of the cortical sources are of interest. These results provide guidance for choosing a CSD algorithm to locate multiple cortical sources of EEG and for interpreting the results of these algorithms. PMID:26809000

  7. Thermoregulatory behavior and orientation preference in bearded dragons.

    PubMed

    Black, Ian R G; Tattersall, Glenn J

    2017-10-01

    The regulation of body temperature is a critical function for animals. Although reliant on ambient temperature as a heat source, reptiles, and especially lizards, make use of multiple voluntary and involuntary behaviors to thermoregulate, including postural changes in body orientation, either toward or away from solar sources of heat. This thermal orientation may also result from a thermoregulatory drive to maintain precise control over cranial temperatures or a rostrally-driven sensory bias. The purpose of this work was to examine thermal orientation behavior in adult and neonatal bearded dragons (Pogona vitticeps), to ascertain its prevalence across different life stages within a laboratory situation and its interaction with behavioral thermoregulation. Both adult and neonatal bearded dragons were placed in a thermal gradient and allowed to voluntarily select temperatures for up to 8h to observe the presence and development of a thermoregulatory orientation preference. Both adult and neonatal dragons displayed a non-random orientation, preferring to face toward a heat source while achieving mean thermal preferences of ~ 33-34°C. Specifically, adult dragons were more likely to face a heat source when at cooler ambient temperatures and less likely at warmer temperatures, suggesting that orientation behavior counter-balances local selected temperatures but contributes to their thermoregulatory response. Neonates were also more likely to select cooler temperatures when facing a heat source, but required more experience before this orientation behavior emerged. Combined, these results demonstrate the importance of orientation to behavioral thermoregulation in multiple life stages of bearded dragons. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Combined use of a new SNP-based assay and multilocus SSR markers to assess genetic diversity of Xylella fastidiosa subsp. pauca infecting citrus and coffee plants.

    PubMed

    Montes-Borrego, Miguel; Lopes, Joao R S; Jiménez-Díaz, Rafael M; Landa, Blanca B

    2015-03-01

    Two haplotypes of Xylella fastidiosa subsp. pauca (Xfp) that correlated with their host of origin were identified in a collection of 90 isolates infecting citrus and coffee plants in Brazil, based on a single-nucleotide polymorphism in the gyrB sequence. A new single-nucleotide primer extension (SNuPE) protocol was designed for rapid identification of Xfp according to the host source. The protocol proved to be robust for the prediction of the Xfp host source in blind tests using DNA from cultures of the bacterium, infected plants, and insect vectors allowed to feed on Xfp-infected citrus plants. AMOVA and STRUCTURE analyses of microsatellite data separated most Xfp populations on the basis of their host source, indicating that they were genetically distinct. The combined use of the SNaPshot protocol and three previously developed multilocus SSR markers showed that two haplotypes and distinct isolates of Xfp infect citrus and coffee in Brazil and that multiple, genetically different isolates can be present in a single orchard or infect a single tree. This combined approach will be very useful in studies of the epidemiology of Xfp-induced diseases, host specificity of bacterial genotypes, the occurrence of Xfp host jumping, vector feeding habits, etc., in economically important cultivated plants or weed host reservoirs of Xfp in Brazil and elsewhere. Copyright© by the Spanish Society for Microbiology and Institute for Catalan Studies.

  9. A Multiple-star Combined Solution Program - Application to the Population II Binary μ Cas

    NASA Astrophysics Data System (ADS)

    Gudehus, D. H.

    2001-05-01

    A multiple-star combined-solution computer program which can simultaneously fit astrometric, speckle, and spectroscopic data, and solve for the orbital parameters, parallax, proper motion, and masses has been written and is now publicly available. Some features of the program are the ability to scale the weights at run time, hold selected parameters constant, handle up to five spectroscopic subcomponents for the primary and the secondary each, account for the light travel time across the system, account for apsidal motion, plot the results, and write the residuals in position to a standard file for further analysis. The spectroscopic subcomponent data can be represented by reflex velocities and/or by independent measurements. A companion editing program which can manage the data files is included in the package. The program has been applied to the Population II binary μ Cas to derive improved masses and an estimate of the primordial helium abundance. The source code, executables, sample data files, and documentation for OpenVMS and Unix, including Linux, are available at http://www.chara.gsu.edu/\\rlap\\ \\ gudehus/binary.html.

  10. Application of a color scanner for 60Co high dose rate brachytherapy dosimetry with EBT radiochromic film

    PubMed Central

    Ghorbani, Mahdi; Toossi, Mohammad Taghi Bahreyni; Mowlavi, Ali Asghar; Roodi, Shahram Bayani; Meigooni, Ali Soleimani

    2012-01-01

    Background. The aim of this study is to evaluate the performance of a color scanner as a radiochromic film reader in two dimensional dosimetry around a high dose rate brachytherapy source. Materials and methods A Microtek ScanMaker 1000XL film scanner was utilized for the measurement of dose distribution around a high dose rate GZP6 60Co brachytherapy source with GafChromic® EBT radiochromic films. In these investigations, the non-uniformity of the film and scanner response, combined, as well as the films sensitivity to scanner’s light source was evaluated using multiple samples of films, prior to the source dosimetry. The results of these measurements were compared with the Monte Carlo simulated data using MCNPX code. In addition, isodose curves acquired by radiochromic films and Monte Carlo simulation were compared with those provided by the GZP6 treatment planning system. Results Scanning of samples of uniformly irradiated films demonstrated approximately 2.85% and 4.97% nonuniformity of the response, respectively in the longitudinal and transverse directions of the film. Our findings have also indicated that the film response is not affected by the exposure to the scanner’s light source, particularly in multiple scanning of film. The results of radiochromic film measurements are in good agreement with the Monte Carlo calculations (4%) and the corresponding dose values presented by the GZP6 treatment planning system (5%). Conclusions The results of these investigations indicate that the Microtek ScanMaker 1000XL color scanner in conjunction with GafChromic EBT film is a reliable system for dosimetric evaluation of a high dose rate brachytherapy source. PMID:23411947

  11. Toward multimodal signal detection of adverse drug reactions.

    PubMed

    Harpaz, Rave; DuMouchel, William; Schuemie, Martijn; Bodenreider, Olivier; Friedman, Carol; Horvitz, Eric; Ripple, Anna; Sorbello, Alfred; White, Ryen W; Winnenburg, Rainer; Shah, Nigam H

    2017-12-01

    Improving mechanisms to detect adverse drug reactions (ADRs) is key to strengthening post-marketing drug safety surveillance. Signal detection is presently unimodal, relying on a single information source. Multimodal signal detection is based on jointly analyzing multiple information sources. Building on, and expanding the work done in prior studies, the aim of the article is to further research on multimodal signal detection, explore its potential benefits, and propose methods for its construction and evaluation. Four data sources are investigated; FDA's adverse event reporting system, insurance claims, the MEDLINE citation database, and the logs of major Web search engines. Published methods are used to generate and combine signals from each data source. Two distinct reference benchmarks corresponding to well-established and recently labeled ADRs respectively are used to evaluate the performance of multimodal signal detection in terms of area under the ROC curve (AUC) and lead-time-to-detection, with the latter relative to labeling revision dates. Limited to our reference benchmarks, multimodal signal detection provides AUC improvements ranging from 0.04 to 0.09 based on a widely used evaluation benchmark, and a comparative added lead-time of 7-22 months relative to labeling revision dates from a time-indexed benchmark. The results support the notion that utilizing and jointly analyzing multiple data sources may lead to improved signal detection. Given certain data and benchmark limitations, the early stage of development, and the complexity of ADRs, it is currently not possible to make definitive statements about the ultimate utility of the concept. Continued development of multimodal signal detection requires a deeper understanding the data sources used, additional benchmarks, and further research on methods to generate and synthesize signals. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Aircraft and background noise annoyance effects

    NASA Technical Reports Server (NTRS)

    Willshire, K. F.

    1984-01-01

    To investigate annoyance of multiple noise sources, two experiments were conducted. The first experiment, which used 48 subjects, was designed to establish annoyance-noise level functions for three community noise sources presented individually: jet aircraft flyovers, air conditioner, and traffic. The second experiment, which used 216 subjects, investigated the effects of background noise on aircraft annoyance as a function of noise level and spectrum shape; and the differences between overall, aircraft, and background noise annoyance. In both experiments, rated annoyance was the dependent measure. Results indicate that the slope of the linear relationship between annoyance and noise level for traffic is significantly different from that of flyover and air conditioner noise and that further research was justified to determine the influence of the two background noises on overall, aircraft, and background noise annoyance (e.g., experiment two). In experiment two, total noise exposure, signal-to-noise ratio, and background source type were found to have effects on all three types of annoyance. Thus, both signal-to-noise ratio, and the background source must be considered when trying to determine community response to combined noise sources.

  13. Uncovering the Protostars in Serpens South with ALMA: Continuum Sources and Their Outflow Activity

    NASA Astrophysics Data System (ADS)

    Plunkett, Adele; Arce, H.; Corder, S.; Dunham, M.

    2017-06-01

    Serpens South is an appealing protostellar cluster to study due the combination of several factors: (1) a high protostar fraction that shows evidence for very recent and ongoing star formation; (2) iconic clustered star formation along a filamentary structure; (3) its relative proximity within a few hundred parsecs. An effective study requires the sensitivity, angular and spectral resolution, and mapping capabilities recently provided with ALMA. Here we present a multi-faceted data set acquired from Cycles 1 through 3 with ALMA, including maps of continuum sources and molecular outflows throughout the region, as well as a more focused kinematical study of the protostar that is the strongest continuum source at the cluster center. Together these data span spatial scales over several orders of magnitude, allowing us to investigate the outflow-driving sources and the impact of the outflows on the cluster environment. Currently, we focus on the census of protostars in the cluster center, numbering about 20, including low-flux, low-mass sources never before detected in mm-wavelengths and evidence for multiplicity that was previously unresolved.

  14. Pollutant source identification model for water pollution incidents in small straight rivers based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Shou-ping; Xin, Xiao-kang

    2017-07-01

    Identification of pollutant sources for river pollution incidents is an important and difficult task in the emergency rescue, and an intelligent optimization method can effectively compensate for the weakness of traditional methods. An intelligent model for pollutant source identification has been established using the basic genetic algorithm (BGA) as an optimization search tool and applying an analytic solution formula of one-dimensional unsteady water quality equation to construct the objective function. Experimental tests show that the identification model is effective and efficient: the model can accurately figure out the pollutant amounts or positions no matter single pollution source or multiple sources. Especially when the population size of BGA is set as 10, the computing results are sound agree with analytic results for a single source amount and position identification, the relative errors are no more than 5 %. For cases of multi-point sources and multi-variable, there are some errors in computing results for the reasons that there exist many possible combinations of the pollution sources. But, with the help of previous experience to narrow the search scope, the relative errors of the identification results are less than 5 %, which proves the established source identification model can be used to direct emergency responses.

  15. Hearing Scenes: A Neuromagnetic Signature of Auditory Source and Reverberant Space Separation

    PubMed Central

    Oliva, Aude

    2017-01-01

    Abstract Perceiving the geometry of surrounding space is a multisensory process, crucial to contextualizing object perception and guiding navigation behavior. Humans can make judgments about surrounding spaces from reverberation cues, caused by sounds reflecting off multiple interior surfaces. However, it remains unclear how the brain represents reverberant spaces separately from sound sources. Here, we report separable neural signatures of auditory space and source perception during magnetoencephalography (MEG) recording as subjects listened to brief sounds convolved with monaural room impulse responses (RIRs). The decoding signature of sound sources began at 57 ms after stimulus onset and peaked at 130 ms, while space decoding started at 138 ms and peaked at 386 ms. Importantly, these neuromagnetic responses were readily dissociable in form and time: while sound source decoding exhibited an early and transient response, the neural signature of space was sustained and independent of the original source that produced it. The reverberant space response was robust to variations in sound source, and vice versa, indicating a generalized response not tied to specific source-space combinations. These results provide the first neuromagnetic evidence for robust, dissociable auditory source and reverberant space representations in the human brain and reveal the temporal dynamics of how auditory scene analysis extracts percepts from complex naturalistic auditory signals. PMID:28451630

  16. Language Model Combination and Adaptation Using Weighted Finite State Transducers

    NASA Technical Reports Server (NTRS)

    Liu, X.; Gales, M. J. F.; Hieronymus, J. L.; Woodland, P. C.

    2010-01-01

    In speech recognition systems language model (LMs) are often constructed by training and combining multiple n-gram models. They can be either used to represent different genres or tasks found in diverse text sources, or capture stochastic properties of different linguistic symbol sequences, for example, syllables and words. Unsupervised LM adaption may also be used to further improve robustness to varying styles or tasks. When using these techniques, extensive software changes are often required. In this paper an alternative and more general approach based on weighted finite state transducers (WFSTs) is investigated for LM combination and adaptation. As it is entirely based on well-defined WFST operations, minimum change to decoding tools is needed. A wide range of LM combination configurations can be flexibly supported. An efficient on-the-fly WFST decoding algorithm is also proposed. Significant error rate gains of 7.3% relative were obtained on a state-of-the-art broadcast audio recognition task using a history dependently adapted multi-level LM modelling both syllable and word sequences

  17. EverVIEW: a visualization platform for hydrologic and Earth science gridded data

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig

    2015-01-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  18. Reaction schemes visualized in network form: the syntheses of strychnine as an example.

    PubMed

    Proudfoot, John R

    2013-05-24

    Representation of synthesis sequences in a network form provides an effective method for the comparison of multiple reaction schemes and an opportunity to emphasize features such as reaction scale that are often relegated to experimental sections. An example of data formatting that allows construction of network maps in Cytoscape is presented, along with maps that illustrate the comparison of multiple reaction sequences, comparison of scaffold changes within sequences, and consolidation to highlight common key intermediates used across sequences. The 17 different synthetic routes reported for strychnine are used as an example basis set. The reaction maps presented required a significant data extraction and curation, and a standardized tabular format for reporting reaction information, if applied in a consistent way, could allow the automated combination of reaction information across different sources.

  19. Boby-Vortex Interaction, Sound Generation and Destructive Interference

    NASA Technical Reports Server (NTRS)

    Kao, Hsiao C.

    2000-01-01

    It is generally recognized that interaction of vortices with downstream blades is a major source of noise production. To analyze this problem numerically, a two-dimensional model of inviscid flow together with the method of matched asymptotic expansions is proposed. The method of matched asymptotic expansions is used to match the inner region of incompressible flow to the outer region of compressible flow. Because of incompressibility, relatively simple numerical methods are available to treat multiple vortices and multiple bodies of arbitrary shape. Disturbances from vortices and bodies propagate outward as sound waves. Due to their interactions, either constructive or destructive interference may result. When it is destructive, the combined sound intensity can be reduced, sometimes substantially. In addition, an analytical solution to sound generation by the cascade-vonex interaction is given.

  20. Satellite Remote Sensing of Harmful Algal Blooms (HABs) and a Potential Synthesized Framework

    PubMed Central

    Shen, Li; Xu, Huiping; Guo, Xulin

    2012-01-01

    Harmful algal blooms (HABs) are severe ecological disasters threatening aquatic systems throughout the World, which necessitate scientific efforts in detecting and monitoring them. Compared with traditional in situ point observations, satellite remote sensing is considered as a promising technique for studying HABs due to its advantages of large-scale, real-time, and long-term monitoring. The present review summarizes the suitability of current satellite data sources and different algorithms for detecting HABs. It also discusses the spatial scale issue of HABs. Based on the major problems identified from previous literature, including the unsystematic understanding of HABs, the insufficient incorporation of satellite remote sensing, and a lack of multiple oceanographic explanations of the mechanisms causing HABs, this review also attempts to provide a comprehensive understanding of the complicated mechanism of HABs impacted by multiple oceanographic factors. A potential synthesized framework can be established by combining multiple accessible satellite remote sensing approaches including visual interpretation, spectra analysis, parameters retrieval and spatial-temporal pattern analysis. This framework aims to lead to a systematic and comprehensive monitoring of HABs based on satellite remote sensing from multiple oceanographic perspectives. PMID:22969372

  1. Organometallic chemical vapor deposition and characterization of ZnGeP2/GaP multiple heterostructures on GaP substrates

    NASA Technical Reports Server (NTRS)

    Xing, G. C.; Bachmann, Klaus J.

    1993-01-01

    The growth of ZnGeP2/GaP double and multiple heterostructures on GaP substrates by organometallic chemical vapor deposition is reported. These epitaxial films were deposited at a temperature of 580 C using dimethylzinc, trimethylgallium, germane, and phosphine as source gases. With appropriate deposition conditions, mirror smooth epitaxial GaP/ZnGeP2 multiple heterostructures were obtained on (001) GaP substrates. Transmission electron microscopy (TEM) and secondary ion mass spectroscopy (SIMS) studies of the films showed that the interfaces are sharp and smooth. Etching study of the films showed dislocation density on the order of 5x10(exp 4)cm(sup -2). The growth rates of the GaP layers depend linearly on the flow rates of trimethylgallium. While the GaP layers crystallize in zinc-blende structure, the ZnGeP2 layers crystallize in the chalcopyrite structure as determined by (010) electron diffraction pattern. This is the first time that multiple heterostructures combining these two crystal structures were made.

  2. Intuitive theories of information: beliefs about the value of redundancy.

    PubMed

    Soll, J B

    1999-03-01

    In many situations, quantity estimates from multiple experts or diagnostic instruments must be collected and combined. Normatively, and all else equal, one should value information sources that are nonredundant, in the sense that correlation in forecast errors should be minimized. Past research on the preference for redundancy has been inconclusive. While some studies have suggested that people correctly place higher value on uncorrelated inputs when collecting estimates, others have shown that people either ignore correlation or, in some cases, even prefer it. The present experiments show that the preference for redundancy depends on one's intuitive theory of information. The most common intuitive theory identified is the Error Tradeoff Model (ETM), which explicitly distinguishes between measurement error and bias. According to ETM, measurement error can only be averaged out by consulting the same source multiple times (normatively false), and bias can only be averaged out by consulting different sources (normatively true). As a result, ETM leads people to prefer redundant estimates when the ratio of measurement error to bias is relatively high. Other participants favored different theories. Some adopted the normative model, while others were reluctant to mathematically average estimates from different sources in any circumstance. In a post hoc analysis, science majors were more likely than others to subscribe to the normative model. While tentative, this result lends insight into how intuitive theories might develop and also has potential ramifications for how statistical concepts such as correlation might best be learned and internalized. Copyright 1999 Academic Press.

  3. A VGI data integration framework based on linked data model

    NASA Astrophysics Data System (ADS)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  4. Human Health Risk Implications of Multiple Sources of Faecal Indicator Bacteria in a Recreational Waterbody

    EPA Science Inventory

    We evaluate the influence of multiple sources of faecal indicator bacteria in recreational water bodies on potential human health risk by considering waters impacted by human and animal sources, human and non-pathogenic sources, and animal and non-pathogenic sources. We illustrat...

  5. Incomplete Multisource Transfer Learning.

    PubMed

    Ding, Zhengming; Shao, Ming; Fu, Yun

    2018-02-01

    Transfer learning is generally exploited to adapt well-established source knowledge for learning tasks in weakly labeled or unlabeled target domain. Nowadays, it is common to see multiple sources available for knowledge transfer, each of which, however, may not include complete classes information of the target domain. Naively merging multiple sources together would lead to inferior results due to the large divergence among multiple sources. In this paper, we attempt to utilize incomplete multiple sources for effective knowledge transfer to facilitate the learning task in target domain. To this end, we propose an incomplete multisource transfer learning through two directional knowledge transfer, i.e., cross-domain transfer from each source to target, and cross-source transfer. In particular, in cross-domain direction, we deploy latent low-rank transfer learning guided by iterative structure learning to transfer knowledge from each single source to target domain. This practice reinforces to compensate for any missing data in each source by the complete target data. While in cross-source direction, unsupervised manifold regularizer and effective multisource alignment are explored to jointly compensate for missing data from one portion of source to another. In this way, both marginal and conditional distribution discrepancy in two directions would be mitigated. Experimental results on standard cross-domain benchmarks and synthetic data sets demonstrate the effectiveness of our proposed model in knowledge transfer from incomplete multiple sources.

  6. Legacy of contaminant N sources to the NO3− signature in rivers: a combined isotopic (δ15N-NO3−, δ18O-NO3−, δ11B) and microbiological investigation

    PubMed Central

    Briand, Cyrielle; Sebilo, Mathieu; Louvat, Pascale; Chesnot, Thierry; Vaury, Véronique; Schneider, Maude; Plagnes, Valérie

    2017-01-01

    Nitrate content of surface waters results from complex mixing of multiple sources, whose signatures can be modified through N reactions occurring within the different compartments of the whole catchment. Despite this complexity, the determination of nitrate origin is the first and crucial step for water resource preservation. Here, for the first time, we combined at the catchment scale stable isotopic tracers (δ15N and δ18O of nitrate and δ11B) and fecal indicators to trace nitrate sources and pathways to the stream. We tested this approach on two rivers in an agricultural region of SW France. Boron isotopic ratios evidenced inflow from anthropogenic waters, microbiological markers revealed organic contaminations from both human and animal wastes. Nitrate δ15N and δ18O traced inputs from the surface leaching during high flow events and from the subsurface drainage in base flow regime. They also showed that denitrification occurred within the soils before reaching the rivers. Furthermore, this study highlighted the determinant role of the soil compartment in nitrate formation and recycling with important spatial heterogeneity and temporal variability. PMID:28150819

  7. Measurements of Carbon Dioxide, Methane, and Other Related Tracers at High Spatial and Temporal Resolution in an Urban Environment

    NASA Astrophysics Data System (ADS)

    Yasuhara, Scott; Forgeron, Jeff; Rella, Chris; Franz, Patrick; Jacobson, Gloria; Chiao, Sen; Saad, Nabil

    2013-04-01

    The ability to quantify sources and sinks of carbon dioxide and methane on the urban scale is essential for understanding the atmospheric drivers to global climate change. In the 'top-down' approach, overall carbon fluxes are determined by combining remote measurements of carbon dioxide concentrations with complex atmospheric transport models, and these emissions measurements are compared to 'bottom-up' predictions based on detailed inventories of the sources and sinks of carbon, both anthropogenic and biogenic in nature. This approach, which has proven to be effective at continental scales, becomes challenging to implement at urban scales, due to poorly understood atmospheric transport models and high variability of the emissions sources in space (e.g., factories, highways, green spaces) and time (rush hours, factory shifts and shutdowns, and diurnal and seasonal variation in residential energy use). New measurement and analysis techniques are required to make sense of the carbon dioxide signal in cities. Here we present detailed, high spatial- and temporal- resolution greenhouse gas measurements made by multiple Picarro-CRDS analyzers in Silicon Valley in California. Real-time carbon dioxide data from a 20-month period are combined with real-time carbon monoxide, methane, and acetylene to partition the observed carbon dioxide concentrations between different anthropogenic sectors (e.g., transport, residential) and biogenic sources. Real-time wind rose data are also combined with real-time methane data to help identify the direction of local emissions of methane. High resolution WRF models are also included to better understand the dynamics of the boundary layer. The ratio between carbon dioxide and carbon monoxide is shown to vary over more than a factor of two from season to season or even from day to night, indicating rapid but frequent shifts in the balance between different carbon dioxide sources. Additional information is given by acetylene, a fossil fuel combustion tracer that provides complimentary information to carbon monoxide. In spring and summer, the combined signal of the urban center and the surrounding biosphere and urban green space is explored. These methods show great promise for identifying, quantifying, and partitioning urban-ecological (carbon) emissions.

  8. Evidence Combination From an Evolutionary Game Theory Perspective.

    PubMed

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2016-09-01

    Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.

  9. A multi-channel tunable source for atomic sensors

    NASA Astrophysics Data System (ADS)

    Bigelow, Matthew S.; Roberts, Tony D.; McNeil, Shirley A.; Hawthorne, Todd; Battle, Phil

    2015-09-01

    We have designed and completed initial testing on a laser source suitable for atomic interferometry from compact, robust, integrated components. Our design is enabled by capitalizing on robust, well-commercialized, low-noise telecom components with high reliability and declining costs which will help to drive the widespread deployment of this system. The key innovation is the combination of current telecom-based fiber laser and modulator technology with periodicallypoled waveguide technology to produce tunable laser light at rubidium D1 and D2 wavelengths (and expandable to other alkalis) using second harmonic generation (SHG). Unlike direct-diode sources, this source is immune to feedback at the Rb line eliminating the need for bulky high-power isolators in the system. In addition, the source has GHz-level frequency agility and in our experiments was found to only be limited by the agility of our RF generator. As a proof-of principle, the source was scanned through the Doppler-broadened Rb D2 absorption line. With this technology, multiple channels can be independently tuned to produce the fields needed for addressing atomic states in atom interferometers and clocks. Thus, this technology could be useful in the development cold-atom inertial sensors and gyroscopes.

  10. UNMIX Methods Applied to Characterize Sources of Volatile Organic Compounds in Toronto, Ontario

    PubMed Central

    Porada, Eugeniusz; Szyszkowicz, Mieczysław

    2016-01-01

    UNMIX, a sensor modeling routine from the U.S. Environmental Protection Agency (EPA), was used to model volatile organic compound (VOC) receptors in four urban sites in Toronto, Ontario. VOC ambient concentration data acquired in 2000–2009 for 175 VOC species in four air quality monitoring stations were analyzed. UNMIX, by performing multiple modeling attempts upon varying VOC menus—while rejecting the results that were not reliable—allowed for discriminating sources by their most consistent chemical characteristics. The method assessed occurrences of VOCs in sources typical of the urban environment (traffic, evaporative emissions of fuels, banks of fugitive inert gases), industrial point sources (plastic-, polymer-, and metalworking manufactures), and in secondary sources (releases from water, sediments, and contaminated urban soil). The remote sensing and robust modeling used here produces chemical profiles of putative VOC sources that, if combined with known environmental fates of VOCs, can be used to assign physical sources’ shares of VOCs emissions into the atmosphere. This in turn provides a means of assessing the impact of environmental policies on one hand, and industrial activities on the other hand, on VOC air pollution. PMID:29051416

  11. Combined use of stable isotopes and hydrologic modeling to better understand nutrient sources and cycling in highly altered systems (Invited)

    NASA Astrophysics Data System (ADS)

    Young, M. B.; Kendall, C.; Guerin, M.; Stringfellow, W. T.; Silva, S. R.; Harter, T.; Parker, A.

    2013-12-01

    The Sacramento and San Joaquin Rivers provide the majority of freshwater for the San Francisco Bay Delta. Both rivers are important sources of drinking and irrigation water for California, and play critical roles in the health of California fisheries. Understanding the factors controlling water quality and primary productivity in these rivers and the Delta is essential for making sound economic and environmental water management decisions. However, these highly altered surface water systems present many challenges for water quality monitoring studies due to factors such as multiple potential nutrient and contaminant inputs, dynamic source water inputs, and changing flow regimes controlled by both natural and engineered conditions. The watersheds for both rivers contain areas of intensive agriculture along with many other land uses, and the Sacramento River receives significant amounts of treated wastewater from the large population around the City of Sacramento. We have used a multi-isotope approach combined with mass balance and hydrodynamic modeling in order to better understand the dominant nutrient sources for each of these rivers, and to track nutrient sources and cycling within the complex Delta region around the confluence of the rivers. High nitrate concentrations within the San Joaquin River fuel summer algal blooms, contributing to low dissolved oxygen conditions. High δ15N-NO3 values combined with the high nitrate concentrations suggest that animal manure is a significant source of nitrate to the San Joaquin River. In contrast, the Sacramento River has lower nitrate concentrations but elevated ammonium concentrations from wastewater discharge. Downstream nitrification of the ammonium can be clearly traced using δ15N-NH4. Flow conditions for these rivers and the Delta have strong seasonal and inter-annual variations, resulting in significant changes in nutrient delivery and cycling. Isotopic measurements and estimates of source water contributions derived from the DSM2-HYDRO hydrologic model demonstrate that mixing between San Joaquin and Sacramento River water can occur as far as 30 miles upstream of the confluence within the San Joaquin channel, and that San Joaquin-derived nitrate only reaches the western Delta during periods of high flow.

  12. Capacity of MIMO free space optical communications using multiple partially coherent beams propagation through non-Kolmogorov strong turbulence.

    PubMed

    Deng, Peng; Kavehrad, Mohsen; Liu, Zhiwen; Zhou, Zhou; Yuan, Xiuhua

    2013-07-01

    We study the average capacity performance for multiple-input multiple-output (MIMO) free-space optical (FSO) communication systems using multiple partially coherent beams propagating through non-Kolmogorov strong turbulence, assuming equal gain combining diversity configuration and the sum of multiple gamma-gamma random variables for multiple independent partially coherent beams. The closed-form expressions of scintillation and average capacity are derived and then used to analyze the dependence on the number of independent diversity branches, power law α, refractive-index structure parameter, propagation distance and spatial coherence length of source beams. Obtained results show that, the average capacity increases more significantly with the increase in the rank of MIMO channel matrix compared with the diversity order. The effect of the diversity order on the average capacity is independent of the power law, turbulence strength parameter and spatial coherence length, whereas these effects on average capacity are gradually mitigated as the diversity order increases. The average capacity increases and saturates with the decreasing spatial coherence length, at rates depending on the diversity order, power law and turbulence strength. There exist optimal values of the spatial coherence length and diversity configuration for maximizing the average capacity of MIMO FSO links over a variety of atmospheric turbulence conditions.

  13. Simultaneous monitoring of multiple contrast agents using a hybrid MR-DOT system

    NASA Astrophysics Data System (ADS)

    Gulsen, Gultekin; Unlu, Mehmet Burcin; Birgul, Ozlem; Nalcioglu, Orhan

    2007-02-01

    Frequency domain diffuse optical tomography (DOT) is a recently emerging technique that uses arrays of sources and detectors to obtain spatially dependent optical parameters of tissue. Here, we describe the design of a hybrid MR-DOT system for dynamic imaging cancer. The combined system acquires both MR and optical data simultaneously. The performance of the system is tested with phantom and in-vivo studies. Gd-DTPA and ICG was used for this purpose and the enhancement kinetics of both agents are recorded using the hybrid system.

  14. Modeling Passive Propagation of Malwares on the WWW

    NASA Astrophysics Data System (ADS)

    Chunbo, Liu; Chunfu, Jia

    Web-based malwares host in websites fixedly and download onto user's computers automatically while users browse. This passive propagation pattern is different from that of traditional viruses and worms. A propagation model based on reverse web graph is proposed. In this model, propagation of malwares is analyzed by means of random jump matrix which combines orderness and randomness of user browsing behaviors. Explanatory experiments, which has single or multiple propagation sources respectively, prove the validity of the model. Using this model, people can evaluate the hazardness of specified websites and take corresponding countermeasures.

  15. Frequency division multiplex technique

    NASA Technical Reports Server (NTRS)

    Brey, H. (Inventor)

    1973-01-01

    A system for monitoring a plurality of condition responsive devices is described. It consists of a master control station and a remote station. The master control station is capable of transmitting command signals which includes a parity signal to a remote station which transmits the signals back to the command station so that such can be compared with the original signals in order to determine if there are any transmission errors. The system utilizes frequency sources which are 1.21 multiples of each other so that no linear combination of any harmonics will interfere with another frequency.

  16. Detection of complex cyber attacks

    NASA Astrophysics Data System (ADS)

    Gregorio-de Souza, Ian; Berk, Vincent H.; Giani, Annarita; Bakos, George; Bates, Marion; Cybenko, George; Madory, Doug

    2006-05-01

    One significant drawback to currently available security products is their inabilty to correlate diverse sensor input. For instance, by only using network intrusion detection data, a root kit installed through a weak username-password combination may go unnoticed. Similarly, an administrator may never make the link between deteriorating response times from the database server and an attacker exfiltrating trusted data, if these facts aren't presented together. Current Security Information Management Systems (SIMS) can collect and represent diverse data but lack sufficient correlation algorithms. By using a Process Query System, we were able to quickly bring together data flowing from many sources, including NIDS, HIDS, server logs, CPU load and memory usage, etc. We constructed PQS models that describe dynamic behavior of complicated attacks and failures, allowing us to detect and differentiate simultaneous sophisticated attacks on a target network. In this paper, we discuss the benefits of implementing such a multistage cyber attack detection system using PQS. We focus on how data from multiple sources can be combined and used to detect and track comprehensive network security events that go unnoticed using conventional tools.

  17. Tectonic Recycling in the Paleozoic Ouachita Assemblage from U-Pb Detrital Zircon Studies

    NASA Astrophysics Data System (ADS)

    Gleason, J. D.; Gehrels, G. E.; Finney, S. C.

    2001-05-01

    The Paleozoic Ouachita deep-marine clastic sedimentary assemblage records a complex provenance over the course of its 200 m.y. history, with evidence for mixed sources and multiple dispersal paths. Combined neodymium and U-Pb detrital zircon work has established that most of the assemblage in Arkansas and Oklahoma is derived from Laurentian sources, meaning that regardless of the multiple pathways by which sediment was delivered to Ouachita seafloor, the material had its ultimate origin on the North American continent. More detailed work is in progress to elucidate specific dispersal paths, in particular for the middle to late Ordovician when a major change in provenance is recorded, and during the Carboniferous when voluminous turbidites entered the basin. We sampled three formations for U-Pb detrital zircon studies: the lower Middle Ordovician Blakely Sandstone, the Upper Ordovician/Lower Silurian Blaylock Sandstone, and the Pennsylvanian Jackfork Group. Individual zircon ages from these units document a major change in provenance between deposition of the Blakely Sandstone and Blaylock Sandstone, which is also reflected in the neodymium isotopic signature. Both units have a large population of Grenvillian-age zircons (1.0-1.2 Ga), and a less abundant population of 1.3-1.4 Ga zircons likely derived from sources in the mid-continent region. The Blakely Sandstone also contains abundant Archean zircons (2.5-2.7 Ga, likely derived from the Superior Province), and one grain apparently derived from the Penokean orogen (1.9 Ga). Zircon morphology (highly rounded, spherical), combined with the pure quartz sandstone lithology of the Blakely Sandstone, indicates very mature sedimentary sources. We conclude that zircons from this source were recycled ultimately from source terranes in the North American craton. This is reinforced by neodymium isotopes (eNd = -15), paleocurrents (from the north) and olistoliths (1.3 Ga granites), the latter indicating that Blakely turbidites were delivered to Ouachita seafloor from the North American shelf. In contrast, the Blaylock Sandstone lacks any grains older than 1.4 Ga. A single grain dated at 467 Ma (Taconian) is consistent with the primary source of the Blaylock turbidites being the southern Appalachian Mountains. This is reinforced by neodymium isotopes (eNd = -8), paleocurrent data (sources to the east-southeast), sandstone petrography (quartzolithic, indicating recycled fold-thrust belt sources), and the zircon morphology we observed (fewer rounded grains, indicating less mature sources). Sandstone from the Carboniferous Jackfork Group yields a wide spectrum of zircon ages (1.0 - 3.5 Ga), suggesting that it was derived in part by tectonic recycling of the pre-Carboniferous seafloor assemblage as the Ouachita remnant ocean basin closed between North America and Gondwana. In addition to Grenvillian-, Penokean- and Archean-age grains, there are also grains with ages of 1.4 and 1.5 Ga, all of which suggest a North American provenance. Dispersal paths for sediment entering the Carboniferous Ouachita basin are still a matter of debate, but the U-Pb zircon data are consistent with well-mixed material from the Appalachian-Ouachita orogen entering the basin from multiple directions. The preponderance of Grenvillian-age zircons in all three units reinforces the notion that sediment eroded from the Grenville orogen had widespread distribution across much of the North American continent.

  18. Laboratory multiple-crystal X-ray topography and reciprocal-space mapping of protein crystals: influence of impurities on crystal perfection

    NASA Technical Reports Server (NTRS)

    Hu, Z. W.; Thomas, B. R.; Chernov, A. A.

    2001-01-01

    Double-axis multiple-crystal X-ray topography, rocking-curve measurements and triple-axis reciprocal-space mapping have been combined to characterize protein crystals using a laboratory source. Crystals of lysozyme and lysozyme crystals doped with acetylated lysozyme impurities were examined. It was shown that the incorporation of acetylated lysozyme into crystals of lysozyme induces mosaic domains that are responsible for the broadening and/or splitting of rocking curves and diffraction-space maps along the direction normal to the reciprocal-lattice vector, while the overall elastic lattice strain of the impurity-doped crystals does not appear to be appreciable in high angular resolution reciprocal-space maps. Multiple-crystal monochromatic X-ray topography, which is highly sensitive to lattice distortions, was used to reveal the spatial distribution of mosaic domains in crystals which correlates with the diffraction features in reciprocal space. Discussions of the influence of acetylated lysozyme on crystal perfection are given in terms of our observations.

  19. Laboratory multiple-crystal X-ray topography and reciprocal-space mapping of protein crystals: influence of impurities on crystal perfection.

    PubMed

    Hu, Z W; Thomas, B R; Chernov, A A

    2001-06-01

    Double-axis multiple-crystal X-ray topography, rocking-curve measurements and triple-axis reciprocal-space mapping have been combined to characterize protein crystals using a laboratory source. Crystals of lysozyme and lysozyme crystals doped with acetylated lysozyme impurities were examined. It was shown that the incorporation of acetylated lysozyme into crystals of lysozyme induces mosaic domains that are responsible for the broadening and/or splitting of rocking curves and diffraction-space maps along the direction normal to the reciprocal-lattice vector, while the overall elastic lattice strain of the impurity-doped crystals does not appear to be appreciable in high angular resolution reciprocal-space maps. Multiple-crystal monochromatic X-ray topography, which is highly sensitive to lattice distortions, was used to reveal the spatial distribution of mosaic domains in crystals which correlates with the diffraction features in reciprocal space. Discussions of the influence of acetylated lysozyme on crystal perfection are given in terms of our observations.

  20. The ASAS-SN Bright Supernova Catalog – II. 2015

    DOE PAGES

    Holoien, T. W. -S.; Brown, J. S.; Stanek, K. Z.; ...

    2017-01-16

    Here, this paper presents information for all supernovae discovered by the All-Sky Automated Survey for SuperNovae (ASAS-SN) during 2015, its second full year of operations. The same information is presented for bright (mV ≤ 17), spectroscopically confirmed supernovae discovered by other sources in 2015. As with the first ASAS-SN bright supernova catalogue, we also present redshifts and near-ultraviolet through infrared magnitudes for all supernova host galaxies in both samples. Combined with our previous catalogue, this work comprises a complete catalogue of 455 supernovae from multiple professional and amateur sources, allowing for population studies that were previously impossible. This is themore » second of a series of yearly papers on bright supernovae and their hosts from the ASAS-SN team.« less

  1. The ASAS-SN Bright Supernova Catalog – II. 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holoien, T. W. -S.; Brown, J. S.; Stanek, K. Z.

    Here, this paper presents information for all supernovae discovered by the All-Sky Automated Survey for SuperNovae (ASAS-SN) during 2015, its second full year of operations. The same information is presented for bright (mV ≤ 17), spectroscopically confirmed supernovae discovered by other sources in 2015. As with the first ASAS-SN bright supernova catalogue, we also present redshifts and near-ultraviolet through infrared magnitudes for all supernova host galaxies in both samples. Combined with our previous catalogue, this work comprises a complete catalogue of 455 supernovae from multiple professional and amateur sources, allowing for population studies that were previously impossible. This is themore » second of a series of yearly papers on bright supernovae and their hosts from the ASAS-SN team.« less

  2. En face projection imaging of the human choroidal layers with tracking SLO and swept source OCT angiography methods

    NASA Astrophysics Data System (ADS)

    Gorczynska, Iwona; Migacz, Justin; Zawadzki, Robert J.; Sudheendran, Narendran; Jian, Yifan; Tiruveedhula, Pavan K.; Roorda, Austin; Werner, John S.

    2015-07-01

    We tested and compared the capability of multiple optical coherence tomography (OCT) angiography methods: phase variance, amplitude decorrelation and speckle variance, with application of the split spectrum technique, to image the choroiretinal complex of the human eye. To test the possibility of OCT imaging stability improvement we utilized a real-time tracking scanning laser ophthalmoscopy (TSLO) system combined with a swept source OCT setup. In addition, we implemented a post- processing volume averaging method for improved angiographic image quality and reduction of motion artifacts. The OCT system operated at the central wavelength of 1040nm to enable sufficient depth penetration into the choroid. Imaging was performed in the eyes of healthy volunteers and patients diagnosed with age-related macular degeneration.

  3. Identification of biased sectors in emission data using a combination of chemical transport model and receptor model

    NASA Astrophysics Data System (ADS)

    Uranishi, Katsushige; Ikemori, Fumikazu; Nakatsubo, Ryohei; Shimadera, Hikari; Kondo, Akira; Kikutani, Yuki; Asano, Katsuyoshi; Sugata, Seiji

    2017-10-01

    This study presented a comparison approach with multiple source apportionment methods to identify which sectors of emission data have large biases. The source apportionment methods for the comparison approach included both receptor and chemical transport models, which are widely used to quantify the impacts of emission sources on fine particulate matter of less than 2.5 μm in diameter (PM2.5). We used daily chemical component concentration data in the year 2013, including data for water-soluble ions, elements, and carbonaceous species of PM2.5 at 11 sites in the Kinki-Tokai district in Japan in order to apply the Positive Matrix Factorization (PMF) model for the source apportionment. Seven PMF factors of PM2.5 were identified with the temporal and spatial variation patterns and also retained features of the sites. These factors comprised two types of secondary sulfate, road transportation, heavy oil combustion by ships, biomass burning, secondary nitrate, and soil and industrial dust, accounting for 46%, 17%, 7%, 14%, 13%, and 3% of the PM2.5, respectively. The multiple-site data enabled a comprehensive identification of the PM2.5 sources. For the same period, source contributions were estimated by air quality simulations using the Community Multiscale Air Quality model (CMAQ) with the brute-force method (BFM) for four source categories. Both models provided consistent results for the following three of the four source categories: secondary sulfates, road transportation, and heavy oil combustion sources. For these three target categories, the models' agreement was supported by the small differences and high correlations between the CMAQ/BFM- and PMF-estimated source contributions to the concentrations of PM2.5, SO42-, and EC. In contrast, contributions of the biomass burning sources apportioned by CMAQ/BFM were much lower than and little correlated with those captured by the PMF model, indicating large uncertainties in the biomass burning emissions used in the CMAQ simulations. Thus, this comparison approach using the two antithetical models enables us to identify which sectors of emission data have large biases for improvement of future air quality simulations.

  4. An integrated modelling framework for neural circuits with multiple neuromodulators.

    PubMed

    Joshi, Alok; Youssofzadeh, Vahab; Vemana, Vinith; McGinnity, T M; Prasad, Girijesh; Wong-Lin, KongFatt

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. © 2017 The Authors.

  5. Modeling the combined influence of host dispersal and waterborne fate and transport on pathogen spread in complex landscapes

    PubMed Central

    Lu, Ding; McDowell, Julia Z.; Davis, George M.; Spear, Robert C.; Remais, Justin V.

    2012-01-01

    Environmental models, often applied to questions on the fate and transport of chemical hazards, have recently become important in tracing certain environmental pathogens to their upstream sources of contamination. These tools, such as first order decay models applied to contaminants in surface waters, offer promise for quantifying the fate and transport of pathogens with multiple environmental stages and/or multiple hosts, in addition to those pathogens whose environmental stages are entirely waterborne. Here we consider the fate and transport capabilities of the human schistosome Schistosoma japonicum, which exhibits two waterborne stages and is carried by an amphibious intermediate snail host. We present experimentally-derived dispersal estimates for the intermediate snail host and fate and transport estimates for the passive downstream diffusion of cercariae, the waterborne, human-infective parasite stage. Using a one dimensional advective transport model exhibiting first-order decay, we simulate the added spatial reach and relative increase in cercarial concentrations that dispersing snail hosts contribute to downstream sites. Simulation results suggest that snail dispersal can substantially increase the concentrations of cercariae reaching downstream locations, relative to no snail dispersal, effectively putting otherwise isolated downstream sites at increased risk of exposure to cercariae from upstream sources. The models developed here can be applied to other infectious diseases with multiple life-stages and hosts, and have important implications for targeted ecological control of disease spread. PMID:23162675

  6. An integrated modelling framework for neural circuits with multiple neuromodulators

    PubMed Central

    Vemana, Vinith

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. PMID:28100828

  7. RSAT matrix-clustering: dynamic exploration and redundancy reduction of transcription factor binding motif collections

    PubMed Central

    Jaeger, Sébastien; Thieffry, Denis

    2017-01-01

    Abstract Transcription factor (TF) databases contain multitudes of binding motifs (TFBMs) from various sources, from which non-redundant collections are derived by manual curation. The advent of high-throughput methods stimulated the production of novel collections with increasing numbers of motifs. Meta-databases, built by merging these collections, contain redundant versions, because available tools are not suited to automatically identify and explore biologically relevant clusters among thousands of motifs. Motif discovery from genome-scale data sets (e.g. ChIP-seq) also produces redundant motifs, hampering the interpretation of results. We present matrix-clustering, a versatile tool that clusters similar TFBMs into multiple trees, and automatically creates non-redundant TFBM collections. A feature unique to matrix-clustering is its dynamic visualisation of aligned TFBMs, and its capability to simultaneously treat multiple collections from various sources. We demonstrate that matrix-clustering considerably simplifies the interpretation of combined results from multiple motif discovery tools, and highlights biologically relevant variations of similar motifs. We also ran a large-scale application to cluster ∼11 000 motifs from 24 entire databases, showing that matrix-clustering correctly groups motifs belonging to the same TF families, and drastically reduced motif redundancy. matrix-clustering is integrated within the RSAT suite (http://rsat.eu/), accessible through a user-friendly web interface or command-line for its integration in pipelines. PMID:28591841

  8. Intravenous glucose preparation as the source of an outbreak of extended-spectrum beta-lactamase-producing Klebsiella pneumoniae infections in the neonatal unit of a regional hospital in KwaZulu-Natal.

    PubMed

    Moodley, Prashini; Coovadia, Yacoob M; Sturm, A Willem

    2005-11-01

    In the last week of May 2005, staff at Mahatma Gandhi Memorial Hospital in KwaZulu-Natal realised that many babies in the high-care nursery ward had bloodstream infections involving Klebsiella pneumoniae bacteria. Attempts to identify a common source of infection failed. The ward was therefore closed and new babies needing high care were admitted to another empty ward. Despite this, babies still became infected. This resulted in a request for assistance from the Department of Medical Microbiology of the Nelson R Mandela School of Medicine. A search for common factors through case history studies of the 26 infected babies showed that blood cultures of the babies remained positive despite the administration of appropriate antibiotics. Different options that could explain this were investigated. The organism was found in intravenous glucose preparations used for multiple dosing. Unopened vials of the same medication were sterile. The nursery was found to lack proper hand-wash facilities and to be overcrowded and understaffed. Reinforcement of hand hygiene and a ban on the multiple dosing of medicines stopped the outbreak. In conclusion, this outbreak resulted from a combination of factors among which lack of hand hygiene and multiple dosing of an intravenous glucose preparation were most significant.

  9. Bayesian Networks Improve Causal Environmental Assessments for Evidence-Based Policy.

    PubMed

    Carriger, John F; Barron, Mace G; Newman, Michael C

    2016-12-20

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on valued ecological resources. These aspects are demonstrated through hypothetical problem scenarios that explore some major benefits of using Bayesian networks for reasoning and making inferences in evidence-based policy.

  10. Genomic-based multiple-trait evaluation in Eucalyptus grandis using dominant DArT markers.

    PubMed

    Cappa, Eduardo P; El-Kassaby, Yousry A; Muñoz, Facundo; Garcia, Martín N; Villalba, Pamela V; Klápště, Jaroslav; Marcucci Poltri, Susana N

    2018-06-01

    We investigated the impact of combining the pedigree- and genomic-based relationship matrices in a multiple-trait individual-tree mixed model (a.k.a., multiple-trait combined approach) on the estimates of heritability and on the genomic correlations between growth and stem straightness in an open-pollinated Eucalyptus grandis population. Additionally, the added advantage of incorporating genomic information on the theoretical accuracies of parents and offspring breeding values was evaluated. Our results suggested that the use of the combined approach for estimating heritabilities and additive genetic correlations in multiple-trait evaluations is advantageous and including genomic information increases the expected accuracy of breeding values. Furthermore, the multiple-trait combined approach was proven to be superior to the single-trait combined approach in predicting breeding values, in particular for low-heritability traits. Finally, our results advocate the use of the combined approach in forest tree progeny testing trials, specifically when a multiple-trait individual-tree mixed model is considered. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Stochastic Industrial Source Detection Using Lower Cost Methods

    NASA Astrophysics Data System (ADS)

    Thoma, E.; George, I. J.; Brantley, H.; Deshmukh, P.; Cansler, J.; Tang, W.

    2017-12-01

    Hazardous air pollutants (HAPs) can be emitted from a variety of sources in industrial facilities, energy production, and commercial operations. Stochastic industrial sources (SISs) represent a subcategory of emissions from fugitive leaks, variable area sources, malfunctioning processes, and improperly controlled operations. From the shared perspective of industries and communities, cost-effective detection of mitigable SIS emissions can yield benefits such as safer working environments, cost saving through reduced product loss, lower air shed pollutant impacts, and improved transparency and community relations. Methods for SIS detection can be categorized by their spatial regime of operation, ranging from component-level inspection to high-sensitivity kilometer scale surveys. Methods can be temporally intensive (providing snap-shot measures) or sustained in both time-integrated and continuous forms. Each method category has demonstrated utility, however, broad adoption (or routine use) has thus far been limited by cost and implementation viability. Described here are a subset of SIS methods explored by the U.S EPA's next generation emission measurement (NGEM) program that focus on lower cost methods and models. An emerging systems approach that combines multiple forms to help compensate for reduced performance factors of lower cost systems is discussed. A case study of a multi-day HAP emission event observed by a combination of low cost sensors, open-path spectroscopy, and passive samplers is detailed. Early field results of a novel field gas chromatograph coupled with a fast HAP concentration sensor is described. Progress toward near real-time inverse source triangulation assisted by pre-modeled facility profiles using the Los Alamos Quick Urban & Industrial Complex (QUIC) model is discussed.

  12. Nonlinear inversion of tilt-affected very long period records of explosive eruptions at Fuego volcano

    NASA Astrophysics Data System (ADS)

    Waite, Gregory P.; Lanza, Federica

    2016-10-01

    Magmatic processes produce a rich variety of volcano seismic signals, ranging over several orders of magnitude in frequency and over a wide range of mechanism types. We examined signals from 400 to 10 s period associated with explosive eruptions at Fuego volcano, Guatemala, that were recorded over 19 days in 2009 on broadband stations with 30 s and 60 s corner periods. The raw data from the closest stations include tilt effects on the horizontal components but also have significant signal at periods below the instrument corners on the vertical components, where tilt effects should be negligible. We address the problems of tilt-affected horizontal waveforms through a joint waveform inversion of translation and rotation, which allows for an investigation of the varying influence of tilt with period. Using a phase-weighted stack of six similar events, we invert for source moment tensor using multiple bands. We use a grid search for source type and constrained inversions, which provides a quantitative measure of source mechanism reliability. The 30-10 s band-pass results are consistent with previous work that modeled data with a combined two crack or crack and pipe model. At the longest-period band examined, 400-60 s, the source mechanism is like a pipe that could represent the shallowest portion of the conduit. On the other hand, source mechanisms in some bands are unconstrained, presumably due to the combined tilt-dominated and translation-dominated signals, which are not coincident in space and have different time spans.

  13. Spatial filters and automated spike detection based on brain topographies improve sensitivity of EEG-fMRI studies in focal epilepsy.

    PubMed

    Siniatchkin, Michael; Moeller, Friederike; Jacobs, Julia; Stephani, Ulrich; Boor, Rainer; Wolff, Stephan; Jansen, Olav; Siebner, Hartwig; Scherg, Michael

    2007-09-01

    The ballistocardiogram (BCG) represents one of the most prominent sources of artifacts that contaminate the electroencephalogram (EEG) during functional MRI. The BCG artifacts may affect the detection of interictal epileptiform discharges (IED) in patients with epilepsy, reducing the sensitivity of the combined EEG-fMRI method. In this study we improved the BCG artifact correction using a multiple source correction (MSC) approach. On the one hand, a source analysis of the IEDs was applied to the EEG data obtained outside the MRI scanner to prevent the distortion of EEG signals of interest during the correction of BCG artifacts. On the other hand, the topographies of the BCG artifacts were defined based on the EEG recorded inside the scanner. The topographies of the BCG artifacts were then added to the surrogate model of IED sources and a combined source model was applied to the data obtained inside the scanner. The artifact signal was then subtracted without considerable distortion of the IED topography. The MSC approach was compared with the traditional averaged artifact subtraction (AAS) method. Both methods reduced the spectral power of BCG-related harmonics and enabled better detection of IEDs. Compared with the conventional AAS method, the MSC approach increased the sensitivity of IED detection because the IED signal was less attenuated when subtracting the BCG artifacts. The proposed MSC method is particularly useful in situations in which the BCG artifact is spatially correlated and time-locked with the EEG signal produced by the focal brain activity of interest.

  14. A study of impact of Asian dusts and their transport pathways to Hong Kong using multiple AERONET data, trajectory, and in-situ measurements

    NASA Astrophysics Data System (ADS)

    Wong, Man Sing; Nichol, Janet Elizabeth; Lee, Kwon Ho

    2010-10-01

    Hong Kong, a commercial and financial city located in south-east China has suffered serious air pollution for the last decade due largely to rapid urban and industrial expansion of the cities of mainland China. However, the potential sources and pathways of aerosols transported to Hong Kong have not been well researched due to the lack of air quality monitoring stations in southern China. Here, an integrated method combining the AErosol RObotic NETwork (AERONET) data, trajectory and Potential Source Contribution Function (PSCF) modeling is used to identify the potential transport pathways and contribution of sources from four characteristic aerosol types. Four characteristic aerosol types were defined using a total of 730 AERONET data measurements between 2005 and 2008. They are coastal urban, polluted urban, dust (likely to be long distance desert dust), and heavy pollution. Results show that the sources of polluted urban and heavy pollution are associated with industrial emissions in southern China, whereas coastal urban aerosols have been affected both from natural marine aerosol and emissions. The PSCF map of dust shows a wide range of pathways followed by east- and south-eastwards trajectories from northwest China to Hong Kong. Although the contribution from dust sources is small compared to the anthropogenic aerosols, a serious recent dust outbreak has been observed in Hong Kong with an elevation of the Air Pollution Index to 500, compared with 50-100 on normal days. Therefore, the combined use of clustered AERONET data, trajectory and the PSCF models can help to resolve the longstanding issue about source regions and characteristics of pollutants carried to Hong Kong.

  15. All-Solid-State 2.45-to-2.78-THz Source

    NASA Technical Reports Server (NTRS)

    Mehdi, Imran; Chattopadhyay, Goutam; Schlecht, Erich T.; Lin, Robert H.; Sin, Seith; Peralta, Alejandro; Lee, Choonsup; Gill, John J.; Pearson, John C.; Goldsmith, Paul F.; hide

    2011-01-01

    Sources in the THz range are required in order for NASA to implement heterodyne instruments in this frequency range. The source that has been demonstrated here will be used for an instrument on the SOFIA platform as well as for upcoming astrophysics missions. There are currently no electronic sources in the 2 3- THz frequency range. An electronically tunable compact source in this frequency range is needed for lab spectroscopy as well as for compact space-deployable heterodyne receivers. This solution for obtaining useful power levels in the 2 3- THz range is based on utilizing power-combined multiplier stages. Utilizing power combining, the input power can be distributed between different multiplier chips and then recombined after the frequency multiplication. A continuous wave (CW) coherent source covering 2.48 2.75 THz, with greater than 10 percent instantaneous and tuning bandwidth, and having l 14 W of output power at room temperature, has been demonstrated. This source is based on a 91.8 101.8-GHz synthesizer followed by a power amplifier and three cascaded frequency triplers. It demonstrates that purely electronic solid-state sources can generate a useful amount of power in a region of the electromagnetic spectrum where lasers (solid-state or gas) were previously the only available coherent sources. The bandwidth, agility, and operability of this THz source has enabled wideband, high-resolution spectroscopic measurements of water, methanol, and carbon monoxide with a resolution and signal-to-noise ratio unmatched by other existing systems, providing new insight in the physics of these molecules. Further - more, the power and optical beam quality are high enough to observe the Lamb-dip effect in water. The source frequency has an absolute accuracy better than 1 part in 1012, and the spectrometer achieves sub-Doppler frequency resolution better than 1 part in 108. The harmonic purity is better than 25 dB. This source can serve as a local oscillator for a variety of heterodyne systems, and can be used as a method for precision control of more powerful but much less frequency-agile quantum mechanical terahertz sources.

  16. Discovering Anti-platelet Drug Combinations with an Integrated Model of Activator-Inhibitor Relationships, Activator-Activator Synergies and Inhibitor-Inhibitor Synergies

    PubMed Central

    Lombardi, Federica; Golla, Kalyan; Fitzpatrick, Darren J.; Casey, Fergal P.; Moran, Niamh; Shields, Denis C.

    2015-01-01

    Identifying effective therapeutic drug combinations that modulate complex signaling pathways in platelets is central to the advancement of effective anti-thrombotic therapies. However, there is no systems model of the platelet that predicts responses to different inhibitor combinations. We developed an approach which goes beyond current inhibitor-inhibitor combination screening to efficiently consider other signaling aspects that may give insights into the behaviour of the platelet as a system. We investigated combinations of platelet inhibitors and activators. We evaluated three distinct strands of information, namely: activator-inhibitor combination screens (testing a panel of inhibitors against a panel of activators); inhibitor-inhibitor synergy screens; and activator-activator synergy screens. We demonstrated how these analyses may be efficiently performed, both experimentally and computationally, to identify particular combinations of most interest. Robust tests of activator-activator synergy and of inhibitor-inhibitor synergy required combinations to show significant excesses over the double doses of each component. Modeling identified multiple effects of an inhibitor of the P2Y12 ADP receptor, and complementarity between inhibitor-inhibitor synergy effects and activator-inhibitor combination effects. This approach accelerates the mapping of combination effects of compounds to develop combinations that may be therapeutically beneficial. We integrated the three information sources into a unified model that predicted the benefits of a triple drug combination targeting ADP, thromboxane and thrombin signaling. PMID:25875950

  17. Accurate Influenza Monitoring and Forecasting Using Novel Internet Data Streams: A Case Study in the Boston Metropolis

    PubMed Central

    Lu, Fred Sun; Hou, Suqin; Baltrusaitis, Kristin; Shah, Manan; Leskovec, Jure; Sosic, Rok; Hawkins, Jared; Brownstein, John; Conidi, Giuseppe; Gunn, Julia; Gray, Josh; Zink, Anna

    2018-01-01

    Background Influenza outbreaks pose major challenges to public health around the world, leading to thousands of deaths a year in the United States alone. Accurate systems that track influenza activity at the city level are necessary to provide actionable information that can be used for clinical, hospital, and community outbreak preparation. Objective Although Internet-based real-time data sources such as Google searches and tweets have been successfully used to produce influenza activity estimates ahead of traditional health care–based systems at national and state levels, influenza tracking and forecasting at finer spatial resolutions, such as the city level, remain an open question. Our study aimed to present a precise, near real-time methodology capable of producing influenza estimates ahead of those collected and published by the Boston Public Health Commission (BPHC) for the Boston metropolitan area. This approach has great potential to be extended to other cities with access to similar data sources. Methods We first tested the ability of Google searches, Twitter posts, electronic health records, and a crowd-sourced influenza reporting system to detect influenza activity in the Boston metropolis separately. We then adapted a multivariate dynamic regression method named ARGO (autoregression with general online information), designed for tracking influenza at the national level, and showed that it effectively uses the above data sources to monitor and forecast influenza at the city level 1 week ahead of the current date. Finally, we presented an ensemble-based approach capable of combining information from models based on multiple data sources to more robustly nowcast as well as forecast influenza activity in the Boston metropolitan area. The performances of our models were evaluated in an out-of-sample fashion over 4 influenza seasons within 2012-2016, as well as a holdout validation period from 2016 to 2017. Results Our ensemble-based methods incorporating information from diverse models based on multiple data sources, including ARGO, produced the most robust and accurate results. The observed Pearson correlations between our out-of-sample flu activity estimates and those historically reported by the BPHC were 0.98 in nowcasting influenza and 0.94 in forecasting influenza 1 week ahead of the current date. Conclusions We show that information from Internet-based data sources, when combined using an informed, robust methodology, can be effectively used as early indicators of influenza activity at fine geographic resolutions. PMID:29317382

  18. Theoretical and Experimental Investigation of Mufflers with Comments on Engine-Exhaust Muffler Design

    NASA Technical Reports Server (NTRS)

    Davis, Don D , Jr; Stokes, George M; Moore, Dewey; Stevens, George L , Jr

    1954-01-01

    Equations are presented for the attenuation characteristics of single-chamber and multiple-chamber mufflers of both the expansion-chamber and resonator types, for tuned side-branch tubes, and for the combination of an expansion chamber with a resonator. Experimental curves of attenuation plotted against frequency are presented for 77 different mufflers with a reflection-free tailpipe termination. The experiments were made at room temperature without flow; the sound source was a loud-speaker. A method is given for including the tailpipe reflections in the calculations. Experimental attenuation curves are presented for four different muffler-tailpipe combinations, and the results are compared with the theory. The application of the theory to the design of engine-exhaust mufflers is discussed, and charts are included for the assistance of the designer.

  19. Everolimus.

    PubMed

    Hasskarl, Jens

    2014-01-01

    Everolimus (RAD001, Afinitor®) is an oral protein kinase inhibitor of the mammalian target of rapamycin (mTOR) serine/threonine kinase signal transduction pathway. The mTOR pathway regulates cell growth, proliferation, and survival and is frequently deregulated in cancer. Everolimus has been approved by the FDA and the EMA for the treatment of advanced renal cell carcinoma (RCC), subependymal giant cell astrocytoma (SEGA) associated with tuberous sclerosis (TSC), pancreatic neuroendocrine tumors (PNET), in combination with exemestane in advanced hormone-receptor (HR)-positive, HER2-negative breast cancer. Everolimus shows promising clinical activity in additional indications. Multiple phase 2 and phase 3 trials of everolimus alone or in combination are ongoing and will help to further elucidate the role of mTOR in oncology. For a review on everolimus as immunosuppressant, please consult other sources.

  20. A Murine Model to Study Epilepsy and SUDEP Induced by Malaria Infection

    PubMed Central

    Ssentongo, Paddy; Robuccio, Anna E.; Thuku, Godfrey; Sim, Derek G.; Nabi, Ali; Bahari, Fatemeh; Shanmugasundaram, Balaji; Billard, Myles W.; Geronimo, Andrew; Short, Kurt W.; Drew, Patrick J.; Baccon, Jennifer; Weinstein, Steven L.; Gilliam, Frank G.; Stoute, José A.; Chinchilli, Vernon M.; Read, Andrew F.; Gluckman, Bruce J.; Schiff, Steven J.

    2017-01-01

    One of the largest single sources of epilepsy in the world is produced as a neurological sequela in survivors of cerebral malaria. Nevertheless, the pathophysiological mechanisms of such epileptogenesis remain unknown and no adjunctive therapy during cerebral malaria has been shown to reduce the rate of subsequent epilepsy. There is no existing animal model of postmalarial epilepsy. In this technical report we demonstrate the first such animal models. These models were created from multiple mouse and parasite strain combinations, so that the epilepsy observed retained universality with respect to genetic background. We also discovered spontaneous sudden unexpected death in epilepsy (SUDEP) in two of our strain combinations. These models offer a platform to enable new preclinical research into mechanisms and prevention of epilepsy and SUDEP. PMID:28272506

  1. SU-G-201-17: Verification of Dose Distributions From High-Dose-Rate Brachytherapy Ir-192 Source Using a Multiple-Array-Diode-Detector (MapCheck2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harpool, K; De La Fuente Herman, T; Ahmad, S

    Purpose: To investigate quantitatively the accuracy of dose distributions for the Ir-192 high-dose-rate (HDR) brachytherapy source calculated by the Brachytherapy-Planning system (BPS) and measured using a multiple-array-diode-detector in a heterogeneous medium. Methods: A two-dimensional diode-array-detector system (MapCheck2) was scanned with a catheter and the CT-images were loaded into the Varian-Brachytherapy-Planning which uses TG-43-formalism for dose calculation. Treatment plans were calculated for different combinations of one dwell-position and varying irradiation times and different-dwell positions and fixed irradiation time with the source placed 12mm from the diode-array plane. The calculated dose distributions were compared to the measured doses with MapCheck2 delivered bymore » an Ir-192-source from a Nucletron-Microselectron-V2-remote-after-loader. The linearity of MapCheck2 was tested for a range of dwell-times (2–600 seconds). The angular effect was tested with 30 seconds irradiation delivered to the central-diode and then moving the source away in increments of 10mm. Results: Large differences were found between calculated and measured dose distributions. These differences are mainly due to absence of heterogeneity in the dose calculation and diode-artifacts in the measurements. The dose differences between measured and calculated due to heterogeneity ranged from 5%–12% depending on the position of the source relative to the diodes in MapCheck2 and different heterogeneities in the beam path. The linearity test of the diode-detector showed 3.98%, 2.61%, and 2.27% over-response at short irradiation times of 2, 5, and 10 seconds, respectively, and within 2% for 20 to 600 seconds (p-value=0.05) which depends strongly on MapCheck2 noise. The angular dependency was more pronounced at acute angles ranging up to 34% at 5.7 degrees. Conclusion: Large deviations between measured and calculated dose distributions for HDR-brachytherapy with Ir-192 may be improved when considering medium heterogeneity and dose-artifact of the diodes. This study demonstrates that multiple-array-diode-detectors provide practical and accurate dosimeter to verify doses delivered from the brachytherapy Ir-192-source.« less

  2. Cold Atom Source Containing Multiple Magneto-Optical Traps

    NASA Technical Reports Server (NTRS)

    Ramirez-Serrano, Jaime; Kohel, James; Kellogg, James; Lim, Lawrence; Yu, Nan; Maleki, Lute

    2007-01-01

    An apparatus that serves as a source of a cold beam of atoms contains multiple two-dimensional (2D) magneto-optical traps (MOTs). (Cold beams of atoms are used in atomic clocks and in diverse scientific experiments and applications.) The multiple-2D-MOT design of this cold atom source stands in contrast to single-2D-MOT designs of prior cold atom sources of the same type. The advantages afforded by the present design are that this apparatus is smaller than prior designs.

  3. Source facies and oil families of the Malay Basin, Malaysia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Creaney, S.; Hussein, A.H.; Curry, D.J.

    1994-07-01

    The Malay Basin consists of a number of separate petroleum systems, driven exclusively by nonmarine source rocks. These systems range from lower Oligocene to middle Miocene and show a progression from lacustrine-dominated source facies in the lower Oligocene to lower Miocene section to coastal plain/delta plain coal-related sources in the lower to middle Miocene section. Two lacustrine sources are recognized in the older section, and multiple source/reservoir pairs are recognized in the younger coaly section. The lacustrine sources can be recognized using well-log analysis combined with detailed core and sidewall core sampling. Chemically, they are characterized by low pristane/phytane ratios,more » low oleanane contents, and a general absence of resin-derived terpanes. These sources have TOCs in the 1.0-4.0% range and hydrogen indices of up to 750. In contrast, the coal-related sources are chemically distinct with pristane/phytane ratios of up to 8, very high oleanane contents, and often abundant resinous compounds. All these sources are generally overmature in the basin center and immature toward the basin margin. The oils sourced from all sources in the Malay Basin are generally low in sulfur and of very high economic value. Detailed biomarker analysis of the oils in the Malay Basin has allowed the recognition of families associated with the above sources and demonstrated that oil migration has been largely strata parallel with little cross-stratal mixing of families.« less

  4. Trends and Patterns in a New Time Series of Natural and Anthropogenic Methane Emissions, 1980-2000

    NASA Astrophysics Data System (ADS)

    Matthews, E.; Bruhwiler, L.; Themelis, N. J.

    2007-12-01

    We report on a new time series of methane (CH4) emissions from anthropogenic and natural sources developed for a multi-decadal methane modeling study (see following presentation by Bruhwiler et al.). The emission series extends from 1980 through the early 2000s with annual emissions for all countries has several features distinct from the source histories based on IPCC methods typically employed in modeling the global methane cycle. Fossil fuel emissions rely on 7 fuel-process emission combinations and minimize reliance on highly-uncertain emission factors. Emissions from ruminant animals employ regional profiles of bovine populations that account for the influence of variable age- and size-demographics on emissions and are ~15% lower than other estimates. Waste-related emissions are developed using an approach that avoids using of data-poor emission factors and accounts for impacts of recycling and thermal treatment of waste on diverting material from landfills and CH4 capture at landfill facilities. Emissions from irrigated rice use rice-harvest areas under 3 water-management systems and a new historical data set that analyzes multiple sources for trends in water management since 1980. A time series of emissions from natural wetlands was developed by applying a multiple-regression model derived from full process-based model of Walter with analyzed meteorology from the ERA-40 reanalysis.

  5. Multiple vesicle recycling pathways in central synapses and their impact on neurotransmission

    PubMed Central

    Kavalali, Ege T

    2007-01-01

    Short-term synaptic depression during repetitive activity is a common property of most synapses. Multiple mechanisms contribute to this rapid depression in neurotransmission including a decrease in vesicle fusion probability, inactivation of voltage-gated Ca2+ channels or use-dependent inhibition of release machinery by presynaptic receptors. In addition, synaptic depression can arise from a rapid reduction in the number of vesicles available for release. This reduction can be countered by two sources. One source is replenishment from a set of reserve vesicles. The second source is the reuse of vesicles that have undergone exocytosis and endocytosis. If the synaptic vesicle reuse is fast enough then it can replenish vesicles during a brief burst of action potentials and play a substantial role in regulating the rate of synaptic depression. In the last 5 years, we have examined the impact of synaptic vesicle reuse on neurotransmission using fluorescence imaging of synaptic vesicle trafficking in combination with electrophysiological detection of short-term synaptic plasticity. These studies have revealed that synaptic vesicle reuse shapes the kinetics of short-term synaptic depression in a frequency-dependent manner. In addition, synaptic vesicle recycling helps maintain the level of neurotransmission at steady state. Moreover, our studies showed that synaptic vesicle reuse is a highly plastic process as it varies widely among synapses and can adapt to changes in chronic activity levels. PMID:17690145

  6. Combining multiple ecosystem productivity measurements to constrain carbon uptake estimates in semiarid grasslands and shrublands

    NASA Astrophysics Data System (ADS)

    Maurer, G. E.; Krofcheck, D. J.; Collins, S. L.; Litvak, M. E.

    2016-12-01

    Recent observational and modeling studies have indicated that semiarid ecosystems are more dynamic contributors to the global carbon budget than once thought. Semiarid carbon fluxes, however, are generally small, with high interannual and spatial variability, which suggests that validating their global significance may depend on examining multiple productivity measures and their associated uncertainties and inconsistencies. We examined ecosystem productivity from eddy covariance (NEE), harvest (NPP), and terrestrial biome models (NEPm) at two very similar grassland sites and one creosote shrubland site in the Sevilleta National Wildlife Refuge of central New Mexico, USA. Our goal was to assess site and methodological correspondence in annual carbon uptake, patterns of interannual variability, and measurement uncertainty. One grassland site was a perennial carbon source losing 30 g C m-2 per year on average, while the other two sites were carbon sources or sinks depending on the year, with average net uptake of 5 and 25 g C m-2 per year at the grassland and shrubland site, respectively. Uncertainty values for cumulative annual NEE overlapped between the three sites in most years. When combined, aboveground and belowground annual NPP measurements were 15% higher than annual NEE values and did not confirm a loss of carbon at any site in any year. Despite differences in mean site carbon balance, year-to-year changes in cumulative annual NEE and NPP were similar at all sites with years 2010 and 2013 being favorable for carbon uptake and 2011 and 2012 being unfavorable at all sites. Modeled NEPm data for a number of nearby grid cells reproduced only a fraction of the observed range in carbon uptake and its interannual variability. These three sites are highly similar in location and climate and multiple carbon flux measurements confirm the high interannual variability in carbon flux. The exact magnitude of these fluxes, however, remains difficult to discern.

  7. Typologies of Social Support and Associations with Mental Health Outcomes Among LGBT Youth.

    PubMed

    McConnell, Elizabeth A; Birkett, Michelle A; Mustanski, Brian

    2015-03-01

    Lesbian, gay, bisexual, and transgender (LGBT) youth show increased risk for a number of negative mental health outcomes, which research has linked to minority stressors such as victimization. Further, social support promotes positive mental health outcomes for LGBT youth, and different sources of social support show differential relationships with mental health outcomes. However, little is known about how combinations of different sources of support impact mental health. In the present study, we identify clusters of family, peer, and significant other social support and then examine demographic and mental health differences by cluster in an analytic sample of 232 LGBT youth between the ages of 16 and 20 years. Using k-means cluster analysis, three social support cluster types were identified: high support (44.0% of participants), low support (21.6%), and non-family support (34.5%). A series of chi-square tests were used to examine demographic differences between these clusters, which were found for socio-economic status (SES). Regression analyses indicated that, while controlling for victimization, individuals within the three clusters showed different relationships with multiple mental health outcomes: loneliness, hopelessness, depression, anxiety, somatization, general symptom severity, and symptoms of major depressive disorder (MDD). Findings suggest the combinations of sources of support LGBT youth receive are related to their mental health. Higher SES youth are more likely to receive support from family, peers, and significant others. For most mental health outcomes, family support appears to be an especially relevant and important source of support to target for LGBT youth.

  8. Mechanisms for hydrogen production by different bacteria during mixed-acid and photo-fermentation and perspectives of hydrogen production biotechnology.

    PubMed

    Trchounian, Armen

    2015-03-01

    H2 has a great potential as an ecologically-clean, renewable and capable fuel. It can be mainly produced via hydrogenases (Hyd) by different bacteria, especially Escherichia coli and Rhodobacter sphaeroides. The operation direction and activity of multiple Hyd enzymes in E. coli during mixed-acid fermentation might determine H2 production; some metabolic cross-talk between Hyd enzymes is proposed. Manipulating the activity of different Hyd enzymes is an effective way to enhance H2 production by E. coli in biotechnology. Moreover, a novel approach would be the use of glycerol as feedstock in fermentation processes leading to H2 production. Mixed carbon (sugar and glycerol) utilization studies enlarge the kind of organic wastes used in biotechnology. During photo-fermentation under limited nitrogen conditions, H2 production by Rh. sphaeroides is observed when carbon and nitrogen sources are supplemented. The relationship of H2 production with H(+) transport across the membrane and membrane-associated ATPase activity is shown. On the other hand, combination of carbon sources (succinate, malate) with different nitrogen sources (yeast extract, glutamate, glycine) as well as different metal (Fe, Ni, Mg) ions might regulate H2 production. All these can enhance H2 production yield by Rh. sphaeroides in biotechnology Finally, two of these bacteria might be combined to develop and consequently to optimize two stages of H2 production biotechnology with high efficiency transformation of different organic sources.

  9. The Grism Lens-Amplified Survey from Space (GLASS). VI. Comparing the Mass and Light in MACS J0416.1-2403 Using Frontier Field Imaging and GLASS Spectroscopy

    NASA Astrophysics Data System (ADS)

    Hoag, A.; Huang, K.-H.; Treu, T.; Bradač, M.; Schmidt, K. B.; Wang, X.; Brammer, G. B.; Broussard, A.; Amorin, R.; Castellano, M.; Fontana, A.; Merlin, E.; Schrabback, T.; Trenti, M.; Vulcani, B.

    2016-11-01

    We present a model using both strong and weak gravitational lensing of the galaxy cluster MACS J0416.1-2403, constrained using spectroscopy from the Grism Lens-Amplified Survey from Space (GLASS) and Hubble Frontier Fields (HFF) imaging data. We search for emission lines in known multiply imaged sources in the GLASS spectra, obtaining secure spectroscopic redshifts of 30 multiple images belonging to 15 distinct source galaxies. The GLASS spectra provide the first spectroscopic measurements for five of the source galaxies. The weak lensing signal is acquired from 884 galaxies in the F606W HFF image. By combining the weak lensing constraints with 15 multiple image systems with spectroscopic redshifts and nine multiple image systems with photometric redshifts, we reconstruct the gravitational potential of the cluster on an adaptive grid. The resulting map of total mass density is compared with a map of stellar mass density obtained from the deep Spitzer Frontier Fields imaging data to study the relative distribution of stellar and total mass in the cluster. We find that the projected stellar mass to total mass ratio, f ⋆, varies considerably with the stellar surface mass density. The mean projected stellar mass to total mass ratio is < {f}\\star > =0.009+/- 0.003 (stat.), but with a systematic error as large as 0.004-0.005, dominated by the choice of the initial mass function. We find agreement with several recent measurements of f ⋆ in massive cluster environments. The lensing maps of convergence, shear, and magnification are made available to the broader community in the standard HFF format.

  10. MEASURING THE GEOMETRY OF THE UNIVERSE FROM WEAK GRAVITATIONAL LENSING BEHIND GALAXY GROUPS IN THE HST COSMOS SURVEY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, James E.; Massey, Richard J.; Leauthaud, Alexie

    2012-04-20

    Gravitational lensing can provide pure geometric tests of the structure of spacetime, for instance by determining empirically the angular diameter distance-redshift relation. This geometric test has been demonstrated several times using massive clusters which produce a large lensing signal. In this case, matter at a single redshift dominates the lensing signal, so the analysis is straightforward. It is less clear how weaker signals from multiple sources at different redshifts can be stacked to demonstrate the geometric dependence. We introduce a simple measure of relative shear which for flat cosmologies separates the effect of lens and source positions into multiplicative terms,more » allowing signals from many different source-lens pairs to be combined. Applying this technique to a sample of groups and low-mass clusters in the COSMOS survey, we detect a clear variation of shear with distance behind the lens. This represents the first detection of the geometric effect using weak lensing by multiple, low-mass groups. The variation of distance with redshift is measured with sufficient precision to constrain the equation of state of the universe under the assumption of flatness, equivalent to a detection of a dark energy component {Omega}{sub X} at greater than 99% confidence for an equation-of-state parameter -2.5 {<=} w {<=} -0.1. For the case w = -1, we find a value for the cosmological constant density parameter {Omega}{sub {Lambda}} = 0.85{sup +0.044}{sub -}0{sub .19} (68% CL) and detect cosmic acceleration (q{sub 0} < 0) at the 98% CL. We consider the systematic uncertainties associated with this technique and discuss the prospects for applying it in forthcoming weak-lensing surveys.« less

  11. JDet: interactive calculation and visualization of function-related conservation patterns in multiple sequence alignments and structures.

    PubMed

    Muth, Thilo; García-Martín, Juan A; Rausell, Antonio; Juan, David; Valencia, Alfonso; Pazos, Florencio

    2012-02-15

    We have implemented in a single package all the features required for extracting, visualizing and manipulating fully conserved positions as well as those with a family-dependent conservation pattern in multiple sequence alignments. The program allows, among other things, to run different methods for extracting these positions, combine the results and visualize them in protein 3D structures and sequence spaces. JDet is a multiplatform application written in Java. It is freely available, including the source code, at http://csbg.cnb.csic.es/JDet. The package includes two of our recently developed programs for detecting functional positions in protein alignments (Xdet and S3Det), and support for other methods can be added as plug-ins. A help file and a guided tutorial for JDet are also available.

  12. Particle image velocimetry based on wavelength division multiplexing

    NASA Astrophysics Data System (ADS)

    Tang, Chunxiao; Li, Enbang; Li, Hongqiang

    2018-01-01

    This paper introduces a technical approach of wavelength division multiplexing (WDM) based particle image velocimetry (PIV). It is designed to measure transient flows with different scales of velocity by capturing multiple particle images in one exposure. These images are separated by different wavelengths, and thus the pulse separation time is not influenced by the frame rate of the camera. A triple-pulsed PIV system has been created in order to prove the feasibility of WDM-PIV. This is demonstrated in a sieve plate extraction column model by simultaneously measuring the fast flow in the downcomer and the slow vortices inside the plates. A simple displacement/velocity field combination method has also been developed. The constraints imposed by WDM-PIV are limited wavelength choices of available light sources and cameras. The usage of WDM technique represents a feasible way to realize multiple-pulsed PIV.

  13. Magnetour: Surfing planetary systems on electromagnetic and multi-body gravity fields

    NASA Astrophysics Data System (ADS)

    Lantoine, Gregory; Russell, Ryan P.; Anderson, Rodney L.; Garrett, Henry B.

    2017-09-01

    A comprehensive tour of the complex outer planet systems is a central goal in space science. However, orbiting multiple moons of the same planet would be extremely prohibitive using traditional propulsion and power technologies. In this paper, a new mission concept, named Magnetour, is presented to facilitate the exploration of outer planet systems and address both power and propulsion challenges. This approach would enable a single spacecraft to orbit and travel between multiple moons of an outer planet, without significant propellant or onboard power source. To achieve this free-lunch 'Grand Tour', Magnetour exploits the unexplored combination of magnetic and multi-body gravitational fields of planetary systems, with a unique focus on using a bare electrodynamic tether for power and propulsion. Preliminary results indicate that the Magnetour concept is sound and is potentially highly promising at Jupiter.

  14. Trait synergisms and the rarity, extirpation, and extinction risk of desert fishes.

    PubMed

    Olden, Julian D; Poff, N LeRoy; Bestgen, Kevin R

    2008-03-01

    Understanding the causes and consequences of species extinctions is a central goal in ecology. Faced with the difficult task of identifying those species with the greatest need for conservation, ecologists have turned to using predictive suites of ecological and life-history traits to provide reasonable estimates of species extinction risk. Previous studies have linked individual traits to extinction risk, yet the nonadditive contribution of multiple traits to the entire extinction process, from species rarity to local extirpation to global extinction, has not been examined. This study asks whether trait synergisms predispose native fishes of the Lower Colorado River Basin (USA) to risk of extinction through their effects on rarity and local extirpation and their vulnerability to different sources of threat. Fish species with "slow" life histories (e.g., large body size, long life, and delayed maturity), minimal parental care to offspring, and specialized feeding behaviors are associated with smaller geographic distribution, greater frequency of local extirpation, and higher perceived extinction risk than that expected by simple additive effects of traits in combination. This supports the notion that trait synergisms increase the susceptibility of native fishes to multiple stages of the extinction process, thus making them prone to the multiple jeopardies resulting from a combination of fewer individuals, narrow environmental tolerances, and long recovery times following environmental change. Given that particular traits, some acting in concert, may differentially predispose native fishes to rarity, extirpation, and extinction, we suggest that management efforts in the Lower Colorado River Basin should be congruent with the life-history requirements of multiple species over large spatial and temporal scales.

  15. Rigorous ILT optimization for advanced patterning and design-process co-optimization

    NASA Astrophysics Data System (ADS)

    Selinidis, Kosta; Kuechler, Bernd; Cai, Howard; Braam, Kyle; Hoppe, Wolfgang; Domnenko, Vitaly; Poonawala, Amyn; Xiao, Guangming

    2018-03-01

    Despite the large difficulties involved in extending 193i multiple patterning and the slow ramp of EUV lithography to full manufacturing readiness, the pace of development for new technology node variations has been accelerating. Multiple new variations of new and existing technology nodes have been introduced for a range of device applications; each variation with at least a few new process integration methods, layout constructs and/or design rules. This had led to a strong increase in the demand for predictive technology tools which can be used to quickly guide important patterning and design co-optimization decisions. In this paper, we introduce a novel hybrid predictive patterning method combining two patterning technologies which have each individually been widely used for process tuning, mask correction and process-design cooptimization. These technologies are rigorous lithography simulation and inverse lithography technology (ILT). Rigorous lithography simulation has been extensively used for process development/tuning, lithography tool user setup, photoresist hot-spot detection, photoresist-etch interaction analysis, lithography-TCAD interactions/sensitivities, source optimization and basic lithography design rule exploration. ILT has been extensively used in a range of lithographic areas including logic hot-spot fixing, memory layout correction, dense memory cell optimization, assist feature (AF) optimization, source optimization, complex patterning design rules and design-technology co-optimization (DTCO). The combined optimization capability of these two technologies will therefore have a wide range of useful applications. We investigate the benefits of the new functionality for a few of these advanced applications including correction for photoresist top loss and resist scumming hotspots.

  16. A computational framework for modeling targets as complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  17. Common source-multiple load vs. separate source-individual load photovoltaic system

    NASA Technical Reports Server (NTRS)

    Appelbaum, Joseph

    1989-01-01

    A comparison of system performance is made for two possible system setups: (1) individual loads powered by separate solar cell sources; and (2) multiple loads powered by a common solar cell source. A proof for resistive loads is given that shows the advantage of a common source over a separate source photovoltaic system for a large range of loads. For identical loads, both systems perform the same.

  18. Use of Multi-class Empirical Orthogonal Function for Identification of Hydrogeological Parameters and Spatiotemporal Pattern of Multiple Recharges in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Huang, C. L.; Hsu, N. S.; Yeh, W. W. G.; Hsieh, I. H.

    2017-12-01

    This study develops an innovative calibration method for regional groundwater modeling by using multi-class empirical orthogonal functions (EOFs). The developed method is an iterative approach. Prior to carrying out the iterative procedures, the groundwater storage hydrographs associated with the observation wells are calculated. The combined multi-class EOF amplitudes and EOF expansion coefficients of the storage hydrographs are then used to compute the initial gauss of the temporal and spatial pattern of multiple recharges. The initial guess of the hydrogeological parameters are also assigned according to in-situ pumping experiment. The recharges include net rainfall recharge and boundary recharge, and the hydrogeological parameters are riverbed leakage conductivity, horizontal hydraulic conductivity, vertical hydraulic conductivity, storage coefficient, and specific yield. The first step of the iterative algorithm is to conduct the numerical model (i.e. MODFLOW) by the initial guess / adjusted values of the recharges and parameters. Second, in order to determine the best EOF combination of the error storage hydrographs for determining the correction vectors, the objective function is devised as minimizing the root mean square error (RMSE) of the simulated storage hydrographs. The error storage hydrograph are the differences between the storage hydrographs computed from observed and simulated groundwater level fluctuations. Third, adjust the values of recharges and parameters and repeat the iterative procedures until the stopping criterion is reached. The established methodology was applied to the groundwater system of Ming-Chu Basin, Taiwan. The study period is from January 1st to December 2ed in 2012. Results showed that the optimal EOF combination for the multiple recharges and hydrogeological parameters can decrease the RMSE of the simulated storage hydrographs dramatically within three calibration iterations. It represents that the iterative approach that using EOF techniques can capture the groundwater flow tendency and detects the correction vector of the simulated error sources. Hence, the established EOF-based methodology can effectively and accurately identify the multiple recharges and hydrogeological parameters.

  19. An Optimal Image-Based Method for Identification of Acoustic Emission (AE) Sources in Plate-Like Structures Using a Lead Zirconium Titanate (PZT) Sensor Array.

    PubMed

    Yan, Gang; Zhou, Li

    2018-02-21

    This paper proposes an innovative method for identifying the locations of multiple simultaneous acoustic emission (AE) events in plate-like structures from the view of image processing. By using a linear lead zirconium titanate (PZT) sensor array to record the AE wave signals, a reverse-time frequency-wavenumber (f-k) migration is employed to produce images displaying the locations of AE sources by back-propagating the AE waves. Lamb wave theory is included in the f-k migration to consider the dispersive property of the AE waves. Since the exact occurrence time of the AE events is usually unknown when recording the AE wave signals, a heuristic artificial bee colony (ABC) algorithm combined with an optimal criterion using minimum Shannon entropy is used to find the image with the identified AE source locations and occurrence time that mostly approximate the actual ones. Experimental studies on an aluminum plate with AE events simulated by PZT actuators are performed to validate the applicability and effectiveness of the proposed optimal image-based AE source identification method.

  20. Rhythmic entrainment source separation: Optimizing analyses of neural responses to rhythmic sensory stimulation.

    PubMed

    Cohen, Michael X; Gulbinaite, Rasa

    2017-02-15

    Steady-state evoked potentials (SSEPs) are rhythmic brain responses to rhythmic sensory stimulation, and are often used to study perceptual and attentional processes. We present a data analysis method for maximizing the signal-to-noise ratio of the narrow-band steady-state response in the frequency and time-frequency domains. The method, termed rhythmic entrainment source separation (RESS), is based on denoising source separation approaches that take advantage of the simultaneous but differential projection of neural activity to multiple electrodes or sensors. Our approach is a combination and extension of existing multivariate source separation methods. We demonstrate that RESS performs well on both simulated and empirical data, and outperforms conventional SSEP analysis methods based on selecting electrodes with the strongest SSEP response, as well as several other linear spatial filters. We also discuss the potential confound of overfitting, whereby the filter captures noise in absence of a signal. Matlab scripts are available to replicate and extend our simulations and methods. We conclude with some practical advice for optimizing SSEP data analyses and interpreting the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. ShiftNMFk 1.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrov, Boian S.; Vesselinov, Velimir V.; Stanev, Valentin

    The ShiftNMFk1.2 code, or as we call it, GreenNMFk, represents a hybrid algorithm combining unsupervised adaptive machine learning and Green's function inverse method. GreenNMFk allows an efficient and high performance de-mixing and feature extraction of a multitude of nonnegative signals that change their shape propagating through the medium. The signals are mixed and recorded by a network of uncorrelated sensors. The code couples Non-negative Matrix Factorization (NMF) and inverse-analysis Green's functions method. GreenNMF synergistically performs decomposition of the recorded mixtures, finds the number of the unknown sources and uses the Green's function of the governing partial differential equation to identifymore » the unknown sources and their charecteristics. GreenNMF can be applied directly to any problem controlled by a known partial-differential parabolic equation where mixtures of an unknown number of sources are measured at multiple locations. Full GreenNMFk method is a subject LANL U.S. Patent application S133364.000 August, 2017. The ShiftNMFk 1.2 version here is a toy version of this method that can work with a limited number of unknown sources (4 or less).« less

  2. An Optimal Image-Based Method for Identification of Acoustic Emission (AE) Sources in Plate-Like Structures Using a Lead Zirconium Titanate (PZT) Sensor Array

    PubMed Central

    Zhou, Li

    2018-01-01

    This paper proposes an innovative method for identifying the locations of multiple simultaneous acoustic emission (AE) events in plate-like structures from the view of image processing. By using a linear lead zirconium titanate (PZT) sensor array to record the AE wave signals, a reverse-time frequency-wavenumber (f-k) migration is employed to produce images displaying the locations of AE sources by back-propagating the AE waves. Lamb wave theory is included in the f-k migration to consider the dispersive property of the AE waves. Since the exact occurrence time of the AE events is usually unknown when recording the AE wave signals, a heuristic artificial bee colony (ABC) algorithm combined with an optimal criterion using minimum Shannon entropy is used to find the image with the identified AE source locations and occurrence time that mostly approximate the actual ones. Experimental studies on an aluminum plate with AE events simulated by PZT actuators are performed to validate the applicability and effectiveness of the proposed optimal image-based AE source identification method. PMID:29466310

  3. A Self-Adaptive Dynamic Recognition Model for Fatigue Driving Based on Multi-Source Information and Two Levels of Fusion

    PubMed Central

    Sun, Wei; Zhang, Xiaorui; Peeta, Srinivas; He, Xiaozheng; Li, Yongfu; Zhu, Senlai

    2015-01-01

    To improve the effectiveness and robustness of fatigue driving recognition, a self-adaptive dynamic recognition model is proposed that incorporates information from multiple sources and involves two sequential levels of fusion, constructed at the feature level and the decision level. Compared with existing models, the proposed model introduces a dynamic basic probability assignment (BPA) to the decision-level fusion such that the weight of each feature source can change dynamically with the real-time fatigue feature measurements. Further, the proposed model can combine the fatigue state at the previous time step in the decision-level fusion to improve the robustness of the fatigue driving recognition. An improved correction strategy of the BPA is also proposed to accommodate the decision conflict caused by external disturbances. Results from field experiments demonstrate that the effectiveness and robustness of the proposed model are better than those of models based on a single fatigue feature and/or single-source information fusion, especially when the most effective fatigue features are used in the proposed model. PMID:26393615

  4. Total-dose radiation effects data for semiconductor devices, volume 3

    NASA Technical Reports Server (NTRS)

    Price, W. E.; Martin, K. E.; Nichols, D. K.; Gauthier, M. K.; Brown, S. F.

    1982-01-01

    Volume 3 of this three-volume set provides a detailed analysis of the data in Volumes 1 and 2, most of which was generated for the Galileo Orbiter Program in support of NASA space programs. Volume 1 includes total ionizing dose radiation test data on diodes, bipolar transistors, field effect transistors, and miscellaneous discrete solid-state devices. Volume 2 includes similar data on integrated circuits and a few large-scale integrated circuits. The data of Volumes 1 and 2 are combined in graphic format in Volume 3 to provide a comparison of radiation sensitivities of devices of a given type and different manufacturer, a comparison of multiple tests for a single data code, a comparison of multiple tests for a single lot, and a comparison of radiation sensitivities vs time (date codes). All data were generated using a steady-state 2.5-MeV electron source (Dynamitron) or a Cobalt-60 gamma ray source. The data that compose Volume 3 represent 26 different device types, 224 tests, and a total of 1040 devices. A comparison of the effects of steady-state electrons and Cobat-60 gamma rays is also presented.

  5. Infrared Multiple Photon Dissociation Spectroscopy Of Metal Cluster-Adducts

    NASA Astrophysics Data System (ADS)

    Cox, D. M.; Kaldor, A.; Zakin, M. R.

    1987-01-01

    Recent development of the laser vaporization technique combined with mass-selective detection has made possible new studies of the fundamental chemical and physical properties of unsupported transition metal clusters as a function of the number of constituent atoms. A variety of experimental techniques have been developed in our laboratory to measure ionization threshold energies, magnetic moments, and gas phase reactivity of clusters. However, studies have so far been unable to determine the cluster structure or the chemical state of chemisorbed species on gas phase clusters. The application of infrared multiple photon dissociation IRMPD to obtain the IR absorption properties of metal cluster-adsorbate species in a molecular beam is described here. Specifically using a high power, pulsed CO2 laser as the infrared source, the IRMPD spectrum for methanol chemisorbed on small iron clusters is measured as a function of the number of both iron atoms and methanols in the complex for different methanol isotopes. Both the feasibility and potential utility of IRMPD for characterizing metal cluster-adsorbate interactions are demonstrated. The method is generally applicable to any cluster or cluster-adsorbate system dependent only upon the availability of appropriate high power infrared sources.

  6. Moving Environmental Justice Indoors: Understanding Structural Influences on Residential Exposure Patterns in Low-Income Communities

    PubMed Central

    Zota, Ami R.; Fabian, M. Patricia; Chahine, Teresa; Julien, Rhona; Spengler, John D.; Levy, Jonathan I.

    2011-01-01

    Objectives. The indoor environment has not been fully incorporated into the environmental justice dialogue. To inform strategies to reduce disparities, we developed a framework to identify the individual and place-based drivers of indoor environment quality. Methods. We reviewed empirical evidence of socioeconomic disparities in indoor exposures and key determinants of these exposures for air pollutants, lead, allergens, and semivolatile organic compounds. We also used an indoor air quality model applied to multifamily housing to illustrate how nitrogen dioxide (NO2) and fine particulate matter (PM2.5) vary as a function of factors known to be influenced by socioeconomic status. Results. Indoor concentrations of multiple pollutants are elevated in low-socioeconomic status households. Differences in these exposures are driven by the combined influences of indoor sources, outdoor sources, physical structures, and residential activity patterns. Simulation models confirmed indoor sources’ importance in determining indoor NO2 and PM2.5 exposures and showed the influence of household-specific determinants. Conclusions. Both theoretical models and empirical evidence emphasized that disparities in indoor environmental exposure can be significant. Understanding key determinants of multiple indoor exposures can aid in developing policies to reduce these disparities. PMID:21836112

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase

    A number of Department of Energy (DOE) science applications, involving exascale computing systems and large experimental facilities, are expected to generate large volumes of data, in the range of petabytes to exabytes, which will be transported over wide-area networks for the purpose of storage, visualization, and analysis. The objectives of this proposal are to (1) develop and test the component technologies and their synthesis methods to achieve source-to-sink high-performance flows, and (2) develop tools that provide these capabilities through simple interfaces to users and applications. In terms of the former, we propose to develop (1) optimization methods that align andmore » transition multiple storage flows to multiple network flows on multicore, multibus hosts; and (2) edge and long-haul network path realization and maintenance using advanced provisioning methods including OSCARS and OpenFlow. We also propose synthesis methods that combine these individual technologies to compose high-performance flows using a collection of constituent storage-network flows, and realize them across the storage and local network connections as well as long-haul connections. We propose to develop automated user tools that profile the hosts, storage systems, and network connections; compose the source-to-sink complex flows; and set up and maintain the needed network connections.« less

  8. Presentation Extensions of the SOAP

    NASA Technical Reports Server (NTRS)

    Carnright, Robert; Stodden, David; Coggi, John

    2009-01-01

    A set of extensions of the Satellite Orbit Analysis Program (SOAP) enables simultaneous and/or sequential presentation of information from multiple sources. SOAP is used in the aerospace community as a means of collaborative visualization and analysis of data on planned spacecraft missions. The following definitions of terms also describe the display modalities of SOAP as now extended: In SOAP terminology, View signifies an animated three-dimensional (3D) scene, two-dimensional still image, plot of numerical data, or any other visible display derived from a computational simulation or other data source; a) "Viewport" signifies a rectangular portion of a computer-display window containing a view; b) "Palette" signifies a collection of one or more viewports configured for simultaneous (split-screen) display in the same window; c) "Slide" signifies a palette with a beginning and ending time and an animation time step; and d) "Presentation" signifies a prescribed sequence of slides. For example, multiple 3D views from different locations can be crafted for simultaneous display and combined with numerical plots and other representations of data for both qualitative and quantitative analysis. The resulting sets of views can be temporally sequenced to convey visual impressions of a sequence of events for a planned mission.

  9. Profiling Students' Multiple Source Use by Question Type

    ERIC Educational Resources Information Center

    List, Alexandra; Grossnickle, Emily M.; Alexander, Patricia A.

    2016-01-01

    The present study examined undergraduate students' multiple source use in response to two different types of academic questions, one discrete and one open-ended. Participants (N = 240) responded to two questions using a library of eight digital sources, varying in source type (e.g., newspaper article) and reliability (e.g., authors' credentials).…

  10. The Multiple Source Effect and Synthesized Speech: Doubly-Disembodied Language as a Conceptual Framework

    ERIC Educational Resources Information Center

    Lee, Kwan Min; Nass, Clifford

    2004-01-01

    Two experiments examine the effect of multiple synthetic voices in an e-commerce context. In Study 1, participants (N=40) heard five positive reviews about a book from five different synthetic voices or from a single synthetic voice. Consistent with the multiple source effect, results showed that participants hearing multiple synthetic voices…

  11. Studying extragalactic background fluctuations with the Cosmic Infrared Background ExpeRiment 2 (CIBER-2)

    NASA Astrophysics Data System (ADS)

    Lanz, Alicia; Arai, Toshiaki; Battle, John; Bock, James; Cooray, Asantha; Hristov, Viktor; Korngut, Phillip; Lee, Dae Hee; Mason, Peter; Matsumoto, Toshio; Matsuura, Shuji; Morford, Tracy; Onishi, Yosuke; Shirahata, Mai; Tsumura, Kohji; Wada, Takehiko; Zemcov, Michael

    2014-08-01

    Fluctuations in the extragalactic background light trace emission from the history of galaxy formation, including the emission from the earliest sources from the epoch of reionization. A number of recent near-infrared measure- ments show excess spatial power at large angular scales inconsistent with models of z < 5 emission from galaxies. These measurements have been interpreted as arising from either redshifted stellar and quasar emission from the epoch of reionization, or the combined intra-halo light from stars thrown out of galaxies during merging activity at lower redshifts. Though astrophysically distinct, both interpretations arise from faint, low surface brightness source populations that are difficult to detect except by statistical approaches using careful observations with suitable instruments. The key to determining the source of these background anisotropies will be wide-field imaging measurements spanning multiple bands from the optical to the near-infrared. The Cosmic Infrared Background ExpeRiment 2 (CIBER-2) will measure spatial anisotropies in the extra- galactic infrared background caused by cosmological structure using six broad spectral bands. The experiment uses three 2048 x 2048 Hawaii-2RG near-infrared arrays in three cameras coupled to a single 28.5 cm telescope housed in a reusable sounding rocket-borne payload. A small portion of each array will also be combined with a linear-variable filter to make absolute measurements of the spectrum of the extragalactic background with high spatial resolution for deep subtraction of Galactic starlight. The large field of view and multiple spectral bands make CIBER-2 unique in its sensitivity to fluctuations predicted by models of lower limits on the luminosity of the first stars and galaxies and in its ability to distinguish between primordial and foreground anisotropies. In this paper the scientific motivation for CIBER-2 and details of its first flight instrumentation will be discussed, including detailed designs of the mechanical, cryogenic, and electrical systems. Plans for the future will also be presented.

  12. THE CHANDRA COSMOS SURVEY. III. OPTICAL AND INFRARED IDENTIFICATION OF X-RAY POINT SOURCES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Civano, F.; Elvis, M.; Aldcroft, T.

    2012-08-01

    The Chandra COSMOS Survey (C-COSMOS) is a large, 1.8 Ms, Chandra program that has imaged the central 0.9 deg{sup 2} of the COSMOS field down to limiting depths of 1.9 Multiplication-Sign 10{sup -16} erg cm{sup -2} s{sup -1} in the soft (0.5-2 keV) band, 7.3 Multiplication-Sign 10{sup -16} erg cm{sup -2} s{sup -1} in the hard (2-10 keV) band, and 5.7 Multiplication-Sign 10{sup -16} erg cm{sup -2} s{sup -1} in the full (0.5-10 keV) band. In this paper we report the i, K, and 3.6 {mu}m identifications of the 1761 X-ray point sources. We use the likelihood ratio technique tomore » derive the association of optical/infrared counterparts for 97% of the X-ray sources. For most of the remaining 3%, the presence of multiple counterparts or the faintness of the possible counterpart prevented a unique association. For only 10 X-ray sources we were not able to associate a counterpart, mostly due to the presence of a very bright field source close by. Only two sources are truly empty fields. The full catalog, including spectroscopic and photometric redshifts and classification described here in detail, is available online. Making use of the large number of X-ray sources, we update the 'classic locus' of active galactic nuclei (AGNs) defined 20 years ago in soft X-ray surveys and define a new locus containing 90% of the AGNs in the survey with full-band luminosity >10{sup 42} erg s{sup -1}. We present the linear fit between the total i-band magnitude and the X-ray flux in the soft and hard bands, drawn over two orders of magnitude in X-ray flux, obtained using the combined C-COSMOS and XMM-COSMOS samples. We focus on the X-ray to optical flux ratio (X/O) and we test its known correlation with redshift and luminosity, and a recently introduced anti-correlation with the concentration index (C). We find a strong anti-correlation (though the dispersion is of the order of 0.5 dex) between X/O computed in the hard band and C and that 90% of the obscured AGNs in the sample with morphological information live in galaxies with regular morphology (bulgy and disky/spiral), suggesting that secular processes govern a significant fraction of the black hole growth at X-ray luminosities of 10{sup 43}-10{sup 44.5} erg s{sup -1}. We also investigate the degree of obscuration of the sample using the hardness ratio, and we compare the X-ray color with the near-infrared to optical color.« less

  13. Electrophysiology quantitative electroencephalography/low resolution brain electromagnetic tomography functional brain imaging (QEEG LORETA): Case report: Subjective idiopathic tinnitus - predominantly central type severe disabling tinnitus.

    PubMed

    Shulman, Abraham; Goldstein, Barbara

    2014-01-01

    The clinical significance of QEEG LORETA data analysis performed sequentially within 6 months is presented in a case report of a predominantly central type severe disabling subjective idiopathic tinnitus (SIT) before and following treatment. The QEEG LORETA data is reported as Z-scores of z = ± 2.54, p < 0.013. The focus is on demonstration of patterns of brain wave oscillations reflecting multiple brain functions in multiple ROIs in the presence of the tinnitus signal (SIT). The patterns of brain activity both high, middle and low frequencies are hypothesized to reflect connectivities within and between multiple neuronal networks in brain. The Loreta source localization non auditory ROI Images at the maximal abnormality in the very narrow band frequency spectra (24.21 Hz), showed the mathematically most probable underlying sources of the scalp recorded data to be greatest in the mid-cingulate, bilateral precuneus, cingulate and the bilateral caudate nucleus. Clinical correlation of the data with the history and course of the SIT is considered an objective demonstration of the affect, behavioral, and emotional component of the SIT. The correlation of the caudate activity, SIT as the traumatic event with the clinical course of PTSD, and the clinical diagnosis of PTSD is discussed. The clinical translation for patient care is highlighted in a SIT patient with multiple comorbidities by translation of QEEG/LORETA electrophysiologic data, as an adjunct to: provide an objectivity of patterns of brain wave activity in multiple regions of interest (ROIs) reflecting multiple brain functions, in response to and in the presence of the tinnitus signal, recorded from the scalp and analyzed with the metrics of absolute power, relative power, asymmetry, and coherence, for the subjective tinnitus complaint (SIT); 2) provide an increase in the accuracy of the tinnitus diagnosis; 3) assess/monitor treatment efficacy; 4) provide a rationale for selection of a combined tinnitus targeted therapy of behavioral, pharmacologic, sound therapy modalities of treatment attempting tinnitus relief; 5) provide insight into the medical significance of the SIT; 6) attempt discriminant function analysis for identification of a particular diagnostic clinical category of CNS neuropsychiatric disease; and 7) attempt to translate what is known of the neuroscience of sensation, brain function, QEEG/LORETA source localization, for the etiology and prognosis of the individual SIT patient.

  14. The Polarimeter for Relativistic Astrophysical X-ray Sources

    NASA Astrophysics Data System (ADS)

    Jahoda, Keith; Kallman, Timothy R.; Kouveliotou, Chryssa; Angelini, Lorella; Black, J. Kevin; Hill, Joanne E.; Jaeger, Theodore; Kaaret, Philip E.; Markwardt, Craig B.; Okajima, Takashi; Petre, Robert; Schnittman, Jeremy; Soong, Yang; Strohmayer, Tod E.; Tamagawa, Toru; Tawara, Yuzuru

    2016-07-01

    The Polarimeter for Relativistic Astrophysical X-ray Sources (PRAXyS) is one of three Small Explorer (SMEX) missions selected by NASA for Phase A study, with a launch date in 2020. The PRAXyS Observatory exploits grazing incidence X-ray mirrors and Time Projection Chamber Polarimeters capable of measuring the linear polarization of cosmic X-ray sources in the 2-10 keV band. PRAXyS combines well-characterized instruments with spacecraft rotation to ensure low systematic errors. The PRAXyS payload is developed at the Goddard Space Flight Center with the Johns Hopkins University Applied Physics Laboratory, University of Iowa, and RIKEN (JAXA) collaborating on the Polarimeter Assembly. The LEOStar-2 spacecraft bus is developed by Orbital ATK, which also supplies the extendable optical bench that enables the Observatory to be compatible with a Pegasus class launch vehicle. A nine month primary mission will provide sensitive observations of multiple black hole and neutron star sources, where theory predicts polarization is a strong diagnostic, as well as exploratory observations of other high energy sources. The primary mission data will be released to the community rapidly and a Guest Observer extended mission will be vigorously proposed.

  15. Analysis and Application of Microgrids

    NASA Astrophysics Data System (ADS)

    Yue, Lu

    New trends of generating electricity locally and utilizing non-conventional or renewable energy sources have attracted increasing interests due to the gradual depletion of conventional fossil fuel energy sources. The new type of power generation is called Distributed Generation (DG) and the energy sources utilized by Distributed Generation are termed Distributed Energy Sources (DERs). With DGs embedded in the distribution networks, they evolve from passive distribution networks to active distribution networks enabling bidirectional power flows in the networks. Further incorporating flexible and intelligent controllers and employing future technologies, active distribution networks will turn to a Microgrid. A Microgrid is a small-scale, low voltage Combined with Heat and Power (CHP) supply network designed to supply electrical and heat loads for a small community. To further implement Microgrids, a sophisticated Microgrid Management System must be integrated. However, due to the fact that a Microgrid has multiple DERs integrated and is likely to be deregulated, the ability to perform real-time OPF and economic dispatch with fast speed advanced communication network is necessary. In this thesis, first, problems such as, power system modelling, power flow solving and power system optimization, are studied. Then, Distributed Generation and Microgrid are studied and reviewed, including a comprehensive review over current distributed generation technologies and Microgrid Management Systems, etc. Finally, a computer-based AC optimization method which minimizes the total transmission loss and generation cost of a Microgrid is proposed and a wireless communication scheme based on synchronized Code Division Multiple Access (sCDMA) is proposed. The algorithm is tested with a 6-bus power system and a 9-bus power system.

  16. WATER QUALITY IN SOURCE WATER, TREATMENT, AND DISTRIBUTION SYSTEMS

    EPA Science Inventory

    Most drinking water utilities practice the multiple-barrier concept as the guiding principle for providing safe water. This chapter discusses multiple barriers as they relate to the basic criteria for selecting and protecting source waters, including known and potential sources ...

  17. Natural health products that inhibit angiogenesis: a potential source for investigational new agents to treat cancer—Part 1

    PubMed Central

    Sagar, S.M.; Yance, D.; Wong, R.K.

    2006-01-01

    An integrative approach for managing a patient with cancer should target the multiple biochemical and physiologic pathways that support tumour development and minimize normal-tissue toxicity. Angiogenesis is a key process in the promotion of cancer. Many natural health products that inhibit angiogenesis also manifest other anticancer activities. The present article focuses on products that have a high degree of anti-angiogenic activity, but it also describes some of the many other actions of these agents that can inhibit tumour progression and reduce the risk of metastasis. Natural health products target molecular pathways other than angiogenesis, including epidermal growth factor receptor, the HER2/neu gene, the cyclooxygenase-2 enzyme, the nuclear factor kappa-B transcription factor, the protein kinases, the Bcl-2 protein, and coagulation pathways. The herbs that are traditionally used for anticancer treatment and that are anti-angiogenic through multiple interdependent processes (including effects on gene expression, signal processing, and enzyme activities) include Artemisia annua (Chinese wormwood), Viscum album (European mistletoe), Curcuma longa (curcumin), Scutellaria baicalensis (Chinese skullcap), resveratrol and proanthocyanidin (grape seed extract), Magnolia officinalis (Chinese magnolia tree), Camellia sinensis (green tea), Ginkgo biloba, quercetin, Poria cocos, Zingiber officinalis (ginger), Panax ginseng, Rabdosia rubescens hora (Rabdosia), and Chinese destagnation herbs. Quality assurance of appropriate extracts is essential prior to embarking upon clinical trials. More data are required on dose–response, appropriate combinations, and potential toxicities. Given the multiple effects of these agents, their future use for cancer therapy probably lies in synergistic combinations. During active cancer therapy, they should generally be evaluated in combination with chemotherapy and radiation. In this role, they act as modifiers of biologic response or as adaptogens, potentially enhancing the efficacy of the conventional therapies. PMID:17576437

  18. Parse, simulation, and prediction of NOx emission across the Midwestern United States

    NASA Astrophysics Data System (ADS)

    Fang, H.; Michalski, G. M.; Spak, S.

    2017-12-01

    Accurately constraining N emissions in space and time has been a challenge for atmospheric scientists. It has been suggested that 15N isotopes may be a way of tracking N emission sources across various spatial and temporal scales. However, the complexity of multiple N sources that can quickly change in intensity has made this a difficult problem. We have used a SMOKE emission model to parse NOx emission across the Midwestern United States for a one-year simulation. An isotope mass balance methods was used to assign 15N values to road, non-road, point, and area sources. The SMOKE emissions and isotope mass balance were then combined to predict the 15N of NOx emissions (Figure 1). This ^15N of NOx emissions model was then incorporated into CMAQ to assess the role of transport and chemistry would impact the 15N value of NOx due to mixing and removal processes. The predicted 15N value of NOx was compared to those in recent measurements of NOx and atmospheric nitrate.

  19. Creation of a national resource with linked genealogy and phenotypic data: the Veterans Genealogy Project.

    PubMed

    Cannon-Albright, Lisa A; Dintelman, Sue; Maness, Tim; Backus, Steve; Thomas, Alun; Meyer, Laurence J

    2013-07-01

    Creation of a genealogy of the United States and its ancestral populations is under way. When complete, this US genealogy will be record linked to the National Veteran's Health Administration medical data representing more than 8 million US veterans. Genealogical data are gathered from public sources, primarily the Internet. Record linking using data from relatives is accomplished to integrate multiple data sources and then to link genealogical data to the veteran's demographic data. This resource currently includes genealogy for more than 22 million individuals representing the Intermountain West and the East Coast. The demographic data for more than 40,000 veteran patients using Veterans Hospital Administration services in Utah and Massachusetts have already been record linked. The resource is only in its second year of creation and already represents the largest such combination of genealogy and medical data in the world. The data sources, the creation of the genealogy, record-linking methods and results, proposed genetic analyses, and future directions are discussed.

  20. A Dashboard for the Italian Computing in ALICE

    NASA Astrophysics Data System (ADS)

    Elia, D.; Vino, G.; Bagnasco, S.; Crescente, A.; Donvito, G.; Franco, A.; Lusso, S.; Mura, D.; Piano, S.; Platania, G.; ALICE Collaboration

    2017-10-01

    A dashboard devoted to the computing in the Italian sites for the ALICE experiment at the LHC has been deployed. A combination of different complementary monitoring tools is typically used in most of the Tier-2 sites: this makes somewhat difficult to figure out at a glance the status of the site and to compare information extracted from different sources for debugging purposes. To overcome these limitations a dedicated ALICE dashboard has been designed and implemented in each of the ALICE Tier-2 sites in Italy: in particular, it provides a single, interactive and easily customizable graphical interface where heterogeneous data are presented. The dashboard is based on two main ingredients: an open source time-series database and a dashboard builder tool for visualizing time-series metrics. Various sensors, able to collect data from the multiple data sources, have been also written. A first version of a national computing dashboard has been implemented using a specific instance of the builder to gather data from all the local databases.

  1. Subtyping of Canadian isolates of Salmonella Enteritidis using Multiple Locus Variable Number Tandem Repeat Analysis (MLVA) alone and in combination with Pulsed-Field Gel Electrophoresis (PFGE) and phage typing.

    PubMed

    Ziebell, Kim; Chui, Linda; King, Robin; Johnson, Suzanne; Boerlin, Patrick; Johnson, Roger P

    2017-08-01

    Salmonella enterica subspecies enterica serovar Enteritidis (SE) is one of the most common causes of human salmonellosis and in Canada currently accounts for over 40% of human cases. Reliable subtyping of isolates is required for outbreak detection and source attribution. However, Pulsed-Field Gel Electrophoresis (PFGE), the current standard subtyping method for Salmonella spp., is compromised by the high genetic homogeneity of SE. Multiple Locus Variable Number Tandem Repeat Analysis (MLVA) was introduced to supplement PFGE, although there is a lack of data on the ability of MLVA to subtype Canadian isolates of SE. Three subtyping methods, PFGE, MLVA and phage typing were compared for their discriminatory power when applied to three panels of Canadian SE isolates: Panel 1: 70 isolates representing the diversity of phage types (PTs) and PFGE subtypes within these PTs; Panel 2: 214 apparently unrelated SE isolates of the most common PTs; and Panel 3: 27 isolates from 10 groups of epidemiologically related strains. For Panel 2 isolates, four MLVA subtypes were shared among 74% of unrelated isolates and in Panel 3 isolates, one MLVA subtype accounted for 62% of the isolates. For all panels, combining results from PFGE, MLVA and PT gave the best discrimination, except in Panel 1, where the combination of PT and PFGE was equally as high, due to the selection criteria for this panel. However, none of these methods is sufficiently discriminatory alone for reliable outbreak detection or source attribution, and must be applied together to achieve sufficient discrimination for practical purposes. Even then, some large clusters were not differentiated adequately. More discriminatory methods are required for reliable subtyping of this genetically highly homogeneous serovar. This need will likely be met by whole genome sequence analysis given the recent promising reports and as more laboratories implement this tool for outbreak response and surveillance. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Benefits of Localization and Speech Perception with Multiple Noise Sources in Listeners with a Short-electrode Cochlear Implant

    PubMed Central

    Dunn, Camille C.; Perreau, Ann; Gantz, Bruce; Tyler, Richard

    2009-01-01

    Background Research suggests that for individuals with significant low-frequency hearing, implantation of a short-electrode cochlear implant may provide benefits of improved speech perception abilities. Because this strategy combines acoustic and electrical hearing within the same ear while at the same time preserving low-frequency residual acoustic hearing in both ears, localization abilities may also be improved. However, very little research has focused on the localization and spatial hearing abilities of users with a short-electrode cochlear implant. Purpose The purpose of this study was to evaluate localization abilities for listeners with a short-electrode cochlear implant who continue to wear hearing aids in both ears. A secondary purpose was to document speech perception abilities using a speech in noise test with spatially-separate noise sources. Research Design Eleven subjects that utilized a short-electrode cochlear implant and bilateral hearing aids were tested on localization and speech perception with multiple noise locations using an eight-loudspeaker array. Performance was assessed across four listening conditions using various combinations of cochlear implant and/or hearing aid use. Results Results for localization showed no significant difference between using bilateral hearing aids and bilateral hearing aids plus the cochlear implant. However, there was a significant difference between the bilateral hearing aid condition and the implant plus use of a contralateral hearing aid for all eleven subjects. Results for speech perception showed a significant benefit when using bilateral hearing aids plus the cochlear implant over use of the implant plus only one hearing aid. Conclusion Combined use of both hearing aids and the cochlear implant show significant benefits for both localization and speech perception in noise for users with a short-electrode cochlear implant. These results emphasize the importance of low-frequency information in two ears for the purpose of localization and speech perception in noise. PMID:20085199

  3. Benefits of localization and speech perception with multiple noise sources in listeners with a short-electrode cochlear implant.

    PubMed

    Dunn, Camille C; Perreau, Ann; Gantz, Bruce; Tyler, Richard S

    2010-01-01

    Research suggests that for individuals with significant low-frequency hearing, implantation of a short-electrode cochlear implant may provide benefits of improved speech perception abilities. Because this strategy combines acoustic and electrical hearing within the same ear while at the same time preserving low-frequency residual acoustic hearing in both ears, localization abilities may also be improved. However, very little research has focused on the localization and spatial hearing abilities of users with a short-electrode cochlear implant. The purpose of this study was to evaluate localization abilities for listeners with a short-electrode cochlear implant who continue to wear hearing aids in both ears. A secondary purpose was to document speech perception abilities using a speech-in-noise test with spatially separate noise sources. Eleven subjects that utilized a short-electrode cochlear implant and bilateral hearing aids were tested on localization and speech perception with multiple noise locations using an eight-loudspeaker array. Performance was assessed across four listening conditions using various combinations of cochlear implant and/or hearing aid use. Results for localization showed no significant difference between using bilateral hearing aids and bilateral hearing aids plus the cochlear implant. However, there was a significant difference between the bilateral hearing aid condition and the implant plus use of a contralateral hearing aid for all 11 subjects. Results for speech perception showed a significant benefit when using bilateral hearing aids plus the cochlear implant over use of the implant plus only one hearing aid. Combined use of both hearing aids and the cochlear implant show significant benefits for both localization and speech perception in noise for users with a short-electrode cochlear implant. These results emphasize the importance of low-frequency information in two ears for the purpose of localization and speech perception in noise.

  4. Domestic water service delivery indicators and frameworks for monitoring, evaluation, policy and planning: a review.

    PubMed

    Kayser, Georgia L; Moriarty, Patrick; Fonseca, Catarina; Bartram, Jamie

    2013-10-11

    Monitoring of water services informs policy and planning for national governments and the international community. Currently, the international monitoring system measures the type of drinking water source that households use. There have been calls for improved monitoring systems over several decades, some advocating use of multiple indicators. We review the literature on water service indicators and frameworks with a view to informing debate on their relevance to national and international monitoring. We describe the evidence concerning the relevance of each identified indicator to public health, economic development and human rights. We analyze the benefits and challenges of using these indicators separately and combined in an index as tools for planning, monitoring, and evaluating water services. We find substantial evidence on the importance of each commonly recommended indicator--service type, safety, quantity, accessibility, reliability or continuity of service, equity, and affordability. Several frameworks have been proposed that give structure to the relationships among individual indicators and some combine multiple indicator scores into a single index but few have been rigorously tested. More research is needed to understand if employing a composite metric of indicators is advantageous and how each indicator might be scored and scaled.

  5. Domestic Water Service Delivery Indicators and Frameworks for Monitoring, Evaluation, Policy and Planning: A Review

    PubMed Central

    Kayser, Georgia L.; Moriarty, Patrick; Fonseca, Catarina; Bartram, Jamie

    2013-01-01

    Monitoring of water services informs policy and planning for national governments and the international community. Currently, the international monitoring system measures the type of drinking water source that households use. There have been calls for improved monitoring systems over several decades, some advocating use of multiple indicators. We review the literature on water service indicators and frameworks with a view to informing debate on their relevance to national and international monitoring. We describe the evidence concerning the relevance of each identified indicator to public health, economic development and human rights. We analyze the benefits and challenges of using these indicators separately and combined in an index as tools for planning, monitoring, and evaluating water services. We find substantial evidence on the importance of each commonly recommended indicator—service type, safety, quantity, accessibility, reliability or continuity of service, equity, and affordability. Several frameworks have been proposed that give structure to the relationships among individual indicators and some combine multiple indicator scores into a single index but few have been rigorously tested. More research is needed to understand if employing a composite metric of indicators is advantageous and how each indicator might be scored and scaled. PMID:24157507

  6. Drying performance of fermented cassava (fercaf) using a convective multiple flash dryer

    NASA Astrophysics Data System (ADS)

    Handojo, Lienda A.; Zefanya, Samuel; Christanto, Yohanes

    2017-05-01

    Fermented cassava (fercaf) is a tropical versatile carbohydrate source flour which is produced by modifying the characteristics of cassava. Drying process is one of the processes that could influence the quality of fercaf. In general, for food application, convective and vacuum drying were used, however recently another advanced method using combination of both convective and vacuum, i.e. convective multiple flash drying (CMFD), was proposed. This method is conducted by repeating cycles of convective and vacuum drying in intermittent manner. Cassava chips with thickness of 0.1-0.2 cm were fermented for 24 hours at room condition. Then, the drying process was conducted by using 3 techniques, i.e. convective, vacuum, and combined method (CMFD), with operation temperatures between 50 and 70°C for 10 hours or until fermented cassava reached a moisture content of less than 20%. The study shows that CMFD was the fastest drying method with only 5-6 hours period compared to 8-10 hours using vacuum and more than 10 hours using convective method. CMFD also produces harder fercaf chips than those of vacuum and convective methods. Moreover, this research also proves that the operating pressure and temperature influence the moisture content.

  7. Micronutrients in the treatment of stunting and moderate malnutrition.

    PubMed

    Penny, Mary Edith

    2012-01-01

    Linear growth retardation or stunting may occur with or without low weight-for-age, but in both cases stunted or moderately malnourished children are deficient in micronutrients. Pregnancy and the first 2 years are critical periods. Dietary deficiency of zinc, iron, calcium, and vitamin A are especially common and often occur together. Zinc is essential for adequate growth, and supplements have been shown to increase intrauterine femur length and to prevent stunting. However, in general, supplements which provide a mixture of micronutrients have been more successful in preventing stunting and are simpler to take and distribute. Multiple micronutrients together with energy and macronutrients are also needed for the management of moderate malnutrition. Multiple micronutrients may be delivered as medicinal-like supplements, but may also be combined with food, for instance in milk drinks, in fortified dried cereal mixes used to supplement complementary foods or in lipid nutrition supplements. The latter also provide essential fats necessary for growth. Micronutrient powders for home fortification are effective in preventing anemia, but present combinations do not prevent stunting. Improving the diets of infant and young children is also possible, and increased intake of animal source foods can improve growth. Copyright © 2012 S. Karger AG, Basel.

  8. Multiple-wavelength spectroscopic quantitation of light-absorbing species in scattering media

    DOEpatents

    Nathel, Howard; Cartland, Harry E.; Colston, Jr., Billy W.; Everett, Matthew J.; Roe, Jeffery N.

    2000-01-01

    An oxygen concentration measurement system for blood hemoglobin comprises a multiple-wavelength low-coherence optical light source that is coupled by single mode fibers through a splitter and combiner and focused on both a target tissue sample and a reference mirror. Reflections from both the reference mirror and from the depths of the target tissue sample are carried back and mixed to produce interference fringes in the splitter and combiner. The reference mirror is set such that the distance traversed in the reference path is the same as the distance traversed into and back from the target tissue sample at some depth in the sample that will provide light attenuation information that is dependent on the oxygen in blood hemoglobin in the target tissue sample. Two wavelengths of light are used to obtain concentrations. The method can be used to measure total hemoglobin concentration [Hb.sub.deoxy +Hb.sub.oxy ] or total blood volume in tissue and in conjunction with oxygen saturation measurements from pulse oximetry can be used to absolutely quantify oxyhemoglobin [HbO.sub.2 ] in tissue. The apparatus and method provide a general means for absolute quantitation of an absorber dispersed in a highly scattering medium.

  9. Combining stable isotopes with contamination indicators: A method for improved investigation of nitrate sources and dynamics in aquifers with mixed nitrogen inputs.

    PubMed

    Minet, E P; Goodhue, R; Meier-Augenstein, W; Kalin, R M; Fenton, O; Richards, K G; Coxon, C E

    2017-11-01

    Excessive nitrate (NO 3 - ) concentration in groundwater raises health and environmental issues that must be addressed by all European Union (EU) member states under the Nitrates Directive and the Water Framework Directive. The identification of NO 3 - sources is critical to efficiently control or reverse NO 3 - contamination that affects many aquifers. In that respect, the use of stable isotope ratios 15 N/ 14 N and 18 O/ 16 O in NO 3 - (expressed as δ 15 N-NO 3 - and δ 18 O-NO 3 - , respectively) has long shown its value. However, limitations exist in complex environments where multiple nitrogen (N) sources coexist. This two-year study explores a method for improved NO 3 - source investigation in a shallow unconfined aquifer with mixed N inputs and a long established NO 3 - problem. In this tillage-dominated area of free-draining soil and subsoil, suspected NO 3 - sources were diffuse applications of artificial fertiliser and organic point sources (septic tanks and farmyards). Bearing in mind that artificial diffuse sources were ubiquitous, groundwater samples were first classified according to a combination of two indicators relevant of point source contamination: presence/absence of organic point sources (i.e. septic tank and/or farmyard) near sampling wells and exceedance/non-exceedance of a contamination threshold value for sodium (Na + ) in groundwater. This classification identified three contamination groups: agricultural diffuse source but no point source (D+P-), agricultural diffuse and point source (D+P+) and agricultural diffuse but point source occurrence ambiguous (D+P±). Thereafter δ 15 N-NO 3 - and δ 18 O-NO 3 - data were superimposed on the classification. As δ 15 N-NO 3 - was plotted against δ 18 O-NO 3 - , comparisons were made between the different contamination groups. Overall, both δ variables were significantly and positively correlated (p < 0.0001, r s  = 0.599, slope of 0.5), which was indicative of denitrification. An inspection of the contamination groups revealed that denitrification did not occur in the absence of point source contamination (group D+P-). In fact, strong significant denitrification lines occurred only in the D+P+ and D+P± groups (p < 0.0001, r s  > 0.6, 0.53 ≤ slope ≤ 0.76), i.e. where point source contamination was characterised or suspected. These lines originated from the 2-6‰ range for δ 15 N-NO 3 - , which suggests that i) NO 3 - contamination was dominated by an agricultural diffuse N source (most likely the large organic matter pool that has incorporated 15 N-depleted nitrogen from artificial fertiliser in agricultural soils and whose nitrification is stimulated by ploughing and fertilisation) rather than point sources and ii) denitrification was possibly favoured by high dissolved organic content (DOC) from point sources. Combining contamination indicators and a large stable isotope dataset collected over a large study area could therefore improve our understanding of the NO 3 - contamination processes in groundwater for better land use management. We hypothesise that in future research, additional contamination indicators (e.g. pharmaceutical molecules) could also be combined to disentangle NO 3 - contamination from animal and human wastes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Using Robust Standard Errors to Combine Multiple Regression Estimates with Meta-Analysis

    ERIC Educational Resources Information Center

    Williams, Ryan T.

    2012-01-01

    Combining multiple regression estimates with meta-analysis has continued to be a difficult task. A variety of methods have been proposed and used to combine multiple regression slope estimates with meta-analysis, however, most of these methods have serious methodological and practical limitations. The purpose of this study was to explore the use…

  11. Multiple-source multiple-harmonic active vibration control of variable section cylindrical structures: A numerical study

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Chen, Xuefeng; Gao, Jiawei; Zhang, Xingwu

    2016-12-01

    Air vehicles, space vehicles and underwater vehicles, the cabins of which can be viewed as variable section cylindrical structures, have multiple rotational vibration sources (e.g., engines, propellers, compressors and motors), making the spectrum of noise multiple-harmonic. The suppression of such noise has been a focus of interests in the field of active vibration control (AVC). In this paper, a multiple-source multiple-harmonic (MSMH) active vibration suppression algorithm with feed-forward structure is proposed based on reference amplitude rectification and conjugate gradient method (CGM). An AVC simulation scheme called finite element model in-loop simulation (FEMILS) is also proposed for rapid algorithm verification. Numerical studies of AVC are conducted on a variable section cylindrical structure based on the proposed MSMH algorithm and FEMILS scheme. It can be seen from the numerical studies that: (1) the proposed MSMH algorithm can individually suppress each component of the multiple-harmonic noise with an unified and improved convergence rate; (2) the FEMILS scheme is convenient and straightforward for multiple-source simulations with an acceptable loop time. Moreover, the simulations have similar procedure to real-life control and can be easily extended to physical model platform.

  12. Searching Information Sources in Networks

    DTIC Science & Technology

    2017-06-14

    SECURITY CLASSIFICATION OF: During the course of this project, we made significant progresses in multiple directions of the information detection...result on information source detection on non-tree networks; (2) The development of information source localization algorithms to detect multiple... information sources. The algorithms have provable performance guarantees and outperform existing algorithms in 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND

  13. Improved Multiple-Species Cyclotron Ion Source

    NASA Technical Reports Server (NTRS)

    Soli, George A.; Nichols, Donald K.

    1990-01-01

    Use of pure isotope 86Kr instead of natural krypton in multiple-species ion source enables source to produce krypton ions separated from argon ions by tuning cylcotron with which source used. Addition of capability to produce and separate krypton ions at kinetic energies of 150 to 400 MeV necessary for simulation of worst-case ions occurring in outer space.

  14. The size of coronal hard X-ray sources in solar flares: How big are they?

    NASA Astrophysics Data System (ADS)

    Effenberger, F.; Krucker, S.; Rubio da Costa, F.

    2017-12-01

    Coronal hard X-ray sources are considered to be one of the key signatures of non-thermal particle acceleration and heating during the energy release in solar flares. In some cases, X-ray observations reveal multiple components spatially located near and above the loop top and even further up in the corona. Here, we combine a detailed RHESSI imaging analysis of near-limb solar flares with occulted footpoints and a multi-wavelength study of the flare loop evolution in SDO/AIA. We connect our findings to different current sheet formation and magnetic break-out scenarios and relate it to particle acceleration theory. We find that the upper and usually fainter emission regions can be underestimated in their size due to the majority of flux originating from the lower loops.

  15. The underutilization of street markets as a source of food security indicators in famine early warning systems: a case study of Ethiopia.

    PubMed

    Companion, Michèle

    2008-09-01

    Famine Early Warning Systems (EWS) are reliant on data aggregated from multiple sources. Consequently, they are often insensitive to localized changes in food security status, leading to delayed response or interventions. While price and infrastructural data are often gathered, this case study suggests that local street markets and vendor knowledge are underutilized. Few efforts have been made to monitor systematically the street markets as an indicator of local stressors. Findings from Ethiopia show that knowledge generated by expanding food security indicators in this sector can be used in combination with EWS to facilitate earlier intervention in, or to monitor more effectively, on-going humanitarian crises. Indicators developed from this study are accurate, cost effective, and sensitive to local climatic and food stressors.

  16. Multi-gas sensing with quantum cascade laser array in the mid-infrared region

    NASA Astrophysics Data System (ADS)

    Bizet, Laurent; Vallon, Raphael; Parvitte, Bertrand; Brun, Mickael; Maisons, Gregory; Carras, Mathieu; Zeninari, Virginie

    2017-05-01

    Wide tunable lasers sources are useful for spectroscopy of complex molecules that have broad absorption spectra and for multiple sensing of smaller molecules. A region of interest is the mid-infrared region, where many species have strong ro-vibrational modes. In this paper a novel broad tunable source composed of a QCL DFB array and an arrayed waveguide grating (also called multiplexer) was used to perform multi-species spectroscopy (CO, C2H2, CO2). The array and the multiplexer are associated in a way to obtain a prototype that is non-sensitive to mechanical vibrations. A 2190-2220 cm^{-1} spectral range is covered by the chip. The arrayed waveguide grating combines beams to have a single output. A multi-pass White cell was used to demonstrate the efficiency of the multiplexer.

  17. Harnessing the web information ecosystem with wiki-based visualization dashboards.

    PubMed

    McKeon, Matt

    2009-01-01

    We describe the design and deployment of Dashiki, a public website where users may collaboratively build visualization dashboards through a combination of a wiki-like syntax and interactive editors. Our goals are to extend existing research on social data analysis into presentation and organization of data from multiple sources, explore new metaphors for these activities, and participate more fully in the web!s information ecology by providing tighter integration with real-time data. To support these goals, our design includes novel and low-barrier mechanisms for editing and layout of dashboard pages and visualizations, connection to data sources, and coordinating interaction between visualizations. In addition to describing these technologies, we provide a preliminary report on the public launch of a prototype based on this design, including a description of the activities of our users derived from observation and interviews.

  18. A Multi-Camera System for Bioluminescence Tomography in Preclinical Oncology Research

    PubMed Central

    Lewis, Matthew A.; Richer, Edmond; Slavine, Nikolai V.; Kodibagkar, Vikram D.; Soesbe, Todd C.; Antich, Peter P.; Mason, Ralph P.

    2013-01-01

    Bioluminescent imaging (BLI) of cells expressing luciferase is a valuable noninvasive technique for investigating molecular events and tumor dynamics in the living animal. Current usage is often limited to planar imaging, but tomographic imaging can enhance the usefulness of this technique in quantitative biomedical studies by allowing accurate determination of tumor size and attribution of the emitted light to a specific organ or tissue. Bioluminescence tomography based on a single camera with source rotation or mirrors to provide additional views has previously been reported. We report here in vivo studies using a novel approach with multiple rotating cameras that, when combined with image reconstruction software, provides the desired representation of point source metastases and other small lesions. Comparison with MRI validated the ability to detect lung tumor colonization in mouse lung. PMID:26824926

  19. Energetic Phenomena on the Sun: The Solar Maximum Mission Flare Workshop. Proceedings

    NASA Technical Reports Server (NTRS)

    Kundu, Mukul (Editor); Woodgate, Bruce (Editor)

    1986-01-01

    The general objectives of the conference were as follows: (1) Synthesize flare studies after three years of Solar Maximum Mission (SSM) data analysis. Encourage a broader participation in the SMM data analysis and combine this more fully with theory and other data sources-data obtained with other spacecraft such as the HINOTORI, p78-1, and ISEE-3 spacecrafts, and with the Very Large Array (VLA) and many other ground-based instruments. Many coordinated data sets, unprecedented in their breadth of coverage and multiplicity of sources, had been obtained within the structure of the Solar Maximum Year (SMY). (2) Stimulate joint studies, and publication in the general scientific literature. The intended primary benefit was for informal collaborations to be started or broadened at the Workshops with subsequent publications. (3) Provide a special publication resulting from the Workshop.

  20. Complex Event Recognition Architecture

    NASA Technical Reports Server (NTRS)

    Fitzgerald, William A.; Firby, R. James

    2009-01-01

    Complex Event Recognition Architecture (CERA) is the name of a computational architecture, and software that implements the architecture, for recognizing complex event patterns that may be spread across multiple streams of input data. One of the main components of CERA is an intuitive event pattern language that simplifies what would otherwise be the complex, difficult tasks of creating logical descriptions of combinations of temporal events and defining rules for combining information from different sources over time. In this language, recognition patterns are defined in simple, declarative statements that combine point events from given input streams with those from other streams, using conjunction, disjunction, and negation. Patterns can be built on one another recursively to describe very rich, temporally extended combinations of events. Thereafter, a run-time matching algorithm in CERA efficiently matches these patterns against input data and signals when patterns are recognized. CERA can be used to monitor complex systems and to signal operators or initiate corrective actions when anomalous conditions are recognized. CERA can be run as a stand-alone monitoring system, or it can be integrated into a larger system to automatically trigger responses to changing environments or problematic situations.

  1. Effects of multiple congruent cues on concurrent sound segregation during passive and active listening: an event-related potential (ERP) study.

    PubMed

    Kocsis, Zsuzsanna; Winkler, István; Szalárdy, Orsolya; Bendixen, Alexandra

    2014-07-01

    In two experiments, we assessed the effects of combining different cues of concurrent sound segregation on the object-related negativity (ORN) and the P400 event-related potential components. Participants were presented with sequences of complex tones, half of which contained some manipulation: one or two harmonic partials were mistuned, delayed, or presented from a different location than the rest. In separate conditions, one, two, or three of these manipulations were combined. Participants watched a silent movie (passive listening) or reported after each tone whether they perceived one or two concurrent sounds (active listening). ORN was found in almost all conditions except for location difference alone during passive listening. Combining several cues or manipulating more than one partial consistently led to sub-additive effects on the ORN amplitude. These results support the view that ORN reflects a combined, feature-unspecific assessment of the auditory system regarding the contribution of two sources to the incoming sound. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Enabling Resiliency Operations across Multiple Microgrids with Grid Friendly Appliance Controllers

    DOE PAGES

    Schneider, Kevin P.; Tuffner, Frank K.; Elizondo, Marcelo A.; ...

    2017-02-16

    Changes in economic, technological, and environmental policies are resulting in a re-evaluation of the dependence on large central generation facilities and their associated transmission networks. Emerging concepts of smart communities/cities are examining the potential to leverage cleaner sources of generation, as well as integrating electricity generation with other municipal functions. When grid connected, these generation assets can supplement the existing interconnections with the bulk transmission system, and in the event of an extreme event, they can provide power via a collection of microgrids. To achieve the highest level of resiliency, it may be necessary to conduct switching operations to interconnectmore » individual microgrids. While the interconnection of multiple microgrids can increase the resiliency of the system, the associated switching operations can cause large transients in low inertia microgrids. The combination of low system inertia and IEEE 1547 and 1547a-compliant inverters can prevent multiple microgrids from being interconnected during extreme weather events. This study will present a method of using end-use loads equipped with Grid Friendly™ Appliance controllers to facilitate the switching operations between multiple microgrids; operations that are necessary for optimal operations when islanded for resiliency.« less

  3. Enabling Resiliency Operations across Multiple Microgrids with Grid Friendly Appliance Controllers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, Kevin P.; Tuffner, Frank K.; Elizondo, Marcelo A.

    Changes in economic, technological, and environmental policies are resulting in a re-evaluation of the dependence on large central generation facilities and their associated transmission networks. Emerging concepts of smart communities/cities are examining the potential to leverage cleaner sources of generation, as well as integrating electricity generation with other municipal functions. When grid connected, these generation assets can supplement the existing interconnections with the bulk transmission system, and in the event of an extreme event, they can provide power via a collection of microgrids. To achieve the highest level of resiliency, it may be necessary to conduct switching operations to interconnectmore » individual microgrids. While the interconnection of multiple microgrids can increase the resiliency of the system, the associated switching operations can cause large transients in low inertia microgrids. The combination of low system inertia and IEEE 1547 and 1547a-compliant inverters can prevent multiple microgrids from being interconnected during extreme weather events. This study will present a method of using end-use loads equipped with Grid Friendly™ Appliance controllers to facilitate the switching operations between multiple microgrids; operations that are necessary for optimal operations when islanded for resiliency.« less

  4. Fusion of magnetometer and gradiometer sensors of MEG in the presence of multiplicative error.

    PubMed

    Mohseni, Hamid R; Woolrich, Mark W; Kringelbach, Morten L; Luckhoo, Henry; Smith, Penny Probert; Aziz, Tipu Z

    2012-07-01

    Novel neuroimaging techniques have provided unprecedented information on the structure and function of the living human brain. Multimodal fusion of data from different sensors promises to radically improve this understanding, yet optimal methods have not been developed. Here, we demonstrate a novel method for combining multichannel signals. We show how this method can be used to fuse signals from the magnetometer and gradiometer sensors used in magnetoencephalography (MEG), and through extensive experiments using simulation, head phantom and real MEG data, show that it is both robust and accurate. This new approach works by assuming that the lead fields have multiplicative error. The criterion to estimate the error is given within a spatial filter framework such that the estimated power is minimized in the worst case scenario. The method is compared to, and found better than, existing approaches. The closed-form solution and the conditions under which the multiplicative error can be optimally estimated are provided. This novel approach can also be employed for multimodal fusion of other multichannel signals such as MEG and EEG. Although the multiplicative error is estimated based on beamforming, other methods for source analysis can equally be used after the lead-field modification.

  5. Part 2. Development of Enhanced Statistical Methods for Assessing Health Effects Associated with an Unknown Number of Major Sources of Multiple Air Pollutants.

    PubMed

    Park, Eun Sug; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford

    2015-06-01

    A major difficulty with assessing source-specific health effects is that source-specific exposures cannot be measured directly; rather, they need to be estimated by a source-apportionment method such as multivariate receptor modeling. The uncertainty in source apportionment (uncertainty in source-specific exposure estimates and model uncertainty due to the unknown number of sources and identifiability conditions) has been largely ignored in previous studies. Also, spatial dependence of multipollutant data collected from multiple monitoring sites has not yet been incorporated into multivariate receptor modeling. The objectives of this project are (1) to develop a multipollutant approach that incorporates both sources of uncertainty in source-apportionment into the assessment of source-specific health effects and (2) to develop enhanced multivariate receptor models that can account for spatial correlations in the multipollutant data collected from multiple sites. We employed a Bayesian hierarchical modeling framework consisting of multivariate receptor models, health-effects models, and a hierarchical model on latent source contributions. For the health model, we focused on the time-series design in this project. Each combination of number of sources and identifiability conditions (additional constraints on model parameters) defines a different model. We built a set of plausible models with extensive exploratory data analyses and with information from previous studies, and then computed posterior model probability to estimate model uncertainty. Parameter estimation and model uncertainty estimation were implemented simultaneously by Markov chain Monte Carlo (MCMC*) methods. We validated the methods using simulated data. We illustrated the methods using PM2.5 (particulate matter ≤ 2.5 μm in aerodynamic diameter) speciation data and mortality data from Phoenix, Arizona, and Houston, Texas. The Phoenix data included counts of cardiovascular deaths and daily PM2.5 speciation data from 1995-1997. The Houston data included respiratory mortality data and 24-hour PM2.5 speciation data sampled every six days from a region near the Houston Ship Channel in years 2002-2005. We also developed a Bayesian spatial multivariate receptor modeling approach that, while simultaneously dealing with the unknown number of sources and identifiability conditions, incorporated spatial correlations in the multipollutant data collected from multiple sites into the estimation of source profiles and contributions based on the discrete process convolution model for multivariate spatial processes. This new modeling approach was applied to 24-hour ambient air concentrations of 17 volatile organic compounds (VOCs) measured at nine monitoring sites in Harris County, Texas, during years 2000 to 2005. Simulation results indicated that our methods were accurate in identifying the true model and estimated parameters were close to the true values. The results from our methods agreed in general with previous studies on the source apportionment of the Phoenix data in terms of estimated source profiles and contributions. However, we had a greater number of statistically insignificant findings, which was likely a natural consequence of incorporating uncertainty in the estimated source contributions into the health-effects parameter estimation. For the Houston data, a model with five sources (that seemed to be Sulfate-Rich Secondary Aerosol, Motor Vehicles, Industrial Combustion, Soil/Crustal Matter, and Sea Salt) showed the highest posterior model probability among the candidate models considered when fitted simultaneously to the PM2.5 and mortality data. There was a statistically significant positive association between respiratory mortality and same-day PM2.5 concentrations attributed to one of the sources (probably industrial combustion). The Bayesian spatial multivariate receptor modeling approach applied to the VOC data led to a highest posterior model probability for a model with five sources (that seemed to be refinery, petrochemical production, gasoline evaporation, natural gas, and vehicular exhaust) among several candidate models, with the number of sources varying between three and seven and with different identifiability conditions. Our multipollutant approach assessing source-specific health effects is more advantageous than a single-pollutant approach in that it can estimate total health effects from multiple pollutants and can also identify emission sources that are responsible for adverse health effects. Our Bayesian approach can incorporate not only uncertainty in the estimated source contributions, but also model uncertainty that has not been addressed in previous studies on assessing source-specific health effects. The new Bayesian spatial multivariate receptor modeling approach enables predictions of source contributions at unmonitored sites, minimizing exposure misclassification and providing improved exposure estimates along with their uncertainty estimates, as well as accounting for uncertainty in the number of sources and identifiability conditions.

  6. Survival of the hermit crab, Clibanarius vittatus, exposed to selenium and other environmental factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Recent investigations of water quality criteria have frequently examined the effects of a pollutant; however, a more realistic investigation would consider effects of multiple environmental factors and their interactions with the pollutant. Awareness of selenium as a pollutant is increasing. The growing sulfur and petroleum industries are only two of the potential sources of the element on the Texas coast. This study examined the toxicity of selenium to hermit crab Clibanarius vittatus (Bosc) under twelve different combinations of temperature and salinity. Additionally, the impact of the organisms' original environment was considered as an environmental factor.

  7. service line analytics in the new era.

    PubMed

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  8. A consensual neural network

    NASA Technical Reports Server (NTRS)

    Benediktsson, J. A.; Ersoy, O. K.; Swain, P. H.

    1991-01-01

    A neural network architecture called a consensual neural network (CNN) is proposed for the classification of data from multiple sources. Its relation to hierarchical and ensemble neural networks is discussed. CNN is based on the statistical consensus theory and uses nonlinearly transformed input data. The input data are transformed several times, and the different transformed data are applied as if they were independent inputs. The independent inputs are classified using stage neural networks and outputs from the stage networks are then weighted and combined to make a decision. Experimental results based on remote-sensing data and geographic data are given.

  9. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  10. Application of the MCNP5 code to the Modeling of vaginal and intra-uterine applicators used in intracavitary brachytherapy: a first approach

    NASA Astrophysics Data System (ADS)

    Gerardy, I.; Rodenas, J.; Van Dycke, M.; Gallardo, S.; Tondeur, F.

    2008-02-01

    Brachytherapy is a radiotherapy treatment where encapsulated radioactive sources are introduced within a patient. Depending on the technique used, such sources can produce high, medium or low local dose rates. The Monte Carlo method is a powerful tool to simulate sources and devices in order to help physicists in treatment planning. In multiple types of gynaecological cancer, intracavitary brachytherapy (HDR Ir-192 source) is used combined with other therapy treatment to give an additional local dose to the tumour. Different types of applicators are used in order to increase the dose imparted to the tumour and to limit the effect on healthy surrounding tissues. The aim of this work is to model both applicator and HDR source in order to evaluate the dose at a reference point as well as the effect of the materials constituting the applicators on the near field dose. The MCNP5 code based on the Monte Carlo method has been used for the simulation. Dose calculations have been performed with *F8 energy deposition tally, taking into account photons and electrons. Results from simulation have been compared with experimental in-phantom dose measurements. Differences between calculations and measurements are lower than 5%.The importance of the source position has been underlined.

  11. Investigation of spherical loudspeaker arrays for local active control of sound.

    PubMed

    Peleg, Tomer; Rafaely, Boaz

    2011-10-01

    Active control of sound can be employed globally to reduce noise levels in an entire enclosure, or locally around a listener's head. Recently, spherical loudspeaker arrays have been studied as multiple-channel sources for local active control of sound, presenting the fundamental theory and several active control configurations. In this paper, important aspects of using a spherical loudspeaker array for local active control of sound are further investigated. First, the feasibility of creating sphere-shaped quiet zones away from the source is studied both theoretically and numerically, showing that these quiet zones are associated with sound amplification and poor system robustness. To mitigate the latter, the design of shell-shaped quiet zones around the source is investigated. A combination of two spherical sources is then studied with the aim of enlarging the quiet zone. The two sources are employed to generate quiet zones that surround a rigid sphere, investigating the application of active control around a listener's head. A significant improvement in performance is demonstrated in this case over a conventional headrest-type system that uses two monopole secondary sources. Finally, several simulations are presented to support the theoretical work and to demonstrate the performance and limitations of the system. © 2011 Acoustical Society of America

  12. Supporting spatial data harmonization process with the use of ontologies and Semantic Web technologies

    NASA Astrophysics Data System (ADS)

    Strzelecki, M.; Iwaniak, A.; Łukowicz, J.; Kaczmarek, I.

    2013-10-01

    Nowadays, spatial information is not only used by professionals, but also by common citizens, who uses it for their daily activities. Open Data initiative states that data should be freely and unreservedly available for all users. It also applies to spatial data. As spatial data becomes widely available it is essential to publish it in form which guarantees the possibility of integrating it with other, heterogeneous data sources. Interoperability is the possibility to combine spatial data sets from different sources in a consistent way as well as providing access to it. Providing syntactic interoperability based on well-known data formats is relatively simple, unlike providing semantic interoperability, due to the multiple possible data interpretation. One of the issues connected with the problem of achieving interoperability is data harmonization. It is a process of providing access to spatial data in a representation that allows combining it with other harmonized data in a coherent way by using a common set of data product specification. Spatial data harmonization is performed by creating definition of reclassification and transformation rules (mapping schema) for source application schema. Creation of those rules is a very demanding task which requires wide domain knowledge and a detailed look into application schemas. The paper focuses on proposing methods for supporting data harmonization process, by automated or supervised creation of mapping schemas with the use of ontologies, ontology matching methods and Semantic Web technologies.

  13. Combined Landsat-8 and Sentinel-2 Burned Area Mapping

    NASA Astrophysics Data System (ADS)

    Huang, H.; Roy, D. P.; Zhang, H.; Boschetti, L.; Yan, L.; Li, Z.

    2017-12-01

    Fire products derived from coarse spatial resolution satellite data have become an important source of information for the multiple user communities involved in fire science and applications. The advent of the MODIS on NASA's Terra and Aqua satellites enabled systematic production of 500m global burned area maps. There is, however, an unequivocal demand for systematically generated higher spatial resolution burned area products, in particular to examine the role of small-fires for various applications. Moderate spatial resolution contemporaneous satellite data from Landsat-8 and the Sentinel-2A and -2B sensors provide the opportunity for detailed spatial mapping of burned areas. Combined, these polar-orbiting systems provide 10m to 30m multi-spectral global coverage more than once every three days. This NASA funded research presents results to prototype a combined Landsat-8 Sentinel-2 burned area product. The Landsat-8 and Sentinel-2 pre-processing, the time-series burned area mapping algorithm, and preliminary results and validation using high spatial resolution commercial satellite data over Africa are presented.

  14. Airborne Dioxins, Furans and Polycyclic Aromatic Hydrocarbons Exposure to Military Personnel in Iraq

    PubMed Central

    Masiol, Mauro; Mallon, Timothy; Haines, Kevin M.; Utell, Mark J.; Hopke, Philip K.

    2016-01-01

    Objectives The objective was to use ambient polycyclic aromatic hydrocarbon (PAH), polychlorinated dibenzo-p-dioxins (PCDD) and polychlorinated dibenzofurans (PCDF) concentrations measured at Joint Base Balad in Iraq in 2007 to identify the sources of these species and their spatial patterns. Methods The ratios of the measured species were compared to literature data for likely emission sources. Using the multiple site measurements on specific days, contour maps have been drawn using inverse distance weighting (IDW). Results These analyses suggest multiple sources including the burn pit (primarily a source of PCDD/PCDFs), the transportation field (primarily as source of PAHs) and other sources of PAHs that include aircraft, space heating, and diesel power generation. Conclusions The nature and locations of the sources were identified. PCDD/PCDFs were emitted by the burn pit. Multiple PAH sources exist across the base. PMID:27501100

  15. Challenges with secondary use of multi-source water-quality data in the United States

    USGS Publications Warehouse

    Sprague, Lori A.; Oelsner, Gretchen P.; Argue, Denise M.

    2017-01-01

    Combining water-quality data from multiple sources can help counterbalance diminishing resources for stream monitoring in the United States and lead to important regional and national insights that would not otherwise be possible. Individual monitoring organizations understand their own data very well, but issues can arise when their data are combined with data from other organizations that have used different methods for reporting the same common metadata elements. Such use of multi-source data is termed “secondary use”—the use of data beyond the original intent determined by the organization that collected the data. In this study, we surveyed more than 25 million nutrient records collected by 488 organizations in the United States since 1899 to identify major inconsistencies in metadata elements that limit the secondary use of multi-source data. Nearly 14.5 million of these records had missing or ambiguous information for one or more key metadata elements, including (in decreasing order of records affected) sample fraction, chemical form, parameter name, units of measurement, precise numerical value, and remark codes. As a result, metadata harmonization to make secondary use of these multi-source data will be time consuming, expensive, and inexact. Different data users may make different assumptions about the same ambiguous data, potentially resulting in different conclusions about important environmental issues. The value of these ambiguous data is estimated at \\$US12 billion, a substantial collective investment by water-resource organizations in the United States. By comparison, the value of unambiguous data is estimated at \\$US8.2 billion. The ambiguous data could be preserved for uses beyond the original intent by developing and implementing standardized metadata practices for future and legacy water-quality data throughout the United States.

  16. Evaluating the distribution of terrestrial dissolved organic matter in a complex coastal ecosystem using fluorescence spectroscopy

    NASA Astrophysics Data System (ADS)

    Yamashita, Youhei; Boyer, Joseph N.; Jaffé, Rudolf

    2013-09-01

    The coastal zone of the Florida Keys features the only living coral reef in the continental United States and as such represents a unique regional environmental resource. Anthropogenic pressures combined with climate disturbances such as hurricanes can affect the biogeochemistry of the region and threaten the health of this unique ecosystem. As such, water quality monitoring has historically been implemented in the Florida Keys, and six spatially distinct zones have been identified. In these studies however, dissolved organic matter (DOM) has only been studied as a quantitative parameter, and DOM composition can be a valuable biogeochemical parameter in assessing environmental change in coastal regions. Here we report the first data of its kind on the application of optical properties of DOM, in particular excitation emission matrix fluorescence with parallel factor analysis (EEM-PARAFAC), throughout these six Florida Keys regions in an attempt to assess spatial differences in DOM sources. Our data suggests that while DOM in the Florida Keys can be influenced by distant terrestrial environments such as the Everglades, spatial differences in DOM distribution were also controlled in part by local surface runoff/fringe mangroves, contributions from seasgrass communities, as well as the reefs and waters from the Florida Current. Application of principal component analysis (PCA) of the relative abundance of EEM-PARAFAC components allowed for a clear distinction between the sources of DOM (allochthonous vs. autochthonous), between different autochthonous sources and/or the diagenetic status of DOM, and further clarified contribution of terrestrial DOM in zones where levels of DOM were low in abundance. The combination between EEM-PARAFAC and PCA proved to be ideally suited to discern DOM composition and source differences in coastal zones with complex hydrology and multiple DOM sources.

  17. Tropical Rainfall Analysis Using TRMM in Combination With Other Satellite Gauge Data: Comparison with Global Precipitation Climatology Project (GPCP) Results

    NASA Technical Reports Server (NTRS)

    Adler, Robert F.; Huffman, George J.; Bolvin, David; Nelkin, Eric; Curtis, Scott

    1999-01-01

    This paper describes recent results of using Tropical Rainfall Measuring Mission (TRMM) information as the key calibration tool in a merged analysis on a 1 deg x 1 deg latitude/longitude monthly scale based on multiple satellite sources and raingauge analysis. The procedure used to produce the GPCP data set is a stepwise approach which first combines the satellite low-orbit microwave and geosynchronous IR observations into a "multi-satellite" product and than merges that result with the raingauge analysis. Preliminary results produced with the still-stabilizing TRMM algorithms indicate that TRMM shows tighter spatial gradients in tropical rain maxima with higher peaks in the center of the maxima. The TRMM analyses will be used to evaluate the evolution of the 1998 ENSO variations, again in comparison with the GPCP analyses.

  18. Bit error rate analysis of the K channel using wavelength diversity

    NASA Astrophysics Data System (ADS)

    Shah, Dhaval; Kothari, Dilip Kumar; Ghosh, Anjan K.

    2017-05-01

    The presence of atmospheric turbulence in the free space causes fading and degrades the performance of a free space optical (FSO) system. To mitigate the turbulence-induced fading, multiple copies of the signal can be transmitted on a different wavelength. Each signal, in this case, will undergo different fadings. This is known as the wavelength diversity technique. Bit error rate (BER) performance of the FSO systems with wavelength diversity under strong turbulence condition is investigated. K-distribution is chosen to model a strong turbulence scenario. The source information is transmitted onto three carrier wavelengths of 1.55, 1.31, and 0.85 μm. The signals at the receiver side are combined using three different methods: optical combining (OC), equal gain combining (EGC), and selection combining (SC). Mathematical expressions are derived for the calculation of the BER for all three schemes (OC, EGC, and SC). Results are presented for the link distance of 2 and 3 km under strong turbulence conditions for all the combining methods. The performance of all three schemes is also compared. It is observed that OC provides better performance than the other two techniques. Proposed method results are also compared with the published article.

  19. Examining the influence of urban definition when assessing relative safety of drinking-water in Nigeria.

    PubMed

    Christenson, Elizabeth; Bain, Robert; Wright, Jim; Aondoakaa, Stephen; Hossain, Rifat; Bartram, Jamie

    2014-08-15

    Reducing inequalities is a priority from a human rights perspective and in water and public health initiatives. There are periodic calls for differential national and global standards for rural and urban areas, often justified by the suggestion that, for a given water source type, safety is worse in urban areas. For instance, initially proposed post-2015 water targets included classifying urban but not rural protected dug wells as unimproved. The objectives of this study were to: (i) examine the influence of urban extent definition on water safety in Nigeria, (ii) compare the frequency of thermotolerant coliform (TTC) contamination and prevalence of sanitary risks between rural and urban water sources of a given type and (iii) investigate differences in exposure to contaminated drinking-water in rural and urban areas. We use spatially referenced data from a Nigerian national randomized sample survey of five improved water source types to assess the extent of any disparities in urban-rural safety. We combined the survey data on TTC and sanitary risk with map layers depicting urban versus rural areas according to eight urban definitions. When examining water safety separately for each improved source type, we found no significant urban-rural differences in TTC contamination and sanitary risk for groundwater sources (boreholes and protected dug wells) and inconclusive findings for piped water and stored water. However, when improved and unimproved source types were combined, TTC contamination was 1.6 to 2.3 times more likely in rural compared to urban water sources depending on the urban definition. Our results suggest that different targets for urban and rural water safety are not justified and that rural dwellers are more exposed to unsafe water than urban dwellers. Additionally, urban-rural analyses should assess multiple definitions or indicators of urban to assess robustness of findings and to characterize a gradient that disaggregates the urban-rural dichotomy. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. The Early Detection of the Emerald Ash Borer (eab) Using Multi-Source Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Hu, B.; Naveed, F.; Tasneem, F.; Xing, C.

    2018-04-01

    The objectives of this study were to exploit the synergy of hyperspectral imagery, Light Detection And Ranging (LiDAR) and high spatial resolution data and their synergy in the early detection of the EAB (Emerald Ash Borer) presence in trees within urban areas and to develop a framework to combine information extracted from multiple data sources. To achieve these, an object-oriented framework was developed to combine information derived from available data sets to characterize ash trees. Within this framework, an advanced individual tree delineation method was developed to delineate individual trees using the combined high-spatial resolution worldview-3 imagery was used together with LiDAR data. Individual trees were then classified to ash and non-ash trees using spectral and spatial information. In order to characterize the health state of individual ash trees, leaves from ash trees with various health states were sampled and measured using a field spectrometer. Based on the field measurements, the best indices that sensitive to leaf chlorophyll content were selected. The developed framework and methods were tested using worldview-3, airborne LiDAR data over the Keele campus of York University Toronto Canada. Satisfactory results in terms of individual tree crown delineation, ash tree identification and characterization of the health state of individual ash trees. Quantitative evaluations is being carried out.

  1. RSAT matrix-clustering: dynamic exploration and redundancy reduction of transcription factor binding motif collections.

    PubMed

    Castro-Mondragon, Jaime Abraham; Jaeger, Sébastien; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2017-07-27

    Transcription factor (TF) databases contain multitudes of binding motifs (TFBMs) from various sources, from which non-redundant collections are derived by manual curation. The advent of high-throughput methods stimulated the production of novel collections with increasing numbers of motifs. Meta-databases, built by merging these collections, contain redundant versions, because available tools are not suited to automatically identify and explore biologically relevant clusters among thousands of motifs. Motif discovery from genome-scale data sets (e.g. ChIP-seq) also produces redundant motifs, hampering the interpretation of results. We present matrix-clustering, a versatile tool that clusters similar TFBMs into multiple trees, and automatically creates non-redundant TFBM collections. A feature unique to matrix-clustering is its dynamic visualisation of aligned TFBMs, and its capability to simultaneously treat multiple collections from various sources. We demonstrate that matrix-clustering considerably simplifies the interpretation of combined results from multiple motif discovery tools, and highlights biologically relevant variations of similar motifs. We also ran a large-scale application to cluster ∼11 000 motifs from 24 entire databases, showing that matrix-clustering correctly groups motifs belonging to the same TF families, and drastically reduced motif redundancy. matrix-clustering is integrated within the RSAT suite (http://rsat.eu/), accessible through a user-friendly web interface or command-line for its integration in pipelines. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Content Integration across Multiple Documents Reduces Memory for Sources

    ERIC Educational Resources Information Center

    Braasch, Jason L. G.; McCabe, Rebecca M.; Daniel, Frances

    2016-01-01

    The current experiments systematically examined semantic content integration as a mechanism for explaining source inattention and forgetting when reading-to-remember multiple texts. For all 3 experiments, degree of semantic overlap was manipulated amongst messages provided by various information sources. In Experiment 1, readers' source…

  3. Assessing the chemical contamination dynamics in a mixed land use stream system.

    PubMed

    Sonne, Anne Th; McKnight, Ursula S; Rønde, Vinni; Bjerg, Poul L

    2017-11-15

    Traditionally, the monitoring of streams for chemical and ecological status has been limited to surface water concentrations, where the dominant focus has been on general water quality and the risk for eutrophication. Mixed land use stream systems, comprising urban areas and agricultural production, are challenging to assess with multiple chemical stressors impacting stream corridors. New approaches are urgently needed for identifying relevant sources, pathways and potential impacts for implementation of suitable source management and remedial measures. We developed a method for risk assessing chemical stressors in these systems and applied the approach to a 16-km groundwater-fed stream corridor (Grindsted, Denmark). Three methods were combined: (i) in-stream contaminant mass discharge for source quantification, (ii) Toxic Units and (iii) environmental standards. An evaluation of the chemical quality of all three stream compartments - stream water, hyporheic zone, streambed sediment - made it possible to link chemical stressors to their respective sources and obtain new knowledge about source composition and origin. Moreover, toxic unit estimation and comparison to environmental standards revealed the stream water quality was substantially impaired by both geogenic and diffuse anthropogenic sources of metals along the entire corridor, while the streambed was less impacted. Quantification of the contaminant mass discharge originating from a former pharmaceutical factory revealed that several 100 kgs of chlorinated ethenes and pharmaceutical compounds discharge into the stream every year. The strongly reduced redox conditions in the plume result in high concentrations of dissolved iron and additionally release arsenic, generating the complex contaminant mixture found in the narrow discharge zone. The fingerprint of the plume was observed in the stream several km downgradient, while nutrients, inorganics and pesticides played a minor role for the stream health. The results emphasize that future investigations should include multiple compounds and stream compartments, and highlight the need for holistic approaches when risk assessing these dynamic systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Multi-epoch VLBA Imaging of 20 New TeV Blazars: Apparent Jet Speeds

    NASA Astrophysics Data System (ADS)

    Piner, B. Glenn; Edwards, Philip G.

    2018-01-01

    We present 88 multi-epoch Very Long Baseline Array (VLBA) images (most at an observing frequency of 8 GHz) of 20 TeV blazars, all of the high-frequency-peaked BL Lac (HBL) class, that have not been previously studied at multiple epochs on the parsec scale. From these 20 sources, we analyze the apparent speeds of 43 jet components that are all detected at four or more epochs. As has been found for other TeV HBLs, the apparent speeds of these components are relatively slow. About two-thirds of the components have an apparent speed that is consistent (within 2σ) with no motion, and some of these components may be stationary patterns whose apparent speed does not relate to the underlying bulk flow speed. In addition, a superluminal tail to the apparent speed distribution of the TeV HBLs is detected for the first time, with eight components in seven sources having a 2σ lower limit on the apparent speed exceeding 1c. We combine the data from these 20 sources with an additional 18 sources from the literature to analyze the complete apparent speed distribution of all 38 TeV HBLs that have been studied with very long baseline interferometry at multiple epochs. The highest 2σ apparent speed lower limit considering all sources is 3.6c. This suggests that bulk Lorentz factors of up to about 4, but probably not much higher, exist in the parsec-scale radio-emitting regions of these sources, consistent with estimates obtained in the radio by other means such as brightness temperatures. This can be reconciled with the high Lorentz factors estimated from the high-energy data if the jet has velocity structures consisting of different emission regions with different Lorentz factors. In particular, we analyze the current apparent speed data for the TeV HBLs in the context of a model with a fast central spine and a slower outer layer.

  5. White-Light Supercontinuum Laser-Based Multiple Wavelength Excitation for TCSPC-FLIM of Cutaneous Nanocarrier Uptake

    NASA Astrophysics Data System (ADS)

    Volz, Pierre; Brodwolf, Robert; Zoschke, Christian; Haag, Rainer; Schäfer-Korting, Monika; Alexiev, Ulrike

    2018-05-01

    We report here on a custom-built time-correlated single photon-counting (TCSPC)-based fluorescence lifetime imaging microscopy (FLIM) setup with a continuously tunable white-light supercontinuum laser combined with acousto-optical tunable filters (AOTF) as an excitation source for simultaneous excitation of multiple spectrally separated fluorophores. We characterized the wavelength dependence of the white-light supercontinuum laser pulse properties and demonstrated the performance of the FLIM setup, aiming to show the experimental setup in depth together with a biomedical application. We herein summarize the physical-technical parameters as well as our approach to map the skin uptake of nanocarriers using FLIM with a resolution compared to spectroscopy. As an example, we focus on the penetration study of indocarbocyanine-labeled dendritic core-multishell nanocarriers (CMS-ICC) into reconstructed human epidermis. Unique fluorescence lifetime signatures of indocarbocyanine-labeled nanocarriers indicate nanocarrier-tissue interactions within reconstructed human epidermis, bringing FLIM close to spectroscopic analysis.

  6. Apparatus and method for operating internal combustion engines from variable mixtures of gaseous fuels

    DOEpatents

    Heffel, James W [Lake Matthews, CA; Scott, Paul B [Northridge, CA; Park, Chan Seung [Yorba Linda, CA

    2011-11-01

    An apparatus and method for utilizing any arbitrary mixture ratio of multiple fuel gases having differing combustion characteristics, such as natural gas and hydrogen gas, within an internal combustion engine. The gaseous fuel composition ratio is first sensed, such as by thermal conductivity, infrared signature, sound propagation speed, or equivalent mixture differentiation mechanisms and combinations thereof which are utilized as input(s) to a "multiple map" engine control module which modulates selected operating parameters of the engine, such as fuel injection and ignition timing, in response to the proportions of fuel gases available so that the engine operates correctly and at high efficiency irrespective of the gas mixture ratio being utilized. As a result, an engine configured according to the teachings of the present invention may be fueled from at least two different fuel sources without admixing constraints.

  7. Apparatus and method for operating internal combustion engines from variable mixtures of gaseous fuels

    DOEpatents

    Heffel, James W.; Scott, Paul B.

    2003-09-02

    An apparatus and method for utilizing any arbitrary mixture ratio of multiple fuel gases having differing combustion characteristics, such as natural gas and hydrogen gas, within an internal combustion engine. The gaseous fuel composition ratio is first sensed, such as by thermal conductivity, infrared signature, sound propagation speed, or equivalent mixture differentiation mechanisms and combinations thereof which are utilized as input(s) to a "multiple map" engine control module which modulates selected operating parameters of the engine, such as fuel injection and ignition timing, in response to the proportions of fuel gases available so that the engine operates correctly and at high efficiency irrespective of the gas mixture ratio being utilized. As a result, an engine configured according to the teachings of the present invention may be fueled from at least two different fuel sources without admixing constraints.

  8. Causal inference and the data-fusion problem

    PubMed Central

    Bareinboim, Elias; Pearl, Judea

    2016-01-01

    We review concepts, principles, and tools that unify current approaches to causal analysis and attend to new challenges presented by big data. In particular, we address the problem of data fusion—piecing together multiple datasets collected under heterogeneous conditions (i.e., different populations, regimes, and sampling methods) to obtain valid answers to queries of interest. The availability of multiple heterogeneous datasets presents new opportunities to big data analysts, because the knowledge that can be acquired from combined data would not be possible from any individual source alone. However, the biases that emerge in heterogeneous environments require new analytical tools. Some of these biases, including confounding, sampling selection, and cross-population biases, have been addressed in isolation, largely in restricted parametric models. We here present a general, nonparametric framework for handling these biases and, ultimately, a theoretical solution to the problem of data fusion in causal inference tasks. PMID:27382148

  9. Photoelectron spectroscopic study of the anionic transition metalorganic complexes [Fe(1,2)(COT)](-) and [Co(COT)](-).

    PubMed

    Li, Xiang; Eustis, Soren N; Bowen, Kit H; Kandalam, Anil

    2008-09-28

    The gas-phase, iron and cobalt cyclooctatetraene cluster anions, [Fe(1,2)(COT)](-) and [Co(COT)](-), were generated using a laser vaporization source and studied using mass spectrometry and anion photoelectron spectroscopy. Density functional theory was employed to compute the structures and spin multiplicities of these cluster anions as well as those of their corresponding neutrals. Both experimental and theoretically predicted electron affinities and photodetachment transition energies are in good agreement, authenticating the structures and spin multiplicities predicted by theory. The implied spin magnetic moments of these systems suggest that [Fe(COT)], [Fe(2)(COT)], and [Co(COT)] retain the magnetic moments of the Fe atom, the Fe(2) dimer, and the Co atom, respectively. Thus, the interaction of these transition metal, atomic and dimeric moieties with a COT molecule does not quench their magnetic moments, leading to the possibility that these combinations may be useful in forming novel magnetic materials.

  10. Progress toward Consensus Estimates of Regional Glacier Mass Balances for IPCC AR5

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Gardner, A. S.; Cogley, J. G.

    2011-12-01

    Glaciers are potentially large contributors to rising sea level. Since the last IPCC report in 2007 (AR4), there has been a widespread increase in the use of geodetic observations from satellite and airborne platforms to complement field observations of glacier mass balance, as well as significant improvements in the global glacier inventory. Here we summarize our ongoing efforts to integrate data from multiple sources to arrive at a consensus estimate for each region, and to quantify uncertainties in those estimates. We will use examples from Alaska to illustrate methods for combining Gravity Recovery and Climate Experiment (GRACE), elevation differencing and field observations into a single time series with related uncertainty estimates. We will pay particular attention to reconciling discrepancies between GRACE estimates from multiple processing centers. We will also investigate the extent to which improvements in the glacier inventory affect the accuracy of our regional mass balances.

  11. Analysis of Radiation Damage in Light Water Reactors: Comparison of Cluster Analysis Methods for the Analysis of Atom Probe Data.

    PubMed

    Hyde, Jonathan M; DaCosta, Gérald; Hatzoglou, Constantinos; Weekes, Hannah; Radiguet, Bertrand; Styman, Paul D; Vurpillot, Francois; Pareige, Cristelle; Etienne, Auriane; Bonny, Giovanni; Castin, Nicolas; Malerba, Lorenzo; Pareige, Philippe

    2017-04-01

    Irradiation of reactor pressure vessel (RPV) steels causes the formation of nanoscale microstructural features (termed radiation damage), which affect the mechanical properties of the vessel. A key tool for characterizing these nanoscale features is atom probe tomography (APT), due to its high spatial resolution and the ability to identify different chemical species in three dimensions. Microstructural observations using APT can underpin development of a mechanistic understanding of defect formation. However, with atom probe analyses there are currently multiple methods for analyzing the data. This can result in inconsistencies between results obtained from different researchers and unnecessary scatter when combining data from multiple sources. This makes interpretation of results more complex and calibration of radiation damage models challenging. In this work simulations of a range of different microstructures are used to directly compare different cluster analysis algorithms and identify their strengths and weaknesses.

  12. Observational constraints on the physical nature of submillimetre source multiplicity: chance projections are common

    NASA Astrophysics Data System (ADS)

    Hayward, Christopher C.; Chapman, Scott C.; Steidel, Charles C.; Golob, Anneya; Casey, Caitlin M.; Smith, Daniel J. B.; Zitrin, Adi; Blain, Andrew W.; Bremer, Malcolm N.; Chen, Chian-Chou; Coppin, Kristen E. K.; Farrah, Duncan; Ibar, Eduardo; Michałowski, Michał J.; Sawicki, Marcin; Scott, Douglas; van der Werf, Paul; Fazio, Giovanni G.; Geach, James E.; Gurwell, Mark; Petitpas, Glen; Wilner, David J.

    2018-05-01

    Interferometric observations have demonstrated that a significant fraction of single-dish submillimetre (submm) sources are blends of multiple submm galaxies (SMGs), but the nature of this multiplicity, i.e. whether the galaxies are physically associated or chance projections, has not been determined. We performed spectroscopy of 11 SMGs in six multicomponent submm sources, obtaining spectroscopic redshifts for nine of them. For an additional two component SMGs, we detected continuum emission but no obvious features. We supplement our observed sources with four single-dish submm sources from the literature. This sample allows us to statistically constrain the physical nature of single-dish submm source multiplicity for the first time. In three (3/7, { or} 43^{+39 }_{ -33} {per cent at 95 {per cent} confidence}) of the single-dish sources for which the nature of the blending is unambiguous, the components for which spectroscopic redshifts are available are physically associated, whereas 4/7 (57^{+33 }_{ -39} per cent) have at least one unassociated component. When components whose spectra exhibit continuum but no features and for which the photometric redshift is significantly different from the spectroscopic redshift of the other component are also considered, 6/9 (67^{+26 }_{ -37} per cent) of the single-dish sources are comprised of at least one unassociated component SMG. The nature of the multiplicity of one single-dish source is ambiguous. We conclude that physically associated systems and chance projections both contribute to the multicomponent single-dish submm source population. This result contradicts the conventional wisdom that bright submm sources are solely a result of merger-induced starbursts, as blending of unassociated galaxies is also important.

  13. Enabling technologies for fiber optic sensing

    NASA Astrophysics Data System (ADS)

    Ibrahim, Selwan K.; Farnan, Martin; Karabacak, Devrez M.; Singer, Johannes M.

    2016-04-01

    In order for fiber optic sensors to compete with electrical sensors, several critical parameters need to be addressed such as performance, cost, size, reliability, etc. Relying on technologies developed in different industrial sectors helps to achieve this goal in a more efficient and cost effective way. FAZ Technology has developed a tunable laser based optical interrogator based on technologies developed in the telecommunication sector and optical transducer/sensors based on components sourced from the automotive market. Combining Fiber Bragg Grating (FBG) sensing technology with the above, high speed, high precision, reliable quasi distributed optical sensing systems for temperature, pressure, acoustics, acceleration, etc. has been developed. Careful design needs to be considered to filter out any sources of measurement drifts/errors due to different effects e.g. polarization and birefringence, coating imperfections, sensor packaging etc. Also to achieve high speed and high performance optical sensing systems, combining and synchronizing multiple optical interrogators similar to what has been used with computer/processors to deliver super computing power is an attractive solution. This path can be achieved by using photonic integrated circuit (PIC) technology which opens the doors to scaling up and delivering powerful optical sensing systems in an efficient and cost effective way.

  14. IPeak: An open source tool to combine results from multiple MS/MS search engines.

    PubMed

    Wen, Bo; Du, Chaoqin; Li, Guilin; Ghali, Fawaz; Jones, Andrew R; Käll, Lukas; Xu, Shaohang; Zhou, Ruo; Ren, Zhe; Feng, Qiang; Xu, Xun; Wang, Jun

    2015-09-01

    Liquid chromatography coupled tandem mass spectrometry (LC-MS/MS) is an important technique for detecting peptides in proteomics studies. Here, we present an open source software tool, termed IPeak, a peptide identification pipeline that is designed to combine the Percolator post-processing algorithm and multi-search strategy to enhance the sensitivity of peptide identifications without compromising accuracy. IPeak provides a graphical user interface (GUI) as well as a command-line interface, which is implemented in JAVA and can work on all three major operating system platforms: Windows, Linux/Unix and OS X. IPeak has been designed to work with the mzIdentML standard from the Proteomics Standards Initiative (PSI) as an input and output, and also been fully integrated into the associated mzidLibrary project, providing access to the overall pipeline, as well as modules for calling Percolator on individual search engine result files. The integration thus enables IPeak (and Percolator) to be used in conjunction with any software packages implementing the mzIdentML data standard. IPeak is freely available and can be downloaded under an Apache 2.0 license at https://code.google.com/p/mzidentml-lib/. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Risk assessment for tephra dispersal and sedimentation: the example of four Icelandic volcanoes

    NASA Astrophysics Data System (ADS)

    Biass, Sebastien; Scaini, Chiara; Bonadonna, Costanza; Smith, Kate; Folch, Arnau; Höskuldsson, Armann; Galderisi, Adriana

    2014-05-01

    In order to assist the elaboration of proactive measures for the management of future Icelandic volcanic eruptions, we developed a new approach to assess the impact associated with tephra dispersal and sedimentation at various scales and for multiple sources. Target volcanoes are Hekla, Katla, Eyjafjallajökull and Askja, selected for their high probabilities of eruption and/or their high potential impact. We combined stratigraphic studies, probabilistic strategies and numerical modelling to develop comprehensive eruption scenarios and compile hazard maps for local ground deposition and regional atmospheric concentration using both TEPHRA2 and FALL3D models. New algorithms for the identification of comprehensive probability density functions of eruptive source parameters were developed for both short and long-lasting activity scenarios. A vulnerability assessment of socioeconomic and territorial aspects was also performed at both national and continental scales. The identification of relevant vulnerability indicators allowed for the identification of the most critical areas and territorial nodes. At a national scale, the vulnerability of economic activities and the accessibility to critical infrastructures was assessed. At a continental scale, we assessed the vulnerability of the main airline routes and airports. Resulting impact and risk were finally assessed by combining hazard and vulnerability analysis.

  16. A Location Method Using Sensor Arrays for Continuous Gas Leakage in Integrally Stiffened Plates Based on the Acoustic Characteristics of the Stiffener

    PubMed Central

    Bian, Xu; Li, Yibo; Feng, Hao; Wang, Jiaqiang; Qi, Lei; Jin, Shijiu

    2015-01-01

    This paper proposes a continuous leakage location method based on the ultrasonic array sensor, which is specific to continuous gas leakage in a pressure container with an integral stiffener. This method collects the ultrasonic signals generated from the leakage hole through the piezoelectric ultrasonic sensor array, and analyzes the space-time correlation of every collected signal in the array. Meanwhile, it combines with the method of frequency compensation and superposition in time domain (SITD), based on the acoustic characteristics of the stiffener, to obtain a high-accuracy location result on the stiffener wall. According to the experimental results, the method successfully solves the orientation problem concerning continuous ultrasonic signals generated from leakage sources, and acquires high accuracy location information on the leakage source using a combination of multiple sets of orienting results. The mean value of location absolute error is 13.51 mm on the one-square-meter plate with an integral stiffener (4 mm width; 20 mm height; 197 mm spacing), and the maximum location absolute error is generally within a ±25 mm interval. PMID:26404316

  17. Combination of surface and borehole seismic data for robust target-oriented imaging

    NASA Astrophysics Data System (ADS)

    Liu, Yi; van der Neut, Joost; Arntsen, Børge; Wapenaar, Kees

    2016-05-01

    A novel application of seismic interferometry (SI) and Marchenko imaging using both surface and borehole data is presented. A series of redatuming schemes is proposed to combine both data sets for robust deep local imaging in the presence of velocity uncertainties. The redatuming schemes create a virtual acquisition geometry where both sources and receivers lie at the horizontal borehole level, thus only a local velocity model near the borehole is needed for imaging, and erroneous velocities in the shallow area have no effect on imaging around the borehole level. By joining the advantages of SI and Marchenko imaging, a macrovelocity model is no longer required and the proposed schemes use only single-component data. Furthermore, the schemes result in a set of virtual data that have fewer spurious events and internal multiples than previous virtual source redatuming methods. Two numerical examples are shown to illustrate the workflow and to demonstrate the benefits of the method. One is a synthetic model and the other is a realistic model of a field in the North Sea. In both tests, improved local images near the boreholes are obtained using the redatumed data without accurate velocities, because the redatumed data are close to the target.

  18. Detection, localization and classification of multiple dipole-like magnetic sources using magnetic gradient tensor data

    NASA Astrophysics Data System (ADS)

    Gang, Yin; Yingtang, Zhang; Hongbo, Fan; Zhining, Li; Guoquan, Ren

    2016-05-01

    We have developed a method for automatic detection, localization and classification (DLC) of multiple dipole sources using magnetic gradient tensor data. First, we define modified tilt angles to estimate the approximate horizontal locations of the multiple dipole-like magnetic sources simultaneously and detect the number of magnetic sources using a fixed threshold. Secondly, based on the isotropy of the normalized source strength (NSS) response of a dipole, we obtain accurate horizontal locations of the dipoles. Then the vertical locations are calculated using magnitude magnetic transforms of magnetic gradient tensor data. Finally, we invert for the magnetic moments of the sources using the measured magnetic gradient tensor data and forward model. Synthetic and field data sets demonstrate effectiveness and practicality of the proposed method.

  19. ZnO-based multiple channel and multiple gate FinMOSFETs

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Ting; Huang, Hung-Lin; Tseng, Chun-Yen; Lee, Hsin-Ying

    2016-02-01

    In recent years, zinc oxide (ZnO)-based metal-oxide-semiconductor field-effect transistors (MOSFETs) have attracted much attention, because ZnO-based semiconductors possess several advantages, including large exciton binding energy, nontoxicity, biocompatibility, low material cost, and wide direct bandgap. Moreover, the ZnO-based MOSFET is one of most potential devices, due to the applications in microwave power amplifiers, logic circuits, large scale integrated circuits, and logic swing. In this study, to enhance the performances of the ZnO-based MOSFETs, the ZnObased multiple channel and multiple gate structured FinMOSFETs were fabricated using the simple laser interference photolithography method and the self-aligned photolithography method. The multiple channel structure possessed the additional sidewall depletion width control ability to improve the channel controllability, because the multiple channel sidewall portions were surrounded by the gate electrode. Furthermore, the multiple gate structure had a shorter distance between source and gate and a shorter gate length between two gates to enhance the gate operating performances. Besides, the shorter distance between source and gate could enhance the electron velocity in the channel fin structure of the multiple gate structure. In this work, ninety one channels and four gates were used in the FinMOSFETs. Consequently, the drain-source saturation current (IDSS) and maximum transconductance (gm) of the ZnO-based multiple channel and multiple gate structured FinFETs operated at a drain-source voltage (VDS) of 10 V and a gate-source voltage (VGS) of 0 V were respectively improved from 11.5 mA/mm to 13.7 mA/mm and from 4.1 mS/mm to 6.9 mS/mm in comparison with that of the conventional ZnO-based single channel and single gate MOSFETs.

  20. Technology Performance Level (TPL) Scoring Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Jochem; Roberts, Jesse D.; Costello, Ronan

    2016-09-01

    Three different ways of combining scores are used in the revised formulation. These are arithmetic mean, geometric mean and multiplication with normalisation. Arithmetic mean is used when combining scores that measure similar attributes, e.g. used for combining costs. The arithmetic mean has the property that it is similar to a logical OR, e.g. when combining costs it does not matter what the individual costs are only what the combined cost is. Geometric mean and Multiplication are used when combining scores that measure disparate attributes. Multiplication is similar to a logical AND, it is used to combine ‘must haves.’ As amore » result, this method is more punitive than the geometric mean; to get a good score in the combined result it is necessary to have a good score in ALL of the inputs. e.g. the different types of survivability are ‘must haves.’ On balance, the revised TPL is probably less punitive than the previous spreadsheet, multiplication is used sparingly as a method of combining scores. This is in line with the feedback of the Wave Energy Prize judges.« less

Top