Sample records for multiple source analysis

  1. Regression Models for the Analysis of Longitudinal Gaussian Data from Multiple Sources

    PubMed Central

    O’Brien, Liam M.; Fitzmaurice, Garrett M.

    2006-01-01

    We present a regression model for the joint analysis of longitudinal multiple source Gaussian data. Longitudinal multiple source data arise when repeated measurements are taken from two or more sources, and each source provides a measure of the same underlying variable and on the same scale. This type of data generally produces a relatively large number of observations per subject; thus estimation of an unstructured covariance matrix often may not be possible. We consider two methods by which parsimonious models for the covariance can be obtained for longitudinal multiple source data. The methods are illustrated with an example of multiple informant data arising from a longitudinal interventional trial in psychiatry. PMID:15726666

  2. Analysis in Motion Initiative – Summarization Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arendt, Dustin; Pirrung, Meg; Jasper, Rob

    2017-06-22

    Analysts are tasked with integrating information from multiple data sources for important and timely decision making. What if sense making and overall situation awareness could be improved through visualization techniques? The Analysis in Motion initiative is advancing the ability to summarize and abstract multiple streams and static data sources over time.

  3. Multiple fingerprinting analyses in quality control of Cassiae Semen polysaccharides.

    PubMed

    Cheng, Jing; He, Siyu; Wan, Qiang; Jing, Pu

    2018-03-01

    Quality control issue overshadows potential health benefits of Cassiae Semen due to the analytic limitations. In this study, multiple-fingerprint analysis integrated with several chemometrics was performed to assess the polysaccharide quality of Cassiae Semen harvested from different locations. FT-IR, HPLC, and GC fingerprints of polysaccharide extracts from the authentic source were established as standard profiles, applying to assess the quality of foreign sources. Analyses of FT-IR fingerprints of polysaccharide extracts using either Pearson correlation analysis or principal component analysis (PCA), or HPLC fingerprints of partially hydrolyzed polysaccharides with PCA, distinguished the foreign sources from the authentic source. However, HPLC or GC fingerprints of completely hydrolyzed polysaccharides couldn't identify all foreign sources and the methodology using GC is quite limited in determining the monosaccharide composition. This indicates that FT-IR/HPLC fingerprints of non/partially-hydrolyzed polysaccharides, respectively, accompanied by multiple chemometrics methods, might be potentially applied in detecting and differentiating sources of Cassiae Semen. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Using real options analysis to support strategic management decisions

    NASA Astrophysics Data System (ADS)

    Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan

    2013-12-01

    Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.

  5. Joint Blind Source Separation by Multi-set Canonical Correlation Analysis

    PubMed Central

    Li, Yi-Ou; Adalı, Tülay; Wang, Wei; Calhoun, Vince D

    2009-01-01

    In this work, we introduce a simple and effective scheme to achieve joint blind source separation (BSS) of multiple datasets using multi-set canonical correlation analysis (M-CCA) [1]. We first propose a generative model of joint BSS based on the correlation of latent sources within and between datasets. We specify source separability conditions, and show that, when the conditions are satisfied, the group of corresponding sources from each dataset can be jointly extracted by M-CCA through maximization of correlation among the extracted sources. We compare source separation performance of the M-CCA scheme with other joint BSS methods and demonstrate the superior performance of the M-CCA scheme in achieving joint BSS for a large number of datasets, group of corresponding sources with heterogeneous correlation values, and complex-valued sources with circular and non-circular distributions. We apply M-CCA to analysis of functional magnetic resonance imaging (fMRI) data from multiple subjects and show its utility in estimating meaningful brain activations from a visuomotor task. PMID:20221319

  6. Multiple Component Event-Related Potential (mcERP) Estimation

    NASA Technical Reports Server (NTRS)

    Knuth, K. H.; Clanton, S. T.; Shah, A. S.; Truccolo, W. A.; Ding, M.; Bressler, S. L.; Trejo, L. J.; Schroeder, C. E.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We show how model-based estimation of the neural sources responsible for transient neuroelectric signals can be improved by the analysis of single trial data. Previously, we showed that a multiple component event-related potential (mcERP) algorithm can extract the responses of individual sources from recordings of a mixture of multiple, possibly interacting, neural ensembles. McERP also estimated single-trial amplitudes and onset latencies, thus allowing more accurate estimation of ongoing neural activity during an experimental trial. The mcERP algorithm is related to informax independent component analysis (ICA); however, the underlying signal model is more physiologically realistic in that a component is modeled as a stereotypic waveshape varying both in amplitude and onset latency from trial to trial. The result is a model that reflects quantities of interest to the neuroscientist. Here we demonstrate that the mcERP algorithm provides more accurate results than more traditional methods such as factor analysis and the more recent ICA. Whereas factor analysis assumes the sources are orthogonal and ICA assumes the sources are statistically independent, the mcERP algorithm makes no such assumptions thus allowing investigators to examine interactions among components by estimating the properties of single-trial responses.

  7. Joint source based analysis of multiple brain structures in studying major depressive disorder

    NASA Astrophysics Data System (ADS)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  8. Analysing and correcting the differences between multi-source and multi-scale spatial remote sensing observations.

    PubMed

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.

  9. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    PubMed Central

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760

  10. MULGRES: a computer program for stepwise multiple regression analysis

    Treesearch

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  11. Isolating and Examining Sources of Suppression and Multicollinearity in Multiple Linear Regression

    ERIC Educational Resources Information Center

    Beckstead, Jason W.

    2012-01-01

    The presence of suppression (and multicollinearity) in multiple regression analysis complicates interpretation of predictor-criterion relationships. The mathematical conditions that produce suppression in regression analysis have received considerable attention in the methodological literature but until now nothing in the way of an analytic…

  12. Auditing the multiply-related concepts within the UMLS

    PubMed Central

    Mougin, Fleur; Grabar, Natalia

    2014-01-01

    Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853

  13. Text-Based Argumentation with Multiple Sources: A Descriptive Study of Opportunity to Learn in Secondary English Language Arts, History, and Science

    ERIC Educational Resources Information Center

    Litman, Cindy; Marple, Stacy; Greenleaf, Cynthia; Charney-Sirott, Irisa; Bolz, Michael J.; Richardson, Lisa K.; Hall, Allison H.; George, MariAnne; Goldman, Susan R.

    2017-01-01

    This study presents a descriptive analysis of 71 videotaped lessons taught by 34 highly regarded secondary English language arts, history, and science teachers, collected to inform an intervention focused on evidence-based argumentation from multiple text sources. Studying the practices of highly regarded teachers is valuable for identifying…

  14. Innovations in the Analysis of Chandra-ACIS Observations

    NASA Astrophysics Data System (ADS)

    Broos, Patrick S.; Townsley, Leisa K.; Feigelson, Eric D.; Getman, Konstantin V.; Bauer, Franz E.; Garmire, Gordon P.

    2010-05-01

    As members of the instrument team for the Advanced CCD Imaging Spectrometer (ACIS) on NASA's Chandra X-ray Observatory and as Chandra General Observers, we have developed a wide variety of data analysis methods that we believe are useful to the Chandra community, and have constructed a significant body of publicly available software (the ACIS Extract package) addressing important ACIS data and science analysis tasks. This paper seeks to describe these data analysis methods for two purposes: to document the data analysis work performed in our own science projects and to help other ACIS observers judge whether these methods may be useful in their own projects (regardless of what tools and procedures they choose to implement those methods). The ACIS data analysis recommendations we offer here address much of the workflow in a typical ACIS project, including data preparation, point source detection via both wavelet decomposition and image reconstruction, masking point sources, identification of diffuse structures, event extraction for both point and diffuse sources, merging extractions from multiple observations, nonparametric broadband photometry, analysis of low-count spectra, and automation of these tasks. Many of the innovations presented here arise from several, often interwoven, complications that are found in many Chandra projects: large numbers of point sources (hundreds to several thousand), faint point sources, misaligned multiple observations of an astronomical field, point source crowding, and scientifically relevant diffuse emission.

  15. FIA: An Open Forensic Integration Architecture for Composing Digital Evidence

    NASA Astrophysics Data System (ADS)

    Raghavan, Sriram; Clark, Andrew; Mohay, George

    The analysis and value of digital evidence in an investigation has been the domain of discourse in the digital forensic community for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a distant dream. With the advent of modern technologies, pro-active measures are integral to keeping abreast of all forms of cyber crimes and attacks. This paper motivates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architecture (FIA) which provides a framework for abstracting the evidence source and storage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identifies evidence information from multiple sources that enables an investigator to build theories to reconstruct the past. FIA is hierarchically composed of multiple layers and adopts a technology independent approach. FIA is also open and extensible making it simple to adapt to technological changes. We present a case study using a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.

  16. An analysis of the vapor flow and the heat conduction through the liquid-wick and pipe wall in a heat pipe with single or multiple heat sources

    NASA Technical Reports Server (NTRS)

    Chen, Ming-Ming; Faghri, Amir

    1990-01-01

    A numerical analysis is presented for the overall performance of heat pipes with single or multiple heat sources. The analysis includes the heat conduction in the wall and liquid-wick regions as well as the compressibility effect of the vapor inside the heat pipe. The two-dimensional elliptic governing equations in conjunction with the thermodynamic equilibrium relation and appropriate boundary conditions are solved numerically. The solutions are in agreement with existing experimental data for the vapor and wall temperatures at both low and high operating temperatures.

  17. Bayesian multiple-source localization in an uncertain ocean environment.

    PubMed

    Dosso, Stan E; Wilmut, Michael J

    2011-06-01

    This paper considers simultaneous localization of multiple acoustic sources when properties of the ocean environment (water column and seabed) are poorly known. A Bayesian formulation is developed in which the environmental parameters, noise statistics, and locations and complex strengths (amplitudes and phases) of multiple sources are considered to be unknown random variables constrained by acoustic data and prior information. Two approaches are considered for estimating source parameters. Focalization maximizes the posterior probability density (PPD) over all parameters using adaptive hybrid optimization. Marginalization integrates the PPD using efficient Markov-chain Monte Carlo methods to produce joint marginal probability distributions for source ranges and depths, from which source locations are obtained. This approach also provides quantitative uncertainty analysis for all parameters, which can aid in understanding of the inverse problem and may be of practical interest (e.g., source-strength probability distributions). In both approaches, closed-form maximum-likelihood expressions for source strengths and noise variance at each frequency allow these parameters to be sampled implicitly, substantially reducing the dimensionality and difficulty of the inversion. Examples are presented of both approaches applied to single- and multi-frequency localization of multiple sources in an uncertain shallow-water environment, and a Monte Carlo performance evaluation study is carried out. © 2011 Acoustical Society of America

  18. A Social Judgment Analysis of Information Source Preference Profiles: An Exploratory Study to Empirically Represent Media Selection Patterns.

    ERIC Educational Resources Information Center

    Stefl-Mabry, Joette

    2003-01-01

    Describes a study that empirically identified individual preferences profiles to understand information-seeking behavior among professional groups for six selected information sources. Highlights include Social Judgment Analysis; the development of the survey used, a copy of which is appended; hypotheses tested; results of multiple regression…

  19. Role of diversity in ICA and IVA: theory and applications

    NASA Astrophysics Data System (ADS)

    Adalı, Tülay

    2016-05-01

    Independent component analysis (ICA) has been the most popular approach for solving the blind source separation problem. Starting from a simple linear mixing model and the assumption of statistical independence, ICA can recover a set of linearly-mixed sources to within a scaling and permutation ambiguity. It has been successfully applied to numerous data analysis problems in areas as diverse as biomedicine, communications, finance, geo- physics, and remote sensing. ICA can be achieved using different types of diversity—statistical property—and, can be posed to simultaneously account for multiple types of diversity such as higher-order-statistics, sample dependence, non-circularity, and nonstationarity. A recent generalization of ICA, independent vector analysis (IVA), generalizes ICA to multiple data sets and adds the use of one more type of diversity, statistical dependence across the data sets, for jointly achieving independent decomposition of multiple data sets. With the addition of each new diversity type, identification of a broader class of signals become possible, and in the case of IVA, this includes sources that are independent and identically distributed Gaussians. We review the fundamentals and properties of ICA and IVA when multiple types of diversity are taken into account, and then ask the question whether diversity plays an important role in practical applications as well. Examples from various domains are presented to demonstrate that in many scenarios it might be worthwhile to jointly account for multiple statistical properties. This paper is submitted in conjunction with the talk delivered for the "Unsupervised Learning and ICA Pioneer Award" at the 2016 SPIE Conference on Sensing and Analysis Technologies for Biomedical and Cognitive Applications.

  20. An algorithm for separation of mixed sparse and Gaussian sources

    PubMed Central

    Akkalkotkar, Ameya

    2017-01-01

    Independent component analysis (ICA) is a ubiquitous method for decomposing complex signal mixtures into a small set of statistically independent source signals. However, in cases in which the signal mixture consists of both nongaussian and Gaussian sources, the Gaussian sources will not be recoverable by ICA and will pollute estimates of the nongaussian sources. Therefore, it is desirable to have methods for mixed ICA/PCA which can separate mixtures of Gaussian and nongaussian sources. For mixtures of purely Gaussian sources, principal component analysis (PCA) can provide a basis for the Gaussian subspace. We introduce a new method for mixed ICA/PCA which we call Mixed ICA/PCA via Reproducibility Stability (MIPReSt). Our method uses a repeated estimations technique to rank sources by reproducibility, combined with decomposition of multiple subsamplings of the original data matrix. These multiple decompositions allow us to assess component stability as the size of the data matrix changes, which can be used to determinine the dimension of the nongaussian subspace in a mixture. We demonstrate the utility of MIPReSt for signal mixtures consisting of simulated sources and real-word (speech) sources, as well as mixture of unknown composition. PMID:28414814

  1. An algorithm for separation of mixed sparse and Gaussian sources.

    PubMed

    Akkalkotkar, Ameya; Brown, Kevin Scott

    2017-01-01

    Independent component analysis (ICA) is a ubiquitous method for decomposing complex signal mixtures into a small set of statistically independent source signals. However, in cases in which the signal mixture consists of both nongaussian and Gaussian sources, the Gaussian sources will not be recoverable by ICA and will pollute estimates of the nongaussian sources. Therefore, it is desirable to have methods for mixed ICA/PCA which can separate mixtures of Gaussian and nongaussian sources. For mixtures of purely Gaussian sources, principal component analysis (PCA) can provide a basis for the Gaussian subspace. We introduce a new method for mixed ICA/PCA which we call Mixed ICA/PCA via Reproducibility Stability (MIPReSt). Our method uses a repeated estimations technique to rank sources by reproducibility, combined with decomposition of multiple subsamplings of the original data matrix. These multiple decompositions allow us to assess component stability as the size of the data matrix changes, which can be used to determinine the dimension of the nongaussian subspace in a mixture. We demonstrate the utility of MIPReSt for signal mixtures consisting of simulated sources and real-word (speech) sources, as well as mixture of unknown composition.

  2. Mass media in health promotion: an analysis using an extended information-processing model.

    PubMed

    Flay, B R; DiTecco, D; Schlegel, R P

    1980-01-01

    The information-processing model of the attitude and behavior change process was critically examined and extended from six to 12 levels for a better analysis of change due to mass media campaigns. Findings from social psychology and communications research, and from evaluations of mass media health promotion programs, were reviewed to determine how source, message, channel, receiver, and destination variables affect each of the levels of change of major interest (knowledge, beliefs, attitudes, intentions and behavior). Factors found to most likely induce permanent attitude and behavior change (most important in health promotion) were: presentation and repetition over long time periods, via multiple sources, at different times (including "prime" or high-exposure times), by multiple sources, in novel and involving ways, with appeals to multiple motives, development of social support, and provisions of appropriate behavioral skills, alternatives, and reinforcement (preferably in ways that get the active participation of the audience). Suggestions for evaluation of mass media programs that take account of this complexity were advanced.

  3. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  4. Defense Small Business Innovation Research Program (SBIR) Abstracts of Phase I Awards 1984.

    DTIC Science & Technology

    1985-04-16

    PROTECTION OF SATELLITES FROM DIRECTED ENERGY WEAPONS, IS THE UTILIZATION OF HEAT PIPES WITHIN A SHIELD STRUCTURE. HEAT PIPES COULD BE DESIGNED TO...780 EDEN ROAD LANCASTER, PA 17601 ROBERT M. SHAUBACK TITLE: ANALYSIS AND PERFORMNCE EVALUATION OF HEAT PIPES WITH MULTIPLE HEAT SOURCES TOPIC: 97... PIPES CAPABLE OF ACCEPTING HEAT FROM MULTIPLE HEAT SOURCES. THERE IS NO THOROUGH ANALYTICAL OR EXPERIMENTAL BASIS FOR THE DESIGN OF HEAT PIPES OF

  5. Data-optimized source modeling with the Backwards Liouville Test–Kinetic method

    DOE PAGES

    Woodroffe, J. R.; Brito, T. V.; Jordanova, V. K.; ...

    2017-09-14

    In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution were used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. Our study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra,more » Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  6. Auditing the multiply-related concepts within the UMLS.

    PubMed

    Mougin, Fleur; Grabar, Natalia

    2014-10-01

    This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. Time-correlated neutron analysis of a multiplying HEU source

    NASA Astrophysics Data System (ADS)

    Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  8. Analysis of Ribosome Inactivating Protein (RIP): A Bioinformatics Approach

    NASA Astrophysics Data System (ADS)

    Jothi, G. Edward Gnana; Majilla, G. Sahaya Jose; Subhashini, D.; Deivasigamani, B.

    2012-10-01

    In spite of the medical advances in recent years, the world is in need of different sources to encounter certain health issues.Ribosome Inactivating Proteins (RIPs) were found to be one among them. In order to get easy access about RIPs, there is a need to analyse RIPs towards constructing a database on RIPs. Also, multiple sequence alignment was done towards screening for homologues of significant RIPs from rare sources against RIPs from easily available sources in terms of similarity. Protein sequences were retrieved from SWISS-PROT and are further analysed using pair wise and multiple sequence alignment.Analysis shows that, 151 RIPs have been characterized to date. Amongst them, there are 87 type I, 37 type II, 1 type III and 25 unknown RIPs. The sequence length information of various RIPs about the availability of full or partial sequence was also found. The multiple sequence alignment of 37 type I RIP using the online server Multalin, indicates the presence of 20 conserved residues. Pairwise alignment and multiple sequence alignment of certain selected RIPs in two groups namely Group I and Group II were carried out and the consensus level was found to be 98%, 98% and 90% respectively.

  9. Combining data from multiple sources using the CUAHSI Hydrologic Information System

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Ames, D. P.; Horsburgh, J. S.; Goodall, J. L.

    2012-12-01

    The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) has developed a Hydrologic Information System (HIS) to provide better access to data by enabling the publication, cataloging, discovery, retrieval, and analysis of hydrologic data using web services. The CUAHSI HIS is an Internet based system comprised of hydrologic databases and servers connected through web services as well as software for data publication, discovery and access. The HIS metadata catalog lists close to 100 web services registered to provide data through this system, ranging from large federal agency data sets to experimental watersheds managed by University investigators. The system's flexibility in storing and enabling public access to similarly formatted data and metadata has created a community data resource from governmental and academic data that might otherwise remain private or analyzed only in isolation. Comprehensive understanding of hydrology requires integration of this information from multiple sources. HydroDesktop is the client application developed as part of HIS to support data discovery and access through this system. HydroDesktop is founded on an open source GIS client and has a plug-in architecture that has enabled the integration of modeling and analysis capability with the functionality for data discovery and access. Model integration is possible through a plug-in built on the OpenMI standard and data visualization and analysis is supported by an R plug-in. This presentation will demonstrate HydroDesktop, showing how it provides an analysis environment within which data from multiple sources can be discovered, accessed and integrated.

  10. Analysis of the load selection on the error of source characteristics identification for an engine exhaust system

    NASA Astrophysics Data System (ADS)

    Zheng, Sifa; Liu, Haitao; Dan, Jiabi; Lian, Xiaomin

    2015-05-01

    Linear time-invariant assumption for the determination of acoustic source characteristics, the source strength and the source impedance in the frequency domain has been proved reasonable in the design of an exhaust system. Different methods have been proposed to its identification and the multi-load method is widely used for its convenience by varying the load number and impedance. Theoretical error analysis has rarely been referred to and previous results have shown an overdetermined set of open pipes can reduce the identification error. This paper contributes a theoretical error analysis for the load selection. The relationships between the error in the identification of source characteristics and the load selection were analysed. A general linear time-invariant model was built based on the four-load method. To analyse the error of the source impedance, an error estimation function was proposed. The dispersion of the source pressure was obtained by an inverse calculation as an indicator to detect the accuracy of the results. It was found that for a certain load length, the load resistance at the frequency points of one-quarter wavelength of odd multiples results in peaks and in the maximum error for source impedance identification. Therefore, the load impedance of frequency range within the one-quarter wavelength of odd multiples should not be used for source impedance identification. If the selected loads have more similar resistance values (i.e., the same order of magnitude), the identification error of the source impedance could be effectively reduced.

  11. Skills Conversion Project: Chapter 19, Profile of the Technological Manpower Pool. Final Report.

    ERIC Educational Resources Information Center

    National Society of Professional Engineers, Washington, DC.

    In the absence of any one central source of data covering the profile of the unemployed aerospace and defense technical professional, an extensive analysis of multiple data sources was conducted for the U.S. Department of Labor by the National Society for Professional Engineers. The survey and analysis included data covering approximately 63,000…

  12. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  13. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE PAGES

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    2017-01-16

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  14. Multiwavelength counterparts of the point sources in the Chandra Source Catalog

    NASA Astrophysics Data System (ADS)

    Reynolds, Michael; Civano, Francesca Maria; Fabbiano, Giuseppina; D'Abrusco, Raffaele

    2018-01-01

    The most recent release of the Chandra Source Catalog (CSC) version 2.0 comprises more than $\\sim$350,000 point sources, down to fluxes of $\\sim$10$^{-16}$ erg/cm$^2$/s, covering $\\sim$500 deg$^2$ of the sky, making it one of the best available X-ray catalogs to date. There are many reasons to have multiwavelength counterparts for sources, one such reason is that X-ray information alone is not enough to identify the sources and divide them between galactic and extragalactic origin, therefore multiwavelength data associated to each X-ray source is crucial for classification and scientific analysis of the sample. To perform this multiwavelength association, we are going to employ the recently released versatile tool NWAY (Salvato et al. 2017), based on a Bayesian algorithm for cross-matching multiple catalogs. NWAY allows the combination of multiple catalogs at the same time, provides a probability for the matches, even in case of non-detection due to different depth of the matching catalogs, and it can be used by including priors on the nature of the sources (e.g. colors, magnitudes, etc). In this poster, we are presenting the preliminary analysis using the CSC sources above the galactic plane matched to the WISE All-Sky catalog, SDSS, Pan-STARRS and GALEX.

  15. Direct Position Determination of Multiple Non-Circular Sources with a Moving Coprime Array.

    PubMed

    Zhang, Yankui; Ba, Bin; Wang, Daming; Geng, Wei; Xu, Haiyun

    2018-05-08

    Direct position determination (DPD) is currently a hot topic in wireless localization research as it is more accurate than traditional two-step positioning. However, current DPD algorithms are all based on uniform arrays, which have an insufficient degree of freedom and limited estimation accuracy. To improve the DPD accuracy, this paper introduces a coprime array to the position model of multiple non-circular sources with a moving array. To maximize the advantages of this coprime array, we reconstruct the covariance matrix by vectorization, apply a spatial smoothing technique, and converge the subspace data from each measuring position to establish the cost function. Finally, we obtain the position coordinates of the multiple non-circular sources. The complexity of the proposed method is computed and compared with that of other methods, and the Cramer⁻Rao lower bound of DPD for multiple sources with a moving coprime array, is derived. Theoretical analysis and simulation results show that the proposed algorithm is not only applicable to circular sources, but can also improve the positioning accuracy of non-circular sources. Compared with existing two-step positioning algorithms and DPD algorithms based on uniform linear arrays, the proposed technique offers a significant improvement in positioning accuracy with a slight increase in complexity.

  16. Estimating the mass variance in neutron multiplicity counting-A comparison of approaches

    NASA Astrophysics Data System (ADS)

    Dubi, C.; Croft, S.; Favalli, A.; Ocherashvili, A.; Pedersen, B.

    2017-12-01

    In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α , n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.

  17. Estimating the mass variance in neutron multiplicity counting $-$ A comparison of approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubi, C.; Croft, S.; Favalli, A.

    In the standard practice of neutron multiplicity counting, the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy,more » sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  18. Estimating the mass variance in neutron multiplicity counting $-$ A comparison of approaches

    DOE PAGES

    Dubi, C.; Croft, S.; Favalli, A.; ...

    2017-09-14

    In the standard practice of neutron multiplicity counting, the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy,more » sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  19. A boundary element approach to optimization of active noise control sources on three-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cunefare, K. A.; Koopmann, G. H.

    1991-01-01

    This paper presents the theoretical development of an approach to active noise control (ANC) applicable to three-dimensional radiators. The active noise control technique, termed ANC Optimization Analysis, is based on minimizing the total radiated power by adding secondary acoustic sources on the primary noise source. ANC Optimization Analysis determines the optimum magnitude and phase at which to drive the secondary control sources in order to achieve the best possible reduction in the total radiated power from the noise source/control source combination. For example, ANC Optimization Analysis predicts a 20 dB reduction in the total power radiated from a sphere of radius at a dimensionless wavenumber ka of 0.125, for a single control source representing 2.5 percent of the total area of the sphere. ANC Optimization Analysis is based on a boundary element formulation of the Helmholtz Integral Equation, and thus, the optimization analysis applies to a single frequency, while multiple frequencies can be treated through repeated analyses.

  20. Spatiotemporal source tuning filter bank for multiclass EEG based brain computer interfaces.

    PubMed

    Acharya, Soumyadipta; Mollazadeh, Moshen; Murari, Kartikeya; Thakor, Nitish

    2006-01-01

    Non invasive brain-computer interfaces (BCI) allow people to communicate by modulating features of their electroencephalogram (EEG). Spatiotemporal filtering has a vital role in multi-class, EEG based BCI. In this study, we used a novel combination of principle component analysis, independent component analysis and dipole source localization to design a spatiotemporal multiple source tuning (SPAMSORT) filter bank, each channel of which was tuned to the activity of an underlying dipole source. Changes in the event-related spectral perturbation (ERSP) were measured and used to train a linear support vector machine to classify between four classes of motor imagery tasks (left hand, right hand, foot and tongue) for one subject. ERSP values were significantly (p<0.01) different across tasks and better (p<0.01) than conventional spatial filtering methods (large Laplacian and common average reference). Classification resulted in an average accuracy of 82.5%. This approach could lead to promising BCI applications such as control of a prosthesis with multiple degrees of freedom.

  1. Deblending of simultaneous-source data using iterative seislet frame thresholding based on a robust slope estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Yatong; Han, Chunying; Chi, Yue

    2018-06-01

    In a simultaneous source survey, no limitation is required for the shot scheduling of nearby sources and thus a huge acquisition efficiency can be obtained but at the same time making the recorded seismic data contaminated by strong blending interference. In this paper, we propose a multi-dip seislet frame based sparse inversion algorithm to iteratively separate simultaneous sources. We overcome two inherent drawbacks of traditional seislet transform. For the multi-dip problem, we propose to apply a multi-dip seislet frame thresholding strategy instead of the traditional seislet transform for deblending simultaneous-source data that contains multiple dips, e.g., containing multiple reflections. The multi-dip seislet frame strategy solves the conflicting dip problem that degrades the performance of the traditional seislet transform. For the noise issue, we propose to use a robust dip estimation algorithm that is based on velocity-slope transformation. Instead of calculating the local slope directly using the plane-wave destruction (PWD) based method, we first apply NMO-based velocity analysis and obtain NMO velocities for multi-dip components that correspond to multiples of different orders, then a fairly accurate slope estimation can be obtained using the velocity-slope conversion equation. An iterative deblending framework is given and validated through a comprehensive analysis over both numerical synthetic and field data examples.

  2. Molar Functional Relations and Clinical Behavior Analysis: Implications for Assessment and Treatment

    ERIC Educational Resources Information Center

    Waltz, Thomas J.; Follette, William C.

    2009-01-01

    The experimental analysis of behavior has identified several molar functional relations that are highly relevant to clinical behavior analysis. These include matching, discounting, momentum, and variability. Matching provides a broader analysis of how multiple sources of reinforcement influence how individuals choose to allocate their time and…

  3. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    PubMed

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Gary L.; Dubinsky, Eric A.

    Herein are described 1058 different bacterial taxa that were unique to either human, grazing mammal, or bird fecal wastes. These identified taxa can serve as specific identifier taxa for these sources in environmental waters. Two field tests in marine waters demonstrate the capacity of phylogenetic microarray analysis to track multiple sources with one test.

  5. Reduction and Analysis of GALFACTS Data in Search of Compact Variable Sources

    NASA Astrophysics Data System (ADS)

    Wenger, Trey; Barenfeld, S.; Ghosh, T.; Salter, C.

    2012-01-01

    The Galactic ALFA Continuum Transit Survey (GALFACTS) is an all-Arecibo sky, full-Stokes survey from 1225 to 1525 MHz using the multibeam Arecibo L-band Feed Array (ALFA). Using data from survey field N1, the first field covered by GALFACTS, we are searching for compact sources that vary in intensity and/or polarization. The multistep procedure for reducing the data includes radio frequency interference (RFI) removal, source detection, Gaussian fitting in multiple dimensions, polarization leakage calibration, and gain calibration. We have developed code to analyze and calculate the calibration parameters from the N1 calibration sources, and apply these to the data of the main run. For detected compact sources, our goal is to compare results from multiple passes over a source to search for rapid variability, as well as to compare our flux densities with those from the NRAO VLA Sky Survey (NVSS) to search for longer time-scale variations.

  6. Start-up Characteristics of Swallow-tailed Axial-grooved Heat Pipe under the conditions of Multiple Heat Sources

    NASA Astrophysics Data System (ADS)

    Zhang, Renping

    2017-12-01

    A mathematical model was developed for predicting start-up characteristics of Swallow-tailed Axial-grooved Heat Pipe under the conditions of Multiple Heat Sources. The effects of heat capacitance of heat source, liquid-vapour interfacial evaporation-condensation heat transfer, shear stress at the interface was considered in current model. The interfacial evaporating mass flow rate is based on the kinetic analysis. Time variations of evaporating mass rate, wall temperature and liquid velocity are studied from the start-up to steady state. The calculated results show that wall temperature demonstrates step transition at the junction between the heat source and non-existent heat source on the evaporator. The liquid velocity changes drastically at the evaporator section, however, it has slight variation at the evaporator section without heat source. When the effect of heat source is ignored, the numerical temperature demonstrates a quicker response. With the consideration of capacitance of the heat source, the data obtained from the proposed model agree well with the experimental results.

  7. Social inequalities in health information seeking among young adults in Montreal.

    PubMed

    Gagné, Thierry; Ghenadenik, Adrian E; Abel, Thomas; Frohlich, Katherine L

    2018-06-01

    Over their lifecourse, young adults develop different skills and preferences in relationship to the information sources they seek when having questions about health. Health information seeking behaviour (HISB) includes multiple, unequally accessed sources; yet most studies have focused on single sources and did not examine HISB's association with social inequalities. This study explores 'multiple-source' profiles and their association with socioeconomic characteristics. We analyzed cross-sectional data from the Interdisciplinary Study of Inequalities in Smoking involving 2093 young adults recruited in Montreal, Canada, in 2011-2012. We used latent class analysis to create profiles based on responses to questions regarding whether participants sought health professionals, family, friends or the Internet when having questions about health. Using multinomial logistic regression, we examined the associations between profiles and economic, social and cultural capital indicators: financial difficulties and transportation means, friend satisfaction and network size, and individual, mother's, and father's education. Five profiles were found: 'all sources' (42%), 'health professional centred' (29%), 'family only' (14%), 'Internet centred' (14%) and 'no sources' (2%). Participants with a larger social network and higher friend satisfaction were more likely to be in the 'all sources' group. Participants who experienced financial difficulties and completed college/university were less likely to be in the 'family only' group; those whose mother had completed college/university were more likely to be in this group. Our findings point to the importance of considering multiple sources to study HISB, especially when the capacity to seek multiple sources is unequally distributed. Scholars should acknowledge HISB's implications for health inequalities.

  8. A Versatile Integrated Ambient Ionization Source Platform.

    PubMed

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-30

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. Graphical abstract ᅟ.

  9. A Versatile Integrated Ambient Ionization Source Platform

    NASA Astrophysics Data System (ADS)

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-01

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodroffe, J. R.; Brito, T. V.; Jordanova, V. K.

    In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution were used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. Our study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra,more » Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  11. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Assessment of port-related air quality impacts: geographic analysis of population

    EPA Science Inventory

    Increased global trade has led to greater transportation by rail, road and ships to move cargo. Based upon multiple near-road and near-source monitoring studies, the busy roadways and large emission sources at ports may impact local air quality within several hundred metres of th...

  13. LOGWAR 15: Analysis Report

    DTIC Science & Technology

    2016-04-01

    Sanitation, and Hygiene WFP World Food Programme WHO World Health Organization Unclassified Unclassified xii This page intentionally left blank...Insurgency Natural Disaster Contamination Visibility Disposition Distribution Sourcing Prioritization Security Financial U.S. Military Services Combatant...supply; restrictions on sourcing; contamination concerns (IV solutions) Small in size; multiple variants with limited interchangeability; requires

  14. The receiver operational characteristic for binary classification with multiple indices and its application to the neuroimaging study of Alzheimer's disease.

    PubMed

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2013-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.

  15. The Receiver Operational Characteristic for Binary Classification with Multiple Indices and Its Application to the Neuroimaging Study of Alzheimer’s Disease

    PubMed Central

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2014-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553

  16. Time-dependent clustering analysis of the second BATSE gamma-ray burst catalog

    NASA Technical Reports Server (NTRS)

    Brainerd, J. J.; Meegan, C. A.; Briggs, Michael S.; Pendleton, G. N.; Brock, M. N.

    1995-01-01

    A time-dependent two-point correlation-function analysis of the Burst and Transient Source Experiment (BATSE) 2B catalog finds no evidence of burst repetition. As part of this analysis, we discuss the effects of sky exposure on the observability of burst repetition and present the equation describing the signature of burst repetition in the data. For a model of all burst repetition from a source occurring in less than five days we derive upper limits on the number of bursts in the catalog from repeaters and model-dependent upper limits on the fraction of burst sources that produce multiple outbursts.

  17. VAUD: A Visual Analysis Approach for Exploring Spatio-Temporal Urban Data.

    PubMed

    Chen, Wei; Huang, Zhaosong; Wu, Feiran; Zhu, Minfeng; Guan, Huihua; Maciejewski, Ross

    2017-10-02

    Urban data is massive, heterogeneous, and spatio-temporal, posing a substantial challenge for visualization and analysis. In this paper, we design and implement a novel visual analytics approach, Visual Analyzer for Urban Data (VAUD), that supports the visualization, querying, and exploration of urban data. Our approach allows for cross-domain correlation from multiple data sources by leveraging spatial-temporal and social inter-connectedness features. Through our approach, the analyst is able to select, filter, aggregate across multiple data sources and extract information that would be hidden to a single data subset. To illustrate the effectiveness of our approach, we provide case studies on a real urban dataset that contains the cyber-, physical-, and socialinformation of 14 million citizens over 22 days.

  18. Prevalence of Multiply Controlled Problem Behavior

    ERIC Educational Resources Information Center

    Beavers, Gracie A.; Iwata, Brian A.

    2011-01-01

    We examined articles in the "Journal of Applied Behavior Analysis" in which results of functional analyses indicated that problem behavior was maintained by multiple sources of reinforcement. Data for 88 (16.9%) of 521 subjects reported in 168 studies met the criteria for multiple control. Data for 11 subjects (2.1%) involved a single response…

  19. Open Source Next Generation Visualization Software for Interplanetary Missions

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  20. A Review of Meta-Analysis Packages in R

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.

    2017-01-01

    Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…

  1. Multiple window spatial registration error of a gamma camera: 133Ba point source as a replacement of the NEMA procedure.

    PubMed

    Bergmann, Helmar; Minear, Gregory; Raith, Maria; Schaffarich, Peter M

    2008-12-09

    The accuracy of multiple window spatial resolution characterises the performance of a gamma camera for dual isotope imaging. In the present study we investigate an alternative method to the standard NEMA procedure for measuring this performance parameter. A long-lived 133Ba point source with gamma energies close to 67Ga and a single bore lead collimator were used to measure the multiple window spatial registration error. Calculation of the positions of the point source in the images used the NEMA algorithm. The results were validated against the values obtained by the standard NEMA procedure which uses a liquid 67Ga source with collimation. Of the source-collimator configurations under investigation an optimum collimator geometry, consisting of a 5 mm thick lead disk with a diameter of 46 mm and a 5 mm central bore, was selected. The multiple window spatial registration errors obtained by the 133Ba method showed excellent reproducibility (standard deviation < 0.07 mm). The values were compared with the results from the NEMA procedure obtained at the same locations and showed small differences with a correlation coefficient of 0.51 (p < 0.05). In addition, the 133Ba point source method proved to be much easier to use. A Bland-Altman analysis showed that the 133Ba and the 67Ga Method can be used interchangeably. The 133Ba point source method measures the multiple window spatial registration error with essentially the same accuracy as the NEMA-recommended procedure, but is easier and safer to use and has the potential to replace the current standard procedure.

  2. Extent of Fecal Contamination of Household Drinking Water in Nepal: Further Analysis of Nepal Multiple Indicator Cluster Survey 2014.

    PubMed

    Kandel, Pragya; Kunwar, Ritu; Lamichhane, Prabhat; Karki, Surendra

    2017-02-08

    Water sources classified as "improved" may not necessarily provide safe drinking water for householders. We analyzed data from Nepal Multiple Indicator Cluster Survey 2014 to explore the extent of fecal contamination of household drinking water. Fecal contamination was detected in 81.2% (95% confidence interval [CI]: 77.9-84.2) household drinking water from improved sources and 89.6% (95% CI: 80.4-94.7) in water samples from unimproved sources. In adjusted analysis, there was no difference in odds of fecal contamination of household drinking water between improved and unimproved sources. We observed significantly lower odds of fecal contamination of drinking water in households in higher wealth quintiles, where soap and water were available for handwashing and in households employing water treatment. The extent of contamination of drinking water as observed in this study highlights the huge amount of effort required to ensure the provision of safely managed water in Nepal by 2030 as aimed in sustainable development goals. © The American Society of Tropical Medicine and Hygiene.

  3. Extent of Fecal Contamination of Household Drinking Water in Nepal: Further Analysis of Nepal Multiple Indicator Cluster Survey 2014

    PubMed Central

    Kandel, Pragya; Kunwar, Ritu; Lamichhane, Prabhat; Karki, Surendra

    2017-01-01

    Water sources classified as “improved” may not necessarily provide safe drinking water for householders. We analyzed data from Nepal Multiple Indicator Cluster Survey 2014 to explore the extent of fecal contamination of household drinking water. Fecal contamination was detected in 81.2% (95% confidence interval [CI]: 77.9–84.2) household drinking water from improved sources and 89.6% (95% CI: 80.4–94.7) in water samples from unimproved sources. In adjusted analysis, there was no difference in odds of fecal contamination of household drinking water between improved and unimproved sources. We observed significantly lower odds of fecal contamination of drinking water in households in higher wealth quintiles, where soap and water were available for handwashing and in households employing water treatment. The extent of contamination of drinking water as observed in this study highlights the huge amount of effort required to ensure the provision of safely managed water in Nepal by 2030 as aimed in sustainable development goals. PMID:27821687

  4. Visualization-based analysis of multiple response survey data

    NASA Astrophysics Data System (ADS)

    Timofeeva, Anastasiia

    2017-11-01

    During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.

  5. A control system for a powered prosthesis using positional and myoelectric inputs from the shoulder complex.

    PubMed

    Losier, Y; Englehart, K; Hudgins, B

    2007-01-01

    The integration of multiple input sources within a control strategy for powered upper limb prostheses could provide smoother, more intuitive multi-joint reaching movements based on the user's intended motion. The work presented in this paper presents the results of using myoelectric signals (MES) of the shoulder area in combination with the position of the shoulder as input sources to multiple linear discriminant analysis classifiers. Such an approach may provide users with control signals capable of controlling three degrees of freedom (DOF). This work is another important step in the development of hybrid systems that will enable simultaneous control of multiple degrees of freedom used for reaching tasks in a prosthetic limb.

  6. Relevance of the hadronic interaction model in the interpretation of multiple muon data as detected with the MACRO experiment

    NASA Astrophysics Data System (ADS)

    Ambrosio, M.; Antolini, R.; Aramo, C.; Auriemma, G.; Baldini, A.; Barbarino, G. C.; Barish, B. C.; Battistoni, G.; Bellotti, R.; Bemporad, C.; Bernardini, P.; Bilokon, H.; Bisi, V.; Bloise, C.; Bower, C.; Bussino, S.; Cafagna, F.; Calicchio, M.; Campana, D.; Carboni, M.; Castellano, M.; Cecchini, S.; Cei, F.; Chiarella, V.; Coutu, S.; de Benedictis, L.; de Cataldo, G.; Dekhissi, H.; de Marzo, C.; de Mitri, I.; de Vincenzi, M.; di Credico, A.; Erriquez, O.; Favuzzi, C.; Forti, C.; Fusco, P.; Giacomelli, G.; Giannini, G.; Giglietto, N.; Grassi, M.; Gray, L.; Grillo, A.; Guarino, F.; Guarnaccia, P.; Gustavino, C.; Habig, A.; Hanson, K.; Hawthorne, A.; Heinz, R.; Iarocci, E.; Katsavounidis, E.; Kearns, E.; Kyriazopoulou, S.; Lamanna, E.; Lane, C.; Levin, D. S.; Lipari, P.; Longley, N. P.; Longo, M. J.; Maaroufi, F.; Mancarella, G.; Mandrioli, G.; Manzoor, S.; Margiotta Neri, A.; Marini, A.; Martello, D.; Marzari-Chiesa, A.; Mazziotta, M. N.; Mazzotta, C.; Michael, D. G.; Mikheyev, S.; Miller, L.; Monacelli, P.; Montaruli, T.; Monteno, M.; Mufson, S.; Musser, J.; Nicoló, D.; Nolty, R.; Okada, C.; Orth, C.; Osteria, G.; Palamara, O.; Patera, V.; Patrizii, L.; Pazzi, R.; Peck, C. W.; Petrera, S.; Pistilli, P.; Popa, V.; Rainó, A.; Rastelli, A.; Reynoldson, J.; Ronga, F.; Rubizzo, U.; Sanzgiri, A.; Satriano, C.; Satta, L.; Scapparone, E.; Scholberg, K.; Sciubba, A.; Serra-Lugaresi, P.; Severi, M.; Sioli, M.; Sitta, M.; Spinelli, P.; Spinetti, M.; Spurio, M.; Steinberg, R.; Stone, J. L.; Sulak, L. R.; Surdo, A.; Tarlé, G.; Togo, V.; Walter, C. W.; Webb, R.

    1999-03-01

    With the aim of discussing the effect of the possible sources of systematic uncertainties in simulation models, the analysis of multiple muon events from the MACRO experiment at Gran Sasso is reviewed. In particular, the predictions from different currently available hadronic interaction models are compared.

  7. Linking Teacher Education to Redesigned Systems of Accountability: A Call for Multiple Measures in Pre-Service Teacher Effectiveness

    ERIC Educational Resources Information Center

    Farley, Amy N.; Clayton, Grant; Kaka, Sarah J.

    2018-01-01

    In this written commentary for the special issue of "Education Policy Analysis Archives" focused on "Redesigning Assessment and Accountability," we call for teacher preparation to embrace a multiple measures philosophy by providing teacher candidates with rich opportunities to engage with data from a variety of sources, beyond…

  8. Sources and Nature of Cost Analysis Data Base Reference Manual.

    DTIC Science & Technology

    1983-07-01

    COVERED Sources and Nature of Cost Analysis Data Base Interim Report (Update) Reference Manual 6 . PERFORMING ORG. REPORT NUMBER USAAVRADCOM TM 83-F-3 7 ...SECTION 6 - DATA FOR MULTIPLE APPLICATIONS 6.0.0 7.0.0 SECTION 7 - GLOSSARY OF COST ANALYSIS TERMS SECTION 8 - REFERENCES 8.0.0 SECTION 9 - BIBLIOGRAPHY...Relationsh-;ips Manual for the Army 1.14. 1 Yateri ci Command, TP-449, Mla; 1912 ( 7 21 RACKFORS JiR 1CO(PTER, INC. xlB.Aii- 6 -4A 1.15. 1 Z FNE>:THj MUNSON

  9. Source-space ICA for MEG source imaging.

    PubMed

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  10. Experimenter's Laboratory for Visualized Interactive Science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Rodier, Daniel R.; Klemp, Marjorie K.

    1994-01-01

    ELVIS (Experimenter's Laboratory for Visualized Interactive Science) is an interactive visualization environment that enables scientists, students, and educators to visualize and analyze large, complex, and diverse sets of scientific data. It accomplishes this by presenting the data sets as 2-D, 3-D, color, stereo, and graphic images with movable and multiple light sources combined with displays of solid-surface, contours, wire-frame, and transparency. By simultaneously rendering diverse data sets acquired from multiple sources, formats, and resolutions and by interacting with the data through an intuitive, direct-manipulation interface, ELVIS provides an interactive and responsive environment for exploratory data analysis.

  11. Integrating data from multiple sources for data completeness in a web-based registry for pediatric renal transplantation--the CERTAIN Registry.

    PubMed

    Köster, Lennart; Krupka, Kai; Höcker, Britta; Rahmel, Axel; Samuel, Undine; Zanen, Wouter; Opelz, Gerhard; Süsal, Caner; Döhler, Bernd; Plotnicki, Lukasz; Kohl, Christian D; Knaup, Petra; Tönshoff, Burkhard

    2015-01-01

    Patient registries are a useful tool to measure outcomes and compare the effectiveness of therapies in a specific patient population. High data quality and completeness are therefore advantageous for registry analysis. Data integration from multiple sources may increase completeness of the data. The pediatric renal transplantation registry CERTAIN identified Eurotransplant (ET) and the Collaborative Transplant Study (CTS) as possible partners for data exchange. Import and export interfaces with CTS and ET were implemented. All parties reached their projected goals and benefit from the exchange.

  12. Qualitative case study data analysis: an example from practice.

    PubMed

    Houghton, Catherine; Murphy, Kathy; Shaw, David; Casey, Dympna

    2015-05-01

    To illustrate an approach to data analysis in qualitative case study methodology. There is often little detail in case study research about how data were analysed. However, it is important that comprehensive analysis procedures are used because there are often large sets of data from multiple sources of evidence. Furthermore, the ability to describe in detail how the analysis was conducted ensures rigour in reporting qualitative research. The research example used is a multiple case study that explored the role of the clinical skills laboratory in preparing students for the real world of practice. Data analysis was conducted using a framework guided by the four stages of analysis outlined by Morse ( 1994 ): comprehending, synthesising, theorising and recontextualising. The specific strategies for analysis in these stages centred on the work of Miles and Huberman ( 1994 ), which has been successfully used in case study research. The data were managed using NVivo software. Literature examining qualitative data analysis was reviewed and strategies illustrated by the case study example provided. Discussion Each stage of the analysis framework is described with illustration from the research example for the purpose of highlighting the benefits of a systematic approach to handling large data sets from multiple sources. By providing an example of how each stage of the analysis was conducted, it is hoped that researchers will be able to consider the benefits of such an approach to their own case study analysis. This paper illustrates specific strategies that can be employed when conducting data analysis in case study research and other qualitative research designs.

  13. Multiple-Star System Adaptive Vortex Coronagraphy Using a Liquid Crystal Light Valve

    NASA Astrophysics Data System (ADS)

    Aleksanyan, Artur; Kravets, Nina; Brasselet, Etienne

    2017-05-01

    We propose the development of a high-contrast imaging technique enabling the simultaneous and selective nulling of several light sources. This is done by realizing a reconfigurable multiple-vortex phase mask made of a liquid crystal thin film on which local topological features can be addressed electro-optically. The method is illustrated by reporting on a triple-star optical vortex coronagraphy laboratory demonstration, which can be easily extended to higher multiplicity. These results allow considering the direct observation and analysis of worlds with multiple suns and more complex extrasolar planetary systems.

  14. Enabling a systems biology knowledgebase with gaggle and firegoose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baliga, Nitin S.

    The overall goal of this project was to extend the existing Gaggle and Firegoose systems to develop an open-source technology that runs over the web and links desktop applications with many databases and software applications. This technology would enable researchers to incorporate workflows for data analysis that can be executed from this interface to other online applications. The four specific aims were to (1) provide one-click mapping of genes, proteins, and complexes across databases and species; (2) enable multiple simultaneous workflows; (3) expand sophisticated data analysis for online resources; and enhance open-source development of the Gaggle-Firegoose infrastructure. Gaggle is anmore » open-source Java software system that integrates existing bioinformatics programs and data sources into a user-friendly, extensible environment to allow interactive exploration, visualization, and analysis of systems biology data. Firegoose is an extension to the Mozilla Firefox web browser that enables data transfer between websites and desktop tools including Gaggle. In the last phase of this funding period, we have made substantial progress on development and application of the Gaggle integration framework. We implemented the workspace to the Network Portal. Users can capture data from Firegoose and save them to the workspace. Users can create workflows to start multiple software components programmatically and pass data between them. Results of analysis can be saved to the cloud so that they can be easily restored on any machine. We also developed the Gaggle Chrome Goose, a plugin for the Google Chrome browser in tandem with an opencpu server in the Amazon EC2 cloud. This allows users to interactively perform data analysis on a single web page using the R packages deployed on the opencpu server. The cloud-based framework facilitates collaboration between researchers from multiple organizations. We have made a number of enhancements to the cmonkey2 application to enable and improve the integration within different environments, and we have created a new tools pipeline for generating EGRIN2 models in a largely automated way.« less

  15. Multiple-locus variable-number tandem repeat analysis of Salmonella Enteritidis isolates from human and non-human sources using a single multiplex PCR

    PubMed Central

    Cho, Seongbeom; Boxrud, David J; Bartkus, Joanne M; Whittam, Thomas S; Saeed, Mahdi

    2007-01-01

    Simplified multiple-locus variable-number tandem repeat analysis (MLVA) was developed using one-shot multiplex PCR for seven variable-number tandem repeats (VNTR) markers with high diversity capacity. MLVA, phage typing, and PFGE methods were applied on 34 diverse Salmonella Enteritidis isolates from human and non-human sources. MLVA detected allelic variations that helped to classify the S. Enteritidis isolates into more evenly distributed subtypes than other methods. MLVA-based S. Enteritidis clonal groups were largely associated with sources of the isolates. Nei's diversity indices for polymorphism ranged from 0.25 to 0.70 for seven VNTR loci markers. Based on Simpson's and Shannon's diversity indices, MLVA had a higher discriminatory power than pulsed field gel electrophoresis (PFGE), phage typing, or multilocus enzyme electrophoresis. Therefore, MLVA may be used along with PFGE to enhance the effectiveness of the molecular epidemiologic investigation of S. Enteritidis infections. PMID:17692097

  16. Detecting and accounting for multiple sources of positional variance in peak list registration analysis and spin system grouping.

    PubMed

    Smelter, Andrey; Rouchka, Eric C; Moseley, Hunter N B

    2017-08-01

    Peak lists derived from nuclear magnetic resonance (NMR) spectra are commonly used as input data for a variety of computer assisted and automated analyses. These include automated protein resonance assignment and protein structure calculation software tools. Prior to these analyses, peak lists must be aligned to each other and sets of related peaks must be grouped based on common chemical shift dimensions. Even when programs can perform peak grouping, they require the user to provide uniform match tolerances or use default values. However, peak grouping is further complicated by multiple sources of variance in peak position limiting the effectiveness of grouping methods that utilize uniform match tolerances. In addition, no method currently exists for deriving peak positional variances from single peak lists for grouping peaks into spin systems, i.e. spin system grouping within a single peak list. Therefore, we developed a complementary pair of peak list registration analysis and spin system grouping algorithms designed to overcome these limitations. We have implemented these algorithms into an approach that can identify multiple dimension-specific positional variances that exist in a single peak list and group peaks from a single peak list into spin systems. The resulting software tools generate a variety of useful statistics on both a single peak list and pairwise peak list alignment, especially for quality assessment of peak list datasets. We used a range of low and high quality experimental solution NMR and solid-state NMR peak lists to assess performance of our registration analysis and grouping algorithms. Analyses show that an algorithm using a single iteration and uniform match tolerances approach is only able to recover from 50 to 80% of the spin systems due to the presence of multiple sources of variance. Our algorithm recovers additional spin systems by reevaluating match tolerances in multiple iterations. To facilitate evaluation of the algorithms, we developed a peak list simulator within our nmrstarlib package that generates user-defined assigned peak lists from a given BMRB entry or database of entries. In addition, over 100,000 simulated peak lists with one or two sources of variance were generated to evaluate the performance and robustness of these new registration analysis and peak grouping algorithms.

  17. Catabolite regulation analysis of Escherichia coli for acetate overflow mechanism and co-consumption of multiple sugars based on systems biology approach using computer simulation.

    PubMed

    Matsuoka, Yu; Shimizu, Kazuyuki

    2013-10-20

    It is quite important to understand the basic principle embedded in the main metabolism for the interpretation of the fermentation data. For this, it may be useful to understand the regulation mechanism based on systems biology approach. In the present study, we considered the perturbation analysis together with computer simulation based on the models which include the effects of global regulators on the pathway activation for the main metabolism of Escherichia coli. Main focus is the acetate overflow metabolism and the co-fermentation of multiple carbon sources. The perturbation analysis was first made to understand the nature of the feed-forward loop formed by the activation of Pyk by FDP (F1,6BP), and the feed-back loop formed by the inhibition of Pfk by PEP in the glycolysis. Those together with the effect of transcription factor Cra caused by FDP level affected the glycolysis activity. The PTS (phosphotransferase system) acts as the feed-back system by repressing the glucose uptake rate for the increase in the glucose uptake rate. It was also shown that the increased PTS flux (or glucose consumption rate) causes PEP/PYR ratio to be decreased, and EIIA-P, Cya, cAMP-Crp decreased, where cAMP-Crp in turn repressed TCA cycle and more acetate is formed. This was further verified by the detailed computer simulation. In the case of multiple carbon sources such as glucose and xylose, it was shown that the sequential utilization of carbon sources was observed for wild type, while the co-consumption of multiple carbon sources with slow consumption rates were observed for the ptsG mutant by computer simulation, and this was verified by experiments. Moreover, the effect of a specific gene knockout such as Δpyk on the metabolic characteristics was also investigated based on the computer simulation. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Principle component analysis to separate deformation signals from multiple sources during a 2015 intrusive sequence at Kīlauea Volcano

    NASA Astrophysics Data System (ADS)

    Johanson, I. A.; Miklius, A.; Poland, M. P.

    2016-12-01

    A sequence of magmatic events in April-May 2015 at Kīlauea Volcano produced a complex deformation pattern that can be described by multiple deforming sources, active simultaneously. The 2015 intrusive sequence began with inflation in the volcano's summit caldera near Halema`uma`u (HMM) Crater, which continued over a few weeks, followed by rapid deflation of the HMM source and inflation of a source in the south caldera region during the next few days. In Kīlauea Volcano's summit area, multiple deformation centers are active at varying times, and all contribute to the overall pattern observed with GPS, tiltmeters, and InSAR. Isolating the contribution of different signals related to each source is a challenge and complicates the determination of optimal source geometry for the underlying magma bodies. We used principle component analysis of continuous GPS time series from the 2015 intrusion sequence to determine three basis vectors which together account for 83% of the variance in the data set. The three basis vectors are non-orthogonal and not strictly the principle components of the data set. In addition to separating deformation sources in the continuous GPS data, the basis vectors provide a means to scale the contribution of each source in a given interferogram. This provides an additional constraint in a joint model of GPS and InSAR data (COSMO-SkyMed and Sentinel-1A) to determine source geometry. The first basis vector corresponds with inflation in the south caldera region, an area long recognized as the location of a long-term storage reservoir. The second vector represents deformation of the HMM source, which is in the same location as a previously modeled shallow reservoir, however InSAR data suggest a more complicated source. Preliminary modeling of the deformation attributed to the third basis vector shows that it is consistent with inflation of a steeply dipping ellipsoid centered below Keanakāko`i crater, southeast of HMM. Keanakāko`i crater is the locus of a known, intermittently active deformation source, which was not previously recognized to have been active during the 2015 event.

  19. Performance analyses of Z-source and quasi Z-source inverter for photovoltaic applications

    NASA Astrophysics Data System (ADS)

    Himabind, S.; Priya, T. Hari; Manjeera, Ch.

    2018-04-01

    This paper presents the comparative analysis of Z-source and Quasi Z-source converter for renewable energy applications. Due to the dependency of renewable energy sources on external weather conditions the output voltage, current changes accordingly which effects the performance of traditional voltage source and current source inverters connected across it. To overcome the drawbacks of VSI and CSI, Z-source and Quasi Z-source inverter (QZSI) are used, which can perform multiple tasks like ac-to-dc, dc-to-ac, ac-to-ac, dc-to-dc conversion. They can be used for both buck and boost operations, by utilizing the shoot-through zero state. The QZSI is derived from the ZSI topology, with a slight change in the impedance network and it overcomes the drawbacks of ZSI. The QZSI draws a constant current from the source when compared to ZSI. A comparative analysis is performed between Z-source and Quasi Z-source inverter, simulation is performed in MATLAB/Simulink environment.

  20. Multiple parallel mass spectrometry for lipid and vitamin D analysis

    USDA-ARS?s Scientific Manuscript database

    Liquid chromatography (LC) coupled to mass spectrometry (MS) has become the method of choice for analysis of complex lipid samples. Two types of ionization sources have emerged as the most commonly used to couple LC to MS: atmospheric pressure chemical ionization (APCI) and electrospray ionization ...

  1. Subspace-based analysis of the ERT inverse problem

    NASA Astrophysics Data System (ADS)

    Ben Hadj Miled, Mohamed Khames; Miller, Eric L.

    2004-05-01

    In a previous work, we proposed a source-type formulation to the electrical resistance tomography (ERT) problem. Specifically, we showed that inhomogeneities in the medium can be viewed as secondary sources embedded in the homogeneous background medium and located at positions associated with variation in electrical conductivity. Assuming a piecewise constant conductivity distribution, the support of equivalent sources is equal to the boundary of the inhomogeneity. The estimation of the anomaly shape takes the form of an inverse source-type problem. In this paper, we explore the use of subspace methods to localize the secondary equivalent sources associated with discontinuities in the conductivity distribution. Our first alternative is the multiple signal classification (MUSIC) algorithm which is commonly used in the localization of multiple sources. The idea is to project a finite collection of plausible pole (or dipole) sources onto an estimated signal subspace and select those with largest correlations. In ERT, secondary sources are excited simultaneously but in different ways, i.e. with distinct amplitude patterns, depending on the locations and amplitudes of primary sources. If the number of receivers is "large enough", different source configurations can lead to a set of observation vectors that span the data subspace. However, since sources that are spatially close to each other have highly correlated signatures, seperation of such signals becomes very difficult in the presence of noise. To overcome this problem we consider iterative MUSIC algorithms like R-MUSIC and RAP-MUSIC. These recursive algorithms pose a computational burden as they require multiple large combinatorial searches. Results obtained with these algorithms using simulated data of different conductivity patterns are presented.

  2. GSBPP CAPSTONE REVIEW

    DTIC Science & Technology

    2016-12-01

    including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new GSBPP student survey in order to detail...analysis from multiple sources, including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new...GSBPP student survey in order to detail the capstone’s process, content, and value to multiple stakeholders. The project team also employs the Plan-Do

  3. Statistical Analysis of a Class: Monte Carlo and Multiple Imputation Spreadsheet Methods for Estimation and Extrapolation

    ERIC Educational Resources Information Center

    Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael

    2017-01-01

    The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…

  4. Who do we think we are? Analysing the content and form of identity work in the English National Health Service.

    PubMed

    McDermott, Imelda; Checkland, Kath; Harrison, Stephen; Snow, Stephanie; Coleman, Anna

    2013-01-01

    The language used by National Health Service (NHS) "commissioning" managers when discussing their roles and responsibilities can be seen as a manifestation of "identity work", defined as a process of identifying. This paper aims to offer a novel approach to analysing "identity work" by triangulation of multiple analytical methods, combining analysis of the content of text with analysis of its form. Fairclough's discourse analytic methodology is used as a framework. Following Fairclough, the authors use analytical methods associated with Halliday's systemic functional linguistics. While analysis of the content of interviews provides some information about NHS Commissioners' perceptions of their roles and responsibilities, analysis of the form of discourse that they use provides a more detailed and nuanced view. Overall, the authors found that commissioning managers have a higher level of certainty about what commissioning is not rather than what commissioning is; GP managers have a high level of certainty of their identity as a GP rather than as a manager; and both GP managers and non-GP managers oscillate between multiple identities depending on the different situations they are in. This paper offers a novel approach to triangulation, based not on the usual comparison of multiple data sources, but rather based on the application of multiple analytical methods to a single source of data. This paper also shows the latent uncertainty about the nature of commissioning enterprise in the English NHS.

  5. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  6. Targeted versus statistical approaches to selecting parameters for modelling sediment provenance

    NASA Astrophysics Data System (ADS)

    Laceby, J. Patrick

    2017-04-01

    One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.

  7. Atmospheric Multiple Scattering Effects on GLAS Altimetry. Part 2; Analysis of Expected Errors in Antarctic Altitude Measurements

    NASA Technical Reports Server (NTRS)

    Mahesh, Ashwin; Spinhirne, James D.; Duda, David P.; Eloranta, Edwin W.; Starr, David O'C (Technical Monitor)

    2001-01-01

    The altimetry bias in GLAS (Geoscience Laser Altimeter System) or other laser altimeters resulting from atmospheric multiple scattering is studied in relationship to current knowledge of cloud properties over the Antarctic Plateau. Estimates of seasonal and interannual changes in the bias are presented. Results show the bias in altitude from multiple scattering in clouds would be a significant error source without correction. The selective use of low optical depth clouds or cloudfree observations, as well as improved analysis of the return pulse such as by the Gaussian method used here, are necessary to minimize the surface altitude errors. The magnitude of the bias is affected by variations in cloud height, cloud effective particle size and optical depth. Interannual variations in these properties as well as in cloud cover fraction could lead to significant year-to-year variations in the altitude bias. Although cloud-free observations reduce biases in surface elevation measurements from space, over Antarctica these may often include near-surface blowing snow, also a source of scattering-induced delay. With careful selection and analysis of data, laser altimetry specifications can be met.

  8. EEG and MEG data analysis in SPM8.

    PubMed

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.

  9. EEG and MEG Data Analysis in SPM8

    PubMed Central

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221

  10. EEG and MEG source localization using recursively applied (RAP) MUSIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosher, J.C.; Leahy, R.M.

    1996-12-31

    The multiple signal characterization (MUSIC) algorithm locates multiple asynchronous dipolar sources from electroencephalography (EEG) and magnetoencephalography (MEG) data. A signal subspace is estimated from the data, then the algorithm scans a single dipole model through a three-dimensional head volume and computes projections onto this subspace. To locate the sources, the user must search the head volume for local peaks in the projection metric. Here we describe a novel extension of this approach which we refer to as RAP (Recursively APplied) MUSIC. This new procedure automatically extracts the locations of the sources through a recursive use of subspace projections, which usesmore » the metric of principal correlations as a multidimensional form of correlation analysis between the model subspace and the data subspace. The dipolar orientations, a form of `diverse polarization,` are easily extracted using the associated principal vectors.« less

  11. Response of dissolved trace metals to land use/land cover and their source apportionment using a receptor model in a subtropic river, China.

    PubMed

    Li, Siyue; Zhang, Quanfa

    2011-06-15

    Water samples were collected for determination of dissolved trace metals in 56 sampling sites throughout the upper Han River, China. Multivariate statistical analyses including correlation analysis, stepwise multiple linear regression models, and principal component and factor analysis (PCA/FA) were employed to examine the land use influences on trace metals, and a receptor model of factor analysis-multiple linear regression (FA-MLR) was used for source identification/apportionment of anthropogenic heavy metals in the surface water of the River. Our results revealed that land use was an important factor in water metals in the snow melt flow period and land use in the riparian zone was not a better predictor of metals than land use away from the river. Urbanization in a watershed and vegetation along river networks could better explain metals, and agriculture, regardless of its relative location, however slightly explained metal variables in the upper Han River. FA-MLR analysis identified five source types of metals, and mining, fossil fuel combustion, and vehicle exhaust were the dominant pollutions in the surface waters. The results demonstrated great impacts of human activities on metal concentrations in the subtropical river of China. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Modelling multiple sources of dissemination bias in meta-analysis.

    PubMed

    Bowden, Jack; Jackson, Dan; Thompson, Simon G

    2010-03-30

    Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.

  13. Bi-level Multi-Source Learning for Heterogeneous Block-wise Missing Data

    PubMed Central

    Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M.; Ye, Jieping

    2013-01-01

    Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer’s Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified “bi-level” learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. PMID:23988272

  14. Bi-level multi-source learning for heterogeneous block-wise missing data.

    PubMed

    Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M; Ye, Jieping

    2014-11-15

    Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer's Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified "bi-level" learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. © 2013 Elsevier Inc. All rights reserved.

  15. Application of Molecular Typing Results in Source Attribution Models: The Case of Multiple Locus Variable Number Tandem Repeat Analysis (MLVA) of Salmonella Isolates Obtained from Integrated Surveillance in Denmark.

    PubMed

    de Knegt, Leonardo V; Pires, Sara M; Löfström, Charlotta; Sørensen, Gitte; Pedersen, Karl; Torpdahl, Mia; Nielsen, Eva M; Hald, Tine

    2016-03-01

    Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark. The purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques. © 2015 Society for Risk Analysis.

  16. Cross-beam coherence of infrasonic signals at local and regional ranges.

    PubMed

    Alberts, W C Kirkpatrick; Tenney, Stephen M

    2017-11-01

    Signals collected by infrasound arrays require continuous analysis by skilled personnel or by automatic algorithms in order to extract useable information. Typical pieces of information gained by analysis of infrasonic signals collected by multiple sensor arrays are arrival time, line of bearing, amplitude, and duration. These can all be used, often with significant accuracy, to locate sources. A very important part of this chain is associating collected signals across multiple arrays. Here, a pairwise, cross-beam coherence method of signal association is described that allows rapid signal association for high signal-to-noise ratio events captured by multiple infrasound arrays at ranges exceeding 150 km. Methods, test cases, and results are described.

  17. Socio-Economic Factors Affecting Adoption of Modern Information and Communication Technology by Farmers in India: Analysis Using Multivariate Probit Model

    ERIC Educational Resources Information Center

    Mittal, Surabhi; Mehar, Mamta

    2016-01-01

    Purpose: The paper analyzes factors that affect the likelihood of adoption of different agriculture-related information sources by farmers. Design/Methodology/Approach: The paper links the theoretical understanding of the existing multiple sources of information that farmers use, with the empirical model to analyze the factors that affect the…

  18. Apportioning Sources of Riverine Nitrogen at Multiple Watershed Scales

    NASA Astrophysics Data System (ADS)

    Boyer, E. W.; Alexander, R. B.; Sebestyen, S. D.

    2005-05-01

    Loadings of reactive nitrogen (N) entering terrestrial landscapes have increased in recent decades due to anthropogenic activities associated with food and energy production. In the northeastern USA, this enhanced supply of N has been linked to many environmental concerns in both terrestrial and aquatic ecosystems, such as forest decline, lake and stream acidification, human respiratory problems, and coastal eutrophication. Thus N is a priority pollutant with regard to a whole host of air, land, and water quality issues, highlighting the need for methods to identify and quantify various N sources. Further, understanding precursor sources of N is critical to current and proposed public policies targeted at the reduction of N inputs to the terrestrial landscape and receiving waters. We present results from published and ongoing studies using multiple approaches to fingerprint sources of N in the northeastern USA, at watershed scales ranging from the headwaters to the coastal zone. The approaches include: 1) a mass balance model with a nitrogen-budgeting approach for analyses of large watersheds; 2) a spatially-referenced regression model with an empirical modeling approach for analyses of water quality at regional scales; and 3) a meta-analysis of monitoring data with a chemical tracer approach, utilizing concentrations of multiple elements and isotopic composition of N from water samples collected in the streams and rivers. We discuss the successes and limitations of these various approaches for apportioning contributions of N from multiple sources to receiving waters at regional scales.

  19. Modal Analysis Using the Singular Value Decomposition and Rational Fraction Polynomials

    DTIC Science & Technology

    2017-04-06

    information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...results. The programs are designed for experimental datasets with multiple drive and response points and have proven effective even for systems with... designed for experimental datasets with multiple drive and response points and have proven effective even for systems with numerous closely-spaced

  20. Two Wrongs Make a Right: Addressing Underreporting in Binary Data from Multiple Sources.

    PubMed

    Cook, Scott J; Blas, Betsabe; Carroll, Raymond J; Sinha, Samiran

    2017-04-01

    Media-based event data-i.e., data comprised from reporting by media outlets-are widely used in political science research. However, events of interest (e.g., strikes, protests, conflict) are often underreported by these primary and secondary sources, producing incomplete data that risks inconsistency and bias in subsequent analysis. While general strategies exist to help ameliorate this bias, these methods do not make full use of the information often available to researchers. Specifically, much of the event data used in the social sciences is drawn from multiple, overlapping news sources (e.g., Agence France-Presse, Reuters). Therefore, we propose a novel maximum likelihood estimator that corrects for misclassification in data arising from multiple sources. In the most general formulation of our estimator, researchers can specify separate sets of predictors for the true-event model and each of the misclassification models characterizing whether a source fails to report on an event. As such, researchers are able to accurately test theories on both the causes of and reporting on an event of interest. Simulations evidence that our technique regularly out performs current strategies that either neglect misclassification, the unique features of the data-generating process, or both. We also illustrate the utility of this method with a model of repression using the Social Conflict in Africa Database.

  1. Advanced analysis techniques for uranium assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples countmore » rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.« less

  2. Understanding and Using the Fermi Science Tools

    NASA Astrophysics Data System (ADS)

    Asercion, Joseph

    2018-01-01

    The Fermi Science Support Center (FSSC) provides information, documentation, and tools for the analysis of Fermi science data, including both the Large-Area Telescope (LAT) and the Gamma-ray Burst Monitor (GBM). Source and binary versions of the Fermi Science Tools can be downloaded from the FSSC website, and are supported on multiple platforms. An overview document, the Cicerone, provides details of the Fermi mission, the science instruments and their response functions, the science data preparation and analysis process, and interpretation of the results. Analysis Threads and a reference manual available on the FSSC website provide the user with step-by-step instructions for many different types of data analysis: point source analysis - generating maps, spectra, and light curves, pulsar timing analysis, source identification, and the use of python for scripting customized analysis chains. We present an overview of the structure of the Fermi science tools and documentation, and how to acquire them. We also provide examples of standard analyses, including tips and tricks for improving Fermi science analysis.

  3. Mass Spectrometry Theatre: A Model for Big-Screen Instrumental Analysis

    ERIC Educational Resources Information Center

    Allison, John

    2008-01-01

    Teaching lecture or lab courses in instrumental analysis can be a source of frustration since one can only crowd a small number of students around a single instrument, typically leading to round-robin approaches. Round-robin labs can spread students into multiple labs and limit instructor-student interactions. We discuss "Mass Spectrometry…

  4. Analyzing Student and Employer Satisfaction with Cooperative Education through Multiple Data Sources

    ERIC Educational Resources Information Center

    Jiang, Yuheng Helen; Lee, Sally Wai Yin; Golab, Lukasz

    2015-01-01

    This paper reports on the analysis of three years research of undergraduate cooperative work term postings and employer and employee evaluations. The objective of the analysis was to determine the factors affecting student and employer success and satisfaction with the work-integrated learning experience. It was found that students performed…

  5. Enhanced thermal and structural properties of partially phosphorylated polyvinyl alcohol - Aluminum phosphate (PPVA-Alpo4) nanocomposites with aluminium nitrate source

    NASA Astrophysics Data System (ADS)

    Saat, Asmalina Mohamed; Johan, Mohd Rafie

    2017-12-01

    Synthesis of AlPO4 nanocomposite depends on the ratio of aluminum to phosphate, method of synthesis and the source for aluminum and phosphate source used. Variation of phosphate and aluminum source used will form multiple equilibria reactions and affected by ions variability and concentration, stoichiometry, temperature during reaction process and especially the precipitation pH. Aluminum nitrate was used to produce a partially phosphorylated poly vinyl alcohol-aluminum phosphate (PPVA-AlPO4) nanocomposite with various nanoparticle shapes, structural and properties. Synthesis of PPVA-AlPO4 nanocomposite with aluminum nitrate shows enhancement of thermal and structural in comparison with pure PVA and modified PPVA. Thermogravimetric (TGA) analysis shows that the weight residue of PPVA-AlPO4 composite was higher than PPVA and PVA. X-ray diffraction (XRD) pattern of PVA shows a single peak broadening after the addition of phosphoric acid. Meanwhile, XRD pattern of PPVA-AlPO4 demonstrates multiple phases of AlPO4 in the nanocomposite. Field Emission Scanning Electron Microscopy (FESEM) confirmed the existence of multiple geometrical phases and nanosize of spherical particles.

  6. Source apportionment of trace metals in surface waters of a polluted stream using multivariate statistical analyses.

    PubMed

    Pekey, Hakan; Karakaş, Duran; Bakoğlu, Mithat

    2004-11-01

    Surface water samples were collected from ten previously selected sites of the polluted Dil Deresi stream, during two field surveys, December 2001 and April 2002. All samples were analyzed using ICP-AES, and the concentrations of trace metals (Al, As, Ba, Cd, Co, Cr, Cu, Fe, Pb, Sn and Zn) were determined. The results were compared with national and international water quality guidelines, as well as literature values reported for similar rivers. Factor analysis (FA) and a factor analysis-multiple regression (FA-MR) model were used for source apportionment and estimation of contributions from identified sources to the concentration of each parameter. By a varimax rotated factor analysis, four source types were identified as the paint industry; sewage, crustal and road traffic runoff for trace metals, explaining about 83% of the total variance. FA-MR results showed that predicted concentrations were calculated with uncertainties lower than 15%.

  7. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  8. Characterize kinematic rupture history of large earthquakes with Multiple Haskell sources

    NASA Astrophysics Data System (ADS)

    Jia, Z.; Zhan, Z.

    2017-12-01

    Earthquakes are often regarded as continuous rupture along a single fault, but the occurrence of complex large events involving multiple faults and dynamic triggering challenges this view. Such rupture complexities cause difficulties in existing finite fault inversion algorithms, because they rely on specific parameterizations and regularizations to obtain physically meaningful solutions. Furthermore, it is difficult to assess reliability and uncertainty of obtained rupture models. Here we develop a Multi-Haskell Source (MHS) method to estimate rupture process of large earthquakes as a series of sub-events of varying location, timing and directivity. Each sub-event is characterized by a Haskell rupture model with uniform dislocation and constant unilateral rupture velocity. This flexible yet simple source parameterization allows us to constrain first-order rupture complexity of large earthquakes robustly. Additionally, relatively few parameters in the inverse problem yields improved uncertainty analysis based on Markov chain Monte Carlo sampling in a Bayesian framework. Synthetic tests and application of MHS method on real earthquakes show that our method can capture major features of large earthquake rupture process, and provide information for more detailed rupture history analysis.

  9. Imaging System and Method for Biomedical Analysis

    DTIC Science & Technology

    2013-03-11

    biological particles and items of interest. Broadly, Padmanabhan et al. utilize the diffraction of a laser light source in flow cytometry to count...spread of light from multiple LED devices over the entire sample surface. Preferably, light source 308 projects a full spectrum white light. Light...for example, red blood cells, white blood cells (which may include lymphocytes which are relatively large and easily detectable), T-helper cells

  10. Effect of multiple-source entry on price competition after patent expiration in the pharmaceutical industry.

    PubMed Central

    Suh, D C; Manning, W G; Schondelmeyer, S; Hadsall, R S

    2000-01-01

    OBJECTIVE: To analyze the effect of multiple-source drug entry on price competition after patent expiration in the pharmaceutical industry. DATA SOURCES: Originators and their multiple-source drugs selected from the 35 chemical entities whose patents expired from 1984 through 1987. Data were obtained from various primary and secondary sources for the patents' expiration dates, sales volume and units sold, and characteristics of drugs in the sample markets. STUDY DESIGN: The study was designed to determine significant factors using the study model developed under the assumption that the off-patented market is an imperfectly segmented market. PRINCIPAL FINDINGS: After patent expiration, the originators' prices continued to increase, while the price of multiple-source drugs decreased significantly over time. By the fourth year after patent expiration, originators' sales had decreased 12 percent in dollars and 30 percent in quantity. Multiple-source drugs increased their sales twofold in dollars and threefold in quantity, and possessed about one-fourth (in dollars) and half (in quantity) of the total market three years after entry. CONCLUSION: After patent expiration, multiple-source drugs compete largely with other multiple-source drugs in the price-sensitive sector, but indirectly with the originator in the price-insensitive sector. Originators have first-mover advantages, and therefore have a market that is less price sensitive after multiple-source drugs enter. On the other hand, multiple-source drugs target the price-sensitive sector, using their lower-priced drugs. This trend may indicate that the off-patented market is imperfectly segmented between the price-sensitive and insensitive sector. Consumers as a whole can gain from the entry of multiple-source drugs because the average price of the market continually declines after patent expiration. PMID:10857475

  11. [Sources analysis and contribution identification of polycyclic aromatic hydrocarbons in indoor and outdoor air of Hangzhou].

    PubMed

    Liu, Y; Zhu, L; Wang, J; Shen, X; Chen, X

    2001-11-01

    Twelve polycyclic aromatic hydrocarbons (PAHs) were measured in eight homes in Hangzhou during the summer and autumn in 1999. The sources of PAHs and the contributions of the sources to the total concentration of PAHs in the indoor air were identified by the combination of correlation analysis, factor analysis and multiple regression, and the equations between the concentrations of PAHs in indoor and outdoor air and factors were got. It was indicated that the factors of PAHs in the indoor air were domestic cuisine, the volatility of the mothball, cigarette smoke and heating, the waste gas from vehicles. In the smokers' home, cigarette smoke was the most important factor, and it contributed 25.8% of BaP to the indoor air of smokers' home.

  12. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China.

    PubMed

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-12-02

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method.

  13. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China

    PubMed Central

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-01-01

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the “source-pathway-target” in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450

  14. AtomicJ: An open source software for analysis of force curves

    NASA Astrophysics Data System (ADS)

    Hermanowicz, Paweł; Sarna, Michał; Burda, Kvetoslava; Gabryś, Halina

    2014-06-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh.

  15. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klumpp, John

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less

  16. Evaluating the Use of Existing Data Sources, Probabilistic Linkage, and Multiple Imputation to Build Population-based Injury Databases Across Phases of Trauma Care

    PubMed Central

    Newgard, Craig; Malveau, Susan; Staudenmayer, Kristan; Wang, N. Ewen; Hsia, Renee Y.; Mann, N. Clay; Holmes, James F.; Kuppermann, Nathan; Haukoos, Jason S.; Bulger, Eileen M.; Dai, Mengtao; Cook, Lawrence J.

    2012-01-01

    Objectives The objective was to evaluate the process of using existing data sources, probabilistic linkage, and multiple imputation to create large population-based injury databases matched to outcomes. Methods This was a retrospective cohort study of injured children and adults transported by 94 emergency medical systems (EMS) agencies to 122 hospitals in seven regions of the western United States over a 36-month period (2006 to 2008). All injured patients evaluated by EMS personnel within specific geographic catchment areas were included, regardless of field disposition or outcome. The authors performed probabilistic linkage of EMS records to four hospital and postdischarge data sources (emergency department [ED] data, patient discharge data, trauma registries, and vital statistics files) and then handled missing values using multiple imputation. The authors compare and evaluate matched records, match rates (proportion of matches among eligible patients), and injury outcomes within and across sites. Results There were 381,719 injured patients evaluated by EMS personnel in the seven regions. Among transported patients, match rates ranged from 14.9% to 87.5% and were directly affected by the availability of hospital data sources and proportion of missing values for key linkage variables. For vital statistics records (1-year mortality), estimated match rates ranged from 88.0% to 98.7%. Use of multiple imputation (compared to complete case analysis) reduced bias for injury outcomes, although sample size, percentage missing, type of variable, and combined-site versus single-site imputation models all affected the resulting estimates and variance. Conclusions This project demonstrates the feasibility and describes the process of constructing population-based injury databases across multiple phases of care using existing data sources and commonly available analytic methods. Attention to key linkage variables and decisions for handling missing values can be used to increase match rates between data sources, minimize bias, and preserve sampling design. PMID:22506952

  17. Surveillance of traffic incident management-related occupational fatalities in Kentucky, 2005-2016.

    PubMed

    Bunn, T L; Slavova, S; Chandler, M; Hanner, N; Singleton, M

    2018-05-19

    Traffic incidents occurring on roadways require the coordinated effort of multiple responder and recovery entities, including communications, law enforcement, fire and rescue, emergency medical services, hazardous materials, transportation agencies, and towing and recovery. The objectives of this study were to (1) identify and characterize transportation incident management (TIM)-related occupational fatalities; (2) assess concordance of surveillance data sources in identifying TIM occupations, driver vs. pedestrian status, and occupational fatality incident location; and (3) determine and compare U.S. occupational fatality rates for TIM industries. The Kentucky Fatality Assessment and Control Evaluation (FACE) program analyzed 2005-2016 TIM occupational fatality data using multiple data sources: death certificate data, Collision Report Analysis for Safer Highways (CRASH) data, and media reports, among others. Literal text analysis was performed on FACE data, and a multiple linear regression model and SAS proc sgpanel were used to estimate and visualize the U.S. TIM occupational mortality trend lines and confidence bounds. There were 29 TIM fatalities from 2005 to 2015 in Kentucky; 41% of decedents were in the police protection occupation, and 21% each were in the fire protection and motor vehicle towing industries. Over one half of the TIM decedents were performing work activities as pedestrians when they died. Media reports identified the majority of the occupational fatalities as TIM related (28 of 29 TIM-related deaths); the use of death certificates as the sole surveillance data source only identified 17 of the 29 deaths as TIM related, and the use of CRASH data only identified 4 of the 29 deaths as TIM related. Injury scenario text analysis showed that law enforcement vehicle pursuit, towing and recovery vehicle loading, and disabled vehicle response were particular high-risk activities that led to TIM deaths. Using U.S. data, the motor vehicle towing industry had a significantly higher risk for occupational mortality compared to the fire protection and police protection industries. Multiple data sources are needed to comprehensively identify TIM fatalities and to examine the circumstances surrounding TIM fatalities, because no one data source in itself was adequate and undercounted the total number of TIM fatalities. The motor vehicle towing industry, in particular, is at elevated risk for occupational mortality, and targeted mandatory TIM training for the motor vehicle towing industry should be considered. In addition, enhanced law enforcement roadside safety training during vehicle pursuit and apprehension of suspects is recommended.

  18. Design Environment for Multifidelity and Multidisciplinary Components

    NASA Technical Reports Server (NTRS)

    Platt, Michael

    2014-01-01

    One of the greatest challenges when developing propulsion systems is predicting the interacting effects between the fluid loads, thermal loads, and structural deflection. The interactions between technical disciplines often are not fully analyzed, and the analysis in one discipline often uses a simplified representation of other disciplines as an input or boundary condition. For example, the fluid forces in an engine generate static and dynamic rotor deflection, but the forces themselves are dependent on the rotor position and its orbit. It is important to consider the interaction between the physical phenomena where the outcome of each analysis is heavily dependent on the inputs (e.g., changes in flow due to deflection, changes in deflection due to fluid forces). A rigid design process also lacks the flexibility to employ multiple levels of fidelity in the analysis of each of the components. This project developed and validated an innovative design environment that has the flexibility to simultaneously analyze multiple disciplines and multiple components with multiple levels of model fidelity. Using NASA's open-source multidisciplinary design analysis and optimization (OpenMDAO) framework, this multifaceted system will provide substantially superior capabilities to current design tools.

  19. A study of high-temperature heat pipes with multiple heat sources and sinks. I - Experimental methodology and frozen startup profiles. II - Analysis of continuum transient and steady-state experimental data with numerical predictions

    NASA Technical Reports Server (NTRS)

    Faghri, A.; Cao, Y.; Buchko, M.

    1991-01-01

    Experimental profiles for heat pipe startup from the frozen state were obtained, using a high-temperature sodium/stainless steel pipe with multiple heat sources and sinks to investigate the startup behavior of the heat pipe for various heat loads and input locations, with both low and high heat rejection rates at the condensor. The experimental results of the performance characteristics for the continuum transient and steady-state operation of the heat pipe were analyzed, and the performance limits for operation with varying heat fluxes and location are determined.

  20. Water sources and mixing in riparian wetlands revealed by tracers and geospatial analysis.

    PubMed

    Lessels, Jason S; Tetzlaff, Doerthe; Birkel, Christian; Dick, Jonathan; Soulsby, Chris

    2016-01-01

    Mixing of waters within riparian zones has been identified as an important influence on runoff generation and water quality. Improved understanding of the controls on the spatial and temporal variability of water sources and how they mix in riparian zones is therefore of both fundamental and applied interest. In this study, we have combined topographic indices derived from a high-resolution Digital Elevation Model (DEM) with repeated spatially high-resolution synoptic sampling of multiple tracers to investigate such dynamics of source water mixing. We use geostatistics to estimate concentrations of three different tracers (deuterium, alkalinity, and dissolved organic carbon) across an extended riparian zone in a headwater catchment in NE Scotland, to identify spatial and temporal influences on mixing of source waters. The various biogeochemical tracers and stable isotopes helped constrain the sources of runoff and their temporal dynamics. Results show that spatial variability in all three tracers was evident in all sampling campaigns, but more pronounced in warmer dryer periods. The extent of mixing areas within the riparian area reflected strong hydroclimatic controls and showed large degrees of expansion and contraction that was not strongly related to topographic indices. The integrated approach of using multiple tracers, geospatial statistics, and topographic analysis allowed us to classify three main riparian source areas and mixing zones. This study underlines the importance of the riparian zones for mixing soil water and groundwater and introduces a novel approach how this mixing can be quantified and the effect on the downstream chemistry be assessed.

  1. Development of Remote Sampling ESI Mass Spectrometry for the Rapid and Automatic Analysis of Multiple Samples

    PubMed Central

    Yamada, Yuki; Ninomiya, Satoshi; Hiraoka, Kenzo; Chen, Lee Chuin

    2016-01-01

    We report on combining a self-aspirated sampling probe and an ESI source using a single metal capillary which is electrically grounded and safe for use by the operator. To generate an electrospray, a negative H.V. is applied to the counter electrode of the ESI emitter to operate in positive ion mode. The sampling/ESI capillary is enclosed within another concentric capillary similar to the arrangement for a standard pneumatically assisted ESI source. The suction of the liquid sample is due to the Venturi effect created by the high-velocity gas flow near the ESI tip. In addition to serving as the mechanism for suction, the high-velocity gas flow also assists in the nebulization of charged droplets, thus producing a stable ion signal. Even though the potential of the ion source counter electrode is more negative than the mass spectrometer in the positive ion mode, the electric field effect is not significant if the ion source and the mass spectrometer are separated by a sufficient distance. Ion transmission is achieved by the viscous flow of the carrier gas. Using the present arrangement, the user can hold the ion source in a bare hand and the ion signal appears almost immediately when the sampling capillary is brought into contact with the liquid sample. The automated analysis of multiple samples can also be achieved by using motorized sample stage and an automated ion source holder. PMID:28616373

  2. Development of Remote Sampling ESI Mass Spectrometry for the Rapid and Automatic Analysis of Multiple Samples.

    PubMed

    Yamada, Yuki; Ninomiya, Satoshi; Hiraoka, Kenzo; Chen, Lee Chuin

    2016-01-01

    We report on combining a self-aspirated sampling probe and an ESI source using a single metal capillary which is electrically grounded and safe for use by the operator. To generate an electrospray, a negative H.V. is applied to the counter electrode of the ESI emitter to operate in positive ion mode. The sampling/ESI capillary is enclosed within another concentric capillary similar to the arrangement for a standard pneumatically assisted ESI source. The suction of the liquid sample is due to the Venturi effect created by the high-velocity gas flow near the ESI tip. In addition to serving as the mechanism for suction, the high-velocity gas flow also assists in the nebulization of charged droplets, thus producing a stable ion signal. Even though the potential of the ion source counter electrode is more negative than the mass spectrometer in the positive ion mode, the electric field effect is not significant if the ion source and the mass spectrometer are separated by a sufficient distance. Ion transmission is achieved by the viscous flow of the carrier gas. Using the present arrangement, the user can hold the ion source in a bare hand and the ion signal appears almost immediately when the sampling capillary is brought into contact with the liquid sample. The automated analysis of multiple samples can also be achieved by using motorized sample stage and an automated ion source holder.

  3. Detecting misinformation and knowledge conflicts in relational data

    NASA Astrophysics Data System (ADS)

    Levchuk, Georgiy; Jackobsen, Matthew; Riordan, Brian

    2014-06-01

    Information fusion is required for many mission-critical intelligence analysis tasks. Using knowledge extracted from various sources, including entities, relations, and events, intelligence analysts respond to commander's information requests, integrate facts into summaries about current situations, augment existing knowledge with inferred information, make predictions about the future, and develop action plans. However, information fusion solutions often fail because of conflicting and redundant knowledge contained in multiple sources. Most knowledge conflicts in the past were due to translation errors and reporter bias, and thus could be managed. Current and future intelligence analysis, especially in denied areas, must deal with open source data processing, where there is much greater presence of intentional misinformation. In this paper, we describe a model for detecting conflicts in multi-source textual knowledge. Our model is based on constructing semantic graphs representing patterns of multi-source knowledge conflicts and anomalies, and detecting these conflicts by matching pattern graphs against the data graph constructed using soft co-reference between entities and events in multiple sources. The conflict detection process maintains the uncertainty throughout all phases, providing full traceability and enabling incremental updates of the detection results as new knowledge or modification to previously analyzed information are obtained. Detected conflicts are presented to analysts for further investigation. In the experimental study with SYNCOIN dataset, our algorithms achieved perfect conflict detection in ideal situation (no missing data) while producing 82% recall and 90% precision in realistic noise situation (15% of missing attributes).

  4. Advanced Optimal Extraction for the Spitzer/IRS

    NASA Astrophysics Data System (ADS)

    Lebouteiller, V.; Bernard-Salas, J.; Sloan, G. C.; Barry, D. J.

    2010-02-01

    We present new advances in the spectral extraction of pointlike sources adapted to the Infrared Spectrograph (IRS) on board the Spitzer Space Telescope. For the first time, we created a supersampled point-spread function of the low-resolution modules. We describe how to use the point-spread function to perform optimal extraction of a single source and of multiple sources within the slit. We also examine the case of the optimal extraction of one or several sources with a complex background. The new algorithms are gathered in a plug-in called AdOpt which is part of the SMART data analysis software.

  5. Acoustic Source Localization in Aircraft Interiors Using Microphone Array Technologies

    NASA Technical Reports Server (NTRS)

    Sklanka, Bernard J.; Tuss, Joel R.; Buehrle, Ralph D.; Klos, Jacob; Williams, Earl G.; Valdivia, Nicolas

    2006-01-01

    Using three microphone array configurations at two aircraft body stations on a Boeing 777-300ER flight test, the acoustic radiation characteristics of the sidewall and outboard floor system are investigated by experimental measurement. Analysis of the experimental data is performed using sound intensity calculations for closely spaced microphones, PATCH Inverse Boundary Element Nearfield Acoustic Holography, and Spherical Nearfield Acoustic Holography. Each method is compared assessing strengths and weaknesses, evaluating source identification capability for both broadband and narrowband sources, evaluating sources during transient and steady-state conditions, and quantifying field reconstruction continuity using multiple array positions.

  6. Multiple Household Water Sources and Their Use in Remote Communities With Evidence From Pacific Island Countries

    NASA Astrophysics Data System (ADS)

    Elliott, Mark; MacDonald, Morgan C.; Chan, Terence; Kearton, Annika; Shields, Katherine F.; Bartram, Jamie K.; Hadwen, Wade L.

    2017-11-01

    Global water research and monitoring typically focus on the household's "main source of drinking-water." Use of multiple water sources to meet daily household needs has been noted in many developing countries but rarely quantified or reported in detail. We gathered self-reported data using a cross-sectional survey of 405 households in eight communities of the Republic of the Marshall Islands (RMI) and five Solomon Islands (SI) communities. Over 90% of households used multiple sources, with differences in sources and uses between wet and dry seasons. Most RMI households had large rainwater tanks and rationed stored rainwater for drinking throughout the dry season, whereas most SI households collected rainwater in small pots, precluding storage across seasons. Use of a source for cooking was strongly positively correlated with use for drinking, whereas use for cooking was negatively correlated or uncorrelated with nonconsumptive uses (e.g., bathing). Dry season water uses implied greater risk of water-borne disease, with fewer (frequently zero) handwashing sources reported and more unimproved sources consumed. Use of multiple sources is fundamental to household water management and feasible to monitor using electronic survey tools. We contend that recognizing multiple water sources can greatly improve understanding of household-level and community-level climate change resilience, that use of multiple sources confounds health impact studies of water interventions, and that incorporating multiple sources into water supply interventions can yield heretofore-unrealized benefits. We propose that failure to consider multiple sources undermines the design and effectiveness of global water monitoring, data interpretation, implementation, policy, and research.

  7. A Task-Based Needs Analysis for Australian Aboriginal Students: Going beyond the Target Situation to Address Cultural Issues

    ERIC Educational Resources Information Center

    Oliver, Rhonda; Grote, Ellen; Rochecouste, Judith; Exell, Michael

    2013-01-01

    While needs analyses underpin the design of second language analytic syllabi, the methodologies undertaken are rarely examined. This paper explores the value of multiple data sources and collection methods for developing a needs analysis model to enable vocational education and training teachers to address the needs of Australian Aboriginal…

  8. Beyond Logging of Fingertip Actions: Analysis of Collaborative Learning Using Multiple Sources of Data

    ERIC Educational Resources Information Center

    Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.

    2007-01-01

    In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…

  9. Accolades and Recommendations: A Longitudinal Analysis of Monitoring Reports for Two Charter Schools Serving Native American Students

    ERIC Educational Resources Information Center

    Anderson, Derek L.; Holder, K. C.

    2012-01-01

    This longitudinal case study examines 10 years' worth of annual monitoring reports for two rural Native American Charter Schools. Using data from multiple sources including interviews, site visits, and document analyses, the authors used provisional coding and constant comparison analysis to categorize the accolades and recommendations embedded in…

  10. Assessment of Innovation Competency: A Thematic Analysis of Upper Secondary School Teachers' Talk

    ERIC Educational Resources Information Center

    Nielsen, Jan Alexis

    2015-01-01

    The author employed a 3-step qualitative research design with multiple instances of source validation to capture expert teachers' (n = 28) reflections on which manifest signs they would look for when they asses students' innovation competency. The author reports on the thematic analysis of the recorded talk in interaction that occurred in teacher…

  11. Sources of sport confidence, imagery type and performance among competitive athletes: the mediating role of sports confidence.

    PubMed

    Levy, A R; Perry, J; Nicholls, A R; Larkin, D; Davies, J

    2015-01-01

    This study explored the mediating role of sport confidence upon (1) sources of sport confidence-performance relationship and (2) imagery-performance relationship. Participants were 157 competitive athletes who completed state measures of confidence level/sources, imagery type and performance within one hour after competition. Among the current sample, confirmatory factor analysis revealed appropriate support for the nine-factor SSCQ and the five-factor SIQ. Mediational analysis revealed that sport confidence had a mediating influence upon the achievement source of confidence-performance relationship. In addition, both cognitive and motivational imagery types were found to be important sources of confidence, as sport confidence mediated imagery type- performance relationship. Findings indicated that athletes who construed confidence from their own achievements and report multiple images on a more frequent basis are likely to benefit from enhanced levels of state sport confidence and subsequent performance.

  12. Predictors of Age of Diagnosis for Children with Autism Spectrum Disorder: The Role of a Consistent Source of Medical Care, Race, and Condition Severity

    ERIC Educational Resources Information Center

    Emerson, Natacha D.; Morrell, Holly E. R.; Neece, Cameron

    2016-01-01

    Having a consistent source of medical care may facilitate diagnosis of autism spectrum disorders (ASD). This study examined predictors of age of ASD diagnosis using data from the 2011-2012 National Survey of Children's Health. Using multiple linear regression analysis, age of diagnosis was predicted by race, ASD severity, having a consistent…

  13. Validation of a Sensor-Driven Modeling Paradigm for Multiple Source Reconstruction with FFT-07 Data

    DTIC Science & Technology

    2009-05-01

    operational warning and reporting (information) systems that combine automated data acquisition, analysis , source reconstruction, display and distribution of...report and to incorporate this operational ca- pability into the integrative multiscale urban modeling system implemented in the com- putational...Journal of Fluid Mechanics, 180, 529–556. [27] Flesch, T., Wilson, J. D., and Yee, E. (1995), Backward- time Lagrangian stochastic dispersion models

  14. Detection of multiple enteric virus strains within a foodborne outbreak of gastroenteritis: an indication of the source of contamination.

    PubMed Central

    Gallimore, C. I.; Pipkin, C.; Shrimpton, H.; Green, A. D.; Pickford, Y.; McCartney, C.; Sutherland, G.; Brown, D. W. G.; Gray, J. J.

    2005-01-01

    An outbreak of acute gastroenteritis of suspected viral aetiology occurred in April 2003 in the British Royal Fleet Auxiliary ship (RFA) Argus deployed in the Northern Arabian Gulf. There were 37 cases amongst a crew of 400 personnel. Of 13 samples examined from cases amongst the crew, six enteric viruses were detected by reverse transcriptase polymerase chain reaction (RT-PCR). Five different viruses were identified including, three norovirus genotypes, a sapovirus and a rotavirus. No multiple infections were detected. A common food source was implicated in the outbreak and epidemiological analysis showed a statistically significant association with salad as the source of the outbreak, with a relative risk of 3.41 (95% confidence interval of 1.7-6.81) of eating salad on a particular date prior to the onset of symptoms. Faecal contamination of the salad at source was the most probable explanation for the diversity of viruses detected and characterized. PMID:15724709

  15. Misconceptions and biases in German students' perception of multiple energy sources: implications for science education

    NASA Astrophysics Data System (ADS)

    Lee, Roh Pin

    2016-04-01

    Misconceptions and biases in energy perception could influence people's support for developments integral to the success of restructuring a nation's energy system. Science education, in equipping young adults with the cognitive skills and knowledge necessary to navigate in the confusing energy environment, could play a key role in paving the way for informed decision-making. This study examined German students' knowledge of the contribution of diverse energy sources to their nation's energy mix as well as their affective energy responses so as to identify implications for science education. Specifically, the study investigated whether and to what extent students hold mistaken beliefs about the role of multiple energy sources in their nation's energy mix, and assessed how misconceptions could act as self-generated reference points to underpin support/resistance of proposed developments. An in-depth analysis of spontaneous affective associations with five key energy sources also enabled the identification of underlying concerns driving people's energy responses and facilitated an examination of how affective perception, in acting as a heuristic, could lead to biases in energy judgment and decision-making. Finally, subgroup analysis differentiated by education and gender supported insights into a 'two culture' effect on energy perception and the challenge it poses to science education.

  16. Dynamic Dependence Analysis : Modeling and Inference of Changing Dependence Among Multiple Time-Series

    DTIC Science & Technology

    2009-06-01

    isolation. In addition to being inherently multi-modal, human perception takes advantages of multiple sources of information within a single modality...restric- tion was reasonable for the applications we looked at. However, consider using a TIM to model a teacher student relationship among moving objects...That is, imagine one teacher object demonstrating a behavior for a student object. The student can observe the teacher and then recreate the behavior

  17. Validation of luminescent source reconstruction using spectrally resolved bioluminescence images

    NASA Astrophysics Data System (ADS)

    Virostko, John M.; Powers, Alvin C.; Jansen, E. D.

    2008-02-01

    This study examines the accuracy of the Living Image® Software 3D Analysis Package (Xenogen, Alameda, CA) in reconstruction of light source depth and intensity. Constant intensity light sources were placed in an optically homogeneous medium (chicken breast). Spectrally filtered images were taken at 560, 580, 600, 620, 640, and 660 nanometers. The Living Image® Software 3D Analysis Package was employed to reconstruct source depth and intensity using these spectrally filtered images. For sources shallower than the mean free path of light there was proportionally higher inaccuracy in reconstruction. For sources deeper than the mean free path, the average error in depth and intensity reconstruction was less than 4% and 12%, respectively. The ability to distinguish multiple sources decreased with increasing source depth and typically required a spatial separation of twice the depth. The constant intensity light sources were also implanted in mice to examine the effect of optical inhomogeneity. The reconstruction accuracy suffered in inhomogeneous tissue with accuracy influenced by the choice of optical properties used in reconstruction.

  18. An integrative framework to reevaluate the Neotropical catfish genus Guyanancistrus (Siluriformes: Loricariidae) with particular emphasis on the Guyanancistrus brevispinis complex.

    PubMed

    Fisch-Muller, Sonia; Mol, Jan H A; Covain, Raphaël

    2018-01-01

    Characterizing and naming species becomes more and more challenging due to the increasing difficulty of accurately delineating specific bounderies. In this context, integrative taxonomy aims to delimit taxonomic units by leveraging the complementarity of multiple data sources (geography, morphology, genetics, etc.). However, while the theoretical framework of integrative taxonomy has been explicitly stated, methods for the simultaneous analysis of multiple data sets are poorly developed and in many cases different information sources are still explored successively. Multi-table methods developed in the field of community ecology provide such an intregrative framework. In particular, multiple co-inertia analysis is flexible enough to allow the integration of morphological, distributional, and genetic data in the same analysis. We have applied this powerfull approach to delimit species boundaries in a group of poorly differentiated catfishes belonging to the genus Guyanancistrus from the Guianas region of northeastern South America. Because the species G. brevispinis has been claimed to be a species complex consisting of five species, particular attention was paid to taxon. Separate analyses indicated the presence of eight distinct species of Guyanancistrus, including five new species and one new genus. However, none of the preliminary analyses revealed different lineages within G. brevispinis, and the multi-table analysis revealed three intraspecific lineages. After taxonomic clarifications and description of the new genus, species and subspecies, a reappraisal of the biogeography of Guyanancistrus members was performed. This analysis revealed three distinct dispersals from the Upper reaches of Amazonian tributaries toward coastal rivers of the Eastern Guianas Ecoregion. The central role played by the Maroni River, as gateway from the Amazon basin, was confirmed. The Maroni River was also found to be a center of speciation for Guyanancistrus (with three species and two subspecies), as well as a source of dispersal of G. brevispinis toward the other main basins of the Eastern Guianas.

  19. An integrative framework to reevaluate the Neotropical catfish genus Guyanancistrus (Siluriformes: Loricariidae) with particular emphasis on the Guyanancistrus brevispinis complex

    PubMed Central

    Fisch-Muller, Sonia; Mol, Jan H. A.

    2018-01-01

    Characterizing and naming species becomes more and more challenging due to the increasing difficulty of accurately delineating specific bounderies. In this context, integrative taxonomy aims to delimit taxonomic units by leveraging the complementarity of multiple data sources (geography, morphology, genetics, etc.). However, while the theoretical framework of integrative taxonomy has been explicitly stated, methods for the simultaneous analysis of multiple data sets are poorly developed and in many cases different information sources are still explored successively. Multi-table methods developed in the field of community ecology provide such an intregrative framework. In particular, multiple co-inertia analysis is flexible enough to allow the integration of morphological, distributional, and genetic data in the same analysis. We have applied this powerfull approach to delimit species boundaries in a group of poorly differentiated catfishes belonging to the genus Guyanancistrus from the Guianas region of northeastern South America. Because the species G. brevispinis has been claimed to be a species complex consisting of five species, particular attention was paid to taxon. Separate analyses indicated the presence of eight distinct species of Guyanancistrus, including five new species and one new genus. However, none of the preliminary analyses revealed different lineages within G. brevispinis, and the multi-table analysis revealed three intraspecific lineages. After taxonomic clarifications and description of the new genus, species and subspecies, a reappraisal of the biogeography of Guyanancistrus members was performed. This analysis revealed three distinct dispersals from the Upper reaches of Amazonian tributaries toward coastal rivers of the Eastern Guianas Ecoregion. The central role played by the Maroni River, as gateway from the Amazon basin, was confirmed. The Maroni River was also found to be a center of speciation for Guyanancistrus (with three species and two subspecies), as well as a source of dispersal of G. brevispinis toward the other main basins of the Eastern Guianas. PMID:29298344

  20. Share Repository Framework: Component Specification and Otology

    DTIC Science & Technology

    2008-04-23

    Palantir Technologies has created one such software application to support the DoD intelligence community by providing robust capabilities for...managing data from various sources. The Palantir tool is based on user-defined ontologies and supports multiple representation and analysis tools

  1. Comparison of two trajectory based models for locating particle sources for two rural New York sites

    NASA Astrophysics Data System (ADS)

    Zhou, Liming; Hopke, Philip K.; Liu, Wei

    Two back trajectory-based statistical models, simplified quantitative transport bias analysis and residence-time weighted concentrations (RTWC) have been compared for their capabilities of identifying likely locations of source emissions contributing to observed particle concentrations at Potsdam and Stockton, New York. Quantitative transport bias analysis (QTBA) attempts to take into account the distribution of concentrations around the directions of the back trajectories. In full QTBA approach, deposition processes (wet and dry) are also considered. Simplified QTBA omits the consideration of deposition. It is best used with multiple site data. Similarly the RTWC approach uses concentrations measured at different sites along with the back trajectories to distribute the concentration contributions across the spatial domain of the trajectories. In this study, these models are used in combination with the source contribution values obtained by the previous positive matrix factorization analysis of particle composition data from Potsdam and Stockton. The six common sources for the two sites, sulfate, soil, zinc smelter, nitrate, wood smoke and copper smelter were analyzed. The results of the two methods are consistent and locate large and clearly defined sources well. RTWC approach can find more minor sources but may also give unrealistic estimations of the source locations.

  2. The influence of the interactions between anthropogenic activities and multiple ecological factors on land surface temperatures of urban forests

    NASA Astrophysics Data System (ADS)

    Ren, Y.

    2017-12-01

    Context Land surface temperatures (LSTs) spatio-temporal distribution pattern of urban forests are influenced by many ecological factors; the identification of interaction between these factors can improve simulations and predictions of spatial patterns of urban cold islands. This quantitative research requires an integrated method that combines multiple sources data with spatial statistical analysis. Objectives The purpose of this study was to clarify urban forest LST influence interaction between anthropogenic activities and multiple ecological factors using cluster analysis of hot and cold spots and Geogdetector model. We introduced the hypothesis that anthropogenic activity interacts with certain ecological factors, and their combination influences urban forests LST. We also assumed that spatio-temporal distributions of urban forest LST should be similar to those of ecological factors and can be represented quantitatively. Methods We used Jinjiang as a representative city in China as a case study. Population density was employed to represent anthropogenic activity. We built up a multi-source data (forest inventory, digital elevation models (DEM), population, and remote sensing imagery) on a unified urban scale to support urban forest LST influence interaction research. Through a combination of spatial statistical analysis results, multi-source spatial data, and Geogdetector model, the interaction mechanisms of urban forest LST were revealed. Results Although different ecological factors have different influences on forest LST, in two periods with different hot spots and cold spots, the patch area and dominant tree species were the main factors contributing to LST clustering in urban forests. The interaction between anthropogenic activity and multiple ecological factors increased LST in urban forest stands, linearly and nonlinearly. Strong interactions between elevation and dominant species were generally observed and were prevalent in either hot or cold spots areas in different years. Conclusions In conclusion, a combination of spatial statistics and GeogDetector models should be effective for quantitatively evaluating interactive relationships among ecological factors, anthropogenic activity and LST.

  3. Analysis of Discontinuity Induced Bifurcations in a Dual Input DC-DC Converter

    NASA Astrophysics Data System (ADS)

    Giaouris, Damian; Banerjee, Soumitro; Mandal, Kuntal; Al-Hindawi, Mohammed M.; Abusorrah, Abdullah; Al-Turki, Yusuf; El Aroudi, Abdelali

    DC-DC power converters with multiple inputs and a single output are used in numerous applications where multiple sources, e.g. two or more renewable energy sources and/or a battery, feed a single load. In this work, a classical boost converter topology with two input branches connected to two different sources is chosen, with each branch independently being controlled by a separate peak current mode controller. We demonstrate for the first time that even though this converter is similar to other well known topologies that have been studied before, it exhibits many complex nonlinear behaviors that are not found in any other standard PWM controlled power converter. The system undergoes period incrementing cascade as a parameter is varied, with discontinuous hard transitions between consecutive periodicities. We show that the system can be described by a discontinuous map, which explains the observed bifurcation phenomena. The results have been experimentally validated.

  4. Assessment of source-specific health effects associated with an unknown number of major sources of multiple air pollutants: a unified Bayesian approach.

    PubMed

    Park, Eun Sug; Hopke, Philip K; Oh, Man-Suk; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford H

    2014-07-01

    There has been increasing interest in assessing health effects associated with multiple air pollutants emitted by specific sources. A major difficulty with achieving this goal is that the pollution source profiles are unknown and source-specific exposures cannot be measured directly; rather, they need to be estimated by decomposing ambient measurements of multiple air pollutants. This estimation process, called multivariate receptor modeling, is challenging because of the unknown number of sources and unknown identifiability conditions (model uncertainty). The uncertainty in source-specific exposures (source contributions) as well as uncertainty in the number of major pollution sources and identifiability conditions have been largely ignored in previous studies. A multipollutant approach that can deal with model uncertainty in multivariate receptor models while simultaneously accounting for parameter uncertainty in estimated source-specific exposures in assessment of source-specific health effects is presented in this paper. The methods are applied to daily ambient air measurements of the chemical composition of fine particulate matter ([Formula: see text]), weather data, and counts of cardiovascular deaths from 1995 to 1997 for Phoenix, AZ, USA. Our approach for evaluating source-specific health effects yields not only estimates of source contributions along with their uncertainties and associated health effects estimates but also estimates of model uncertainty (posterior model probabilities) that have been ignored in previous studies. The results from our methods agreed in general with those from the previously conducted workshop/studies on the source apportionment of PM health effects in terms of number of major contributing sources, estimated source profiles, and contributions. However, some of the adverse source-specific health effects identified in the previous studies were not statistically significant in our analysis, which probably resulted because we incorporated parameter uncertainty in estimated source contributions that has been ignored in the previous studies into the estimation of health effects parameters. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Atmospheric concentrations, sources and gas-particle partitioning of PAHs in Beijing after the 29th Olympic Games.

    PubMed

    Ma, Wan-Li; Sun, De-Zhi; Shen, Wei-Guo; Yang, Meng; Qi, Hong; Liu, Li-Yan; Shen, Ji-Min; Li, Yi-Fan

    2011-07-01

    A comprehensive sampling campaign was carried out to study atmospheric concentration of polycyclic aromatic hydrocarbons (PAHs) in Beijing and to evaluate the effectiveness of source control strategies in reducing PAHs pollution after the 29th Olympic Games. The sub-cooled liquid vapor pressure (logP(L)(o))-based model and octanol-air partition coefficient (K(oa))-based model were applied based on each seasonal dateset. Regression analysis among log K(P), logP(L)(o) and log K(oa) exhibited high significant correlations for four seasons. Source factors were identified by principle component analysis and contributions were further estimated by multiple linear regression. Pyrogenic sources and coke oven emission were identified as major sources for both the non-heating and heating seasons. As compared with literatures, the mean PAH concentrations before and after the 29th Olympic Games were reduced by more than 60%, indicating that the source control measures were effective for reducing PAHs pollution in Beijing. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions.

    PubMed

    Shenoy, Shailesh M

    2016-07-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.

  7. Crawler Solids Unknown Analysis

    NASA Technical Reports Server (NTRS)

    Frandsen, Athela

    2016-01-01

    Crawler Transporter (CT) #2 has been undergoing refurbishment to carry the Space Launch System (SLS). After returning to normal operation, multiple filters of the gear box lubrication system failed/clogged and went on bypass during a test run to the launch pad. Analysis of the filters was done in large part with polarized light microscopy (PLM) to identify the filter contaminates and the source of origin.

  8. Design analysis tracking and data relay satellite simulation system

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The design and development of the equipment necessary to simulate the S-band multiple access link between user spacecraft, the Tracking and Data Relay Satellite, and a ground control terminal are discussed. The core of the S-band multiple access concept is the use of an Adaptive Ground Implemented Phased Array. The array contains thirty channels and provides the multiplexing and demultiplexing equipment required to demonstrate the ground implemented beam forming feature. The system provided will make it possible to demonstrate the performance of a desired user and ten interfering sources attempting to pass data through the multiple access system.

  9. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  10. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  11. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  12. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  13. 46 CFR 111.10-5 - Multiple energy sources.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  14. Decoding emotional valence from electroencephalographic rhythmic activity.

    PubMed

    Celikkanat, Hande; Moriya, Hiroki; Ogawa, Takeshi; Kauppi, Jukka-Pekka; Kawanabe, Motoaki; Hyvarinen, Aapo

    2017-07-01

    We attempt to decode emotional valence from electroencephalographic rhythmic activity in a naturalistic setting. We employ a data-driven method developed in a previous study, Spectral Linear Discriminant Analysis, to discover the relationships between the classification task and independent neuronal sources, optimally utilizing multiple frequency bands. A detailed investigation of the classifier provides insight into the neuronal sources related with emotional valence, and the individual differences of the subjects in processing emotions. Our findings show: (1) sources whose locations are similar across subjects are consistently involved in emotional responses, with the involvement of parietal sources being especially significant, and (2) even though the locations of the involved neuronal sources are consistent, subjects can display highly varying degrees of valence-related EEG activity in the sources.

  15. MSAViewer: interactive JavaScript visualization of multiple sequence alignments.

    PubMed

    Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E; Rost, Burkhard; Goldberg, Tatyana

    2016-11-15

    The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is 'web ready': written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/Supplementary information: Supplementary data are available at Bioinformatics online. msa@bio.sh. © The Author 2016. Published by Oxford University Press.

  16. MSAViewer: interactive JavaScript visualization of multiple sequence alignments

    PubMed Central

    Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E.; Rost, Burkhard; Goldberg, Tatyana

    2016-01-01

    Summary: The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is ‘web ready’: written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. Availability and Implementation: The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: msa@bio.sh PMID:27412096

  17. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  18. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies.

    PubMed

    Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E

    2015-06-16

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .

  19. Source Apportionment and Risk Assessment of Emerging Contaminants: An Approach of Pharmaco-Signature in Water Systems

    PubMed Central

    Jiang, Jheng Jie; Lee, Chon Lin; Fang, Meng Der; Boyd, Kenneth G.; Gibb, Stuart W.

    2015-01-01

    This paper presents a methodology based on multivariate data analysis for characterizing potential source contributions of emerging contaminants (ECs) detected in 26 river water samples across multi-scape regions during dry and wet seasons. Based on this methodology, we unveil an approach toward potential source contributions of ECs, a concept we refer to as the “Pharmaco-signature.” Exploratory analysis of data points has been carried out by unsupervised pattern recognition (hierarchical cluster analysis, HCA) and receptor model (principal component analysis-multiple linear regression, PCA-MLR) in an attempt to demonstrate significant source contributions of ECs in different land-use zone. Robust cluster solutions grouped the database according to different EC profiles. PCA-MLR identified that 58.9% of the mean summed ECs were contributed by domestic impact, 9.7% by antibiotics application, and 31.4% by drug abuse. Diclofenac, ibuprofen, codeine, ampicillin, tetracycline, and erythromycin-H2O have significant pollution risk quotients (RQ>1), indicating potentially high risk to aquatic organisms in Taiwan. PMID:25874375

  20. Effectiveness of source documents for identifying fatal occupational injuries: a synthesis of studies.

    PubMed

    Stout, N; Bell, C

    1991-06-01

    The complete and accurate identification of fatal occupational injuries among the US work force is an important first step in developing work injury prevention efforts. Numerous sources of information, such as death certificates, Workers' Compensation files, Occupational Safety and Health Administration (OSHA) files, medical examiner records, state health and labor department reports, and various combinations of these, have been used to identify cases of work-related fatal injuries. Recent studies have questioned the effectiveness of these sources for identifying such cases. At least 10 studies have used multiple sources to define the universe of fatal work injuries within a state and to determine the capture rates, or proportion of the universe identified, by each source. Results of these studies, which are not all available in published literature, are summarized here in a format that allows researchers to readily compare the ascertainment capabilities of the sources. The overall average capture rates of sources were as follows: death certificates, 81%; medical examiner records, 61%; Workers' Compensation reports, 57%; and OSHA reports 32%. Variations by state and value added through the use of multiple sources are presented and discussed. This meta-analysis of 10 state-based studies summarizes the effectiveness of various source documents for capturing cases of fatal occupational injuries to help researchers make informed decisions when designing occupational injury surveillance systems.

  1. Effectiveness of source documents for identifying fatal occupational injuries: a synthesis of studies.

    PubMed Central

    Stout, N; Bell, C

    1991-01-01

    BACKGROUND: The complete and accurate identification of fatal occupational injuries among the US work force is an important first step in developing work injury prevention efforts. Numerous sources of information, such as death certificates, Workers' Compensation files, Occupational Safety and Health Administration (OSHA) files, medical examiner records, state health and labor department reports, and various combinations of these, have been used to identify cases of work-related fatal injuries. Recent studies have questioned the effectiveness of these sources for identifying such cases. METHODS: At least 10 studies have used multiple sources to define the universe of fatal work injuries within a state and to determine the capture rates, or proportion of the universe identified, by each source. Results of these studies, which are not all available in published literature, are summarized here in a format that allows researchers to readily compare the ascertainment capabilities of the sources. RESULTS: The overall average capture rates of sources were as follows: death certificates, 81%; medical examiner records, 61%; Workers' Compensation reports, 57%; and OSHA reports 32%. Variations by state and value added through the use of multiple sources are presented and discussed. CONCLUSIONS: This meta-analysis of 10 state-based studies summarizes the effectiveness of various source documents for capturing cases of fatal occupational injuries to help researchers make informed decisions when designing occupational injury surveillance systems. PMID:1827569

  2. Apportionment of urban aerosol sources in Cork (Ireland) by synergistic measurement techniques.

    PubMed

    Dall'Osto, Manuel; Hellebust, Stig; Healy, Robert M; O'Connor, Ian P; Kourtchev, Ivan; Sodeau, John R; Ovadnevaite, Jurgita; Ceburnis, Darius; O'Dowd, Colin D; Wenger, John C

    2014-09-15

    The sources of ambient fine particulate matter (PM2.5) during wintertime at a background urban location in Cork city (Ireland) have been determined. Aerosol chemical analyses were performed by multiple techniques including on-line high resolution aerosol time-of-flight mass spectrometry (Aerodyne HR-ToF-AMS), on-line single particle aerosol time-of-flight mass spectrometry (TSI ATOFMS), on-line elemental carbon-organic carbon analysis (Sunset_EC-OC), and off-line gas chromatography/mass spectrometry and ion chromatography analysis of filter samples collected at 6-h resolution. Positive matrix factorization (PMF) has been carried out to better elucidate aerosol sources not clearly identified when analyzing results from individual aerosol techniques on their own. Two datasets have been considered: on-line measurements averaged over 2-h periods, and both on-line and off-line measurements averaged over 6-h periods. Five aerosol sources were identified by PMF in both datasets, with excellent agreement between the two solutions: (1) regional domestic solid fuel burning--"DSF_Regional," 24-27%; (2) local urban domestic solid fuel burning--"DSF_Urban," 22-23%; (3) road vehicle emissions--"Traffic," 15-20%; (4) secondary aerosols from regional anthropogenic sources--"SA_Regional" 9-13%; and (5) secondary aged/processed aerosols related to urban anthropogenic sources--"SA_Urban," 21-26%. The results indicate that, despite regulations for restricting the use of smoky fuels, solid fuel burning is the major source (46-50%) of PM2.5 in wintertime in Cork, and also likely other areas of Ireland. Whilst wood combustion is strongly associated with OC and EC, it was found that peat and coal combustion is linked mainly with OC and the aerosol from these latter sources appears to be more volatile than that produced by wood combustion. Ship emissions from the nearby port were found to be mixed with the SA_Regional factor. The PMF analysis allowed us to link the AMS cooking organic aerosol factor (AMS_PMF_COA) to oxidized organic aerosol, chloride and locally produced nitrate, indicating that AMS_PMF_COA cannot be attributed to primary cooking emissions only. Overall, there are clear benefits from factor analysis applied to results obtained from multiple techniques, which allows better association of aerosols with sources and atmospheric processes. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Observations of discrete magnetosonic waves off the magnetic equator

    DOE PAGES

    Zhima, Zeren; Chen, Lunjin; Fu, Huishan; ...

    2015-11-23

    Fast mode magnetosonic waves are typically confined close to the magnetic equator and exhibit harmonic structures at multiples of the local, equatorial proton cyclotron frequency. Here, we report observations of magnetosonic waves well off the equator at geomagnetic latitudes from -16.5°to -17.9° and L shell ~2.7–4.6. The observed waves exhibit discrete spectral structures with multiple frequency spacings. The predominant frequency spacings are ~6 and 9 Hz, neither of which is equal to the local proton cyclotron frequency. Backward ray tracing simulations show that the feature of multiple frequency spacings is caused by propagation from two spatially narrow equatorial source regionsmore » located at L ≈ 4.2 and 3.7. The equatorial proton cyclotron frequencies at those two locations match the two observed frequency spacings. Finally, our analysis provides the first observations of the harmonic nature of magnetosonic waves well away from the equatorial region and suggests that the propagation from multiple equatorial sources contributes to these off-equatorial magnetosonic emissions with varying frequency spacings.« less

  4. Compilation and analysis of multiple groundwater-quality datasets for Idaho

    USGS Publications Warehouse

    Hundt, Stephen A.; Hopkins, Candice B.

    2018-05-09

    Groundwater is an important source of drinking and irrigation water throughout Idaho, and groundwater quality is monitored by various Federal, State, and local agencies. The historical, multi-agency records of groundwater quality include a valuable dataset that has yet to be compiled or analyzed on a statewide level. The purpose of this study is to combine groundwater-quality data from multiple sources into a single database, to summarize this dataset, and to perform bulk analyses to reveal spatial and temporal patterns of water quality throughout Idaho. Data were retrieved from the Water Quality Portal (https://www.waterqualitydata.us/), the Idaho Department of Environmental Quality, and the Idaho Department of Water Resources. Analyses included counting the number of times a sample location had concentrations above Maximum Contaminant Levels (MCL), performing trends tests, and calculating correlations between water-quality analytes. The water-quality database and the analysis results are available through USGS ScienceBase (https://doi.org/10.5066/F72V2FBG).

  5. Open source tools for the information theoretic analysis of neural data.

    PubMed

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  6. Design of a transportable high efficiency fast neutron spectrometer

    DOE PAGES

    Roecker, C.; Bernstein, A.; Bowden, N. S.; ...

    2016-04-12

    A transportable fast neutron detection system has been designed and constructed for measuring neutron energy spectra and flux ranging from tens to hundreds of MeV. The transportability of the spectrometer reduces the detector-related systematic bias between different neutron spectra and flux measurements, which allows for the comparison of measurements above or below ground. The spectrometer will measure neutron fluxes that are of prohibitively low intensity compared to the site-specific background rates targeted by other transportable fast neutron detection systems. To measure low intensity high-energy neutron fluxes, a conventional capture-gating technique is used for measuring neutron energies above 20 MeV andmore » a novel multiplicity technique is used for measuring neutron energies above 100 MeV. The spectrometer is composed of two Gd containing plastic scintillator detectors arranged around a lead spallation target. To calibrate and characterize the position dependent response of the spectrometer, a Monte Carlo model was developed and used in conjunction with experimental data from gamma ray sources. Multiplicity event identification algorithms were developed and used with a Cf-252 neutron multiplicity source to validate the Monte Carlo model Gd concentration and secondary neutron capture efficiency. The validated Monte Carlo model was used to predict an effective area for the multiplicity and capture gating analyses. For incident neutron energies between 100 MeV and 1000 MeV with an isotropic angular distribution, the multiplicity analysis predicted an effective area of 500 cm 2 rising to 5000 cm 2. For neutron energies above 20 MeV, the capture-gating analysis predicted an effective area between 1800 cm 2 and 2500 cm 2. As a result, the multiplicity mode was found to be sensitive to the incident neutron angular distribution.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roecker, C.; Bernstein, A.; Bowden, N. S.

    A transportable fast neutron detection system has been designed and constructed for measuring neutron energy spectra and flux ranging from tens to hundreds of MeV. The transportability of the spectrometer reduces the detector-related systematic bias between different neutron spectra and flux measurements, which allows for the comparison of measurements above or below ground. The spectrometer will measure neutron fluxes that are of prohibitively low intensity compared to the site-specific background rates targeted by other transportable fast neutron detection systems. To measure low intensity high-energy neutron fluxes, a conventional capture-gating technique is used for measuring neutron energies above 20 MeV andmore » a novel multiplicity technique is used for measuring neutron energies above 100 MeV. The spectrometer is composed of two Gd containing plastic scintillator detectors arranged around a lead spallation target. To calibrate and characterize the position dependent response of the spectrometer, a Monte Carlo model was developed and used in conjunction with experimental data from gamma ray sources. Multiplicity event identification algorithms were developed and used with a Cf-252 neutron multiplicity source to validate the Monte Carlo model Gd concentration and secondary neutron capture efficiency. The validated Monte Carlo model was used to predict an effective area for the multiplicity and capture gating analyses. For incident neutron energies between 100 MeV and 1000 MeV with an isotropic angular distribution, the multiplicity analysis predicted an effective area of 500 cm 2 rising to 5000 cm 2. For neutron energies above 20 MeV, the capture-gating analysis predicted an effective area between 1800 cm 2 and 2500 cm 2. As a result, the multiplicity mode was found to be sensitive to the incident neutron angular distribution.« less

  8. HerMES: ALMA Imaging of Herschel-selected Dusty Star-forming Galaxies

    NASA Astrophysics Data System (ADS)

    Bussmann, R. S.; Riechers, D.; Fialkov, A.; Scudder, J.; Hayward, C. C.; Cowley, W. I.; Bock, J.; Calanog, J.; Chapman, S. C.; Cooray, A.; De Bernardis, F.; Farrah, D.; Fu, Hai; Gavazzi, R.; Hopwood, R.; Ivison, R. J.; Jarvis, M.; Lacey, C.; Loeb, A.; Oliver, S. J.; Pérez-Fournon, I.; Rigopoulou, D.; Roseboom, I. G.; Scott, Douglas; Smith, A. J.; Vieira, J. D.; Wang, L.; Wardlow, J.

    2015-10-01

    The Herschel Multi-tiered Extragalactic Survey (HerMES) has identified large numbers of dusty star-forming galaxies (DSFGs) over a wide range in redshift. A detailed understanding of these DSFGs is hampered by the limited spatial resolution of Herschel. We present 870 μm 0.″45 resolution imaging obtained with the Atacama Large Millimeter/submillimeter Array (ALMA) of a sample of 29 HerMES DSFGs that have far-infrared (FIR) flux densities that lie between the brightest of sources found by Herschel and fainter DSFGs found via ground-based surveys in the submillimeter region. The ALMA imaging reveals that these DSFGs comprise a total of 62 sources (down to the 5σ point-source sensitivity limit in our ALMA sample; σ ≈ 0.2 {mJy}). Optical or near-infrared imaging indicates that 36 of the ALMA sources experience a significant flux boost from gravitational lensing (μ \\gt 1.1), but only six are strongly lensed and show multiple images. We introduce and make use of uvmcmcfit, a general-purpose and publicly available Markov chain Monte Carlo visibility-plane analysis tool to analyze the source properties. Combined with our previous work on brighter Herschel sources, the lens models presented here tentatively favor intrinsic number counts for DSFGs with a break near 8 {mJy} at 880 μ {{m}} and a steep fall-off at higher flux densities. Nearly 70% of the Herschel sources break down into multiple ALMA counterparts, consistent with previous research indicating that the multiplicity rate is high in bright sources discovered in single-dish submillimeter or FIR surveys. The ALMA counterparts to our Herschel targets are located significantly closer to each other than ALMA counterparts to sources found in the LABOCA ECDFS Submillimeter Survey. Theoretical models underpredict the excess number of sources with small separations seen in our ALMA sample. The high multiplicity rate and small projected separations between sources seen in our sample argue in favor of interactions and mergers plausibly driving both the prodigious emission from the brightest DSFGs as well as the sharp downturn above {S}880=8 {mJy}. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.

  9. Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows

    NASA Astrophysics Data System (ADS)

    Jittamai, Phongchai

    This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.

  10. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques

    PubMed Central

    Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.

    2016-01-01

    Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934

  11. Research and Analysis on the Localization of a 3-D Single Source in Lossy Medium Using Uniform Circular Array

    PubMed Central

    Xue, Bing; Qu, Xiaodong; Fang, Guangyou; Ji, Yicai

    2017-01-01

    In this paper, the methods and analysis for estimating the location of a three-dimensional (3-D) single source buried in lossy medium are presented with uniform circular array (UCA). The mathematical model of the signal in the lossy medium is proposed. Using information in the covariance matrix obtained by the sensors’ outputs, equations of the source location (azimuth angle, elevation angle, and range) are obtained. Then, the phase and amplitude of the covariance matrix function are used to process the source localization in the lossy medium. By analyzing the characteristics of the proposed methods and the multiple signal classification (MUSIC) method, the computational complexity and the valid scope of these methods are given. From the results, whether the loss is known or not, we can choose the best method for processing the issues (localization in lossless medium or lossy medium). PMID:28574467

  12. Methods for characterizing subsurface volatile contaminants using in-situ sensors

    DOEpatents

    Ho, Clifford K [Albuquerque, NM

    2006-02-21

    An inverse analysis method for characterizing diffusion of vapor from an underground source of volatile contaminant using data taken by an in-situ sensor. The method uses one-dimensional solutions to the diffusion equation in Cartesian, cylindrical, or spherical coordinates for isotropic and homogenous media. If the effective vapor diffusion coefficient is known, then the distance from the source to the in-situ sensor can be estimated by comparing the shape of the predicted time-dependent vapor concentration response curve to the measured response curve. Alternatively, if the source distance is known, then the effective vapor diffusion coefficient can be estimated using the same inverse analysis method. A triangulation technique can be used with multiple sensors to locate the source in two or three dimensions. The in-situ sensor can contain one or more chemiresistor elements housed in a waterproof enclosure with a gas permeable membrane.

  13. 'I feel like a salesperson': the effect of multiple-source care funding on the experiences and views of nursing home nurses in England.

    PubMed

    Thompson, Juliana; Cook, Glenda; Duschinsky, Robbie

    2015-06-01

    The difficulties faced in the recruitment and retention of nursing staff in nursing homes for older people are an international challenge. It is therefore essential that the causes of nurses' reluctance to work in these settings are determined. This paper considers the influence that multiple-source care funding issues have on nursing home nurses' experiences and views regarding the practice and appeal of the role. The methodology for this study was hermeneutic phenomenology. Thirteen nurses from seven nursing homes in the North East of England were interviewed in a sequence of up to five interviews and data were analysed using a literary analysis method. Findings indicate that participants are uncomfortable with the business aspects that funding issues bring to their role. The primary difficulties faced are: tensions between care issues and funding issues; challenges associated with 'selling beds'; and coping with self-funding residents' changing expectations of care. The findings of the study suggest that multiple-source care funding systems that operate in nursing homes for older people pose challenges to nursing home nurses. Some of these challenges may impact on their recruitment and retention. © 2014 John Wiley & Sons Ltd.

  14. An optimal merging technique for high-resolution precipitation products: OPTIMAL MERGING OF PRECIPITATION METHOD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Roshan; Houser, Paul R.; Anantharaj, Valentine G.

    2011-04-01

    Precipitation products are currently available from various sources at higher spatial and temporal resolution than any time in the past. Each of the precipitation products has its strengths and weaknesses in availability, accuracy, resolution, retrieval techniques and quality control. By merging the precipitation data obtained from multiple sources, one can improve its information content by minimizing these issues. However, precipitation data merging poses challenges of scale-mismatch, and accurate error and bias assessment. In this paper we present Optimal Merging of Precipitation (OMP), a new method to merge precipitation data from multiple sources that are of different spatial and temporal resolutionsmore » and accuracies. This method is a combination of scale conversion and merging weight optimization, involving performance-tracing based on Bayesian statistics and trend-analysis, which yields merging weights for each precipitation data source. The weights are optimized at multiple scales to facilitate multiscale merging and better precipitation downscaling. Precipitation data used in the experiment include products from the 12-km resolution North American Land Data Assimilation (NLDAS) system, the 8-km resolution CMORPH and the 4-km resolution National Stage-IV QPE. The test cases demonstrate that the OMP method is capable of identifying a better data source and allocating a higher priority for them in the merging procedure, dynamically over the region and time period. This method is also effective in filtering out poor quality data introduced into the merging process.« less

  15. Analysis of Critical Parts and Materials

    DTIC Science & Technology

    1980-12-01

    1 1 1% 1% 1% 1% Large Orders Manual Ordering of Some Critical Parts Order Spares with Original Order Incentives Belter Capital Investment...demand 23 Large orders 24 Long lead procurement funding (including raw materials, facility funding) 25 Manpower analysis and training 26 Manual ... ordering of some critical parts 27 More active role in schedule negotiation 28 Multiple source procurements 29 Multi-year program funding 30 Order

  16. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  17. Evaluation of sensor, environment and operational factors impacting the use of multiple sensor constellations for long term resource monitoring

    NASA Astrophysics Data System (ADS)

    Rengarajan, Rajagopalan

    Moderate resolution remote sensing data offers the potential to monitor the long and short term trends in the condition of the Earth's resources at finer spatial scales and over longer time periods. While improved calibration (radiometric and geometric), free access (Landsat, Sentinel, CBERS), and higher level products in reflectance units have made it easier for the science community to derive the biophysical parameters from these remotely sensed data, a number of issues still affect the analysis of multi-temporal datasets. These are primarily due to sources that are inherent in the process of imaging from single or multiple sensors. Some of these undesired or uncompensated sources of variation include variation in the view angles, illumination angles, atmospheric effects, and sensor effects such as Relative Spectral Response (RSR) variation between different sensors. The complex interaction of these sources of variation would make their study extremely difficult if not impossible with real data, and therefore, a simulated analysis approach is used in this study. A synthetic forest canopy is produced using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model and its measured BRDFs are modeled using the RossLi canopy BRDF model. The simulated BRDF matches the real data to within 2% of the reflectance in the red and the NIR spectral bands studied. The BRDF modeling process is extended to model and characterize the defoliation of a forest, which is used in factor sensitivity studies to estimate the effect of each factor for varying environment and sensor conditions. Finally, a factorial experiment is designed to understand the significance of the sources of variation, and regression based analysis are performed to understand the relative importance of the factors. The design of experiment and the sensitivity analysis conclude that the atmospheric attenuation and variations due to the illumination angles are the dominant sources impacting the at-sensor radiance.

  18. Multiple-generator errors are unavoidable under model misspecification.

    PubMed

    Jewett, D L; Zhang, Z

    1995-08-01

    Model misspecification poses a major problem for dipole source localization (DSL) because it causes insidious multiple-generator errors (MulGenErrs) to occur in the fitted dipole parameters. This paper describes how and why this occurs, based upon simple algebraic considerations. MulGenErrs must occur, to some degree, in any DSL analysis of real data because there is model misspecification and mathematically the equations used for the simultaneously active generators must be of a different form than the equations for each generator active alone.

  19. Combining results of multiple search engines in proteomics.

    PubMed

    Shteynberg, David; Nesvizhskii, Alexey I; Moritz, Robert L; Deutsch, Eric W

    2013-09-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques.

  20. Combining Results of Multiple Search Engines in Proteomics*

    PubMed Central

    Shteynberg, David; Nesvizhskii, Alexey I.; Moritz, Robert L.; Deutsch, Eric W.

    2013-01-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques. PMID:23720762

  1. Valid Statistical Analysis for Logistic Regression with Multiple Sources

    NASA Astrophysics Data System (ADS)

    Fienberg, Stephen E.; Nardi, Yuval; Slavković, Aleksandra B.

    Considerable effort has gone into understanding issues of privacy protection of individual information in single databases, and various solutions have been proposed depending on the nature of the data, the ways in which the database will be used and the precise nature of the privacy protection being offered. Once data are merged across sources, however, the nature of the problem becomes far more complex and a number of privacy issues arise for the linked individual files that go well beyond those that are considered with regard to the data within individual sources. In the paper, we propose an approach that gives full statistical analysis on the combined database without actually combining it. We focus mainly on logistic regression, but the method and tools described may be applied essentially to other statistical models as well.

  2. A model-based analysis of extinction ratio effects on phase-OTDR distributed acoustic sensing system performance

    NASA Astrophysics Data System (ADS)

    Aktas, Metin; Maral, Hakan; Akgun, Toygar

    2018-02-01

    Extinction ratio is an inherent limiting factor that has a direct effect on the detection performance of phase-OTDR based distributed acoustics sensing systems. In this work we present a model based analysis of Rayleigh scattering to simulate the effects of extinction ratio on the received signal under varying signal acquisition scenarios and system parameters. These signal acquisition scenarios are constructed to represent typically observed cases such as multiple vibration sources cluttered around the target vibration source to be detected, continuous wave light sources with center frequency drift, varying fiber optic cable lengths and varying ADC bit resolutions. Results show that an insufficient ER can result in high optical noise floor and effectively hide the effects of elaborate system improvement efforts.

  3. Density estimation in tiger populations: combining information for strong inference

    USGS Publications Warehouse

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Delampady, Mohan; Nichols, James D.; Karanth, K. Ullas; Macdonald, David W.

    2012-01-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture–recapture data. The model, which combined information, provided the most precise estimate of density (8.5 ± 1.95 tigers/100 km2 [posterior mean ± SD]) relative to a model that utilized only one data source (photographic, 12.02 ± 3.02 tigers/100 km2 and fecal DNA, 6.65 ± 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  4. Density estimation in tiger populations: combining information for strong inference.

    PubMed

    Gopalaswamy, Arjun M; Royle, J Andrew; Delampady, Mohan; Nichols, James D; Karanth, K Ullas; Macdonald, David W

    2012-07-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture-recapture data. The model, which combined information, provided the most precise estimate of density (8.5 +/- 1.95 tigers/100 km2 [posterior mean +/- SD]) relative to a model that utilized only one data source (photographic, 12.02 +/- 3.02 tigers/100 km2 and fecal DNA, 6.65 +/- 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  5. Global Situational Awareness with Free Tools

    DTIC Science & Technology

    2015-01-15

    Client Technical Solutions • Software Engineering Measurement and Analysis • Architecture Practices • Product Line Practice • Team Software Process...multiple data sources • Snort (Snorby on Security Onion ) • Nagios • SharePoint RSS • Flow • Others • Leverage standard data formats • Keyhole Markup Language

  6. Integration and Optimization of Alternative Sources of Energy in a Remote Region

    NASA Astrophysics Data System (ADS)

    Berberi, Pellumb; Inodnorjani, Spiro; Aleti, Riza

    2010-01-01

    In a remote coastal region supply of energy from national grid is insufficient for a sustainable development. Integration and optimization of local alternative renewable energy sources is an optional solution of the problem. In this paper we have studied the energetic potential of local sources of renewable energy (water, solar, wind and biomass). A bottom-up energy system optimization model is proposed in order to support planning policies for promoting the use of renewable energy sources. A software, based on multiple factors and constrains analysis for optimization energy flow is proposed, which provides detailed information for exploitation each source of energy, power and heat generation, GHG emissions and end-use sectors. Economical analysis shows that with existing technologies both stand alone and regional facilities may be feasible. Improving specific legislation will foster investments from Central or Local Governments and also from individuals, private companies or small families. The study is carried on the frame work of a FP6 project "Integrated Renewable Energy System."

  7. Investigation of a complete sample of flat spectrum radio sources from the S5 survey

    NASA Astrophysics Data System (ADS)

    Eckart, A.; Witzel, A.; Biermann, P.; Johnston, K. J.; Simon, R.; Schalinski, C.; Kuhr, H.

    1986-11-01

    An analysis of 13 extragalactic sources of the S5 survey with flux densities greater than or equal to 1 Jy at 4990 MHz, mapped with milliarcsecond resolution at 1.6 and 5 GHz by means of VLBI, is presented. All sources appear to display multiple components dominated in flux density at 6 cm by a core component which is self-absorbed at 18 cm. Comparison of the measured to predicted X-ray flux density of the core radio components suggests that all sources should display bulk relativistic motion with small angles to the line of sight, and four sources show rapid changes in their radio structures which can be interpreted as apparent superliminal motion.

  8. On the possibility of the multiple inductively coupled plasma and helicon plasma sources for large-area processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jin-Won; Lee, Yun-Seong, E-mail: leeeeys@kaist.ac.kr; Chang, Hong-Young

    2014-08-15

    In this study, we attempted to determine the possibility of multiple inductively coupled plasma (ICP) and helicon plasma sources for large-area processes. Experiments were performed with the one and two coils to measure plasma and electrical parameters, and a circuit simulation was performed to measure the current at each coil in the 2-coil experiment. Based on the result, we could determine the possibility of multiple ICP sources due to a direct change of impedance due to current and saturation of impedance due to the skin-depth effect. However, a helicon plasma source is difficult to adapt to the multiple sources duemore » to the consistent change of real impedance due to mode transition and the low uniformity of the B-field confinement. As a result, it is expected that ICP can be adapted to multiple sources for large-area processes.« less

  9. Effective learning strategies for real-time image-guided adaptive control of multiple-source hyperthermia applicators.

    PubMed

    Cheng, Kung-Shan; Dewhirst, Mark W; Stauffer, Paul R; Das, Shiva

    2010-03-01

    This paper investigates overall theoretical requirements for reducing the times required for the iterative learning of a real-time image-guided adaptive control routine for multiple-source heat applicators, as used in hyperthermia and thermal ablative therapy for cancer. Methods for partial reconstruction of the physical system with and without model reduction to find solutions within a clinically practical timeframe were analyzed. A mathematical analysis based on the Fredholm alternative theorem (FAT) was used to compactly analyze the existence and uniqueness of the optimal heating vector under two fundamental situations: (1) noiseless partial reconstruction and (2) noisy partial reconstruction. These results were coupled with a method for further acceleration of the solution using virtual source (VS) model reduction. The matrix approximation theorem (MAT) was used to choose the optimal vectors spanning the reduced-order subspace to reduce the time for system reconstruction and to determine the associated approximation error. Numerical simulations of the adaptive control of hyperthermia using VS were also performed to test the predictions derived from the theoretical analysis. A thigh sarcoma patient model surrounded by a ten-antenna phased-array applicator was retained for this purpose. The impacts of the convective cooling from blood flow and the presence of sudden increase of perfusion in muscle and tumor were also simulated. By FAT, partial system reconstruction directly conducted in the full space of the physical variables such as phases and magnitudes of the heat sources cannot guarantee reconstructing the optimal system to determine the global optimal setting of the heat sources. A remedy for this limitation is to conduct the partial reconstruction within a reduced-order subspace spanned by the first few maximum eigenvectors of the true system matrix. By MAT, this VS subspace is the optimal one when the goal is to maximize the average tumor temperature. When more than 6 sources present, the steps required for a nonlinear learning scheme is theoretically fewer than that of a linear one, however, finite number of iterative corrections is necessary for a single learning step of a nonlinear algorithm. Thus, the actual computational workload for a nonlinear algorithm is not necessarily less than that required by a linear algorithm. Based on the analysis presented herein, obtaining a unique global optimal heating vector for a multiple-source applicator within the constraints of real-time clinical hyperthermia treatments and thermal ablative therapies appears attainable using partial reconstruction with minimum norm least-squares method with supplemental equations. One way to supplement equations is the inclusion of a method of model reduction.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keefer, Donald A.; Shaffer, Eric G.; Storsved, Brynne

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing jointmore » visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less

  11. Assessing the use of multiple sources in student essays.

    PubMed

    Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly

    2012-09-01

    The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.

  12. Quantifying nutrient sources in an upland catchment using multiple chemical and isotopic tracers

    NASA Astrophysics Data System (ADS)

    Sebestyen, S. D.; Boyer, E. W.; Shanley, J. B.; Doctor, D. H.; Kendall, C.; Aiken, G. R.

    2006-12-01

    To explore processes that control the temporal variation of nutrients in surface waters, we measured multiple environmental tracers at the Sleepers River Research Watershed, an upland catchment in northeastern Vermont, USA. Using a set of high-frequency stream water samples, we quantified the variation of nutrients over a range of stream flow conditions with chemical and isotopic tracers of water, nitrate, and dissolved organic carbon (DOC). Stream water concentrations of nitrogen (predominantly in the forms of nitrate and dissolved organic nitrogen) and DOC reflected mixing of water contributed from distinct sources in the forested landscape. Water isotopic signatures and end-member mixing analysis revealed when solutes entered the stream from these sources and that the sources were linked to the stream by preferential shallow subsurface and overland flow paths. Results from the tracers indicated that freshly-leached, terrestrial organic matter was the overwhelming source of high DOC concentrations in stream water. In contrast, in this region where atmospheric nitrogen deposition is chronically elevated, the highest concentrations of stream nitrate were attributable to atmospheric sources that were transported via melting snow and rain fall. These findings are consistent with a conceptual model of the landscape in which coupled hydrological and biogeochemical processes interact to control stream solute variability over time.

  13. Methods for radiation detection and characterization using a multiple detector probe

    DOEpatents

    Akers, Douglas William; Roybal, Lyle Gene

    2014-11-04

    Apparatuses, methods, and systems relating to radiological characterization of environments are disclosed. Multi-detector probes with a plurality of detectors in a common housing may be used to substantially concurrently detect a plurality of different radiation activities and types. Multiple multi-detector probes may be used in a down-hole environment to substantially concurrently detect radioactive activity and contents of a buried waste container. Software may process, analyze, and integrate the data from the different multi-detector probes and the different detector types therein to provide source location and integrated analysis as to the source types and activity in the measured environment. Further, the integrated data may be used to compensate for differential density effects and the effects of radiation shielding materials within the volume being measured.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, Robert M; Potok, Thomas E

    Assessing the potential property and social impacts of an event, such as tornado or wildfire, continues to be a challenging research area. From financial markets to disaster management to epidemiology, the importance of understanding the impacts that events create cannot be understated. Our work describes an approach to fuse information from multiple sources, then to analyze the information cycles to identify prior temporal patterns related to the impact of an event. This approach is then applied to the analysis of news reports from multiple news sources pertaining to several different natural disasters. Results show that our approach can project themore » severity of the impacts of certain natural disasters, such as heat waves on droughts and wild fires. In addition, results show that specific types of disaster consistently produce similar impacts when each time they occur.« less

  15. Evaluation of Electroencephalography Source Localization Algorithms with Multiple Cortical Sources.

    PubMed

    Bradley, Allison; Yao, Jun; Dewald, Jules; Richter, Claus-Peter

    2016-01-01

    Source localization algorithms often show multiple active cortical areas as the source of electroencephalography (EEG). Yet, there is little data quantifying the accuracy of these results. In this paper, the performance of current source density source localization algorithms for the detection of multiple cortical sources of EEG data has been characterized. EEG data were generated by simulating multiple cortical sources (2-4) with the same strength or two sources with relative strength ratios of 1:1 to 4:1, and adding noise. These data were used to reconstruct the cortical sources using current source density (CSD) algorithms: sLORETA, MNLS, and LORETA using a p-norm with p equal to 1, 1.5 and 2. Precision (percentage of the reconstructed activity corresponding to simulated activity) and Recall (percentage of the simulated sources reconstructed) of each of the CSD algorithms were calculated. While sLORETA has the best performance when only one source is present, when two or more sources are present LORETA with p equal to 1.5 performs better. When the relative strength of one of the sources is decreased, all algorithms have more difficulty reconstructing that source. However, LORETA 1.5 continues to outperform other algorithms. If only the strongest source is of interest sLORETA is recommended, while LORETA with p equal to 1.5 is recommended if two or more of the cortical sources are of interest. These results provide guidance for choosing a CSD algorithm to locate multiple cortical sources of EEG and for interpreting the results of these algorithms.

  16. Evaluation of Electroencephalography Source Localization Algorithms with Multiple Cortical Sources

    PubMed Central

    Bradley, Allison; Yao, Jun; Dewald, Jules; Richter, Claus-Peter

    2016-01-01

    Background Source localization algorithms often show multiple active cortical areas as the source of electroencephalography (EEG). Yet, there is little data quantifying the accuracy of these results. In this paper, the performance of current source density source localization algorithms for the detection of multiple cortical sources of EEG data has been characterized. Methods EEG data were generated by simulating multiple cortical sources (2–4) with the same strength or two sources with relative strength ratios of 1:1 to 4:1, and adding noise. These data were used to reconstruct the cortical sources using current source density (CSD) algorithms: sLORETA, MNLS, and LORETA using a p-norm with p equal to 1, 1.5 and 2. Precision (percentage of the reconstructed activity corresponding to simulated activity) and Recall (percentage of the simulated sources reconstructed) of each of the CSD algorithms were calculated. Results While sLORETA has the best performance when only one source is present, when two or more sources are present LORETA with p equal to 1.5 performs better. When the relative strength of one of the sources is decreased, all algorithms have more difficulty reconstructing that source. However, LORETA 1.5 continues to outperform other algorithms. If only the strongest source is of interest sLORETA is recommended, while LORETA with p equal to 1.5 is recommended if two or more of the cortical sources are of interest. These results provide guidance for choosing a CSD algorithm to locate multiple cortical sources of EEG and for interpreting the results of these algorithms. PMID:26809000

  17. Multiple Sources of Prescription Payment and Risky Opioid Therapy Among Veterans.

    PubMed

    Becker, William C; Fenton, Brenda T; Brandt, Cynthia A; Doyle, Erin L; Francis, Joseph; Goulet, Joseph L; Moore, Brent A; Torrise, Virginia; Kerns, Robert D; Kreiner, Peter W

    2017-07-01

    Opioid overdose and other related harms are a major source of morbidity and mortality among US Veterans, in part due to high-risk opioid prescribing. We sought to determine whether having multiple sources of payment for opioids-as a marker for out-of-system access-is associated with risky opioid therapy among veterans. Cross-sectional study examining the association between multiple sources of payment and risky opioid therapy among all individuals with Veterans Health Administration (VHA) payment for opioid analgesic prescriptions in Kentucky during fiscal year 2014-2015. Source of payment categories: (1) VHA only source of payment (sole source); (2) sources of payment were VHA and at least 1 cash payment [VHA+cash payment(s)] whether or not there was a third source of payment; and (3) at least one other noncash source: Medicare, Medicaid, or private insurance [VHA+noncash source(s)]. Our outcomes were 2 risky opioid therapies: combination opioid/benzodiazepine therapy and high-dose opioid therapy, defined as morphine equivalent daily dose ≥90 mg. Of the 14,795 individuals in the analytic sample, there were 81.9% in the sole source category, 6.6% in the VHA+cash payment(s) category, and 11.5% in the VHA+noncash source(s) category. In logistic regression, controlling for age and sex, persons with multiple payment sources had significantly higher odds of each risky opioid therapy, with those in the VHA+cash having significantly higher odds than those in the VHA+noncash source(s) group. Prescribers should examine the prescription monitoring program as multiple payment sources increase the odds of risky opioid therapy.

  18. Broadband continuous wave source localization via pair-wise, cochleagram processing

    NASA Astrophysics Data System (ADS)

    Nosal, Eva-Marie; Frazer, L. Neil

    2005-04-01

    A pair-wise processor has been developed for the passive localization of broadband continuous-wave underwater sources. The algorithm uses sparse hydrophone arrays and does not require previous knowledge of the source signature. It is applicable in multiple source situations. A spectrogram/cochleagram version of the algorithm has been developed in order to utilize higher frequencies at longer ranges where signal incoherence, and limited computational resources, preclude the use of full waveforms. Simulations demonstrating the robustness of the algorithm with respect to noise and environmental mismatch will be presented, together with initial results from the analysis of humpback whale song recorded at the Pacific Missile Range Facility off Kauai. [Work supported by MHPCC and ONR.

  19. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores.

    PubMed

    Chikkagoudar, Satish; Wang, Kai; Li, Mingyao

    2011-05-26

    Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  20. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    PubMed Central

    2011-01-01

    Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/. PMID:21615923

  1. Defining Adapted Physical Activity: International Perspectives

    ERIC Educational Resources Information Center

    Hutzler, Yeshayahu; Sherrill, Claudine

    2007-01-01

    The purpose of this study was to describe international perspectives concerning terms, definitions, and meanings of adapted physical activity (APA) as (a) activities or service delivery, (b) a profession, and (c) an academic field of study. Gergen's social constructionism, our theory, guided analysis of multiple sources of data via qualitative…

  2. A Common Framework for Multiple Sources of Bacterial Annotation

    ScienceCinema

    White, Owen

    2018-05-03

    Owen White, professor of epidemiology and preventive medicine at the University of Maryland School of Medicine and a researcher at the University of Maryland Institute for Genome Sciences, gives the May 29, 2009 keynote speech at the "Sequencing, Finishing, Analysis in the Future" meeting in Santa Fe, NM.

  3. Quantitation of mycotoxins using direct analysis in real time (DART)-mass spectrometry (MS)

    USDA-ARS?s Scientific Manuscript database

    Ambient ionization represents a new generation of mass spectrometry ion sources which is used for rapid ionization of small molecules under ambient conditions. The combination of ambient ionization and mass spectrometry allows analyzing multiple food samples with simple or no sample treatment, or in...

  4. Omics for aquatic ecotoxicology: Control of extraneous variability to enhance the analysis of environmental effects

    EPA Science Inventory

    There are multiple sources of biological and technical variation in a typical ecotoxicology study that may not be revealed by traditional endpoints but that become apparent in an omics dataset. As researchers increasingly apply omics technologies to environmental studies, it will...

  5. Finite Strain Analysis of Shock Compression of Brittle Solids Applied to Titanium Diboride

    DTIC Science & Technology

    2014-07-01

    dislocation motion [18,19] may take place at high pressures. Multiple investigations have discovered that tita - nium diboride demonstrates a rather unique...mean stress under shock compression. It has been suggested [5] that pore collapse may be an important source of inelasticity in tita - nium diboride

  6. Claiming Unclaimed Spaces: Virtual Spaces for Learning

    ERIC Educational Resources Information Center

    Miller, Nicole C.

    2016-01-01

    The purpose of this study was to describe and examine the environments used by teacher candidates in multi-user virtual environments. Secondary data analysis of a case study methodology was employed. Multiple data sources including interviews, surveys, observations, snapshots, course artifacts, and the researcher's journal were used in the initial…

  7. Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data

    NASA Astrophysics Data System (ADS)

    Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.

    2016-12-01

    Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.

  8. An evaluation of talker localization based on direction of arrival estimation and statistical sound source identification

    NASA Astrophysics Data System (ADS)

    Nishiura, Takanobu; Nakamura, Satoshi

    2002-11-01

    It is very important to capture distant-talking speech for a hands-free speech interface with high quality. A microphone array is an ideal candidate for this purpose. However, this approach requires localizing the target talker. Conventional talker localization algorithms in multiple sound source environments not only have difficulty localizing the multiple sound sources accurately, but also have difficulty localizing the target talker among known multiple sound source positions. To cope with these problems, we propose a new talker localization algorithm consisting of two algorithms. One is DOA (direction of arrival) estimation algorithm for multiple sound source localization based on CSP (cross-power spectrum phase) coefficient addition method. The other is statistical sound source identification algorithm based on GMM (Gaussian mixture model) for localizing the target talker position among localized multiple sound sources. In this paper, we particularly focus on the talker localization performance based on the combination of these two algorithms with a microphone array. We conducted evaluation experiments in real noisy reverberant environments. As a result, we confirmed that multiple sound signals can be identified accurately between ''speech'' or ''non-speech'' by the proposed algorithm. [Work supported by ATR, and MEXT of Japan.

  9. Modified two-sources quantum statistical model and multiplicity fluctuation in the finite rapidity region

    NASA Astrophysics Data System (ADS)

    Ghosh, Dipak; Sarkar, Sharmila; Sen, Sanjib; Roy, Jaya

    1995-06-01

    In this paper the behavior of factorial moments with rapidity window size, which is usually explained in terms of ``intermittency,'' has been interpreted by simple quantum statistical properties of the emitting system using the concept of ``modified two-source model'' as recently proposed by Ghosh and Sarkar [Phys. Lett. B 278, 465 (1992)]. The analysis has been performed using our own data of 16Ag/Br and 24Ag/Br interactions at a few tens of GeV energy regime.

  10. [Impact of children with multiple disabilities on families in Abidjan].

    PubMed

    N Dri, Koumé Mathias; Yaya, Issifou; Zigoli, Robertine; Endemel Ayabakan, François; Ipou, Stéphane Yves; Lambert Moke, Botty

    A child's multiple disabilities have a major impact on families in both developed and developing countries. In Côte d'Ivoire, very few data are available concerning the real experiences of families of children with multiple disabilities. The objective of this study was to improve our knowledge of the impact of children with multiple disabilities on families in Côte d'Ivoire. A qualitative study was conducted among the families consulting the Child Guidance Centre of the National Institute of Public Health in Abidjan. Data were collected in May 2015 by semi-structured individual interviews with mothers of children with multiple disabilities. Twenty mothers of multiply disabled children between the ages of 2 and 14 years were interviewed. The child's multiple disability was found to have a negative impact on finances, health, and social life. Health check-ups, treatment and transport are the main additional costs. Mothers suffer from insomnia, fatigue, back pain and anxiety and were often held responsible for their child's disability. A disabled child was a source of discord in several couples and a cause of school drop-out in some families.This study partially addresses the experiences of families with children with multiple disabilities. It confirms the results of several other studies, highlighting the vulnerability and social dysfunction of these families. The presence of a child with multiple disabilities in a family is a source of psychological, financial and social upheaval. This study raises questions about the impact of multiple disabilities on the whole family and a more detailed analysis of economic aspects.

  11. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    PubMed

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  12. The Chandra Xbootes Survey - IV: Mid-Infrared and Submillimeter Counterparts

    NASA Astrophysics Data System (ADS)

    Brown, Arianna; Mitchell-Wynne, Ketron; Cooray, Asantha R.; Nayyeri, Hooshang

    2016-06-01

    In this work, we use a Bayesian technique to identify mid-IR and submillimeter counterparts for 3,213 X-ray point sources detected in the Chandra XBoötes Survey so as to characterize the relationship between black hole activity and star formation in the XBoötes region. The Chandra XBoötes Survey is a 5-ks X-ray survey of the 9.3 square degree Boötes Field of the NOAO Deep Wide-Field Survey (NDWFS), a survey imaged from the optical to the near-IR. We use a likelihood ratio analysis on Spitzer-IRAC data taken from The Spitzer Deep, Wide-Field Survey (SDWFS) to determine mid-IR counterparts, and a similar method on Herschel-SPIRE sources detected at 250µm from The Herschel Multi-tiered Extragalactic Survey to determine the submillimeter counterparts. The likelihood ratio analysis (LRA) provides the probability that a(n) IRAC or SPIRE point source is the true counterpart to a Chandra source. The analysis is comprised of three parts: the normalized magnitude distributions of counterparts and background sources, and the radial probability distribution of the separation distance between the IRAC or SPIRE source and the Chandra source. Many Chandra sources have multiple prospective counterparts in each band, so additional analysis is performed to determine the identification reliability of the candidates. Identification reliability values lie between 0 and 1, and sources with identification reliability values ≥0.8 are chosen to be the true counterparts. With these results, we will consider the statistical implications of the sample's redshifts, mid-IR and submillimeter luminosities, and star formation rates.

  13. An Analysis of Business Professors' and Their Students' Perceptions of Excellence in Teaching at a Business School-Empirical Evidence from New England

    ERIC Educational Resources Information Center

    Anim, Stephen Kwasi

    2017-01-01

    This qualitative study used evidence gleaned from business professors and their students and compared it with the Measure of Effective Teaching model (MET, 2013). The research is based on a qualitative research design with the aim of collecting data from multiple sources such as interviews, focus group, document analysis and observation to develop…

  14. MPHASYS: a mouse phenotype analysis system

    PubMed Central

    Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan

    2007-01-01

    Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167

  15. Presentation Extensions of the SOAP

    NASA Technical Reports Server (NTRS)

    Carnright, Robert; Stodden, David; Coggi, John

    2009-01-01

    A set of extensions of the Satellite Orbit Analysis Program (SOAP) enables simultaneous and/or sequential presentation of information from multiple sources. SOAP is used in the aerospace community as a means of collaborative visualization and analysis of data on planned spacecraft missions. The following definitions of terms also describe the display modalities of SOAP as now extended: In SOAP terminology, View signifies an animated three-dimensional (3D) scene, two-dimensional still image, plot of numerical data, or any other visible display derived from a computational simulation or other data source; a) "Viewport" signifies a rectangular portion of a computer-display window containing a view; b) "Palette" signifies a collection of one or more viewports configured for simultaneous (split-screen) display in the same window; c) "Slide" signifies a palette with a beginning and ending time and an animation time step; and d) "Presentation" signifies a prescribed sequence of slides. For example, multiple 3D views from different locations can be crafted for simultaneous display and combined with numerical plots and other representations of data for both qualitative and quantitative analysis. The resulting sets of views can be temporally sequenced to convey visual impressions of a sequence of events for a planned mission.

  16. Prediction With Dimension Reduction of Multiple Molecular Data Sources for Patient Survival.

    PubMed

    Kaplan, Adam; Lock, Eric F

    2017-01-01

    Predictive modeling from high-dimensional genomic data is often preceded by a dimension reduction step, such as principal component analysis (PCA). However, the application of PCA is not straightforward for multisource data, wherein multiple sources of 'omics data measure different but related biological components. In this article, we use recent advances in the dimension reduction of multisource data for predictive modeling. In particular, we apply exploratory results from Joint and Individual Variation Explained (JIVE), an extension of PCA for multisource data, for prediction of differing response types. We conduct illustrative simulations to illustrate the practical advantages and interpretability of our approach. As an application example, we consider predicting survival for patients with glioblastoma multiforme from 3 data sources measuring messenger RNA expression, microRNA expression, and DNA methylation. We also introduce a method to estimate JIVE scores for new samples that were not used in the initial dimension reduction and study its theoretical properties; this method is implemented in the R package R.JIVE on CRAN, in the function jive.predict.

  17. Human Health Risk Implications of Multiple Sources of Faecal Indicator Bacteria in a Recreational Waterbody

    EPA Science Inventory

    We evaluate the influence of multiple sources of faecal indicator bacteria in recreational water bodies on potential human health risk by considering waters impacted by human and animal sources, human and non-pathogenic sources, and animal and non-pathogenic sources. We illustrat...

  18. Incomplete Multisource Transfer Learning.

    PubMed

    Ding, Zhengming; Shao, Ming; Fu, Yun

    2018-02-01

    Transfer learning is generally exploited to adapt well-established source knowledge for learning tasks in weakly labeled or unlabeled target domain. Nowadays, it is common to see multiple sources available for knowledge transfer, each of which, however, may not include complete classes information of the target domain. Naively merging multiple sources together would lead to inferior results due to the large divergence among multiple sources. In this paper, we attempt to utilize incomplete multiple sources for effective knowledge transfer to facilitate the learning task in target domain. To this end, we propose an incomplete multisource transfer learning through two directional knowledge transfer, i.e., cross-domain transfer from each source to target, and cross-source transfer. In particular, in cross-domain direction, we deploy latent low-rank transfer learning guided by iterative structure learning to transfer knowledge from each single source to target domain. This practice reinforces to compensate for any missing data in each source by the complete target data. While in cross-source direction, unsupervised manifold regularizer and effective multisource alignment are explored to jointly compensate for missing data from one portion of source to another. In this way, both marginal and conditional distribution discrepancy in two directions would be mitigated. Experimental results on standard cross-domain benchmarks and synthetic data sets demonstrate the effectiveness of our proposed model in knowledge transfer from incomplete multiple sources.

  19. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  20. Improving sensor data analysis through diverse data source integration

    NASA Astrophysics Data System (ADS)

    Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry

    2009-05-01

    Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.

  1. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  2. A Multiple-Tracer Approach for Identifying Sewage Sources to an Urban Stream System

    USGS Publications Warehouse

    Hyer, Kenneth Edward

    2007-01-01

    The presence of human-derived fecal coliform bacteria (sewage) in streams and rivers is recognized as a human health hazard. The source of these human-derived bacteria, however, is often difficult to identify and eliminate, because sewage can be delivered to streams through a variety of mechanisms, such as leaking sanitary sewers or private lateral lines, cross-connected pipes, straight pipes, sewer-line overflows, illicit dumping of septic waste, and vagrancy. A multiple-tracer study was conducted to identify site-specific sources of sewage in Accotink Creek, an urban stream in Fairfax County, Virginia, that is listed on the Commonwealth's priority list of impaired streams for violations of the fecal coliform bacteria standard. Beyond developing this multiple-tracer approach for locating sources of sewage inputs to Accotink Creek, the second objective of the study was to demonstrate how the multiple-tracer approach can be applied to other streams affected by sewage sources. The tracers used in this study were separated into indicator tracers, which are relatively simple and inexpensive to apply, and confirmatory tracers, which are relatively difficult and expensive to analyze. Indicator tracers include fecal coliform bacteria, surfactants, boron, chloride, chloride/bromide ratio, specific conductance, dissolved oxygen, turbidity, and water temperature. Confirmatory tracers include 13 organic compounds that are associated with human waste, including caffeine, cotinine, triclosan, a number of detergent metabolites, several fragrances, and several plasticizers. To identify sources of sewage to Accotink Creek, a detailed investigation of the Accotink Creek main channel, tributaries, and flowing storm drains was undertaken from 2001 to 2004. Sampling was conducted in a series of eight synoptic sampling events, each of which began at the most downstream site and extended upstream through the watershed and into the headwaters of each tributary. Using the synoptic sampling approach, 149 sites were sampled at least one time for indicator tracers; 52 of these sites also were sampled for confirmatory tracers at least one time. Through the analysis of multiple-tracer levels in the synoptic samples, three major sewage sources to the Accotink Creek stream network were identified, and several other minor sewage sources to the Accotink Creek system likely deserve additional investigation. Near the end of the synoptic sampling activities, three additional sampling methods were used to gain better understanding of the potential for sewage sources to the watershed. These additional sampling methods included optical brightener monitoring, intensive stream sampling using automated samplers, and additional sampling of several storm-drain networks. The samples obtained by these methods provided further understanding of possible sewage sources to the streams and a better understanding of the variability in the tracer concentrations at a given sampling site. Collectively, these additional sampling methods were a valuable complement to the synoptic sampling approach that was used for the bulk of this study. The study results provide an approach for local authorities to use in applying a relatively simple and inexpensive collection of tracers to locate sewage sources to streams. Although this multiple-tracer approach is effective in detecting sewage sources to streams, additional research is needed to better detect extremely low-volume sewage sources and better enable local authorities to identify the specific sources of the sewage once it is detected in a stream reach.

  3. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations.

    PubMed

    Dwivedi, Bhakti; Kowalski, Jeanne

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/.

  4. shinyGISPA: A web application for characterizing phenotype by gene sets using multiple omics data combinations

    PubMed Central

    Dwivedi, Bhakti

    2018-01-01

    While many methods exist for integrating multi-omics data or defining gene sets, there is no one single tool that defines gene sets based on merging of multiple omics data sets. We present shinyGISPA, an open-source application with a user-friendly web-based interface to define genes according to their similarity in several molecular changes that are driving a disease phenotype. This tool was developed to help facilitate the usability of a previously published method, Gene Integrated Set Profile Analysis (GISPA), among researchers with limited computer-programming skills. The GISPA method allows the identification of multiple gene sets that may play a role in the characterization, clinical application, or functional relevance of a disease phenotype. The tool provides an automated workflow that is highly scalable and adaptable to applications that go beyond genomic data merging analysis. It is available at http://shinygispa.winship.emory.edu/shinyGISPA/. PMID:29415010

  5. Analysis of Radiation Damage in Light Water Reactors: Comparison of Cluster Analysis Methods for the Analysis of Atom Probe Data.

    PubMed

    Hyde, Jonathan M; DaCosta, Gérald; Hatzoglou, Constantinos; Weekes, Hannah; Radiguet, Bertrand; Styman, Paul D; Vurpillot, Francois; Pareige, Cristelle; Etienne, Auriane; Bonny, Giovanni; Castin, Nicolas; Malerba, Lorenzo; Pareige, Philippe

    2017-04-01

    Irradiation of reactor pressure vessel (RPV) steels causes the formation of nanoscale microstructural features (termed radiation damage), which affect the mechanical properties of the vessel. A key tool for characterizing these nanoscale features is atom probe tomography (APT), due to its high spatial resolution and the ability to identify different chemical species in three dimensions. Microstructural observations using APT can underpin development of a mechanistic understanding of defect formation. However, with atom probe analyses there are currently multiple methods for analyzing the data. This can result in inconsistencies between results obtained from different researchers and unnecessary scatter when combining data from multiple sources. This makes interpretation of results more complex and calibration of radiation damage models challenging. In this work simulations of a range of different microstructures are used to directly compare different cluster analysis algorithms and identify their strengths and weaknesses.

  6. Quantifying methane emission from fugitive sources by combining tracer release and downwind measurements - a sensitivity analysis based on multiple field surveys.

    PubMed

    Mønster, Jacob G; Samuelsson, Jerker; Kjeldsen, Peter; Rella, Chris W; Scheutz, Charlotte

    2014-08-01

    Using a dual species methane/acetylene instrument based on cavity ring down spectroscopy (CRDS), the dynamic plume tracer dispersion method for quantifying the emission rate of methane was successfully tested in four measurement campaigns: (1) controlled methane and trace gas release with different trace gas configurations, (2) landfill with unknown emission source locations, (3) landfill with closely located emission sources, and (4) comparing with an Fourier transform infrared spectroscopy (FTIR) instrument using multiple trace gasses for source separation. The new real-time, high precision instrument can measure methane plumes more than 1.2 km away from small sources (about 5 kg h(-1)) in urban areas with a measurement frequency allowing plume crossing at normal driving speed. The method can be used for quantification of total methane emissions from diffuse area sources down to 1 kg per hour and can be used to quantify individual sources with the right choice of wind direction and road distance. The placement of the trace gas is important for obtaining correct quantification and uncertainty of up to 36% can be incurred when the trace gas is not co-located with the methane source. Measurements made at greater distances are less sensitive to errors in trace gas placement and model calculations showed an uncertainty of less than 5% in both urban and open-country for placing the trace gas 100 m from the source, when measurements were done more than 3 km away. Using the ratio of the integrated plume concentrations of tracer gas and methane gives the most reliable results for measurements at various distances to the source, compared to the ratio of the highest concentration in the plume, the direct concentration ratio and using a Gaussian plume model. Under suitable weather and road conditions, the CRDS system can quantify the emission from different sources located close to each other using only one kind of trace gas due to the high time resolution, while the FTIR system can measure multiple trace gasses but with a lower time resolution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Development of an impulsive noise source to study the acoustic reflection characteristics of hard-walled wind tunnels

    NASA Technical Reports Server (NTRS)

    Salikuddin, M.; Burrin, R. H.; Ahuja, K. K.; Bartel, H. W.

    1986-01-01

    Two impulsive sound sources, one using multiple acoustic drivers and the other using a spark discharge were developed to study the acoustic reflection characteristics of hard-walled wind tunnels, and the results of laboratory tests are presented. The analysis indicates that though the intensity of the pulse generated by the spark source was higher than that obtained from the acoustic source, the number of averages needed for a particular test may require an unacceptibly long tunnel-run time due to the low spark generation repeat rate because of capacitor charging time. The additional hardware problems associated with the longevity of electrodes and electrode holders in sustaining the impact of repetitive spark discharges, show the multidriver acoustic source to be more suitable for this application.

  8. Production and use of estimates for monitoring progress in the health sector: the case of Bangladesh

    PubMed Central

    Ahsan, Karar Zunaid; Tahsina, Tazeen; Iqbal, Afrin; Ali, Nazia Binte; Chowdhury, Suman Kanti; Huda, Tanvir M.; Arifeen, Shams El

    2017-01-01

    ABSTRACT Background: In order to support the progress towards the post-2015 development agenda for the health sector, the importance of high-quality and timely estimates has become evident both globally and at the country level. Objective and Methods: Based on desk review, key informant interviews and expert panel discussions, the paper critically reviews health estimates from both the local (i.e. nationally generated information by the government and other agencies) and the global sources (which are mostly modeled or interpolated estimates developed by international organizations based on different sources of information), and assesses the country capacity and monitoring strategies to meet the increasing data demand in the coming years. Primarily, this paper provides a situation analysis of Bangladesh in terms of production and use of health estimates for monitoring progress towards the post-2015 development goals for the health sector. Results: The analysis reveals that Bangladesh is data rich, particularly from household surveys and health facility assessments. Practices of data utilization also exist, with wide acceptability of survey results for informing policy, programme review and course corrections. Despite high data availability from multiple sources, the country capacity for providing regular updates of major global health estimates/indicators remains low. Major challenges also include limited human resources, capacity to generate quality data and multiplicity of data sources, where discrepancy and lack of linkages among different data sources (local sources and between local and global estimates) present emerging challenges for interpretation of the resulting estimates. Conclusion: To fulfill the increased data requirement for the post-2015 era, Bangladesh needs to invest more in electronic data capture and routine health information systems. Streamlining of data sources, integration of parallel information systems into a common platform, and capacity building for data generation and analysis are recommended as priority actions for Bangladesh in the coming years. In addition to automation of routine health information systems, establishing an Indicator Reference Group for Bangladesh to analyze data; building country capacity in data quality assessment and triangulation; and feeding into global, inter-agency estimates for better reporting would address a number of mentioned challenges in the short- and long-run. PMID:28532305

  9. VideoHacking: Automated Tracking and Quantification of Locomotor Behavior with Open Source Software and Off-the-Shelf Video Equipment.

    PubMed

    Conklin, Emily E; Lee, Kathyann L; Schlabach, Sadie A; Woods, Ian G

    2015-01-01

    Differences in nervous system function can result in differences in behavioral output. Measurements of animal locomotion enable the quantification of these differences. Automated tracking of animal movement is less labor-intensive and bias-prone than direct observation, and allows for simultaneous analysis of multiple animals, high spatial and temporal resolution, and data collection over extended periods of time. Here, we present a new video-tracking system built on Python-based software that is free, open source, and cross-platform, and that can analyze video input from widely available video capture devices such as smartphone cameras and webcams. We validated this software through four tests on a variety of animal species, including larval and adult zebrafish (Danio rerio), Siberian dwarf hamsters (Phodopus sungorus), and wild birds. These tests highlight the capacity of our software for long-term data acquisition, parallel analysis of multiple animals, and application to animal species of different sizes and movement patterns. We applied the software to an analysis of the effects of ethanol on thigmotaxis (wall-hugging) behavior on adult zebrafish, and found that acute ethanol treatment decreased thigmotaxis behaviors without affecting overall amounts of motion. The open source nature of our software enables flexibility, customization, and scalability in behavioral analyses. Moreover, our system presents a free alternative to commercial video-tracking systems and is thus broadly applicable to a wide variety of educational settings and research programs.

  10. Occupational therapy articles in serial publications: an analysis of sources.

    PubMed Central

    Reed, K L

    1988-01-01

    This study was designed to locate and document serial literature on occupational therapy published since 1900. Emphasis is placed on finding articles on occupational therapy or by occupational therapists from sources other than those normally associated with the professional journals. Multiple sources were used including print indexes, online databases, occupational therapy bibliographies, and tables of contents or yearly indexes. Almost 7,000 articles were identified, not including those published in foreign journals. Occupational therapy publications have increased steadily since 1900, with the most rapid increase during the 1970s and 1980s when five new occupational therapy journals were initiated. Suggestions for formulating search strategies are included. PMID:3285932

  11. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    NASA Astrophysics Data System (ADS)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  12. Single-visit or multiple-visit root canal treatment: systematic review, meta-analysis and trial sequential analysis

    PubMed Central

    Schwendicke, Falk; Göstemeyer, Gerd

    2017-01-01

    Objectives Single-visit root canal treatment has some advantages over conventional multivisit treatment, but might increase the risk of complications. We systematically evaluated the risk of complications after single-visit or multiple-visit root canal treatment using meta-analysis and trial-sequential analysis. Data Controlled trials comparing single-visit versus multiple-visit root canal treatment of permanent teeth were included. Trials needed to assess the risk of long-term complications (pain, infection, new/persisting/increasing periapical lesions ≥1 year after treatment), short-term pain or flare-up (acute exacerbation of initiation or continuation of root canal treatment). Sources Electronic databases (PubMed, EMBASE, Cochrane Central) were screened, random-effects meta-analyses performed and trial-sequential analysis used to control for risk of random errors. Evidence was graded according to GRADE. Study selection 29 trials (4341 patients) were included, all but 6 showing high risk of bias. Based on 10 trials (1257 teeth), risk of complications was not significantly different in single-visit versus multiple-visit treatment (risk ratio (RR) 1.00 (95% CI 0.75 to 1.35); weak evidence). Based on 20 studies (3008 teeth), risk of pain did not significantly differ between treatments (RR 0.99 (95% CI 0.76 to 1.30); moderate evidence). Risk of flare-up was recorded by 8 studies (1110 teeth) and was significantly higher after single-visit versus multiple-visit treatment (RR 2.13 (95% CI 1.16 to 3.89); very weak evidence). Trial-sequential analysis revealed that firm evidence for benefit, harm or futility was not reached for any of the outcomes. Conclusions There is insufficient evidence to rule out whether important differences between both strategies exist. Clinical significance Dentists can provide root canal treatment in 1 or multiple visits. Given the possibly increased risk of flare-ups, multiple-visit treatment might be preferred for certain teeth (eg, those with periapical lesions). PMID:28148534

  13. Source apportionment of PAH in Hamilton Harbour suspended sediments: comparison of two factor analysis methods.

    PubMed

    Sofowote, Uwayemi M; McCarry, Brian E; Marvin, Christopher H

    2008-08-15

    A total of 26 suspended sediment samples collected over a 5-year period in Hamilton Harbour, Ontario, Canada and surrounding creeks were analyzed for a suite of polycyclic aromatic hydrocarbons and sulfur heterocycles. Hamilton Harbour sediments contain relatively high levels of polycyclic aromatic compounds and heavy metals due to emissions from industrial and mobile sources. Two receptor modeling methods using factor analyses were compared to determine the profiles and relative contributions of pollution sources to the harbor; these methods are principal component analyses (PCA) with multiple linear regression analysis (MLR) and positive matrix factorization (PMF). Both methods identified four factors and gave excellent correlation coefficients between predicted and measured levels of 25 aromatic compounds; both methods predicted similar contributions from coal tar/coal combustion sources to the harbor (19 and 26%, respectively). One PCA factor was identified as contributions from vehicular emissions (61%); PMF was able to differentiate vehicular emissions into two factors, one attributed to gasoline emissions sources (28%) and the other to diesel emissions sources (24%). Overall, PMF afforded better source identification than PCA with MLR. This work constitutes one of the few examples of the application of PMF to the source apportionment of sediments; the addition of sulfur heterocycles to the analyte list greatly aided in the source identification process.

  14. The use of coded PCR primers enables high-throughput sequencing of multiple homolog amplification products by 454 parallel sequencing.

    PubMed

    Binladen, Jonas; Gilbert, M Thomas P; Bollback, Jonathan P; Panitz, Frank; Bendixen, Christian; Nielsen, Rasmus; Willerslev, Eske

    2007-02-14

    The invention of the Genome Sequence 20 DNA Sequencing System (454 parallel sequencing platform) has enabled the rapid and high-volume production of sequence data. Until now, however, individual emulsion PCR (emPCR) reactions and subsequent sequencing runs have been unable to combine template DNA from multiple individuals, as homologous sequences cannot be subsequently assigned to their original sources. We use conventional PCR with 5'-nucleotide tagged primers to generate homologous DNA amplification products from multiple specimens, followed by sequencing through the high-throughput Genome Sequence 20 DNA Sequencing System (GS20, Roche/454 Life Sciences). Each DNA sequence is subsequently traced back to its individual source through 5'tag-analysis. We demonstrate that this new approach enables the assignment of virtually all the generated DNA sequences to the correct source once sequencing anomalies are accounted for (miss-assignment rate<0.4%). Therefore, the method enables accurate sequencing and assignment of homologous DNA sequences from multiple sources in single high-throughput GS20 run. We observe a bias in the distribution of the differently tagged primers that is dependent on the 5' nucleotide of the tag. In particular, primers 5' labelled with a cytosine are heavily overrepresented among the final sequences, while those 5' labelled with a thymine are strongly underrepresented. A weaker bias also exists with regards to the distribution of the sequences as sorted by the second nucleotide of the dinucleotide tags. As the results are based on a single GS20 run, the general applicability of the approach requires confirmation. However, our experiments demonstrate that 5'primer tagging is a useful method in which the sequencing power of the GS20 can be applied to PCR-based assays of multiple homologous PCR products. The new approach will be of value to a broad range of research areas, such as those of comparative genomics, complete mitochondrial analyses, population genetics, and phylogenetics.

  15. Objective consensus from decision trees.

    PubMed

    Putora, Paul Martin; Panje, Cedric M; Papachristofilou, Alexandros; Dal Pra, Alan; Hundsberger, Thomas; Plasswilm, Ludwig

    2014-12-05

    Consensus-based approaches provide an alternative to evidence-based decision making, especially in situations where high-level evidence is limited. Our aim was to demonstrate a novel source of information, objective consensus based on recommendations in decision tree format from multiple sources. Based on nine sample recommendations in decision tree format a representative analysis was performed. The most common (mode) recommendations for each eventuality (each permutation of parameters) were determined. The same procedure was applied to real clinical recommendations for primary radiotherapy for prostate cancer. Data was collected from 16 radiation oncology centres, converted into decision tree format and analyzed in order to determine the objective consensus. Based on information from multiple sources in decision tree format, treatment recommendations can be assessed for every parameter combination. An objective consensus can be determined by means of mode recommendations without compromise or confrontation among the parties. In the clinical example involving prostate cancer therapy, three parameters were used with two cut-off values each (Gleason score, PSA, T-stage) resulting in a total of 27 possible combinations per decision tree. Despite significant variations among the recommendations, a mode recommendation could be found for specific combinations of parameters. Recommendations represented as decision trees can serve as a basis for objective consensus among multiple parties.

  16. A comprehensive analysis of heavy metals in urban road dust of Xi'an, China: Contamination, source apportionment and spatial distribution.

    PubMed

    Pan, Huiyun; Lu, Xinwei; Lei, Kai

    2017-12-31

    A detailed investigation was conducted to study heavy metal contamination in road dust from four regions of Xi'an, Northwest China. The concentrations of eight heavy metals Co, Cr, Cu, Mn, Ni, Pb, Zn and V were determined by X-Ray Fluorescence. The mean concentrations of these elements were: 30.9mgkg -1 Co, 145.0mgkg -1 Cr, 54.7mgkg -1 Cu, 510.5mgkg -1 Mn, 30.8mgkg -1 Ni, 124.5mgkg -1 Pb, 69.6mgkg -1 V and 268.6mgkg -1 Zn. There was significant enrichment of Pb, Zn, Co, Cu and Cr based on geo-accumulation index value. Multivariate statistical analysis showed that levels of Cu, Pb, Zn, Co and Cr were controlled by anthropogenic activities, while levels of Mn, Ni and V were associated with natural sources. Principle component analysis and multiple linear regression were applied to determine the source apportionment. The results showed that traffic was the main source with a percent contribution of 53.4%. Natural sources contributed 26.5%, and other anthropogenic pollution sources contributed 20.1%. Clear heavy metal pollution hotspots were identified by GIS mapping. The location of point pollution sources and prevailing wind direction were found to be important factors in the spatial distribution of heavy metals. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Spectral Analysis of Breast Cancer on Tissue Microarrays: Seeing Beyond Morphology

    DTIC Science & Technology

    2005-04-01

    Harvey N., Szymanski J.J., Bloch J.J., Mitchell M. investigation of image feature extraction by a genetic algorithm. Proc. SPIE 1999;3812:24-31. 11...automated feature extraction using multiple data sources. Proc. SPIE 2003;5099:190-200. 15 4 Spectral-Spatial Analysis of Urine Cytology Angeletti et al...Appendix Contents: 1. Harvey, N.R., Levenson, R.M., Rimm, D.L. (2003) Investigation of Automated Feature Extraction Techniques for Applications in

  18. Ignition probability of polymer-bonded explosives accounting for multiple sources of material stochasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S.; Barua, A.; Zhou, M., E-mail: min.zhou@me.gatech.edu

    2014-05-07

    Accounting for the combined effect of multiple sources of stochasticity in material attributes, we develop an approach that computationally predicts the probability of ignition of polymer-bonded explosives (PBXs) under impact loading. The probabilistic nature of the specific ignition processes is assumed to arise from two sources of stochasticity. The first source involves random variations in material microstructural morphology; the second source involves random fluctuations in grain-binder interfacial bonding strength. The effect of the first source of stochasticity is analyzed with multiple sets of statistically similar microstructures and constant interfacial bonding strength. Subsequently, each of the microstructures in the multiple setsmore » is assigned multiple instantiations of randomly varying grain-binder interfacial strengths to analyze the effect of the second source of stochasticity. Critical hotspot size-temperature states reaching the threshold for ignition are calculated through finite element simulations that explicitly account for microstructure and bulk and interfacial dissipation to quantify the time to criticality (t{sub c}) of individual samples, allowing the probability distribution of the time to criticality that results from each source of stochastic variation for a material to be analyzed. Two probability superposition models are considered to combine the effects of the multiple sources of stochasticity. The first is a parallel and series combination model, and the second is a nested probability function model. Results show that the nested Weibull distribution provides an accurate description of the combined ignition probability. The approach developed here represents a general framework for analyzing the stochasticity in the material behavior that arises out of multiple types of uncertainty associated with the structure, design, synthesis and processing of materials.« less

  19. Generalisation of the identity method for determination of high-order moments of multiplicity distributions with a software implementation

    NASA Astrophysics Data System (ADS)

    Maćkowiak-Pawłowska, Maja; Przybyła, Piotr

    2018-05-01

    The incomplete particle identification limits the experimentally-available phase space region for identified particle analysis. This problem affects ongoing fluctuation and correlation studies including the search for the critical point of strongly interacting matter performed on SPS and RHIC accelerators. In this paper we provide a procedure to obtain nth order moments of the multiplicity distribution using the identity method, generalising previously published solutions for n=2 and n=3. Moreover, we present an open source software implementation of this computation, called Idhim, that allows one to obtain the true moments of identified particle multiplicity distributions from the measured ones provided the response function of the detector is known.

  20. The Co-Construction of Cooperative Learning in Physical Education with Elementary Classroom Teachers

    ERIC Educational Resources Information Center

    Dyson, Ben P.; Colby, Rachel; Barratt, Mark

    2016-01-01

    The purpose of this study was to investigate generalist classroom elementary teachers' implementation of the Cooperative Learning (CL) pedagogical model into their physical education classes. The study used multiple sources of data drawing on qualitative data collection and data analysis research traditions (Miles, Huberman, & Saldana, 2014).…

  1. Public Service Professionalism among State Administrators: A Multiple State Study. A Working Paper.

    ERIC Educational Resources Information Center

    Rose, Bruce J.; And Others

    This working paper, part of an ongoing national study, presents preliminary analysis of public service professionalism among state public administrators in many states on the basis of data already produced by a continuing survey research project. Information about the data source and sample profiles are provided. Additionally, the research…

  2. Incorporating Computer-Aided Language Sample Analysis into Clinical Practice

    ERIC Educational Resources Information Center

    Price, Lisa Hammett; Hendricks, Sean; Cook, Colleen

    2010-01-01

    Purpose: During the evaluation of language abilities, the needs of the child are best served when multiple types and sources of data are included in the evaluation process. Current educational policies and practice guidelines further dictate the use of authentic assessment data to inform diagnosis and treatment planning. Language sampling and…

  3. Exploring Teacher Leadership in a Rural, Secondary School: Reciprocal Learning Teams as a Catalyst for Emergent Leadership

    ERIC Educational Resources Information Center

    Cherkowski, Sabre; Schnellert, Leyton

    2017-01-01

    The purpose of this case study was to examine how teachers experienced professional development as collaborative inquiry, and how their experiences contributed to their development as teacher leaders. Three overarching themes were identified through iterative qualitative analysis of multiple data sources including interviews, observations,…

  4. Analysis of transmission through slit and multiple grooves structures for biosensors

    NASA Astrophysics Data System (ADS)

    Kim, Bong Ho; Nakarmi, Bikash; Won, Yong Hyub

    2015-03-01

    We analyze the transmission property of nanostructures made on silver and gold metal for the applications in optical biosensors. Various structures such as slit only, slit groove slit, and multiple slit and groove structures are taken into account to find the effect of various physical parameters such as number of grooves, number of slits and others on the transmission of different wavelength light sources through the structure. A broad wavelength of 400 nm to 900 nm is used to analyze the transmission through the structure. With these structures and broad light source, change in transmission intensity is analyzed with the change in the refractive index. The change in refractive index of the analyte varies transmission intensity and wavelength shift at the output beam which can be used for sensing the amount of analyte such as monitoring glucose amount on blood/saliva, hydrogen peroxide and others. The detection of these analytes can be used to detect the different disease. The analysis of the transmittance through the nanostructure can be used for the detection of several disease such as diabetes and others through the saliva, blood and others non-invasively.

  5. A method for separation of heavy metal sources in urban groundwater using multiple lines of evidence.

    PubMed

    Hepburn, Emily; Northway, Anne; Bekele, Dawit; Liu, Gang-Jun; Currell, Matthew

    2018-06-11

    Determining sources of heavy metals in soils, sediments and groundwater is important for understanding their fate and transport and mitigating human and environmental exposures. Artificially imported fill, natural sediments and groundwater from 240 ha of reclaimed land at Fishermans Bend in Australia, were analysed for heavy metals and other parameters to determine the relative contributions from different possible sources. Fishermans Bend is Australia's largest urban re-development project, however, complicated land-use history, geology, and multiple contamination sources pose challenges to successful re-development. We developed a method for heavy metal source separation in groundwater using statistical categorisation of the data, analysis of soil leaching values and fill/sediment XRF profiling. The method identified two major sources of heavy metals in groundwater: 1. Point sources from local or up-gradient groundwater contaminated by industrial activities and/or legacy landfills; and 2. contaminated fill, where leaching of Cu, Mn, Pb and Zn was observed. Across the precinct, metals were most commonly sourced from a combination of these sources; however, eight locations indicated at least one metal sourced solely from fill leaching, and 23 locations indicated at least one metal sourced solely from impacted groundwater. Concentrations of heavy metals in groundwater ranged from 0.0001 to 0.003 mg/L (Cd), 0.001-0.1 mg/L (Cr), 0.001-0.2 mg/L (Cu), 0.001-0.5 mg/L (Ni), 0.001-0.01 mg/L (Pb), and 0.005-1.2 mg/L (Zn). Our method can determine the likely contribution of different metal sources to groundwater, helping inform more detailed contamination assessments and precinct-wide management and remediation strategies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. The foodscape: classification and field validation of secondary data sources.

    PubMed

    Lake, Amelia A; Burgoine, Thomas; Greenhalgh, Fiona; Stamp, Elaine; Tyrrell, Rachel

    2010-07-01

    The aims were to: develop a food environment classification tool and to test the acceptability and validity of three secondary sources of food environment data within a defined urban area of Newcastle-Upon-Tyne, using a field validation method. A 21 point (with 77 sub-categories) classification tool was developed. The fieldwork recorded 617 establishments selling food and/or food products. The sensitivity analysis of the secondary sources against fieldwork for the Newcastle City Council data was good (83.6%), while Yell.com and the Yellow Pages were low (51.2% and 50.9%, respectively). To improve the quality of secondary data, multiple sources should be used in order to achieve a realistic picture of the foodscape. 2010 Elsevier Ltd. All rights reserved.

  7. How organic carbon derived from multiple sources contributes to carbon sequestration processes in a shallow coastal system?

    PubMed Central

    Watanabe, Kenta; Kuwae, Tomohiro

    2015-01-01

    Carbon captured by marine organisms helps sequester atmospheric CO2, especially in shallow coastal ecosystems, where rates of primary production and burial of organic carbon (OC) from multiple sources are high. However, linkages between the dynamics of OC derived from multiple sources and carbon sequestration are poorly understood. We investigated the origin (terrestrial, phytobenthos derived, and phytoplankton derived) of particulate OC (POC) and dissolved OC (DOC) in the water column and sedimentary OC using elemental, isotopic, and optical signatures in Furen Lagoon, Japan. Based on these data analysis, we explored how OC from multiple sources contributes to sequestration via storage in sediments, water column sequestration, and air–sea CO2 exchanges, and analyzed how the contributions vary with salinity in a shallow seagrass meadow as well. The relative contribution of terrestrial POC in the water column decreased with increasing salinity, whereas autochthonous POC increased in the salinity range 10–30. Phytoplankton-derived POC dominated the water column POC (65–95%) within this salinity range; however, it was minor in the sediments (3–29%). In contrast, terrestrial and phytobenthos-derived POC were relatively minor contributors in the water column but were major contributors in the sediments (49–78% and 19–36%, respectively), indicating that terrestrial and phytobenthos-derived POC were selectively stored in the sediments. Autochthonous DOC, part of which can contribute to long-term carbon sequestration in the water column, accounted for >25% of the total water column DOC pool in the salinity range 15–30. Autochthonous OC production decreased the concentration of dissolved inorganic carbon in the water column and thereby contributed to atmospheric CO2 uptake, except in the low-salinity zone. Our results indicate that shallow coastal ecosystems function not only as transition zones between land and ocean but also as carbon sequestration filters. They function at different timescales, depending on the salinity, and OC sources. PMID:25880367

  8. Electrophysiological correlates of cocktail-party listening.

    PubMed

    Lewald, Jörg; Getzmann, Stephan

    2015-10-01

    Detecting, localizing, and selectively attending to a particular sound source of interest in complex auditory scenes composed of multiple competing sources is a remarkable capacity of the human auditory system. The neural basis of this so-called "cocktail-party effect" has remained largely unknown. Here, we studied the cortical network engaged in solving the "cocktail-party" problem, using event-related potentials (ERPs) in combination with two tasks demanding horizontal localization of a naturalistic target sound presented either in silence or in the presence of multiple competing sound sources. Presentation of multiple sound sources, as compared to single sources, induced an increased P1 amplitude, a reduction in N1, and a strong N2 component, resulting in a pronounced negativity in the ERP difference waveform (N2d) around 260 ms after stimulus onset. About 100 ms later, the anterior contralateral N2 subcomponent (N2ac) occurred in the multiple-sources condition, as computed from the amplitude difference for targets in the left minus right hemispaces. Cortical source analyses of the ERP modulation, resulting from the contrast of multiple vs. single sources, generally revealed an initial enhancement of electrical activity in right temporo-parietal areas, including auditory cortex, by multiple sources (at P1) that is followed by a reduction, with the primary sources shifting from right inferior parietal lobule (at N1) to left dorso-frontal cortex (at N2d). Thus, cocktail-party listening, as compared to single-source localization, appears to be based on a complex chronology of successive electrical activities within a specific cortical network involved in spatial hearing in complex situations. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. A European model and case studies for aggregate exposure assessment of pesticides.

    PubMed

    Kennedy, Marc C; Glass, C Richard; Bokkers, Bas; Hart, Andy D M; Hamey, Paul Y; Kruisselbrink, Johannes W; de Boer, Waldo J; van der Voet, Hilko; Garthwaite, David G; van Klaveren, Jacob D

    2015-05-01

    Exposures to plant protection products (PPPs) are assessed using risk analysis methods to protect public health. Traditionally, single sources, such as food or individual occupational sources, have been addressed. In reality, individuals can be exposed simultaneously to multiple sources. Improved regulation therefore requires the development of new tools for estimating the population distribution of exposures aggregated within an individual. A new aggregate model is described, which allows individual users to include as much, or as little, information as is available or relevant for their particular scenario. Depending on the inputs provided by the user, the outputs can range from simple deterministic values through to probabilistic analyses including characterisations of variability and uncertainty. Exposures can be calculated for multiple compounds, routes and sources of exposure. The aggregate model links to the cumulative dietary exposure model developed in parallel and is implemented in the web-based software tool MCRA. Case studies are presented to illustrate the potential of this model, with inputs drawn from existing European data sources and models. These cover exposures to UK arable spray operators, Italian vineyard spray operators, Netherlands users of a consumer spray and UK bystanders/residents. The model could also be adapted to handle non-PPP compounds. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  10. Coupling detrended fluctuation analysis for multiple warehouse-out behavioral sequences

    NASA Astrophysics Data System (ADS)

    Yao, Can-Zhong; Lin, Ji-Nan; Zheng, Xu-Zhou

    2017-01-01

    Interaction patterns among different warehouses could make the warehouse-out behavioral sequences less predictable. We firstly take a coupling detrended fluctuation analysis on the warehouse-out quantity, and find that the multivariate sequences exhibit significant coupling multifractal characteristics regardless of the types of steel products. Secondly, we track the sources of multifractal warehouse-out sequences by shuffling and surrogating original ones, and we find that fat-tail distribution contributes more to multifractal features than the long-term memory, regardless of types of steel products. From perspective of warehouse contribution, some warehouses steadily contribute more to multifractal than other warehouses. Finally, based on multiscale multifractal analysis, we propose Hurst surface structure to investigate coupling multifractal, and show that multiple behavioral sequences exhibit significant coupling multifractal features that emerge and usually be restricted within relatively greater time scale interval.

  11. Cold Atom Source Containing Multiple Magneto-Optical Traps

    NASA Technical Reports Server (NTRS)

    Ramirez-Serrano, Jaime; Kohel, James; Kellogg, James; Lim, Lawrence; Yu, Nan; Maleki, Lute

    2007-01-01

    An apparatus that serves as a source of a cold beam of atoms contains multiple two-dimensional (2D) magneto-optical traps (MOTs). (Cold beams of atoms are used in atomic clocks and in diverse scientific experiments and applications.) The multiple-2D-MOT design of this cold atom source stands in contrast to single-2D-MOT designs of prior cold atom sources of the same type. The advantages afforded by the present design are that this apparatus is smaller than prior designs.

  12. Assessment of sediment quality in the Mediterranean Sea-Boughrara lagoon exchange areas (southeastern Tunisia): GIS approach-based chemometric methods.

    PubMed

    Kharroubi, Adel; Gargouri, Dorra; Baati, Houda; Azri, Chafai

    2012-06-01

    Concentrations of selected heavy metals (Cd, Pb, Zn, Cu, Mn, and Fe) in surface sediments from 66 sites in both northern and eastern Mediterranean Sea-Boughrara lagoon exchange areas (southeastern Tunisia) were studied in order to understand current metal contamination due to the urbanization and economic development of nearby several coastal regions of the Gulf of Gabès. Multiple approaches were applied for the sediment quality assessment. These approaches were based on GIS coupled with chemometric methods (enrichment factors, geoaccumulation index, principal component analysis, and cluster analysis). Enrichment factors and principal component analysis revealed two distinct groups of metals. The first group corresponded to Fe and Mn derived from natural sources, and the second group contained Cd, Pb, Zn, and Cu originated from man-made sources. For these latter metals, cluster analysis showed two distinct distributions in the selected areas. They were attributed to temporal and spatial variations of contaminant sources input. The geoaccumulation index (I (geo)) values explained that only Cd, Pb, and Cu can be considered as moderate to extreme pollutants in the studied sediments.

  13. On Road Study of Colorado Front Range Greenhouse Gases Distribution and Sources

    NASA Astrophysics Data System (ADS)

    Petron, G.; Hirsch, A.; Trainer, M. K.; Karion, A.; Kofler, J.; Sweeney, C.; Andrews, A.; Kolodzey, W.; Miller, B. R.; Miller, L.; Montzka, S. A.; Kitzis, D. R.; Patrick, L.; Frost, G. J.; Ryerson, T. B.; Robers, J. M.; Tans, P.

    2008-12-01

    The Global Monitoring Division and Chemical Sciences Division of the NOAA Earth System Research Laboratory have teamed up over the summer 2008 to experiment with a new measurement strategy to characterize greenhouse gases distribution and sources in the Colorado Front Range. Combining expertise in greenhouse gases measurements and in local to regional scales air quality study intensive campaigns, we have built the 'Hybrid Lab'. A continuous CO2 and CH4 cavity ring down spectroscopic analyzer (Picarro, Inc.), a CO gas-filter correlation instrument (Thermo Environmental, Inc.) and a continuous UV absorption ozone monitor (2B Technologies, Inc., model 202SC) have been installed securely onboard a 2006 Toyota Prius Hybrid vehicle with an inlet bringing in outside air from a few meters above the ground. To better characterize point and distributed sources, air samples were taken with a Portable Flask Package (PFP) for later multiple species analysis in the lab. A GPS unit hooked up to the ozone analyzer and another one installed on the PFP kept track of our location allowing us to map measured concentrations on the driving route using Google Earth. The Hybrid Lab went out for several drives in the vicinity of the NOAA Boulder Atmospheric Observatory (BAO) tall tower located in Erie, CO and covering areas from Boulder, Denver, Longmont, Fort Collins and Greeley. Enhancements in CO2, CO and destruction of ozone mainly reflect emissions from traffic. Methane enhancements however are clearly correlated with nearby point sources (landfill, feedlot, natural gas compressor ...) or with larger scale air masses advected from the NE Colorado, where oil and gas drilling operations are widespread. The multiple species analysis (hydrocarbons, CFCs, HFCs) of the air samples collected along the way bring insightful information about the methane sources at play. We will present results of the analysis and interpretation of the Hybrid Lab Front Range Study and conclude with perspectives on how we will adapt the measurement strategy to study CO2 anthropogenic emissions in Denver Basin.

  14. Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals

    NASA Astrophysics Data System (ADS)

    Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura

    2016-05-01

    Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.

  15. Electrophysiology quantitative electroencephalography/low resolution brain electromagnetic tomography functional brain imaging (QEEG LORETA): Case report: Subjective idiopathic tinnitus - predominantly central type severe disabling tinnitus.

    PubMed

    Shulman, Abraham; Goldstein, Barbara

    2014-01-01

    The clinical significance of QEEG LORETA data analysis performed sequentially within 6 months is presented in a case report of a predominantly central type severe disabling subjective idiopathic tinnitus (SIT) before and following treatment. The QEEG LORETA data is reported as Z-scores of z = ± 2.54, p < 0.013. The focus is on demonstration of patterns of brain wave oscillations reflecting multiple brain functions in multiple ROIs in the presence of the tinnitus signal (SIT). The patterns of brain activity both high, middle and low frequencies are hypothesized to reflect connectivities within and between multiple neuronal networks in brain. The Loreta source localization non auditory ROI Images at the maximal abnormality in the very narrow band frequency spectra (24.21 Hz), showed the mathematically most probable underlying sources of the scalp recorded data to be greatest in the mid-cingulate, bilateral precuneus, cingulate and the bilateral caudate nucleus. Clinical correlation of the data with the history and course of the SIT is considered an objective demonstration of the affect, behavioral, and emotional component of the SIT. The correlation of the caudate activity, SIT as the traumatic event with the clinical course of PTSD, and the clinical diagnosis of PTSD is discussed. The clinical translation for patient care is highlighted in a SIT patient with multiple comorbidities by translation of QEEG/LORETA electrophysiologic data, as an adjunct to: provide an objectivity of patterns of brain wave activity in multiple regions of interest (ROIs) reflecting multiple brain functions, in response to and in the presence of the tinnitus signal, recorded from the scalp and analyzed with the metrics of absolute power, relative power, asymmetry, and coherence, for the subjective tinnitus complaint (SIT); 2) provide an increase in the accuracy of the tinnitus diagnosis; 3) assess/monitor treatment efficacy; 4) provide a rationale for selection of a combined tinnitus targeted therapy of behavioral, pharmacologic, sound therapy modalities of treatment attempting tinnitus relief; 5) provide insight into the medical significance of the SIT; 6) attempt discriminant function analysis for identification of a particular diagnostic clinical category of CNS neuropsychiatric disease; and 7) attempt to translate what is known of the neuroscience of sensation, brain function, QEEG/LORETA source localization, for the etiology and prognosis of the individual SIT patient.

  16. Under-reporting of pertussis in Ontario: A Canadian Immunization Research Network (CIRN) study using capture-recapture

    PubMed Central

    Crowcroft, Natasha S.; Johnson, Caitlin; Chen, Cynthia; Li, Ye; Marchand-Austin, Alex; Bolotin, Shelly; Schwartz, Kevin; Deeks, Shelley L.; Jamieson, Frances; Drews, Steven; Russell, Margaret L.; Svenson, Lawrence W.; Simmonds, Kimberley; Mahmud, Salaheddin M.; Kwong, Jeffrey C.

    2018-01-01

    Introduction Under-reporting of pertussis cases is a longstanding challenge. We estimated the true number of pertussis cases in Ontario using multiple data sources, and evaluated the completeness of each source. Methods We linked data from multiple sources for the period 2009 to 2015: public health reportable disease surveillance data, public health laboratory data, and health administrative data (hospitalizations, emergency department visits, and physician office visits). To estimate the total number of pertussis cases in Ontario, we used a three-source capture-recapture analysis stratified by age (infants, or aged one year and older) and adjusting for dependency between sources. We used the Bayesian Information Criterion to compare models. Results Using probable and confirmed reported cases, laboratory data, and combined hospitalizations/emergency department visits, the estimated total number of cases during the six-year period amongst infants was 924, compared with 545 unique observed cases from all sources. Using the same sources, the estimated total for those aged 1 year and older was 12,883, compared with 3,304 observed cases from all sources. Only 37% of infants and 11% for those aged 1 year and over admitted to hospital or seen in an emergency department for pertussis were reported to public health. Public health reporting sensitivity varied from 2% to 68% depending on age group and the combination of data sources included. Sensitivity of combined hospitalizations and emergency department visits varied from 37% to 49% and of laboratory data from 1% to 50%. Conclusions All data sources contribute cases and are complementary, suggesting that the incidence of pertussis is substantially higher than suggested by routine reports. The sensitivity of different data sources varies. Better case identification is required to improve pertussis control in Ontario. PMID:29718945

  17. Use of ultrasonic array method for positioning multiple partial discharge sources in transformer oil.

    PubMed

    Xie, Qing; Tao, Junhan; Wang, Yongqiang; Geng, Jianghai; Cheng, Shuyi; Lü, Fangcheng

    2014-08-01

    Fast and accurate positioning of partial discharge (PD) sources in transformer oil is very important for the safe, stable operation of power systems because it allows timely elimination of insulation faults. There is usually more than one PD source once an insulation fault occurs in the transformer oil. This study, which has both theoretical and practical significance, proposes a method of identifying multiple PD sources in the transformer oil. The method combines the two-sided correlation transformation algorithm in the broadband signal focusing and the modified Gerschgorin disk estimator. The method of classification of multiple signals is used to determine the directions of arrival of signals from multiple PD sources. The ultrasonic array positioning method is based on the multi-platform direction finding and the global optimization searching. Both the 4 × 4 square planar ultrasonic sensor array and the ultrasonic array detection platform are built to test the method of identifying and positioning multiple PD sources. The obtained results verify the validity and the engineering practicability of this method.

  18. Common source-multiple load vs. separate source-individual load photovoltaic system

    NASA Technical Reports Server (NTRS)

    Appelbaum, Joseph

    1989-01-01

    A comparison of system performance is made for two possible system setups: (1) individual loads powered by separate solar cell sources; and (2) multiple loads powered by a common solar cell source. A proof for resistive loads is given that shows the advantage of a common source over a separate source photovoltaic system for a large range of loads. For identical loads, both systems perform the same.

  19. Employment-based health insurance is failing: now what?

    PubMed

    Enthoven, Alain C

    2003-01-01

    Employment-based health insurance is failing. Costs are out of control. Employers have no effective strategy to deal with this. They must think strategically about fundamental change. This analysis explains how employers' purchasing policies contribute to rising costs and block growth of economical care. Single-source managed care is ineffective, and effective managed care cannot be a single source. Employers should create exchanges through which they can offer employees wide, responsible, individual, multiple choices among health care delivery systems and create serious competition based on value for money. Recently introduced technology can assist this process.

  20. Diagnosing turbulence for research aircraft safety using open source toolkits

    NASA Astrophysics Data System (ADS)

    Lang, T. J.; Guy, N.

    Open source software toolkits have been developed and applied to diagnose in-cloud turbulence in the vicinity of Earth science research aircraft, via analysis of ground-based Doppler radar data. Based on multiple retrospective analyses, these toolkits show promise for detecting significant turbulence well prior to cloud penetrations by research aircraft. A pilot study demonstrated the ability to provide mission scientists turbulence estimates in near real time during an actual field campaign, and thus these toolkits are recommended for usage in future cloud-penetrating aircraft field campaigns.

  1. Modified two-sources quantum statistical model and multiplicity fluctuation in the finite rapidity region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, D.; Sarkar, S.; Sen, S.

    1995-06-01

    In this paper the behavior of factorial moments with rapidity window size, which is usually explained in terms of ``intermittency,`` has been interpreted by simple quantum statistical properties of the emitting system using the concept of ``modified two-source model`` as recently proposed by Ghosh and Sarkar [Phys. Lett. B 278, 465 (1992)]. The analysis has been performed using our own data of {sup 16}O-Ag/Br and {sup 24}Mg-Ag/Br interactions at a few tens of GeV energy regime.

  2. PIVOT: platform for interactive analysis and visualization of transcriptomics data.

    PubMed

    Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong

    2018-01-05

    Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.

  3. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  4. Real-world persistence with fingolimod for the treatment of multiple sclerosis: A systematic review and meta-analysis.

    PubMed

    Kantor, Daniel; Johnson, Kristen; Vieira, Maria Cecilia; Signorovitch, James; Li, Nanxin; Gao, Wei; Koo, Valerie; Duchesneau, Emilie; Herrera, Vivian

    2018-05-15

    To systematically review reports of fingolimod persistence in the treatment of relapsing-remitting multiple sclerosis (RRMS) across data sources and practice settings, and to develop a consensus estimate of the 1-year real-world persistence rate. A systematic literature review was conducted (MEDLINE, EMBASE, and abstracts from selected conferences [2013-2015]) to identify observational studies reporting 1-year fingolimod persistence among adult patients with RRMS (sample size ≥50). A random-effects meta-analysis was performed to estimate a synthesized 1-year persistence rate and to assess heterogeneity across studies. Of 527 publications identified, 25 real-world studies reporting 1-year fingolimod persistence rates were included. The studies included patients from different data sources (e.g., administrative claims, electronic medical records, or registries), used different definitions of persistence (e.g., based on prescriptions refills, patient report, or prescription orders), and spanned multiple geographic regions. Reported 1-year persistence rates ranged from 72%-100%, and exhibited statistical evidence of heterogeneity (I 2  = 93% of the variability due to heterogeneity across studies). The consensus estimate of the 1-year persistence rate was 82% (95% confidence interval: 79%-85%). Across heterogeneous study designs and patient populations found in real-world studies, the consensus 1-year fingolimod persistence rate exceeded 80%, consistent with persistence rates identified in the recently-completed trial, PREFERMS. Copyright © 2018. Published by Elsevier B.V.

  5. A Semantic Web Management Model for Integrative Biomedical Informatics

    PubMed Central

    Deus, Helena F.; Stanislaus, Romesh; Veiga, Diogo F.; Behrens, Carmen; Wistuba, Ignacio I.; Minna, John D.; Garner, Harold R.; Swisher, Stephen G.; Roth, Jack A.; Correa, Arlene M.; Broom, Bradley; Coombes, Kevin; Chang, Allen; Vogel, Lynn H.; Almeida, Jonas S.

    2008-01-01

    Background Data, data everywhere. The diversity and magnitude of the data generated in the Life Sciences defies automated articulation among complementary efforts. The additional need in this field for managing property and access permissions compounds the difficulty very significantly. This is particularly the case when the integration involves multiple domains and disciplines, even more so when it includes clinical and high throughput molecular data. Methodology/Principal Findings The emergence of Semantic Web technologies brings the promise of meaningful interoperation between data and analysis resources. In this report we identify a core model for biomedical Knowledge Engineering applications and demonstrate how this new technology can be used to weave a management model where multiple intertwined data structures can be hosted and managed by multiple authorities in a distributed management infrastructure. Specifically, the demonstration is performed by linking data sources associated with the Lung Cancer SPORE awarded to The University of Texas MDAnderson Cancer Center at Houston and the Southwestern Medical Center at Dallas. A software prototype, available with open source at www.s3db.org, was developed and its proposed design has been made publicly available as an open source instrument for shared, distributed data management. Conclusions/Significance The Semantic Web technologies have the potential to addresses the need for distributed and evolvable representations that are critical for systems Biology and translational biomedical research. As this technology is incorporated into application development we can expect that both general purpose productivity software and domain specific software installed on our personal computers will become increasingly integrated with the relevant remote resources. In this scenario, the acquisition of a new dataset should automatically trigger the delegation of its analysis. PMID:18698353

  6. Combining Multiple Rupture Models in Real-Time for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Wu, S.; Beck, J. L.; Heaton, T. H.

    2015-12-01

    The ShakeAlert earthquake early warning system for the west coast of the United States is designed to combine information from multiple independent earthquake analysis algorithms in order to provide the public with robust predictions of shaking intensity at each user's location before they are affected by strong shaking. The current contributing analyses come from algorithms that determine the origin time, epicenter, and magnitude of an earthquake (On-site, ElarmS, and Virtual Seismologist). A second generation of algorithms will provide seismic line source information (FinDer), as well as geodetically-constrained slip models (BEFORES, GPSlip, G-larmS, G-FAST). These new algorithms will provide more information about the spatial extent of the earthquake rupture and thus improve the quality of the resulting shaking forecasts.Each of the contributing algorithms exploits different features of the observed seismic and geodetic data, and thus each algorithm may perform differently for different data availability and earthquake source characteristics. Thus the ShakeAlert system requires a central mediator, called the Central Decision Module (CDM). The CDM acts to combine disparate earthquake source information into one unified shaking forecast. Here we will present a new design for the CDM that uses a Bayesian framework to combine earthquake reports from multiple analysis algorithms and compares them to observed shaking information in order to both assess the relative plausibility of each earthquake report and to create an improved unified shaking forecast complete with appropriate uncertainties. We will describe how these probabilistic shaking forecasts can be used to provide each user with a personalized decision-making tool that can help decide whether or not to take a protective action (such as opening fire house doors or stopping trains) based on that user's distance to the earthquake, vulnerability to shaking, false alarm tolerance, and time required to act.

  7. Satellite Remote Sensing of Harmful Algal Blooms (HABs) and a Potential Synthesized Framework

    PubMed Central

    Shen, Li; Xu, Huiping; Guo, Xulin

    2012-01-01

    Harmful algal blooms (HABs) are severe ecological disasters threatening aquatic systems throughout the World, which necessitate scientific efforts in detecting and monitoring them. Compared with traditional in situ point observations, satellite remote sensing is considered as a promising technique for studying HABs due to its advantages of large-scale, real-time, and long-term monitoring. The present review summarizes the suitability of current satellite data sources and different algorithms for detecting HABs. It also discusses the spatial scale issue of HABs. Based on the major problems identified from previous literature, including the unsystematic understanding of HABs, the insufficient incorporation of satellite remote sensing, and a lack of multiple oceanographic explanations of the mechanisms causing HABs, this review also attempts to provide a comprehensive understanding of the complicated mechanism of HABs impacted by multiple oceanographic factors. A potential synthesized framework can be established by combining multiple accessible satellite remote sensing approaches including visual interpretation, spectra analysis, parameters retrieval and spatial-temporal pattern analysis. This framework aims to lead to a systematic and comprehensive monitoring of HABs based on satellite remote sensing from multiple oceanographic perspectives. PMID:22969372

  8. Profiling Students' Multiple Source Use by Question Type

    ERIC Educational Resources Information Center

    List, Alexandra; Grossnickle, Emily M.; Alexander, Patricia A.

    2016-01-01

    The present study examined undergraduate students' multiple source use in response to two different types of academic questions, one discrete and one open-ended. Participants (N = 240) responded to two questions using a library of eight digital sources, varying in source type (e.g., newspaper article) and reliability (e.g., authors' credentials).…

  9. The Multiple Source Effect and Synthesized Speech: Doubly-Disembodied Language as a Conceptual Framework

    ERIC Educational Resources Information Center

    Lee, Kwan Min; Nass, Clifford

    2004-01-01

    Two experiments examine the effect of multiple synthetic voices in an e-commerce context. In Study 1, participants (N=40) heard five positive reviews about a book from five different synthetic voices or from a single synthetic voice. Consistent with the multiple source effect, results showed that participants hearing multiple synthetic voices…

  10. Whole Genome Sequence Typing to Investigate the Apophysomyces Outbreak following a Tornado in Joplin, Missouri, 2011

    PubMed Central

    Etienne, Kizee A.; Gillece, John; Hilsabeck, Remy; Schupp, Jim M.; Colman, Rebecca; Lockhart, Shawn R.; Gade, Lalitha; Thompson, Elizabeth H.; Sutton, Deanna A.; Neblett-Fanfair, Robyn; Park, Benjamin J.; Turabelidze, George; Keim, Paul; Brandt, Mary E.; Deak, Eszter; Engelthaler, David M.

    2012-01-01

    Case reports of Apophysomyces spp. in immunocompetent hosts have been a result of traumatic deep implantation of Apophysomyces spp. spore-contaminated soil or debris. On May 22, 2011 a tornado occurred in Joplin, MO, leaving 13 tornado victims with Apophysomyces trapeziformis infections as a result of lacerations from airborne material. We used whole genome sequence typing (WGST) for high-resolution phylogenetic SNP analysis of 17 outbreak Apophysomyces isolates and five additional temporally and spatially diverse Apophysomyces control isolates (three A. trapeziformis and two A. variabilis isolates). Whole genome SNP phylogenetic analysis revealed three clusters of genotypically related or identical A. trapeziformis isolates and multiple distinct isolates among the Joplin group; this indicated multiple genotypes from a single or multiple sources. Though no linkage between genotype and location of exposure was observed, WGST analysis determined that the Joplin isolates were more closely related to each other than to the control isolates, suggesting local population structure. Additionally, species delineation based on WGST demonstrated the need to reassess currently accepted taxonomic classifications of phylogenetic species within the genus Apophysomyces. PMID:23209631

  11. Whole genome sequence typing to investigate the Apophysomyces outbreak following a tornado in Joplin, Missouri, 2011.

    PubMed

    Etienne, Kizee A; Gillece, John; Hilsabeck, Remy; Schupp, Jim M; Colman, Rebecca; Lockhart, Shawn R; Gade, Lalitha; Thompson, Elizabeth H; Sutton, Deanna A; Neblett-Fanfair, Robyn; Park, Benjamin J; Turabelidze, George; Keim, Paul; Brandt, Mary E; Deak, Eszter; Engelthaler, David M

    2012-01-01

    Case reports of Apophysomyces spp. in immunocompetent hosts have been a result of traumatic deep implantation of Apophysomyces spp. spore-contaminated soil or debris. On May 22, 2011 a tornado occurred in Joplin, MO, leaving 13 tornado victims with Apophysomyces trapeziformis infections as a result of lacerations from airborne material. We used whole genome sequence typing (WGST) for high-resolution phylogenetic SNP analysis of 17 outbreak Apophysomyces isolates and five additional temporally and spatially diverse Apophysomyces control isolates (three A. trapeziformis and two A. variabilis isolates). Whole genome SNP phylogenetic analysis revealed three clusters of genotypically related or identical A. trapeziformis isolates and multiple distinct isolates among the Joplin group; this indicated multiple genotypes from a single or multiple sources. Though no linkage between genotype and location of exposure was observed, WGST analysis determined that the Joplin isolates were more closely related to each other than to the control isolates, suggesting local population structure. Additionally, species delineation based on WGST demonstrated the need to reassess currently accepted taxonomic classifications of phylogenetic species within the genus Apophysomyces.

  12. Spherical loudspeaker array for local active control of sound.

    PubMed

    Rafaely, Boaz

    2009-05-01

    Active control of sound has been employed to reduce noise levels around listeners' head using destructive interference from noise-canceling sound sources. Recently, spherical loudspeaker arrays have been studied as multiple-channel sound sources, capable of generating sound fields with high complexity. In this paper, the potential use of a spherical loudspeaker array for local active control of sound is investigated. A theoretical analysis of the primary and secondary sound fields around a spherical sound source reveals that the natural quiet zones for the spherical source have a shell-shape. Using numerical optimization, quiet zones with other shapes are designed, showing potential for quiet zones with extents that are significantly larger than the well-known limit of a tenth of a wavelength for monopole sources. The paper presents several simulation examples showing quiet zones in various configurations.

  13. Principal opium alkaloids as possible biochemical markers for the source identification of Indian opium.

    PubMed

    Mohana, Mudiam; Reddy, Krishna; Jayshanker, Gurumurthy; Suresh, Velayudhan; Sarin, Rajendra Kumar; Sashidhar, R B

    2005-08-01

    A total of 124 opium samples originating from different licit opium growing divisions of India were analyzed for their principal alkaloid (thebaine, codeine, morphine, papaverine, and narcotine) content by capillary zone electrophoresis (CZE) without derivatization or purification. Absence of papaverine in Bareilly, Tilhar, and most of the samples originating from Kota is a significant observation in relation to the source of Indian opium. Multiple discriminant analysis was applied to the quantitative principal alkaloid data to determine an optimal classifier in order to evaluate the source of Indian opium. The predictive value based on the discriminant analysis was found to be 85% in relation to the source of opium and the study also revealed that all the principal alkaloids have to be analyzed for source identification of Indian opium. Chemometrics performed with principal alkaloids analytical data was used successfully in discriminating the licit opium growing divisions of India into three major groups, viz., group I, II, and III. The methodology developed may find wide forensic application in identifying the source of licit or illicit opium originating from India, and to differentiate it from opium originating from other opium producing countries.

  14. X-ray imaging using avalanche multiplication in amorphous selenium: investigation of intrinsic avalanche noise.

    PubMed

    Hunt, D C; Tanioka, Kenkichi; Rowlands, J A

    2007-12-01

    The flat-panel detector (FPD) is the state-of-the-art detector for digital radiography. The FPD can acquire images in real-time, has superior spatial resolution, and is free of the problems of x-ray image intensifiers-veiling glare, pin-cushion and magnetic distortion. However, FPDs suffer from poor signal to noise ratio performance at typical fluoroscopic exposure rates where the quantum noise is reduced to the point that it becomes comparable to the fixed electronic noise. It has been shown previously that avalanche multiplication gain in amorphous selenium (a-Se) can provide the necessary amplification to overcome the electronic noise of the FPD. Avalanche multiplication, however, comes with its own intrinsic contribution to the noise in the form of gain fluctuation noise. In this article a cascaded systems analysis is used to present a modified metric related to the detective quantum efficiency. The modified metric is used to study a diagnostic x-ray imaging system in the presence of intrinsic avalanche multiplication noise independently from other noise sources, such as electronic noise. An indirect conversion imaging system is considered to make the study independent of other avalanche multiplication related noise sources, such as the fluctuations arising from the depth of x-ray absorption. In this case all the avalanche events are initiated at the surface of the avalanche layer, and there are no fluctuations in the depth of absorption. Experiments on an indirect conversion x-ray imaging system using avalanche multiplication in a layer of a-Se are also presented. The cascaded systems analysis shows that intrinsic noise of avalanche multiplication will not have any deleterious influence on detector performance at zero spatial frequency in x-ray imaging provided the product of conversion gain, coupling efficiency, and optical quantum efficiency are much greater than a factor of 2. The experimental results show that avalanche multiplication in a-Se behaves as an intrinsic noise free avalanche multiplication, in accordance with our theory. Provided good coupling efficiency and high optical quantum efficiency are maintained, avalanche multiplication in a-Se has the potential to increase the gain and make negligible contribution to the noise, thereby improving the performance of indirect FPDs in fluoroscopy.

  15. X-ray imaging using avalanche multiplication in amorphous selenium: Investigation of intrinsic avalanche noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, D. C.; Tanioka, Kenkichi; Rowlands, J. A.

    2007-12-15

    The flat-panel detector (FPD) is the state-of-the-art detector for digital radiography. The FPD can acquire images in real-time, has superior spatial resolution, and is free of the problems of x-ray image intensifiers--veiling glare, pin-cushion and magnetic distortion. However, FPDs suffer from poor signal to noise ratio performance at typical fluoroscopic exposure rates where the quantum noise is reduced to the point that it becomes comparable to the fixed electronic noise. It has been shown previously that avalanche multiplication gain in amorphous selenium (a-Se) can provide the necessary amplification to overcome the electronic noise of the FPD. Avalanche multiplication, however, comesmore » with its own intrinsic contribution to the noise in the form of gain fluctuation noise. In this article a cascaded systems analysis is used to present a modified metric related to the detective quantum efficiency. The modified metric is used to study a diagnostic x-ray imaging system in the presence of intrinsic avalanche multiplication noise independently from other noise sources, such as electronic noise. An indirect conversion imaging system is considered to make the study independent of other avalanche multiplication related noise sources, such as the fluctuations arising from the depth of x-ray absorption. In this case all the avalanche events are initiated at the surface of the avalanche layer, and there are no fluctuations in the depth of absorption. Experiments on an indirect conversion x-ray imaging system using avalanche multiplication in a layer of a-Se are also presented. The cascaded systems analysis shows that intrinsic noise of avalanche multiplication will not have any deleterious influence on detector performance at zero spatial frequency in x-ray imaging provided the product of conversion gain, coupling efficiency, and optical quantum efficiency are much greater than a factor of 2. The experimental results show that avalanche multiplication in a-Se behaves as an intrinsic noise free avalanche multiplication, in accordance with our theory. Provided good coupling efficiency and high optical quantum efficiency are maintained, avalanche multiplication in a-Se has the potential to increase the gain and make negligible contribution to the noise, thereby improving the performance of indirect FPDs in fluoroscopy.« less

  16. Multi-source Geospatial Data Analysis with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  17. Advanced correlation grid: Analysis and visualisation of functional connectivity among multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman; Stuart, Liz

    2017-07-15

    This study analyses multiple spike trains (MST) data, defines its functional connectivity and subsequently visualises an accurate diagram of connections. This is a challenging problem. For example, it is difficult to distinguish the common input and the direct functional connection of two spike trains. The new method presented in this paper is based on the traditional pairwise cross-correlation function (CCF) and a new combination of statistical techniques. First, the CCF is used to create the Advanced Correlation Grid (ACG) correlation where both the significant peak of the CCF and the corresponding time delay are used for detailed analysis of connectivity. Second, these two features of functional connectivity are used to classify connections. Finally, the visualization technique is used to represent the topology of functional connections. Examples are presented in the paper to demonstrate the new Advanced Correlation Grid method and to show how it enables discrimination between (i) influence from one spike train to another through an intermediate spike train and (ii) influence from one common spike train to another pair of analysed spike trains. The ACG method enables scientists to automatically distinguish between direct connections from spurious connections such as common source connection and indirect connection whereas existing methods require in-depth analysis to identify such connections. The ACG is a new and effective method for studying functional connectivity of multiple spike trains. This method can identify accurately all the direct connections and can distinguish common source and indirect connections automatically. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Source facies and oil families of the Malay Basin, Malaysia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Creaney, S.; Hussein, A.H.; Curry, D.J.

    1994-07-01

    The Malay Basin consists of a number of separate petroleum systems, driven exclusively by nonmarine source rocks. These systems range from lower Oligocene to middle Miocene and show a progression from lacustrine-dominated source facies in the lower Oligocene to lower Miocene section to coastal plain/delta plain coal-related sources in the lower to middle Miocene section. Two lacustrine sources are recognized in the older section, and multiple source/reservoir pairs are recognized in the younger coaly section. The lacustrine sources can be recognized using well-log analysis combined with detailed core and sidewall core sampling. Chemically, they are characterized by low pristane/phytane ratios,more » low oleanane contents, and a general absence of resin-derived terpanes. These sources have TOCs in the 1.0-4.0% range and hydrogen indices of up to 750. In contrast, the coal-related sources are chemically distinct with pristane/phytane ratios of up to 8, very high oleanane contents, and often abundant resinous compounds. All these sources are generally overmature in the basin center and immature toward the basin margin. The oils sourced from all sources in the Malay Basin are generally low in sulfur and of very high economic value. Detailed biomarker analysis of the oils in the Malay Basin has allowed the recognition of families associated with the above sources and demonstrated that oil migration has been largely strata parallel with little cross-stratal mixing of families.« less

  19. Primary sources and toxicity of PAHs in Milwaukee-area streambed sediment

    USGS Publications Warehouse

    Baldwin, Austin K.; Corsi, Steven R.; Lutz, Michelle A.; Ingersoll, Christopher G.; Dorman, Rebecca A.; Magruder, Christopher; Magruder, Matthew

    2017-01-01

    High concentrations of polycyclic aromatic hydrocarbons (PAHs) in streams can be a significant stressor to aquatic organisms. To understand the likely sources and toxicity of PAHs in Milwaukee-area streams, streambed sediment samples from 40 sites and parking lot dust samples from 6 sites were analyzed for 38 parent PAHs and 25 alkylated PAHs. Diagnostic ratios, profile correlations, principal components analysis, source-receptor modeling, and mass fractions analysis were used to identify potential PAH sources to streambed sediment samples, and land-use analysis was used to relate streambed sediment PAH concentrations to different urban-related land uses. On the basis of this multiple lines-of-evidence approach, coal-tar pavement sealant was indicated as the primary source of PAHs in a majority of streambed sediment samples, contributing an estimated 77% of total PAHs to samples, on average. Comparison to the Probable Effect Concentrations and (or) the Equilibrium Partitioning Sediment Benchmark indicates that 78% of stream sediment samples are likely to cause adverse effects to benthic organisms. Laboratory toxicity tests on a 16-sample subset of the streambed sites using the amphipod Hyalella azteca (28-day) and the midge Chironomus dilutus (10-day) measured significant reductions in one or more biological endpoints, including survival, in 75% of samples, with H. azteca more responsive than C. dilutus.

  20. Mapping of chlorophyll a distributions in coastal zones

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1978-01-01

    It is pointed out that chlorophyll a is an important environmental parameter for monitoring water quality, nutrient loads, and pollution effects in coastal zones. High chlorophyll a concentrations occur in areas which have high nutrient inflows from sources such as sewage treatment plants and industrial wastes. Low chlorophyll a concentrations may be due to the addition of toxic substances from industrial wastes or other sources. Remote sensing provides an opportunity to assess distributions of water quality parameters, such as chlorophyll a. A description is presented of the chlorophyll a analysis and a quantitative mapping of the James River, Virginia. An approach considered by Johnson (1977) was used in the analysis. An application of the multiple regression analysis technique to a data set collected over the New York Bight, an environmentally different area of the coastal zone, is also discussed.

  1. WATER QUALITY IN SOURCE WATER, TREATMENT, AND DISTRIBUTION SYSTEMS

    EPA Science Inventory

    Most drinking water utilities practice the multiple-barrier concept as the guiding principle for providing safe water. This chapter discusses multiple barriers as they relate to the basic criteria for selecting and protecting source waters, including known and potential sources ...

  2. iGC-an integrated analysis package of gene expression and copy number alteration.

    PubMed

    Lai, Yi-Pin; Wang, Liang-Bo; Wang, Wei-An; Lai, Liang-Chuan; Tsai, Mong-Hsun; Lu, Tzu-Pin; Chuang, Eric Y

    2017-01-14

    With the advancement in high-throughput technologies, researchers can simultaneously investigate gene expression and copy number alteration (CNA) data from individual patients at a lower cost. Traditional analysis methods analyze each type of data individually and integrate their results using Venn diagrams. Challenges arise, however, when the results are irreproducible and inconsistent across multiple platforms. To address these issues, one possible approach is to concurrently analyze both gene expression profiling and CNAs in the same individual. We have developed an open-source R/Bioconductor package (iGC). Multiple input formats are supported and users can define their own criteria for identifying differentially expressed genes driven by CNAs. The analysis of two real microarray datasets demonstrated that the CNA-driven genes identified by the iGC package showed significantly higher Pearson correlation coefficients with their gene expression levels and copy numbers than those genes located in a genomic region with CNA. Compared with the Venn diagram approach, the iGC package showed better performance. The iGC package is effective and useful for identifying CNA-driven genes. By simultaneously considering both comparative genomic and transcriptomic data, it can provide better understanding of biological and medical questions. The iGC package's source code and manual are freely available at https://www.bioconductor.org/packages/release/bioc/html/iGC.html .

  3. Spatial assessment of air quality patterns in Malaysia using multivariate analysis

    NASA Astrophysics Data System (ADS)

    Dominick, Doreena; Juahir, Hafizan; Latif, Mohd Talib; Zain, Sharifuddin M.; Aris, Ahmad Zaharin

    2012-12-01

    This study aims to investigate possible sources of air pollutants and the spatial patterns within the eight selected Malaysian air monitoring stations based on a two-year database (2008-2009). The multivariate analysis was applied on the dataset. It incorporated Hierarchical Agglomerative Cluster Analysis (HACA) to access the spatial patterns, Principal Component Analysis (PCA) to determine the major sources of the air pollution and Multiple Linear Regression (MLR) to assess the percentage contribution of each air pollutant. The HACA results grouped the eight monitoring stations into three different clusters, based on the characteristics of the air pollutants and meteorological parameters. The PCA analysis showed that the major sources of air pollution were emissions from motor vehicles, aircraft, industries and areas of high population density. The MLR analysis demonstrated that the main pollutant contributing to variability in the Air Pollutant Index (API) at all stations was particulate matter with a diameter of less than 10 μm (PM10). Further MLR analysis showed that the main air pollutant influencing the high concentration of PM10 was carbon monoxide (CO). This was due to combustion processes, particularly originating from motor vehicles. Meteorological factors such as ambient temperature, wind speed and humidity were also noted to influence the concentration of PM10.

  4. High Hopes--Few Opportunities: The Status of Elementary Science Education in California. Summary Report & Recommendations. Strengthening Science Education in California

    ERIC Educational Resources Information Center

    Center for the Future of Teaching and Learning at WestEd, 2011

    2011-01-01

    This report summarizes research findings on science education in California's elementary schools from multiple sources of data collected during 2010-11, specifically, surveys of district administrators, elementary school principals, and elementary school teachers; case studies of elementary schools; analysis of statewide secondary data sets; and…

  5. Islamic Education and Individual Requirements in Interaction and Media Use

    ERIC Educational Resources Information Center

    Khashab, Hamdollah Karimi; Vaezi, Seyed Hossein; Golestani, Seyed Hashem; Taghipour, Faezeh

    2016-01-01

    This article aims to analyze the views and teachings of Islam and the Islamic religion in order to determine the requirements of interaction and media use. This article is of qualitative kind and content analysis approach and has done based on the study of Islamic texts and sources associated with the media. Because of the multiplicity and…

  6. Energy Security of Army Installations and Islanding Methodologies: A Multiple Criteria Decision Aid to Innovation with Emergent Conditions of the Energy Environment

    DTIC Science & Technology

    2010-06-16

    Clemen and Reilly (2001) Risk analysis Haimes (2009); Kaplan et al. (2001): Lowrance (1976); Kaplan and Garrick (1981) Source: The US Army Energy...collect solar energy and convert to heat (NREL presentation) • Wind turbines capture energy in wind and convert it into electricity (NREL

  7. Milwaukee Area Technical College's Fiscal Condition: Growing Demand, Shrinking Resources. An Independent Third-Party Analysis

    ERIC Educational Resources Information Center

    Day, Douglass; Allen, Vanessa; Henken, Rob

    2010-01-01

    The Milwaukee Area Technical College (MATC) is one of the largest local taxpayer-funded entities in southeastern Wisconsin, ranking fourth in assets and budget behind Milwaukee County, the Milwaukee Public Schools, and the City of Milwaukee. The college's fiscal operations are complex and draw on multiple revenue sources, including nearly $150…

  8. Action Research as Primary Vehicle for Inquiry in the Professional Development School

    ERIC Educational Resources Information Center

    Tunks, Jeanne L.

    2011-01-01

    This Yearbook chapter, a compilation of multiple sources, presents both the history of action research and an analysis of reported action research in the professional development school (PDS) between 1992 and 2010. The history begins prior to the inception of the PDS and provides a theoretical premise for action research in the PDS in subsequent…

  9. Multiple Pathways to College: A Secondary Analysis of the 2004 College Applicant Survey. ACAATO Document

    ERIC Educational Resources Information Center

    Colleges Ontario, 2005

    2005-01-01

    The 2004 College Applicant Survey (CAS) describes the college-bound applicant pool by covering a broad range of areas including key demographics, factors influencing college selection, academic background and financial preparedness. It is the most comprehensive and the richest source of survey data to date on applicants to the Ontario Colleges of…

  10. Using Additional Analyses to Clarify the Functions of Problem Behavior: An Analysis of Two Cases

    ERIC Educational Resources Information Center

    Payne, Steven W.; Dozier, Claudia L.; Neidert, Pamela L.; Jowett, Erica S.; Newquist, Matthew H.

    2014-01-01

    Functional analyses (FA) have proven useful for identifying contingencies that influence problem behavior. Research has shown that some problem behavior may only occur in specific contexts or be influenced by multiple or idiosyncratic variables. When these contexts or sources of influence are not assessed in an FA, further assessment may be…

  11. Making Sense of Assessment Feedback in Higher Education

    ERIC Educational Resources Information Center

    Evans, Carol

    2013-01-01

    This article presents a thematic analysis of the research evidence on assessment feedback in higher education (HE) from 2000 to 2012. The focus of the review is on the feedback that students receive within their coursework from multiple sources. The aims of this study are to (a) examine the nature of assessment feedback in HE through the…

  12. Public voices in pharmaceutical deliberations: negotiating "clinical benefit" in the FDA's Avastin Hearing.

    PubMed

    Teston, Christa B; Graham, S Scott; Baldwinson, Raquel; Li, Andria; Swift, Jessamyn

    2014-06-01

    This article offers a hybrid rhetorical-qualitative discourse analysis of the FDA's 2011 Avastin Hearing, which considered the revocation of the breast cancer indication for the popular cancer drug Avastin. We explore the multiplicity of stakeholders, the questions that motivated deliberations, and the kinds of evidence presented during the hearing. Pairing our findings with contemporary scholarship in rhetorical stasis theory, Mol's (2002) construct of multiple ontologies, and Callon, Lascoumes, and Barthe's (2011) "hybrid forums," we demonstrate that the FDA's deliberative procedures elides various sources of evidence and the potential multiplicity of definitions for "clinical benefit." Our findings suggest that while the FDA invited multiple stakeholders to offer testimony, there are ways that the FDA might have more meaningfully incorporated public voices in the deliberative process. We conclude with suggestions for how a true hybrid forum might be deployed.

  13. The discrimination of man-made explosions from earthquakes using seismo-acoustic analysis in the Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Che, Il-Young; Jeon, Jeong-Soo

    2010-05-01

    Korea Institute of Geoscience and Mineral Resources (KIGAM) operates an infrasound network consisting of seven seismo-acoustic arrays in South Korea. Development of the arrays began in 1999, partially in collaboration with Southern Methodist University, with the goal of detecting distant infrasound signals from natural and anthropogenic phenomena in and around the Korean Peninsula. The main operational purpose of this network is to discriminate man-made seismic events from seismicity including thousands of seismic events per year in the region. The man-made seismic events are major cause of error in estimating the natural seismicity, especially where the seismic activity is weak or moderate such as in the Korean Peninsula. In order to discriminate the man-made explosions from earthquakes, we have applied the seismo-acoustic analysis associating seismic and infrasonic signals generated from surface explosion. The observations of infrasound at multiple arrays made it possible to discriminate surface explosion, because small or moderate size earthquake is not sufficient to generate infrasound. Till now we have annually discriminated hundreds of seismic events in seismological catalog as surface explosions by the seismo-acoustic analysis. Besides of the surface explosions, the network also detected infrasound signals from other sources, such as bolide, typhoons, rocket launches, and underground nuclear test occurred in and around the Korean Peninsula. In this study, ten years of seismo-acoustic data are reviewed with recent infrasonic detection algorithm and association method that finally linked to the seismic monitoring system of the KIGAM to increase the detection rate of surface explosions. We present the long-term results of seismo-acoustic analysis, the detection capability of the multiple arrays, and implications for seismic source location. Since the seismo-acoustic analysis is proved as a definite method to discriminate surface explosion, the analysis will be continuously used for estimating natural seismicity and understanding infrasonic sources.

  14. Multilevel linear modelling of the response-contingent learning of young children with significant developmental delays.

    PubMed

    Raab, Melinda; Dunst, Carl J; Hamby, Deborah W

    2018-02-27

    The purpose of the study was to isolate the sources of variations in the rates of response-contingent learning among young children with multiple disabilities and significant developmental delays randomly assigned to contrasting types of early childhood intervention. Multilevel, hierarchical linear growth curve modelling was used to analyze four different measures of child response-contingent learning where repeated child learning measures were nested within individual children (Level-1), children were nested within practitioners (Level-2), and practitioners were nested within the contrasting types of intervention (Level-3). Findings showed that sources of variations in rates of child response-contingent learning were associated almost entirely with type of intervention after the variance associated with differences in practitioners nested within groups were accounted for. Rates of child learning were greater among children whose existing behaviour were used as the building blocks for promoting child competence (asset-based practices) compared to children for whom the focus of intervention was promoting child acquisition of missing skills (needs-based practices). The methods of analysis illustrate a practical approach to clustered data analysis and the presentation of results in ways that highlight sources of variations in the rates of response-contingent learning among young children with multiple developmental disabilities and significant developmental delays. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  15. The Effect of DEM Source and Grid Size on the Index of Connectivity in Savanna Catchments

    NASA Astrophysics Data System (ADS)

    Jarihani, Ben; Sidle, Roy; Bartley, Rebecca; Roth, Christian

    2017-04-01

    The term "hydrological connectivity" is increasingly used instead of sediment delivery ratio to describe the linkage between the sources of water and sediment within a catchment to the catchment outlet. Sediment delivery ratio is an empirical parameter that is highly site-specific and tends to lump all processes, whilst hydrological connectivity focuses on the spatially-explicit hydrologic drivers of surficial processes. Detailed topographic information plays a fundamental role in geomorphological interpretations as well as quantitative modelling of sediment fluxes and connectivity. Geomorphometric analysis permits a detailed characterization of drainage area and drainage pattern together with the possibility of characterizing surface roughness. High resolution topographic data (i.e., LiDAR) are not available for all areas; however, remotely sensed topographic data from multiple sources with different grid sizes are used to undertake geomorphologic analysis in data-sparse regions. The Index of Connectivity (IC), a geomorphometric model based only on DEM data, is applied in two small savanna catchments in Queensland, Australia. The influence of the scale of the topographic data is explored by using DEMs from LiDAR ( 1 m), WorldDEM ( 10 m), raw SRTM and hydrologically corrected SRTM derived data ( 30 m) to calculate the index of connectivity. The effect of the grid size is also investigated by resampling the high resolution LiDAR DEM to multiple grid sizes (e.g. 5, 10, 20 m) and comparing the extracted IC.

  16. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    PubMed

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  17. Integration of Multiple Data Sources to Simulate the Dynamics of Land Systems

    PubMed Central

    Deng, Xiangzheng; Su, Hongbo; Zhan, Jinyan

    2008-01-01

    In this paper we present and develop a new model, which we have called Dynamics of Land Systems (DLS). The DLS model is capable of integrating multiple data sources to simulate the dynamics of a land system. Three main modules are incorporated in DLS: a spatial regression module, to explore the relationship between land uses and influencing factors, a scenario analysis module of the land uses of a region during the simulation period and a spatial disaggregation module, to allocate land use changes from a regional level to disaggregated grid cells. A case study on Taips County in North China is incorporated in this paper to test the functionality of DLS. The simulation results under the baseline, economic priority and environmental scenarios help to understand the land system dynamics and project near future land-use trajectories of a region, in order to focus management decisions on land uses and land use planning. PMID:27879726

  18. Establishing a link between vehicular PM sources and PM measurements in urban street canyons.

    PubMed

    Eisner, Alfred D; Richmond-Bryant, Jennifer; Wiener, Russell W; Hahn, Intaek; Drake-Richman, Zora E; Ellenson, William D

    2009-12-01

    The Brooklyn Traffic Real-Time Ambient Pollutant Penetration and Environmental Dispersion (B-TRAPPED) study, conducted in Brooklyn, NY, USA, in 2005, was designed with multiple goals in mind, two of which were contaminant source characterization and street canyon transport and dispersion monitoring. In the portion of the study described here, synchronized wind velocity and azimuth as well as particulate matter (PM) concentrations at multiple locations along 33rd Street were used to determine the feasibility of using traffic emissions in a complex urban topography as a sole tracer for studying urban contaminant transport. We demonstrate in this paper that it is possible to link downwind concentrations of contaminants in an urban street canyon to the vehicular traffic cycle using Eigen-frequency analysis. In addition, multivariable circular histograms are used to establish directional frequency maxima for wind velocity and contaminant concentration.

  19. Simulation and analysis of support hardware for multiple instruction rollback

    NASA Technical Reports Server (NTRS)

    Alewine, Neil J.

    1992-01-01

    Recently, a compiler-assisted approach to multiple instruction retry was developed. In this scheme, a read buffer of size 2N, where N represents the maximum instruction rollback distance, is used to resolve one type of data hazard. This hardware support helps to reduce code growth, compilation time, and some of the performance impacts associated with hazard resolution. The 2N read buffer size requirement of the compiler-assisted approach is worst case, assuring data redundancy for all data required but also providing some unnecessary redundancy. By adding extra bits in the operand field for source 1 and source 2 it becomes possible to design the read buffer to save only those values required, thus reducing the read buffer size requirement. This study measures the effect on performance of a DECstation 3100 running 10 application programs using 6 read buffer configurations at varying read buffer sizes.

  20. Resolving z ~2 galaxy using adaptive coadded source plane reconstruction

    NASA Astrophysics Data System (ADS)

    Sharma, Soniya; Richard, Johan; Kewley, Lisa; Yuan, Tiantian

    2018-06-01

    Natural magnification provided by gravitational lensing coupled with Integral field spectrographic observations (IFS) and adaptive optics (AO) imaging techniques have become the frontier of spatially resolved studies of high redshift galaxies (z>1). Mass models of gravitational lenses hold the key for understanding the spatially resolved source–plane (unlensed) physical properties of the background lensed galaxies. Lensing mass models very sensitively control the accuracy and precision of source-plane reconstructions of the observed lensed arcs. Effective source-plane resolution defined by image-plane (observed) point spread function (PSF) makes it challenging to recover the unlensed (source-plane) surface brightness distribution.We conduct a detailed study to recover the source-plane physical properties of z=2 lensed galaxy using spatially resolved observations from two different multiple images of the lensed target. To deal with PSF’s from two data sets on different multiple images of the galaxy, we employ a forward (Source to Image) approach to merge these independent observations. Using our novel technique, we are able to present a detailed analysis of the source-plane dynamics at scales much better than previously attainable through traditional image inversion methods. Moreover, our technique is adapted to magnification, thus allowing us to achieve higher resolution in highly magnified regions of the source. We find that this lensed system is highly evident of a minor merger. In my talk, I present this case study of z=2 lensed galaxy and also discuss the applications of our algorithm to study plethora of lensed systems, which will be available through future telescopes like JWST and GMT.

  1. Einstein Observatory survey of X-ray emission from solar-type stars - The late F and G dwarf stars

    NASA Technical Reports Server (NTRS)

    Maggio, A.; Sciortino, S.; Vaiana, G. S.; Majer, P.; Bookbinder, J.

    1987-01-01

    Results of a volume-limited X-ray survey of stars of luminosity classes IV and V in the spectral range F7-G9 observed with the Einstein Observatory are presented. Using survival analysis techniques, the stellar X-ray luminosity function in the 0.15-4.0 keV energy band for both single and multiple sources. It is shown that the difference in X-ray luminosity between these two classes of sources is consistent with the superposition of individual components in multiple-component systems, whose X-ray properties are similar to those of the single-component sources. The X-ray emission of the stars in our sample is well correlated with their chromospheric CA II H-K line emission and with their projected equatorial rotational velocity. Comparison of the X-ray luminosity function constructed for the sample of the dG stars of the local population with the corresponding functions derived elsewhere for the Hyades, the Pleiades, and the Orion Ic open cluster confirms that the level of X-ray emission decreases with stellar age.

  2. A Systems Biology Approach for Identifying Hepatotoxicant Groups Based on Similarity in Mechanisms of Action and Chemical Structure.

    PubMed

    Hebels, Dennie G A J; Rasche, Axel; Herwig, Ralf; van Westen, Gerard J P; Jennen, Danyel G J; Kleinjans, Jos C S

    2016-01-01

    When evaluating compound similarity, addressing multiple sources of information to reach conclusions about common pharmaceutical and/or toxicological mechanisms of action is a crucial strategy. In this chapter, we describe a systems biology approach that incorporates analyses of hepatotoxicant data for 33 compounds from three different sources: a chemical structure similarity analysis based on the 3D Tanimoto coefficient, a chemical structure-based protein target prediction analysis, and a cross-study/cross-platform meta-analysis of in vitro and in vivo human and rat transcriptomics data derived from public resources (i.e., the diXa data warehouse). Hierarchical clustering of the outcome scores of the separate analyses did not result in a satisfactory grouping of compounds considering their known toxic mechanism as described in literature. However, a combined analysis of multiple data types may hypothetically compensate for missing or unreliable information in any of the single data types. We therefore performed an integrated clustering analysis of all three data sets using the R-based tool iClusterPlus. This indeed improved the grouping results. The compound clusters that were formed by means of iClusterPlus represent groups that show similar gene expression while simultaneously integrating a similarity in structure and protein targets, which corresponds much better with the known mechanism of action of these toxicants. Using an integrative systems biology approach may thus overcome the limitations of the separate analyses when grouping liver toxicants sharing a similar mechanism of toxicity.

  3. Multiple-source multiple-harmonic active vibration control of variable section cylindrical structures: A numerical study

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Chen, Xuefeng; Gao, Jiawei; Zhang, Xingwu

    2016-12-01

    Air vehicles, space vehicles and underwater vehicles, the cabins of which can be viewed as variable section cylindrical structures, have multiple rotational vibration sources (e.g., engines, propellers, compressors and motors), making the spectrum of noise multiple-harmonic. The suppression of such noise has been a focus of interests in the field of active vibration control (AVC). In this paper, a multiple-source multiple-harmonic (MSMH) active vibration suppression algorithm with feed-forward structure is proposed based on reference amplitude rectification and conjugate gradient method (CGM). An AVC simulation scheme called finite element model in-loop simulation (FEMILS) is also proposed for rapid algorithm verification. Numerical studies of AVC are conducted on a variable section cylindrical structure based on the proposed MSMH algorithm and FEMILS scheme. It can be seen from the numerical studies that: (1) the proposed MSMH algorithm can individually suppress each component of the multiple-harmonic noise with an unified and improved convergence rate; (2) the FEMILS scheme is convenient and straightforward for multiple-source simulations with an acceptable loop time. Moreover, the simulations have similar procedure to real-life control and can be easily extended to physical model platform.

  4. Searching Information Sources in Networks

    DTIC Science & Technology

    2017-06-14

    SECURITY CLASSIFICATION OF: During the course of this project, we made significant progresses in multiple directions of the information detection...result on information source detection on non-tree networks; (2) The development of information source localization algorithms to detect multiple... information sources. The algorithms have provable performance guarantees and outperform existing algorithms in 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND

  5. Improved Multiple-Species Cyclotron Ion Source

    NASA Technical Reports Server (NTRS)

    Soli, George A.; Nichols, Donald K.

    1990-01-01

    Use of pure isotope 86Kr instead of natural krypton in multiple-species ion source enables source to produce krypton ions separated from argon ions by tuning cylcotron with which source used. Addition of capability to produce and separate krypton ions at kinetic energies of 150 to 400 MeV necessary for simulation of worst-case ions occurring in outer space.

  6. Integrative data analysis in clinical psychology research.

    PubMed

    Hussong, Andrea M; Curran, Patrick J; Bauer, Daniel J

    2013-01-01

    Integrative data analysis (IDA), a novel framework for conducting the simultaneous analysis of raw data pooled from multiple studies, offers many advantages including economy (i.e., reuse of extant data), power (i.e., large combined sample sizes), the potential to address new questions not answerable by a single contributing study (e.g., combining longitudinal studies to cover a broader swath of the lifespan), and the opportunity to build a more cumulative science (i.e., examining the similarity of effects across studies and potential reasons for dissimilarities). There are also methodological challenges associated with IDA, including the need to account for sampling heterogeneity across studies, to develop commensurate measures across studies, and to account for multiple sources of study differences as they impact hypothesis testing. In this review, we outline potential solutions to these challenges and describe future avenues for developing IDA as a framework for studies in clinical psychology.

  7. Integrative Data Analysis in Clinical Psychology Research

    PubMed Central

    Hussong, Andrea M.; Curran, Patrick J.; Bauer, Daniel J.

    2013-01-01

    Integrative Data Analysis (IDA), a novel framework for conducting the simultaneous analysis of raw data pooled from multiple studies, offers many advantages including economy (i.e., reuse of extant data), power (i.e., large combined sample sizes), the potential to address new questions not answerable by a single contributing study (e.g., combining longitudinal studies to cover a broader swath of the lifespan), and the opportunity to build a more cumulative science (i.e., examining the similarity of effects across studies and potential reasons for dissimilarities). There are also methodological challenges associated with IDA, including the need to account for sampling heterogeneity across studies, to develop commensurate measures across studies, and to account for multiple sources of study differences as they impact hypothesis testing. In this review, we outline potential solutions to these challenges and describe future avenues for developing IDA as a framework for studies in clinical psychology. PMID:23394226

  8. Early Parallel Activation of Semantics and Phonology in Picture Naming: Evidence from a Multiple Linear Regression MEG Study

    PubMed Central

    Miozzo, Michele; Pulvermüller, Friedemann; Hauk, Olaf

    2015-01-01

    The time course of brain activation during word production has become an area of increasingly intense investigation in cognitive neuroscience. The predominant view has been that semantic and phonological processes are activated sequentially, at about 150 and 200–400 ms after picture onset. Although evidence from prior studies has been interpreted as supporting this view, these studies were arguably not ideally suited to detect early brain activation of semantic and phonological processes. We here used a multiple linear regression approach to magnetoencephalography (MEG) analysis of picture naming in order to investigate early effects of variables specifically related to visual, semantic, and phonological processing. This was combined with distributed minimum-norm source estimation and region-of-interest analysis. Brain activation associated with visual image complexity appeared in occipital cortex at about 100 ms after picture presentation onset. At about 150 ms, semantic variables became physiologically manifest in left frontotemporal regions. In the same latency range, we found an effect of phonological variables in the left middle temporal gyrus. Our results demonstrate that multiple linear regression analysis is sensitive to early effects of multiple psycholinguistic variables in picture naming. Crucially, our results suggest that access to phonological information might begin in parallel with semantic processing around 150 ms after picture onset. PMID:25005037

  9. Theory and investigation of acoustic multiple-input multiple-output systems based on spherical arrays in a room.

    PubMed

    Morgenstern, Hai; Rafaely, Boaz; Zotter, Franz

    2015-11-01

    Spatial attributes of room acoustics have been widely studied using microphone and loudspeaker arrays. However, systems that combine both arrays, referred to as multiple-input multiple-output (MIMO) systems, have only been studied to a limited degree in this context. These systems can potentially provide a powerful tool for room acoustics analysis due to the ability to simultaneously control both arrays. This paper offers a theoretical framework for the spatial analysis of enclosed sound fields using a MIMO system comprising spherical loudspeaker and microphone arrays. A system transfer function is formulated in matrix form for free-field conditions, and its properties are studied using tools from linear algebra. The system is shown to have unit-rank, regardless of the array types, and its singular vectors are related to the directions of arrival and radiation at the microphone and loudspeaker arrays, respectively. The formulation is then generalized to apply to rooms, using an image source method. In this case, the rank of the system is related to the number of significant reflections. The paper ends with simulation studies, which support the developed theory, and with an extensive reflection analysis of a room impulse response, using the platform of a MIMO system.

  10. Spatial distribution and source apportionment of water pollution in different administrative zones of Wen-Rui-Tang (WRT) river watershed, China.

    PubMed

    Yang, Liping; Mei, Kun; Liu, Xingmei; Wu, Laosheng; Zhang, Minghua; Xu, Jianming; Wang, Fan

    2013-08-01

    Water quality degradation in river systems has caused great concerns all over the world. Identifying the spatial distribution and sources of water pollutants is the very first step for efficient water quality management. A set of water samples collected bimonthly at 12 monitoring sites in 2009 and 2010 were analyzed to determine the spatial distribution of critical parameters and to apportion the sources of pollutants in Wen-Rui-Tang (WRT) river watershed, near the East China Sea. The 12 monitoring sites were divided into three administrative zones of urban, suburban, and rural zones considering differences in land use and population density. Multivariate statistical methods [one-way analysis of variance, principal component analysis (PCA), and absolute principal component score-multiple linear regression (APCS-MLR) methods] were used to investigate the spatial distribution of water quality and to apportion the pollution sources. Results showed that most water quality parameters had no significant difference between the urban and suburban zones, whereas these two zones showed worse water quality than the rural zone. Based on PCA and APCS-MLR analysis, urban domestic sewage and commercial/service pollution, suburban domestic sewage along with fluorine point source pollution, and agricultural nonpoint source pollution with rural domestic sewage pollution were identified to the main pollution sources in urban, suburban, and rural zones, respectively. Understanding the water pollution characteristics of different administrative zones could put insights into effective water management policy-making especially in the area across various administrative zones.

  11. Indoor source apportionment in urban communities near industrial sites

    NASA Astrophysics Data System (ADS)

    Tunno, Brett J.; Dalton, Rebecca; Cambal, Leah; Holguin, Fernando; Lioy, Paul; Clougherty, Jane E.

    2016-08-01

    Because fine particulate matter (PM2.5) differs in chemical composition, source apportionment is frequently used for identification of relative contributions of multiple sources to outdoor concentrations. Indoor air pollution and source apportionment is often overlooked, though people in northern climates may spend up to 90% of their time inside. We selected 21 homes for a 1-week indoor sampling session during summer (July to September 2011), repeated in winter (January to March 2012). Elemental analysis was performed using inductively-coupled plasma mass spectrometry (ICP-MS), and factor analysis was used to determine constituent grouping. Multivariate modeling was run on factor scores to corroborate interpretations of source factors based on a literature review. For each season, a 5-factor solution explained 86-88% of variability in constituent concentrations. Indoor sources (i.e. cooking, smoking) explained greater variability than did outdoor sources in these industrial communities. A smoking factor was identified in each season, predicted by number of cigarettes smoked. Cooking factors were also identified in each season, explained by frequency of stove cooking and stovetop frying. Significant contributions from outdoor sources including coal and motor vehicles were also identified. Higher coal and secondary-related elemental concentrations were detected during summer than winter. Our findings suggest that source contributions to indoor concentrations can be identified and should be examined in relation to health effects.

  12. Single-channel mixed signal blind source separation algorithm based on multiple ICA processing

    NASA Astrophysics Data System (ADS)

    Cheng, Xiefeng; Li, Ji

    2017-01-01

    Take separating the fetal heart sound signal from the mixed signal that get from the electronic stethoscope as the research background, the paper puts forward a single-channel mixed signal blind source separation algorithm based on multiple ICA processing. Firstly, according to the empirical mode decomposition (EMD), the single-channel mixed signal get multiple orthogonal signal components which are processed by ICA. The multiple independent signal components are called independent sub component of the mixed signal. Then by combining with the multiple independent sub component into single-channel mixed signal, the single-channel signal is expanded to multipath signals, which turns the under-determined blind source separation problem into a well-posed blind source separation problem. Further, the estimate signal of source signal is get by doing the ICA processing. Finally, if the separation effect is not very ideal, combined with the last time's separation effect to the single-channel mixed signal, and keep doing the ICA processing for more times until the desired estimated signal of source signal is get. The simulation results show that the algorithm has good separation effect for the single-channel mixed physiological signals.

  13. Airborne Dioxins, Furans and Polycyclic Aromatic Hydrocarbons Exposure to Military Personnel in Iraq

    PubMed Central

    Masiol, Mauro; Mallon, Timothy; Haines, Kevin M.; Utell, Mark J.; Hopke, Philip K.

    2016-01-01

    Objectives The objective was to use ambient polycyclic aromatic hydrocarbon (PAH), polychlorinated dibenzo-p-dioxins (PCDD) and polychlorinated dibenzofurans (PCDF) concentrations measured at Joint Base Balad in Iraq in 2007 to identify the sources of these species and their spatial patterns. Methods The ratios of the measured species were compared to literature data for likely emission sources. Using the multiple site measurements on specific days, contour maps have been drawn using inverse distance weighting (IDW). Results These analyses suggest multiple sources including the burn pit (primarily a source of PCDD/PCDFs), the transportation field (primarily as source of PAHs) and other sources of PAHs that include aircraft, space heating, and diesel power generation. Conclusions The nature and locations of the sources were identified. PCDD/PCDFs were emitted by the burn pit. Multiple PAH sources exist across the base. PMID:27501100

  14. Factors influencing health information system adoption in American hospitals.

    PubMed

    Wang, Bill B; Wan, Thomas T H; Burke, Darrell E; Bazzoli, Gloria J; Lin, Blossom Y J

    2005-01-01

    To study the number of health information systems (HISs), applicable to administrative, clinical, and executive decision support functionalities, adopted by acute care hospitals and to examine how hospital market, organizational, and financial factors influence HIS adoption. A cross-sectional analysis was performed with 1441 hospitals selected from metropolitan statistical areas in the United States. Multiple data sources were merged. Six hypotheses were empirically tested by multiple regression analysis. HIS adoption was influenced by the hospital market, organizational, and financial factors. Larger, system-affiliated, and for-profit hospitals with more preferred provider organization contracts are more likely to adopt managerial information systems than their counterparts. Operating revenue is positively associated with HIS adoption. The study concludes that hospital organizational and financial factors influence on hospitals' strategic adoption of clinical, administrative, and managerial information systems.

  15. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  16. Analysis of the Performance of Heat Pipes and Phase-Change Materials with Multiple Localized Heat Sources for Space Applications

    DTIC Science & Technology

    1989-05-01

    NUMERICAL ANALYSIS OF STEFAN PROBLEMS FOR GENERALIZED MULTI- DIMENSIONAL PHASE-CHANGE STRUCTURES USING THE ENTHALPY TRANSFORMING MODEL 4.1 Summary...equation St Stefan number, cs(Tm-Tw)/H or cs(Tm-Ti)/H s circumferential distance coordinate, m, Section III s dimensionless interface position along...fluid, kg/m 3 0 viscous dissipation term in the energy eqn. (1.4), Section I; dummy variable, Section IV r dimensionless time, ta/L 2 a Stefan -Boltzmann

  17. Approach to identifying pollutant source and matching flow field

    NASA Astrophysics Data System (ADS)

    Liping, Pang; Yu, Zhang; Hongquan, Qu; Tao, Hu; Wei, Wang

    2013-07-01

    Accidental pollution events often threaten people's health and lives, and it is necessary to identify a pollutant source rapidly so that prompt actions can be taken to prevent the spread of pollution. But this identification process is one of the difficulties in the inverse problem areas. This paper carries out some studies on this issue. An approach using single sensor information with noise was developed to identify a sudden continuous emission trace pollutant source in a steady velocity field. This approach first compares the characteristic distance of the measured concentration sequence to the multiple hypothetical measured concentration sequences at the sensor position, which are obtained based on a source-three-parameter multiple hypotheses. Then we realize the source identification by globally searching the optimal values with the objective function of the maximum location probability. Considering the large amount of computation load resulting from this global searching, a local fine-mesh source search method based on priori coarse-mesh location probabilities is further used to improve the efficiency of identification. Studies have shown that the flow field has a very important influence on the source identification. Therefore, we also discuss the impact of non-matching flow fields with estimation deviation on identification. Based on this analysis, a method for matching accurate flow field is presented to improve the accuracy of identification. In order to verify the practical application of the above method, an experimental system simulating a sudden pollution process in a steady flow field was set up and some experiments were conducted when the diffusion coefficient was known. The studies showed that the three parameters (position, emission strength and initial emission time) of the pollutant source in the experiment can be estimated by using the method for matching flow field and source identification.

  18. The Chandra Source Catalog: Algorithms

    NASA Astrophysics Data System (ADS)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  19. Integrating multiple immunogenetic data sources for feature extraction and mining somatic hypermutation patterns: the case of "towards analysis" in chronic lymphocytic leukaemia.

    PubMed

    Kavakiotis, Ioannis; Xochelli, Aliki; Agathangelidis, Andreas; Tsoumakas, Grigorios; Maglaveras, Nicos; Stamatopoulos, Kostas; Hadzidimitriou, Anastasia; Vlahavas, Ioannis; Chouvarda, Ioanna

    2016-06-06

    Somatic Hypermutation (SHM) refers to the introduction of mutations within rearranged V(D)J genes, a process that increases the diversity of Immunoglobulins (IGs). The analysis of SHM has offered critical insight into the physiology and pathology of B cells, leading to strong prognostication markers for clinical outcome in chronic lymphocytic leukaemia (CLL), the most frequent adult B-cell malignancy. In this paper we present a methodology for integrating multiple immunogenetic and clinocobiological data sources in order to extract features and create high quality datasets for SHM analysis in IG receptors of CLL patients. This dataset is used as the basis for a higher level integration procedure, inspired form social choice theory. This is applied in the Towards Analysis, our attempt to investigate the potential ontogenetic transformation of genes belonging to specific stereotyped CLL subsets towards other genes or gene families, through SHM. The data integration process, followed by feature extraction, resulted in the generation of a dataset containing information about mutations occurring through SHM. The Towards analysis performed on the integrated dataset applying voting techniques, revealed the distinct behaviour of subset #201 compared to other subsets, as regards SHM related movements among gene clans, both in allele-conserved and non-conserved gene areas. With respect to movement between genes, a high percentage movement towards pseudo genes was found in all CLL subsets. This data integration and feature extraction process can set the basis for exploratory analysis or a fully automated computational data mining approach on many as yet unanswered, clinically relevant biological questions.

  20. Content Integration across Multiple Documents Reduces Memory for Sources

    ERIC Educational Resources Information Center

    Braasch, Jason L. G.; McCabe, Rebecca M.; Daniel, Frances

    2016-01-01

    The current experiments systematically examined semantic content integration as a mechanism for explaining source inattention and forgetting when reading-to-remember multiple texts. For all 3 experiments, degree of semantic overlap was manipulated amongst messages provided by various information sources. In Experiment 1, readers' source…

  1. Trait and State Variance in Oppositional Defiant Disorder Symptoms: A Multi-Source Investigation with Spanish Children

    PubMed Central

    Preszler, Jonathan; Burns, G. Leonard; Litson, Kaylee; Geiser, Christian; Servera, Mateu

    2016-01-01

    The objective was to determine and compare the trait and state components of oppositional defiant disorder (ODD) symptom reports across multiple informants. Mothers, fathers, primary teachers, and secondary teachers rated the occurrence of the ODD symptoms in 810 Spanish children (55% boys) on two occasions (end first and second grades). Single source latent state-trait (LST) analyses revealed that ODD symptom ratings from all four sources showed more trait (M = 63%) than state residual (M = 37%) variance. A multiple source LST analysis revealed substantial convergent validity of mothers’ and fathers’ trait variance components (M = 68%) and modest convergent validity of state residual variance components (M = 35%). In contrast, primary and secondary teachers showed low convergent validity relative to mothers for trait variance (Ms = 31%, 32%, respectively) and essentially zero convergent validity relative to mothers for state residual variance (Ms = 1%, 3%, respectively). Although ODD symptom ratings reflected slightly more trait- than state-like constructs within each of the four sources separately across occasions, strong convergent validity for the trait variance only occurred within settings (i.e., mothers with fathers; primary with secondary teachers) with the convergent validity of the trait and state residual variance components being low to non-existent across settings. These results suggest that ODD symptom reports are trait-like across time for individual sources with this trait variance, however, only having convergent validity within settings. Implications for assessment of ODD are discussed. PMID:27148784

  2. Alternative source models of very low frequency events

    NASA Astrophysics Data System (ADS)

    Gomberg, J.; Agnew, D. C.; Schwartz, S. Y.

    2016-09-01

    We present alternative source models for very low frequency (VLF) events, previously inferred to be radiation from individual slow earthquakes that partly fill the period range between slow slip events lasting thousands of seconds and low-frequency earthquakes (LFE) with durations of tenths of a second. We show that VLF events may emerge from bandpass filtering a sum of clustered, shorter duration, LFE signals, believed to be the components of tectonic tremor. Most published studies show VLF events occurring concurrently with tremor bursts and LFE signals. Our analysis of continuous data from Costa Rica detected VLF events only when tremor was also occurring, which was only 7% of the total time examined. Using analytic and synthetic models, we show that a cluster of LFE signals produces the distinguishing characteristics of VLF events, which may be determined by the cluster envelope. The envelope may be diagnostic of a single, dynamic, slowly slipping event that propagates coherently over kilometers or represents a narrowly band-passed version of nearly simultaneous arrivals of radiation from slip on multiple higher stress drop and/or faster propagating slip patches with dimensions of tens of meters (i.e., LFE sources). Temporally clustered LFE sources may be triggered by single or multiple distinct aseismic slip events or represent the nearly simultaneous chance occurrence of background LFEs. Given the nonuniqueness in possible source durations, we suggest it is premature to draw conclusions about VLF event sources or how they scale.

  3. Alternative source models of very low frequency events

    USGS Publications Warehouse

    Gomberg, Joan S.; Agnew, D.C.; Schwartz, S.Y.

    2016-01-01

    We present alternative source models for very low frequency (VLF) events, previously inferred to be radiation from individual slow earthquakes that partly fill the period range between slow slip events lasting thousands of seconds and low-frequency earthquakes (LFE) with durations of tenths of a second. We show that VLF events may emerge from bandpass filtering a sum of clustered, shorter duration, LFE signals, believed to be the components of tectonic tremor. Most published studies show VLF events occurring concurrently with tremor bursts and LFE signals. Our analysis of continuous data from Costa Rica detected VLF events only when tremor was also occurring, which was only 7% of the total time examined. Using analytic and synthetic models, we show that a cluster of LFE signals produces the distinguishing characteristics of VLF events, which may be determined by the cluster envelope. The envelope may be diagnostic of a single, dynamic, slowly slipping event that propagates coherently over kilometers or represents a narrowly band-passed version of nearly simultaneous arrivals of radiation from slip on multiple higher stress drop and/or faster propagating slip patches with dimensions of tens of meters (i.e., LFE sources). Temporally clustered LFE sources may be triggered by single or multiple distinct aseismic slip events or represent the nearly simultaneous chance occurrence of background LFEs. Given the nonuniqueness in possible source durations, we suggest it is premature to draw conclusions about VLF event sources or how they scale.

  4. Observational constraints on the physical nature of submillimetre source multiplicity: chance projections are common

    NASA Astrophysics Data System (ADS)

    Hayward, Christopher C.; Chapman, Scott C.; Steidel, Charles C.; Golob, Anneya; Casey, Caitlin M.; Smith, Daniel J. B.; Zitrin, Adi; Blain, Andrew W.; Bremer, Malcolm N.; Chen, Chian-Chou; Coppin, Kristen E. K.; Farrah, Duncan; Ibar, Eduardo; Michałowski, Michał J.; Sawicki, Marcin; Scott, Douglas; van der Werf, Paul; Fazio, Giovanni G.; Geach, James E.; Gurwell, Mark; Petitpas, Glen; Wilner, David J.

    2018-05-01

    Interferometric observations have demonstrated that a significant fraction of single-dish submillimetre (submm) sources are blends of multiple submm galaxies (SMGs), but the nature of this multiplicity, i.e. whether the galaxies are physically associated or chance projections, has not been determined. We performed spectroscopy of 11 SMGs in six multicomponent submm sources, obtaining spectroscopic redshifts for nine of them. For an additional two component SMGs, we detected continuum emission but no obvious features. We supplement our observed sources with four single-dish submm sources from the literature. This sample allows us to statistically constrain the physical nature of single-dish submm source multiplicity for the first time. In three (3/7, { or} 43^{+39 }_{ -33} {per cent at 95 {per cent} confidence}) of the single-dish sources for which the nature of the blending is unambiguous, the components for which spectroscopic redshifts are available are physically associated, whereas 4/7 (57^{+33 }_{ -39} per cent) have at least one unassociated component. When components whose spectra exhibit continuum but no features and for which the photometric redshift is significantly different from the spectroscopic redshift of the other component are also considered, 6/9 (67^{+26 }_{ -37} per cent) of the single-dish sources are comprised of at least one unassociated component SMG. The nature of the multiplicity of one single-dish source is ambiguous. We conclude that physically associated systems and chance projections both contribute to the multicomponent single-dish submm source population. This result contradicts the conventional wisdom that bright submm sources are solely a result of merger-induced starbursts, as blending of unassociated galaxies is also important.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowder, Jeff; Cornish, Neil J.; Reddinger, J. Lucas

    This work presents the first application of the method of genetic algorithms (GAs) to data analysis for the Laser Interferometer Space Antenna (LISA). In the low frequency regime of the LISA band there are expected to be tens of thousands of galactic binary systems that will be emitting gravitational waves detectable by LISA. The challenge of parameter extraction of such a large number of sources in the LISA data stream requires a search method that can efficiently explore the large parameter spaces involved. As signals of many of these sources will overlap, a global search method is desired. GAs representmore » such a global search method for parameter extraction of multiple overlapping sources in the LISA data stream. We find that GAs are able to correctly extract source parameters for overlapping sources. Several optimizations of a basic GA are presented with results derived from applications of the GA searches to simulated LISA data.« less

  6. Detection, localization and classification of multiple dipole-like magnetic sources using magnetic gradient tensor data

    NASA Astrophysics Data System (ADS)

    Gang, Yin; Yingtang, Zhang; Hongbo, Fan; Zhining, Li; Guoquan, Ren

    2016-05-01

    We have developed a method for automatic detection, localization and classification (DLC) of multiple dipole sources using magnetic gradient tensor data. First, we define modified tilt angles to estimate the approximate horizontal locations of the multiple dipole-like magnetic sources simultaneously and detect the number of magnetic sources using a fixed threshold. Secondly, based on the isotropy of the normalized source strength (NSS) response of a dipole, we obtain accurate horizontal locations of the dipoles. Then the vertical locations are calculated using magnitude magnetic transforms of magnetic gradient tensor data. Finally, we invert for the magnetic moments of the sources using the measured magnetic gradient tensor data and forward model. Synthetic and field data sets demonstrate effectiveness and practicality of the proposed method.

  7. ZnO-based multiple channel and multiple gate FinMOSFETs

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Ting; Huang, Hung-Lin; Tseng, Chun-Yen; Lee, Hsin-Ying

    2016-02-01

    In recent years, zinc oxide (ZnO)-based metal-oxide-semiconductor field-effect transistors (MOSFETs) have attracted much attention, because ZnO-based semiconductors possess several advantages, including large exciton binding energy, nontoxicity, biocompatibility, low material cost, and wide direct bandgap. Moreover, the ZnO-based MOSFET is one of most potential devices, due to the applications in microwave power amplifiers, logic circuits, large scale integrated circuits, and logic swing. In this study, to enhance the performances of the ZnO-based MOSFETs, the ZnObased multiple channel and multiple gate structured FinMOSFETs were fabricated using the simple laser interference photolithography method and the self-aligned photolithography method. The multiple channel structure possessed the additional sidewall depletion width control ability to improve the channel controllability, because the multiple channel sidewall portions were surrounded by the gate electrode. Furthermore, the multiple gate structure had a shorter distance between source and gate and a shorter gate length between two gates to enhance the gate operating performances. Besides, the shorter distance between source and gate could enhance the electron velocity in the channel fin structure of the multiple gate structure. In this work, ninety one channels and four gates were used in the FinMOSFETs. Consequently, the drain-source saturation current (IDSS) and maximum transconductance (gm) of the ZnO-based multiple channel and multiple gate structured FinFETs operated at a drain-source voltage (VDS) of 10 V and a gate-source voltage (VGS) of 0 V were respectively improved from 11.5 mA/mm to 13.7 mA/mm and from 4.1 mS/mm to 6.9 mS/mm in comparison with that of the conventional ZnO-based single channel and single gate MOSFETs.

  8. Which form of assessment provides the best information about student performance in chemistry examinations?

    NASA Astrophysics Data System (ADS)

    Hudson, Ross D.; Treagust, David F.

    2013-04-01

    Background . This study developed from observations of apparent achievement differences between male and female chemistry performances in a state university entrance examination. Male students performed more strongly than female students, especially in higher scores. Apart from the gender of the students, two other important factors that might influence student performance were format of questions (short-answer or multiple-choice) and type of questions (recall or application). Purpose The research question addressed in this study was: Is there a relationship between performance in state university entrance examinations in chemistry and school chemistry examinations and student gender, format of questions - multiple-choice or short-answer, and conceptual level - recall or application? Sample The two sources of data were: (1) secondary analyses of five consecutive years' data published by the examining authority of chemistry examinations, and (2) tests conducted with 192 students which provided information about all aspects of the three variables (question format, question type and gender) under consideration. Design and methods Both sources of data were analysed using ANOVA to compare means for the variables under consideration and the statistical significance of any differences. The data from the tests were also analysed using Rasch analysis to determine differences in gender performance. Results When overall mean data are considered, both male and female students performed better on multiple-choice questions and recall questions than on short-answer questions and application questions, respectively. When overall mean data are considered, male students outperformed female students in both the university entrance and school tests, particularly in the higher scores. When data were analysed with Rasch, there was no statistically significant difference in performance between males and females of equal ability. Conclusions Both male and female students generally perform better on multiple-choice questions than they do on short-answer questions. However, when the questions are matched in terms of difficulty (using Rasch analysis), the differences in performance between multiple-choice and short-answer are quite small. Rasch analysis showed that there was little difference in performance between males and females of equal ability. This study shows that a simple face-value score analysis of relative student performance - in this case, in chemistry - can be deceptive unless the actual abilities of the students concerned, as measured by a tool such as Rasch, are taken into consideration before reaching any conclusion.

  9. The Deficit Representation of Youth at Different Levels of Curriculum-Making: A Case Study on the Liberal Studies Curriculum in Hong Kong

    ERIC Educational Resources Information Center

    Chan, Chitat; Ting, Wai-Fong

    2012-01-01

    This study explores whether the deficit approach to understanding youth, which has been widely critiqued in contemporary youth studies, could still be a dominant paradigm in an emerging curriculum which emphasises multiple-perspective thinking. The analysis compares the representations of youth in selected reference sources at different levels of…

  10. Case Studies of Successful Schoolwide Enrichment Model-Reading (SEM-R) Classroom Implementations. Research Monograph Series. RM10204

    ERIC Educational Resources Information Center

    Reis, Sally M.; Little, Catherine A.; Fogarty, Elizabeth; Housand, Angela M.; Housand, Brian C.; Sweeny, Sheelah M.; Eckert, Rebecca D.; Muller, Lisa M.

    2010-01-01

    The purpose of this qualitative study was to examine the scaling up of the Schoolwide Enrichment Model in Reading (SEM-R) in 11 elementary and middle schools in geographically diverse sites across the country. Qualitative comparative analysis was used in this study, with multiple data sources compiled into 11 in-depth school case studies…

  11. Decision Aids Using Heterogeneous Intelligence Analysis

    DTIC Science & Technology

    2010-08-20

    developing a Geocultural service, a software framework and inferencing engine for the Transparent Urban Structures program. The scope of the effort...has evolved as the program has matured and is including multiple data sources, as well as interfaces out to the ONR architectural framework . Tasks...Interface; Application Program Interface; Application Programmer Interface CAF Common Application Framework EDA Event Driven Architecture a 16. SECURITY

  12. Improved security monitoring method for network bordary

    NASA Astrophysics Data System (ADS)

    Gao, Liting; Wang, Lixia; Wang, Zhenyan; Qi, Aihua

    2013-03-01

    This paper proposes a network bordary security monitoring system based on PKI. The design uses multiple safe technologies, analysis deeply the association between network data flow and system log, it can detect the intrusion activities and position invasion source accurately in time. The experiment result shows that it can reduce the rate of false alarm or missing alarm of the security incident effectively.

  13. Analysis of javelin throwing by high-speed photography

    NASA Astrophysics Data System (ADS)

    Yamamoto, Yoshitaka; Matsuoka, Rutsu; Ishida, Yoshihisa; Seki, Kazuichi

    1999-06-01

    A xenon multiple exposure light source device was manufactured to record the trajectory of a flying javelin, and a wind tunnel experiment was performed with some javelin models to analyze the flying characteristics of the javelin. Furthermore, form of javelin throwing by athletes was recorded to estimate the characteristics in the form of each athlete using a high speed cameras.

  14. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    PubMed

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  15. Error Analysis of CM Data Products Sources of Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across amore » wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.« less

  16. An open-source software package for multivariate modeling and clustering: applications to air quality management.

    PubMed

    Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong

    2015-09-01

    This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.

  17. Integrated system for automated financial document processing

    NASA Astrophysics Data System (ADS)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  18. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool.

    PubMed

    Aqil, Muhammad; Kita, Ichiro; Yano, Akira; Nishiyama, Soichi

    2007-10-01

    Traditionally, the multiple linear regression technique has been one of the most widely used models in simulating hydrological time series. However, when the nonlinear phenomenon is significant, the multiple linear will fail to develop an appropriate predictive model. Recently, neuro-fuzzy systems have gained much popularity for calibrating the nonlinear relationships. This study evaluated the potential of a neuro-fuzzy system as an alternative to the traditional statistical regression technique for the purpose of predicting flow from a local source in a river basin. The effectiveness of the proposed identification technique was demonstrated through a simulation study of the river flow time series of the Citarum River in Indonesia. Furthermore, in order to provide the uncertainty associated with the estimation of river flow, a Monte Carlo simulation was performed. As a comparison, a multiple linear regression analysis that was being used by the Citarum River Authority was also examined using various statistical indices. The simulation results using 95% confidence intervals indicated that the neuro-fuzzy model consistently underestimated the magnitude of high flow while the low and medium flow magnitudes were estimated closer to the observed data. The comparison of the prediction accuracy of the neuro-fuzzy and linear regression methods indicated that the neuro-fuzzy approach was more accurate in predicting river flow dynamics. The neuro-fuzzy model was able to improve the root mean square error (RMSE) and mean absolute percentage error (MAPE) values of the multiple linear regression forecasts by about 13.52% and 10.73%, respectively. Considering its simplicity and efficiency, the neuro-fuzzy model is recommended as an alternative tool for modeling of flow dynamics in the study area.

  19. On the source of cross-grain lineations in the central Pacific gravity field

    NASA Technical Reports Server (NTRS)

    Mcadoo, David C.; Sandwell, David T.

    1989-01-01

    The source of cross-grain lineations in marine gravity field observed in central Pacific was investigated by comparing multiple collinear gravity profiles from Geosat data with coincident bathymetry profiles, in the Fourier transform domain. Bathymetric data were collected by multibeam sonar systems operating from two research vessels, one in June-August 1985, the other in February and March 1987. The results of this analysis indicate that the lineations are superficial features that appear to result from a combination of subsurface and surface loads supported by a thin (2 km to 5 km) lithosphere.

  20. PyPanda: a Python package for gene regulatory network reconstruction

    PubMed Central

    van IJzendoorn, David G.P.; Glass, Kimberly; Quackenbush, John; Kuijjer, Marieke L.

    2016-01-01

    Summary: PANDA (Passing Attributes between Networks for Data Assimilation) is a gene regulatory network inference method that uses message-passing to integrate multiple sources of ‘omics data. PANDA was originally coded in C ++. In this application note we describe PyPanda, the Python version of PANDA. PyPanda runs considerably faster than the C ++ version and includes additional features for network analysis. Availability and implementation: The open source PyPanda Python package is freely available at http://github.com/davidvi/pypanda. Contact: mkuijjer@jimmy.harvard.edu or d.g.p.van_ijzendoorn@lumc.nl PMID:27402905

  1. PyPanda: a Python package for gene regulatory network reconstruction.

    PubMed

    van IJzendoorn, David G P; Glass, Kimberly; Quackenbush, John; Kuijjer, Marieke L

    2016-11-01

    PANDA (Passing Attributes between Networks for Data Assimilation) is a gene regulatory network inference method that uses message-passing to integrate multiple sources of 'omics data. PANDA was originally coded in C ++. In this application note we describe PyPanda, the Python version of PANDA. PyPanda runs considerably faster than the C ++ version and includes additional features for network analysis. The open source PyPanda Python package is freely available at http://github.com/davidvi/pypanda CONTACT: mkuijjer@jimmy.harvard.edu or d.g.p.van_ijzendoorn@lumc.nl. © The Author 2016. Published by Oxford University Press.

  2. Multiple-reflection optical gas cell

    DOEpatents

    Matthews, Thomas G.

    1983-01-01

    A multiple-reflection optical cell for Raman or fluorescence gas analysis consists of two spherical mirrors positioned transverse to a multiple-pass laser cell in a confronting plane-parallel alignment. The two mirrors are of equal diameter but possess different radii of curvature. The spacing between the mirrors is uniform and less than half of the radius of curvature of either mirror. The mirror of greater curvature possesses a small circular portal in its center which is the effective point source for conventional F1 double lens collection optics of a monochromator-detection system. Gas to be analyzed is flowed into the cell and irradiated by a multiply-reflected composite laser beam centered between the mirrors of the cell. Raman or fluorescence radiation originating from a large volume within the cell is (1) collected via multiple reflections with the cell mirrors, (2) partially collimated and (3) directed through the cell portal in a geometric array compatible with F1 collection optics.

  3. Transplantation of epiphytic bioaccumulators (Tillandsia capillaris) for high spatial resolution biomonitoring of trace elements and point sources deconvolution in a complex mining/smelting urban context

    NASA Astrophysics Data System (ADS)

    Goix, Sylvaine; Resongles, Eléonore; Point, David; Oliva, Priscia; Duprey, Jean Louis; de la Galvez, Erika; Ugarte, Lincy; Huayta, Carlos; Prunier, Jonathan; Zouiten, Cyril; Gardon, Jacques

    2013-12-01

    Monitoring atmospheric trace elements (TE) levels and tracing their source origin is essential for exposure assessment and human health studies. Epiphytic Tillandsia capillaris plants were used as bioaccumulator of TE in a complex polymetallic mining/smelting urban context (Oruro, Bolivia). Specimens collected from a pristine reference site were transplanted at a high spatial resolution (˜1 sample/km2) throughout the urban area. About twenty-seven elements were measured after a 4-month exposure, also providing new information values for reference material BCR482. Statistical power analysis for this biomonitoring mapping approach against classical aerosols surveys performed on the same site showed the better aptitude of T. Capillaris to detect geographical trend, and to deconvolute multiple contamination sources using geostatistical principal component analysis. Transplanted specimens in the vicinity of the mining and smelting areas were characterized by extreme TE accumulation (Sn > Ag > Sb > Pb > Cd > As > W > Cu > Zn). Three contamination sources were identified: mining (Ag, Pb, Sb), smelting (As, Sn) and road traffic (Zn) emissions, confirming results of previous aerosol survey.

  4. Rhythmic entrainment source separation: Optimizing analyses of neural responses to rhythmic sensory stimulation.

    PubMed

    Cohen, Michael X; Gulbinaite, Rasa

    2017-02-15

    Steady-state evoked potentials (SSEPs) are rhythmic brain responses to rhythmic sensory stimulation, and are often used to study perceptual and attentional processes. We present a data analysis method for maximizing the signal-to-noise ratio of the narrow-band steady-state response in the frequency and time-frequency domains. The method, termed rhythmic entrainment source separation (RESS), is based on denoising source separation approaches that take advantage of the simultaneous but differential projection of neural activity to multiple electrodes or sensors. Our approach is a combination and extension of existing multivariate source separation methods. We demonstrate that RESS performs well on both simulated and empirical data, and outperforms conventional SSEP analysis methods based on selecting electrodes with the strongest SSEP response, as well as several other linear spatial filters. We also discuss the potential confound of overfitting, whereby the filter captures noise in absence of a signal. Matlab scripts are available to replicate and extend our simulations and methods. We conclude with some practical advice for optimizing SSEP data analyses and interpreting the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Concentration, distribution and source apportionment of atmospheric polycyclic aromatic hydrocarbons in the southeast suburb of Beijing, China.

    PubMed

    Zhang, Shucai; Zhang, Wei; Wang, Kaiyan; Shen, Yating; Hu, Lianwu; Wang, Xuejun

    2009-04-01

    Total suspended particle samples and gas phase samples were collected at three representative sampling sites in the southeastern suburb of Beijing from March 2005 to January 2006. The samples were analyzed for 16 US EPA priority PAHs using GC/MS. Concentrations of Sigma PAHs in particle and gas phases were 0.21-1.18 x 10(3) ng m(-3) and 9.5 x 10(2) ng-1.03 x 10(5) ng m(-3), respectively. PAH concentrations displayed seasonal variation in the order of winter>spring>autumn>summer for particle phase, and winter>autumn>summer>spring for gas phase. Partial correlation analysis indicates that PAH concentrations in particle phase are negatively correlated with temperature and positively correlated with air pollution index of SO(2). No significant correlation is observed between gas phase PAHs and the auxiliary parameters. Sources of PAH are identified through principal component analysis, and source contributions are estimated through multiple linear regression. Major sources of atmospheric PAHs in the study area include coal combustion, coke industry, vehicular emission and natural gas combustion.

  6. Cognitive Affective Engagement Model of Multiple Source Use

    ERIC Educational Resources Information Center

    List, Alexandra; Alexander, Patricia A.

    2017-01-01

    This article introduces the cognitive affective engagement model (CAEM) of multiple source use. The CAEM is presented as a way of unifying cognitive and behaviorally focused models of multiple text engagement with research on the role of affective factors (e.g., interest) in text processing. The CAEM proposes that students' engagement with…

  7. The Chandra Source Catalog 2.0

    NASA Astrophysics Data System (ADS)

    Evans, Ian N.; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; McLaughlin, Warren; Miller, Joseph; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula

    2018-01-01

    The current version of the Chandra Source Catalog (CSC) continues to be well utilized by the astronomical community. Usage over the past year has continued to average more than 15,000 searches per month. Version 1.1 of the CSC, released in 2010, includes properties and data for 158,071 detections, corresponding to 106,586 distinct X-ray sources on the sky. The second major release of the catalog, CSC 2.0, will be made available to the user community in early 2018, and preliminary lists of detections and sources are available now. Release 2.0 will roughly triple the size of the current version of the catalog to an estimated 375,000 detections, corresponding to ~315,000 unique X-ray sources. Compared to release 1.1, the limiting sensitivity for compact sources in CSC 2.0 is significantly enhanced. This improvement is achieved by using a two-stage approach that involves stacking (co-adding) multiple observations of the same field prior to source detection, and then using an improved source detection approach that enables us to detect point source down to ~5 net counts on-axis for exposures shorter than ~15 ks. In addition to enhanced source detection capabilities, improvements to the Bayesian aperture photometry code included in release 2.0 provides robust photometric probability density functions (PDFs) in crowded fields even for low count detections. All post-aperture photometry properties (e.g., hardness ratios, source variability) work directly from the PDFs in release 2.0. CSC 2.0 also adds a Bayesian Blocks analysis of the multi-band aperture photometry PDFs to identify multiple observations of the same source that have similar photometric properties, and therefore can be analyzed simultaneously to improve S/N.We briefly describe these and other updates that significantly enhance the scientific utility of CSC 2.0 when compared to the earlier catalog release.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.

  8. SamuROI, a Python-Based Software Tool for Visualization and Analysis of Dynamic Time Series Imaging at Multiple Spatial Scales.

    PubMed

    Rueckl, Martin; Lenzi, Stephen C; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W

    2017-01-01

    The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca 2+ -imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca 2+ imaging datasets, particularly when these have been acquired at different spatial scales.

  9. SamuROI, a Python-Based Software Tool for Visualization and Analysis of Dynamic Time Series Imaging at Multiple Spatial Scales

    PubMed Central

    Rueckl, Martin; Lenzi, Stephen C.; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W.

    2017-01-01

    The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca2+-imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca2+ imaging datasets, particularly when these have been acquired at different spatial scales. PMID:28706482

  10. An Introduction to MAMA (Meta-Analysis of MicroArray data) System.

    PubMed

    Zhang, Zhe; Fenstermacher, David

    2005-01-01

    Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.

  11. Active Vibration Control for Helicopter Interior Noise Reduction Using Power Minimization

    NASA Technical Reports Server (NTRS)

    Mendoza, J.; Chevva, K.; Sun, F.; Blanc, A.; Kim, S. B.

    2014-01-01

    This report describes work performed by United Technologies Research Center (UTRC) for NASA Langley Research Center (LaRC) under Contract NNL11AA06C. The objective of this program is to develop technology to reduce helicopter interior noise resulting from multiple gear meshing frequencies. A novel active vibration control approach called Minimum Actuation Power (MAP) is developed. MAP is an optimal control strategy that minimizes the total input power into a structure by monitoring and varying the input power of controlling sources. MAP control was implemented without explicit knowledge of the phasing and magnitude of the excitation sources by driving the real part of the input power from the controlling sources to zero. It is shown that this occurs when the total mechanical input power from the excitation and controlling sources is a minimum. MAP theory is developed for multiple excitation sources with arbitrary relative phasing for single or multiple discrete frequencies and controlled by a single or multiple controlling sources. Simulations and experimental results demonstrate the feasibility of MAP for structural vibration reduction of a realistic rotorcraft interior structure. MAP control resulted in significant average global vibration reduction of a single frequency and multiple frequency excitations with one controlling actuator. Simulations also demonstrate the potential effectiveness of the observed vibration reductions on interior radiated noise.

  12. Analyzing the contribution of climate change to long-term variations in sediment nitrogen sources for reservoirs/lakes.

    PubMed

    Xia, Xinghui; Wu, Qiong; Zhu, Baotong; Zhao, Pujun; Zhang, Shangwei; Yang, Lingyan

    2015-08-01

    We applied a mixing model based on stable isotopic δ(13)C, δ(15)N, and C:N ratios to estimate the contributions of multiple sources to sediment nitrogen. We also developed a conceptual model describing and analyzing the impacts of climate change on nitrogen enrichment. These two models were conducted in Miyun Reservoir to analyze the contribution of climate change to the variations in sediment nitrogen sources based on two (210)Pb and (137)Cs dated sediment cores. The results showed that during the past 50years, average contributions of soil and fertilizer, submerged macrophytes, N2-fixing phytoplankton, and non-N2-fixing phytoplankton were 40.7%, 40.3%, 11.8%, and 7.2%, respectively. In addition, total nitrogen (TN) contents in sediment showed significant increasing trends from 1960 to 2010, and sediment nitrogen of both submerged macrophytes and phytoplankton sources exhibited significant increasing trends during the past 50years. In contrast, soil and fertilizer sources showed a significant decreasing trend from 1990 to 2010. According to the changing trend of N2-fixing phytoplankton, changes of temperature and sunshine duration accounted for at least 43% of the trend in the sediment nitrogen enrichment over the past 50years. Regression analysis of the climatic factors on nitrogen sources showed that the contributions of precipitation, temperature, and sunshine duration to the variations in sediment nitrogen sources ranged from 18.5% to 60.3%. The study demonstrates that the mixing model provides a robust method for calculating the contribution of multiple nitrogen sources in sediment, and this study also suggests that N2-fixing phytoplankton could be regarded as an important response factor for assessing the impacts of climate change on nitrogen enrichment. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Chandra follow up analysis on HESS J1841-055

    NASA Astrophysics Data System (ADS)

    Wilbert, Sven

    2012-07-01

    State of the art Imaging Atmospheric Cherenkow Telescopes (IACTs) like the Very Energetic Radiation Imaging Telescope Array System (VERITAS) and the High Energy Stereoscopic System (H.E.S.S) made surveys of the sky in order to discover new sources. The first and most famous is the H.E.S.S survey of the inner Galactic plane. So far more than 50 Galactic TeV Gamma-ray sources have been detected, a large number of which remain unidentified. HESS J1841-055 is one of the largest and most complex among these unidentified sources with an extension of approximately 1°. Follow up observations of the HESS J1841-055 region with Chandra, which is due to its high resolution good suited for searching for X-Ray counterparts and add-on analysis have revealed several X-ray sources spatially coincident with the multiple TeV emission peaks. The search for counterparts brought out the fact that not a single source itself but a bunch of sources of different nature, could be indeed the creators of this complex diffuse emission region; among them the SNR Kes 73, the pulsar within Kes 73, 1E 1841-45 and also the High Mass X-Ray Binary AX 184100.4-0536 and others.

  14. MEG and EEG data analysis with MNE-Python.

    PubMed

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-12-26

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne.

  15. MEG and EEG data analysis with MNE-Python

    PubMed Central

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A.; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-01-01

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne. PMID:24431986

  16. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  17. A Review on Spectral Amplitude Coding Optical Code Division Multiple Access

    NASA Astrophysics Data System (ADS)

    Kaur, Navpreet; Goyal, Rakesh; Rani, Monika

    2017-06-01

    This manuscript deals with analysis of Spectral Amplitude Coding Optical Code Division Multiple Access (SACOCDMA) system. The major noise source in optical CDMA is co-channel interference from other users known as multiple access interference (MAI). The system performance in terms of bit error rate (BER) degrades as a result of increased MAI. It is perceived that number of users and type of codes used for optical system directly decide the performance of system. MAI can be restricted by efficient designing of optical codes and implementing them with unique architecture to accommodate more number of users. Hence, it is a necessity to design a technique like spectral direct detection (SDD) technique with modified double weight code, which can provide better cardinality and good correlation property.

  18. Source Evaluation, Comprehension, and Learning in Internet Science Inquiry Tasks

    ERIC Educational Resources Information Center

    Wiley, Jennifer; Goldman, Susan R.; Graesser, Arthur C.; Sanchez, Christopher A.; Ash, Ivan K.; Hemmerich, Joshua A.

    2009-01-01

    In two experiments, undergraduates' evaluation and use of multiple Internet sources during a science inquiry task were examined. In Experiment 1, undergraduates had the task of explaining what caused the eruption of Mt. St. Helens using the results of an Internet search. Multiple regression analyses indicated that source evaluation significantly…

  19. SIMA: Python software for analysis of dynamic fluorescence imaging data.

    PubMed

    Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  20. Effects of land cover, topography, and built structure on seasonal water quality at multiple spatial scales.

    PubMed

    Pratt, Bethany; Chang, Heejun

    2012-03-30

    The relationship among land cover, topography, built structure and stream water quality in the Portland Metro region of Oregon and Clark County, Washington areas, USA, is analyzed using ordinary least squares (OLS) and geographically weighted (GWR) multiple regression models. Two scales of analysis, a sectional watershed and a buffer, offered a local and a global investigation of the sources of stream pollutants. Model accuracy, measured by R(2) values, fluctuated according to the scale, season, and regression method used. While most wet season water quality parameters are associated with urban land covers, most dry season water quality parameters are related topographic features such as elevation and slope. GWR models, which take into consideration local relations of spatial autocorrelation, had stronger results than OLS regression models. In the multiple regression models, sectioned watershed results were consistently better than the sectioned buffer results, except for dry season pH and stream temperature parameters. This suggests that while riparian land cover does have an effect on water quality, a wider contributing area needs to be included in order to account for distant sources of pollutants. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Liu, Wei; Li, Xiaodong; Yang, Fan; Gao, Peng; Jia, Zhenyuan

    2015-10-01

    Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.

  2. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    PubMed Central

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731

  3. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis.

    PubMed

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.

  4. Time-Frequency Analysis of the Dispersion of Lamb Modes

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Seale, Michael D.; Smith, Barry T.

    1999-01-01

    Accurate knowledge of the velocity dispersion of Lamb modes is important for ultrasonic nondestructive evaluation methods used in detecting and locating flaws in thin plates and in determining their elastic stiffness coefficients. Lamb mode dispersion is also important in the acoustic emission technique for accurately triangulating the location of emissions in thin plates. In this research, the ability to characterize Lamb mode dispersion through a time-frequency analysis (the pseudo-Wigner-Ville distribution) was demonstrated. A major advantage of time-frequency methods is the ability to analyze acoustic signals containing multiple propagation modes, which overlap and superimpose in the time domain signal. By combining time-frequency analysis with a broadband acoustic excitation source, the dispersion of multiple Lamb modes over a wide frequency range can be determined from as little as a single measurement. In addition, the technique provides a direct measurement of the group velocity dispersion. The technique was first demonstrated in the analysis of a simulated waveform in an aluminum plate in which the Lamb mode dispersion was well known. Portions of the dispersion curves of the AO, A I , So, and S2 Lamb modes were obtained from this one waveform. The technique was also applied for the analysis of experimental waveforms from a unidirectional graphite/epoxy composite plate. Measurements were made both along and perpendicular to the fiber direction. In this case, the signals contained only the lowest order symmetric and antisymmetric modes. A least squares fit of the results from several source to detector distances was used. Theoretical dispersion curves were calculated and are shown to be in good agreement with experimental results.

  5. Seeking unique and common biological themes in multiple gene lists or datasets: pathway pattern extraction pipeline for pathway-level comparative analysis.

    PubMed

    Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M

    2009-06-29

    One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.

  6. Passive radio frequency peak power multiplier

    DOEpatents

    Farkas, Zoltan D.; Wilson, Perry B.

    1977-01-01

    Peak power multiplication of a radio frequency source by simultaneous charging of two high-Q resonant microwave cavities by applying the source output through a directional coupler to the cavities and then reversing the phase of the source power to the coupler, thereby permitting the power in the cavities to simultaneously discharge through the coupler to the load in combination with power from the source to apply a peak power to the load that is a multiplication of the source peak power.

  7. Estimation of splitting functions from Earth's normal mode spectra using the neighbourhood algorithm

    NASA Astrophysics Data System (ADS)

    Pachhai, Surya; Tkalčić, Hrvoje; Masters, Guy

    2016-01-01

    The inverse problem for Earth structure from normal mode data is strongly non-linear and can be inherently non-unique. Traditionally, the inversion is linearized by taking partial derivatives of the complex spectra with respect to the model parameters (i.e. structure coefficients), and solved in an iterative fashion. This method requires that the earthquake source model is known. However, the release of energy in large earthquakes used for the analysis of Earth's normal modes is not simple. A point source approximation is often inadequate, and a more complete account of energy release at the source is required. In addition, many earthquakes are required for the solution to be insensitive to the initial constraints and regularization. In contrast to an iterative approach, the autoregressive linear inversion technique conveniently avoids the need for earthquake source parameters, but it also requires a number of events to achieve full convergence when a single event does not excite all singlets well. To build on previous improvements, we develop a technique to estimate structure coefficients (and consequently, the splitting functions) using a derivative-free parameter search, known as neighbourhood algorithm (NA). We implement an efficient forward method derived using the autoregresssion of receiver strips, and this allows us to search over a multiplicity of structure coefficients in a relatively short time. After demonstrating feasibility of the use of NA in synthetic cases, we apply it to observations of the inner core sensitive mode 13S2. The splitting function of this mode is dominated by spherical harmonic degree 2 axisymmetric structure and is consistent with the results obtained from the autoregressive linear inversion. The sensitivity analysis of multiple events confirms the importance of the Bolivia, 1994 earthquake. When this event is used in the analysis, as little as two events are sufficient to constrain the splitting functions of 13S2 mode. Apart from not requiring the knowledge of earthquake source, the newly developed technique provides an approximate uncertainty measure of the structure coefficients and allows us to control the type of structure solved for, for example to establish if elastic structure is sufficient.

  8. Characteristics of violence among high-risk adolescent girls.

    PubMed

    Secor-Turner, Molly; Garwick, Ann; Sieving, Renee; Seppelt, Ann

    2014-01-01

    Recent evidence demonstrates increasing rates of involvement with violence among adolescent girls. The objective of this study was to describe the types and sources of violence experienced within social contexts of adolescent girls at high risk for pregnancy. Qualitative data for this analysis are drawn from intervention summary reports of 116 girls participating in Prime Time, a youth development intervention for adolescent girls. Descriptive content analysis techniques were used to identify types and sources of violence experienced by girls within their daily contexts. Types of violence included physical fighting, witnessing violence, physical abuse, gang-related violence, verbal fighting, verbal abuse, and sexual abuse. Sources of violence included family, peers and friends, romantic partners, community violence, and self-perpetrated violence. Many girls in this study experienced violence in multiple contexts. It is imperative that efforts to assess and prevent violence among adolescent girls include paying attention to the social contexts in which these adolescents live. Copyright © 2014 National Association of Pediatric Nurse Practitioners. Published by Mosby, Inc. All rights reserved.

  9. [Global Atmospheric Chemistry/Transport Modeling and Data-Analysis

    NASA Technical Reports Server (NTRS)

    Prinn, Ronald G.

    1999-01-01

    This grant supported a global atmospheric chemistry/transport modeling and data- analysis project devoted to: (a) development, testing, and refining of inverse methods for determining regional and global transient source and sink strengths for trace gases; (b) utilization of these inverse methods which use either the Model for Atmospheric Chemistry and Transport (MATCH) which is based on analyzed observed winds or back- trajectories calculated from these same winds for determining regional and global source and sink strengths for long-lived trace gases important in ozone depletion and the greenhouse effect; (c) determination of global (and perhaps regional) average hydroxyl radical concentrations using inverse methods with multiple "titrating" gases; and (d) computation of the lifetimes and spatially resolved destruction rates of trace gases using 3D models. Important ultimate goals included determination of regional source strengths of important biogenic/anthropogenic trace gases and also of halocarbons restricted by the Montreal Protocol and its follow-on agreements, and hydrohalocarbons now used as alternatives to the above restricted halocarbons.

  10. CellProfiler and KNIME: open source tools for high content screening.

    PubMed

    Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc

    2013-01-01

    High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.

  11. Lifting degeneracy in holographic characterization of colloidal particles using multi-color imaging.

    PubMed

    Ruffner, David B; Cheong, Fook Chiong; Blusewicz, Jaroslaw M; Philips, Laura A

    2018-05-14

    Micrometer sized particles can be accurately characterized using holographic video microscopy and Lorenz-Mie fitting. In this work, we explore some of the limitations in holographic microscopy and introduce methods for increasing the accuracy of this technique with the use of multiple wavelengths of laser illumination. Large high index particle holograms have near degenerate solutions that can confuse standard fitting algorithms. Using a model based on diffraction from a phase disk, we explain the source of these degeneracies. We introduce multiple color holography as an effective approach to distinguish between degenerate solutions and provide improved accuracy for the holographic analysis of sub-visible colloidal particles.

  12. Disentangling formation of multiple-core holes in aminophenol molecules exposed to bright X-FEL radiation

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Kamińska, M.; Mucke, M.; Squibb, R. J.; Eland, J. H. D.; Piancastelli, M. N.; Frasinski, L. J.; Grilj, J.; Koch, M.; McFarland, B. K.; Sistrunk, E.; Gühr, M.; Coffee, R. N.; Bostedt, C.; Bozek, J. D.; Salén, P.; Meulen, P. v. d.; Linusson, P.; Thomas, R. D.; Larsson, M.; Foucar, L.; Ullrich, J.; Motomura, K.; Mondal, S.; Ueda, K.; Richter, R.; Prince, K. C.; Takahashi, O.; Osipov, T.; Fang, L.; Murphy, B. F.; Berrah, N.; Feifel, R.

    2015-12-01

    Competing multi-photon ionization processes, some leading to the formation of double core hole states, have been examined in 4-aminophenol. The experiments used the linac coherent light source (LCLS) x-ray free electron laser, in combination with a time-of-flight magnetic bottle electron spectrometer and the correlation analysis method of covariance mapping. The results imply that 4-aminophenol molecules exposed to the focused x-ray pulses of the LCLS sequentially absorb more than two x-ray photons, resulting in the formation of multiple core holes as well as in the sequential removal of photoelectrons and Auger electrons (so-called PAPA sequences).

  13. Disentangling formation of multiple-core holes in aminophenol molecules exposed to bright X-FEL radiation

    DOE PAGES

    Zhaunerchyk, V.; Kaminska, M.; Mucke, M.; ...

    2015-10-28

    Competing multi-photon ionization processes, some leading to the formation of double core hole states, have been examined in 4-aminophenol. The experiments used the linac coherent light source (LCLS) x-ray free electron laser, in combination with a time-of-flight magnetic bottle electron spectrometer and the correlation analysis method of covariance mapping. Furthermore, the results imply that 4-aminophenol molecules exposed to the focused x-ray pulses of the LCLS sequentially absorb more than two x-ray photons, resulting in the formation of multiple core holes as well as in the sequential removal of photoelectrons and Auger electrons (so-called PAPA sequences).

  14. Groundwater arsenic and education attainment in Bangladesh.

    PubMed

    Murray, Michael P; Sharmin, Raisa

    2015-10-26

    Thousands of groundwater tube wells serving millions of Bangladeshis are arsenic contaminated. This study investigates the effect of these wells on the education attainment and school attendance of youths who rely on those wells for drinking water. The analysis combines data from the 2006 Bangladesh Multiple Indicator Cluster Survey (2006 MICS) and the National Hydrochemical Survey (NHS) of Bangladeshi tube wells' contamination conducted between 1998 and 2000. The study uses multiple regression analysis to estimate the differences in education attainment and school attendance among the following: (i) youths who live where tube wells are safe, (ii) youths who live where tube wells are unsafe but who report drinking from an arsenic-free source, and (iii) youths who live where tube wells are unsafe but who do not report drinking from an arsenic-free source. Controlling for other determinants of education attainment and school attendance, young Bangladeshi males who live where tube wells are unsafe (by Bangladeshis standards) but who report drinking from arsenic-free sources are found to have the same education attainment (among 19- to 21-year-olds) and school attendance (among 6- to 10-year-olds), on average, as corresponding young Bangladeshi males who live where wells are safe. But young Bangladeshi males who live where tube wells are unsafe and who do not report drinking from an arsenic-free source attain, on average, a half-year less education (among 19- to 21-year-olds) and attend school, on average, five to seven fewer days a year (among 6- to 10-year-olds) than do other Bagladeshi males of those ages. The estimated effects for females are of the same sign but much smaller in magnitude. Bangladeshi public health measures to shift drinking from unsafe to safe wells not only advance good health but also increase males' education attainment.

  15. sLORETA current source density analysis of evoked potentials for spatial updating in a virtual navigation task

    PubMed Central

    Nguyen, Hai M.; Matsumoto, Jumpei; Tran, Anh H.; Ono, Taketoshi; Nishijo, Hisao

    2014-01-01

    Previous studies have reported that multiple brain regions are activated during spatial navigation. However, it is unclear whether these activated brain regions are specifically associated with spatial updating or whether some regions are recruited for parallel cognitive processes. The present study aimed to localize current sources of event related potentials (ERPs) associated with spatial updating specifically. In the control phase of the experiment, electroencephalograms (EEGs) were recorded while subjects sequentially traced 10 blue checkpoints on the streets of a virtual town, which were sequentially connected by a green line, by manipulating a joystick. In the test phase of the experiment, the checkpoints and green line were not indicated. Instead, a tone was presented when the subjects entered the reference points where they were then required to trace the 10 invisible spatial reference points corresponding to the checkpoints. The vertex-positive ERPs with latencies of approximately 340 ms from the moment when the subjects entered the unmarked reference points were significantly larger in the test than in the control phases. Current source density analysis of the ERPs by standardized low-resolution brain electromagnetic tomography (sLORETA) indicated activation of brain regions in the test phase that are associated with place and landmark recognition (entorhinal cortex/hippocampus, parahippocampal and retrosplenial cortices, fusiform, and lingual gyri), detecting self-motion (posterior cingulate and posterior insular cortices), motor planning (superior frontal gyrus, including the medial frontal cortex), and regions that process spatial attention (inferior parietal lobule). The present results provide the first identification of the current sources of ERPs associated with spatial updating, and suggest that multiple systems are active in parallel during spatial updating. PMID:24624067

  16. A HIGHLY ELONGATED PROMINENT LENS AT z = 0.87: FIRST STRONG-LENSING ANALYSIS OF EL GORDO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitrin, Adi; Menanteau, Felipe; Hughes, John P.

    We present the first strong-lensing (SL) analysis of the galaxy cluster ACT-CL J0102-4915 (El Gordo), in recent HST/ACS images, revealing a prominent strong lens at a redshift of z = 0.87. This finding adds to the already-established unique properties of El Gordo: it is the most massive, hot, X-ray luminous, and bright Sunyaev-Zeldovich effect cluster at z {approx}> 0.6, and the only {sup b}ullet{sup -}like merging cluster known at these redshifts. The lens consists of two merging massive clumps, where, for a source redshift of z{sub s} {approx} 2, each clump exhibits only a small, separate critical area, with amore » total area of 0.69 {+-} 0.11{open_square}' over the two clumps. For a higher source redshift, z{sub s} {approx} 4, the critical curves of the two clumps merge together into one bigger and very elongated lens (axis ratio {approx_equal} 5.5), enclosing an effective area of 1.44 {+-} 0.22{open_square}'. The critical curves continue expanding with increasing redshift so that for high-redshift sources (z{sub s} {approx}> 9) they enclose an area of {approx}1.91 {+-} 0.30{open_square}' (effective {theta}{sub e} {approx_equal} 46.''8 {+-} 3.''7) and a mass of 6.09 {+-} 1.04 Multiplication-Sign 10{sup 14} M{sub Sun }. According to our model, the area of high magnification ({mu} > 10) for such high-redshift sources is {approx_equal}1.2{open_square}', and the area with {mu} > 5 is {approx_equal}2.3{open_square}', making El Gordo a compelling target for studying the high-redshift universe. We obtain a strong lower limit on the total mass of El Gordo, {approx}> 1.7 Multiplication-Sign 10{sup 15} M{sub Sun} from the SL regime alone, suggesting a total mass of roughly M{sub 200} {approx} 2.3 Multiplication-Sign 10{sup 15} M{sub Sun }. Our results should be revisited when additional spectroscopic and HST imaging data are available.« less

  17. A Bayesian Analysis of Scale-Invariant Processes

    DTIC Science & Technology

    2012-01-01

    Earth Grid (EASE- Grid). The NED raster elevation data of one arc-second resolution (30 m) over the continental US are derived from multiple satellites ...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...empirical and ME distributions, yet ensuring computational efficiency. Instead of com- puting empirical histograms from large amount of data , only some

  18. Game Design and Learning: A Conjectural Analysis of How Massively Multiple Online Role-Playing Games (MMORPGs) Foster Intrinsic Motivation

    ERIC Educational Resources Information Center

    Dickey, Michele D.

    2007-01-01

    During the past two decades, the popularity of computer and video games has prompted games to become a source of study for educational researchers and instructional designers investigating how various aspects of game design might be appropriated, borrowed, and re-purposed for the design of educational materials. The purpose of this paper is to…

  19. Genetic variation and seed transfer guidelines for ponderosa pine in the Ochoco and Malheur National Forests of central Oregon.

    Treesearch

    Frank C. Sorensen; John C. Weber

    1994-01-01

    Adaptive genetic variation in seed and seedling traits was evaluated for 280 families from 220 locations. Factor scores from three principal components were related by multiple regression to latitude, longitude, elevation, slope, and aspect of the seed source, and by classification analysis to seed zone and elevation band in seed zone. Location variance was significant...

  20. Safe, Multiphase Bounds Check Elimination in Java

    DTIC Science & Technology

    2010-01-28

    production of mobile code from source code, JIT compilation in the virtual ma- chine, and application code execution. The code producer uses...invariants, and inequality constraint analysis) to identify and prove redundancy of bounds checks. During class-loading and JIT compilation, the virtual...unoptimized code if the speculated invariants do not hold. The combined effect of the multiple phases is to shift the effort as- sociated with bounds

  1. Fast and Efficient Feature Engineering for Multi-Cohort Analysis of EHR Data.

    PubMed

    Ozery-Flato, Michal; Yanover, Chen; Gottlieb, Assaf; Weissbrod, Omer; Parush Shear-Yashuv, Naama; Goldschmidt, Yaara

    2017-01-01

    We present a framework for feature engineering, tailored for longitudinal structured data, such as electronic health records (EHRs). To fast-track feature engineering and extraction, the framework combines general-use plug-in extractors, a multi-cohort management mechanism, and modular memoization. Using this framework, we rapidly extracted thousands of features from diverse and large healthcare data sources in multiple projects.

  2. Multiple-rule bias in the comparison of classification rules

    PubMed Central

    Yousefi, Mohammadmahdi R.; Hua, Jianping; Dougherty, Edward R.

    2011-01-01

    Motivation: There is growing discussion in the bioinformatics community concerning overoptimism of reported results. Two approaches contributing to overoptimism in classification are (i) the reporting of results on datasets for which a proposed classification rule performs well and (ii) the comparison of multiple classification rules on a single dataset that purports to show the advantage of a certain rule. Results: This article provides a careful probabilistic analysis of the second issue and the ‘multiple-rule bias’, resulting from choosing a classification rule having minimum estimated error on the dataset. It quantifies this bias corresponding to estimating the expected true error of the classification rule possessing minimum estimated error and it characterizes the bias from estimating the true comparative advantage of the chosen classification rule relative to the others by the estimated comparative advantage on the dataset. The analysis is applied to both synthetic and real data using a number of classification rules and error estimators. Availability: We have implemented in C code the synthetic data distribution model, classification rules, feature selection routines and error estimation methods. The code for multiple-rule analysis is implemented in MATLAB. The source code is available at http://gsp.tamu.edu/Publications/supplementary/yousefi11a/. Supplementary simulation results are also included. Contact: edward@ece.tamu.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:21546390

  3. Lisbon 1755, a multiple-rupture earthquake

    NASA Astrophysics Data System (ADS)

    Fonseca, J. F. B. D.

    2017-12-01

    The Lisbon earthquake of 1755 poses a challenge to seismic hazard assessment. Reports pointing to MMI 8 or above at distances of the order of 500km led to magnitude estimates near M9 in classic studies. A refined analysis of the coeval sources lowered the estimates to 8.7 (Johnston, 1998) and 8.5 (Martinez-Solares, 2004). I posit that even these lower magnitude values reflect the combined effect of multiple ruptures. Attempts to identify a single source capable of explaining the damage reports with published ground motion models did not gather consensus and, compounding the challenge, the analysis of tsunami traveltimes has led to disparate source models, sometimes separated by a few hundred kilometers. From this viewpoint, the most credible source would combine a sub-set of the multiple active structures identifiable in SW Iberia. No individual moment magnitude needs to be above M8.1, thus rendering the search for candidate structures less challenging. The possible combinations of active structures should be ranked as a function of their explaining power, for macroseismic intensities and tsunami traveltimes taken together. I argue that the Lisbon 1755 earthquake is an example of a distinct class of intraplate earthquake previously unrecognized, of which the Indian Ocean earthquake of 2012 is the first instrumentally recorded example, showing space and time correlation over scales of the orders of a few hundred km and a few minutes. Other examples may exist in the historical record, such as the M8 1556 Shaanxi earthquake, with an unusually large damage footprint (MMI equal or above 6 in 10 provinces; 830000 fatalities). The ability to trigger seismicity globally, observed after the 2012 Indian Ocean earthquake, may be a characteristic of this type of event: occurrences in Massachussets (M5.9 Cape Ann earthquake on 18/11/1755), Morocco (M6.5 Fez earthquake on 27/11/1755) and Germany (M6.1 Duren earthquake, on 18/02/1756) had in all likelyhood a causal link to the Lisbon earthquake. This may reflect the very long period of surface waves generated by the combined sources as a result of the delays between ruptures. Recognition of this new class of large intraplate earthquakes may pave the way to a better understanding of the mechanisms driving intraplate deformation.

  4. MEG source imaging method using fast L1 minimum-norm and its applications to signals with brain noise and human resting-state source amplitude images.

    PubMed

    Huang, Ming-Xiong; Huang, Charles W; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L; Baker, Dewleen G; Song, Tao; Harrington, Deborah L; Theilmann, Rebecca J; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M; Edgar, J Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T; Drake, Angela; Lee, Roland R

    2014-01-01

    The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL's performance was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL's performance was then examined in the analysis of human median-nerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer's problems of signal leaking and distorted source time-courses. © 2013.

  5. MEG Source Imaging Method using Fast L1 Minimum-norm and its Applications to Signals with Brain Noise and Human Resting-state Source Amplitude Images

    PubMed Central

    Huang, Ming-Xiong; Huang, Charles W.; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L.; Baker, Dewleen G.; Song, Tao; Harrington, Deborah L.; Theilmann, Rebecca J.; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M.; Edgar, J. Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T.; Drake, Angela; Lee, Roland R.

    2014-01-01

    The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL’s performance of was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL’s performance was then examined in the analysis of human mediannerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer’s problems of signal leaking and distorted source time-courses. PMID:24055704

  6. Concepts, Methods, and Data Sources for Cumulative Health Risk Assessment of Multiple Chemicals, Exposures and Effects: A Resource Document (Final Report, 2008)

    EPA Science Inventory

    EPA announced the availability of the final report, Concepts, Methods, and Data Sources for Cumulative Health Risk Assessment of Multiple Chemicals, Exposures and Effects: A Resource Document. This report provides the concepts, methods and data sources needed to assist in...

  7. Reading Multiple Texts about Climate Change: The Relationship between Memory for Sources and Text Comprehension

    ERIC Educational Resources Information Center

    Stromso, Helge I.; Braten, Ivar; Britt, M. Anne

    2010-01-01

    In many situations, readers are asked to learn from multiple documents. Many studies have found that evaluating the trustworthiness and usefulness of document sources is an important skill in such learning situations. There has been, however, no direct evidence that attending to source information helps readers learn from and interpret a…

  8. The Use of Source-Related Strategies in Evaluating Multiple Psychology Texts: A Student-Scientist Comparison

    ERIC Educational Resources Information Center

    von der Mühlen, Sarah; Richter, Tobias; Schmid, Sebastian; Schmidt, Elisabeth Marie; Berthold, Kirsten

    2016-01-01

    Multiple text comprehension can greatly benefit from paying attention to sources and from using this information for evaluating text information. Previous research based on texts from the domain of history suggests that source-related strategies are acquired as part of the discipline expertise as opposed to the spontaneous use of these strategies…

  9. Using d15 N in Fish Larvae as an Indicator of Watershed Sources of Anthropogenic Nitrogen: Response at Multiple Spatial Scales

    EPA Science Inventory

    The nitrogen stable isotope, 15N, is an effective tool to track anthropogenic N sources to aquatic ecosystems. It may be difficult to identify potential N sources, however, where 15N responds similarly to multiple, concurrent activities in the watershed that cause higher nutrient...

  10. LISA Framework for Enhancing Gravitational Wave Signal Extraction Techniques

    NASA Technical Reports Server (NTRS)

    Thompson, David E.; Thirumalainambi, Rajkumar

    2006-01-01

    This paper describes the development of a Framework for benchmarking and comparing signal-extraction and noise-interference-removal methods that are applicable to interferometric Gravitational Wave detector systems. The primary use is towards comparing signal and noise extraction techniques at LISA frequencies from multiple (possibly confused) ,gravitational wave sources. The Framework includes extensive hybrid learning/classification algorithms, as well as post-processing regularization methods, and is based on a unique plug-and-play (component) architecture. Published methods for signal extraction and interference removal at LISA Frequencies are being encoded, as well as multiple source noise models, so that the stiffness of GW Sensitivity Space can be explored under each combination of methods. Furthermore, synthetic datasets and source models can be created and imported into the Framework, and specific degraded numerical experiments can be run to test the flexibility of the analysis methods. The Framework also supports use of full current LISA Testbeds, Synthetic data systems, and Simulators already in existence through plug-ins and wrappers, thus preserving those legacy codes and systems in tact. Because of the component-based architecture, all selected procedures can be registered or de-registered at run-time, and are completely reusable, reconfigurable, and modular.

  11. Steady-State Ion Beam Modeling with MICHELLE

    NASA Astrophysics Data System (ADS)

    Petillo, John

    2003-10-01

    There is a need to efficiently model ion beam physics for ion implantation, chemical vapor deposition, and ion thrusters. Common to all is the need for three-dimensional (3D) simulation of volumetric ion sources, ion acceleration, and optics, with the ability to model charge exchange of the ion beam with a background neutral gas. The two pieces of physics stand out as significant are the modeling of the volumetric source and charge exchange. In the MICHELLE code, the method for modeling the plasma sheath in ion sources assumes that the electron distribution function is a Maxwellian function of electrostatic potential over electron temperature. Charge exchange is the process by which a neutral background gas with a "fast" charged particle streaming through exchanges its electron with the charged particle. An efficient method for capturing this is essential, and the model presented is based on semi-empirical collision cross section functions. This appears to be the first steady-state 3D algorithm of its type to contain multiple generations of charge exchange, work with multiple species and multiple charge state beam/source particles simultaneously, take into account the self-consistent space charge effects, and track the subsequent fast neutral particles. The solution used by MICHELLE is to combine finite element analysis with particle-in-cell (PIC) methods. The basic physics model is based on the equilibrium steady-state application of the electrostatic particle-in-cell (PIC) approximation employing a conformal computational mesh. The foundation stems from the same basic model introduced in codes such as EGUN. Here, Poisson's equation is used to self-consistently include the effects of space charge on the fields, and the relativistic Lorentz equation is used to integrate the particle trajectories through those fields. The presentation will consider the complexity of modeling ion thrusters.

  12. Neutron coincidence counting based on time interval analysis with one- and two-dimensional Rossi-alpha distributions: an application for passive neutron waste assay

    NASA Astrophysics Data System (ADS)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.

    1996-02-01

    Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.

  13. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.

  14. Simulation of Rate-Related (Dead-Time) Losses In Passive Neutron Multiplicity Counting Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, L.G.; Norman, P.I.; Leadbeater, T.W.

    Passive Neutron Multiplicity Counting (PNMC) based on Multiplicity Shift Register (MSR) electronics (a form of time correlation analysis) is a widely used non-destructive assay technique for quantifying spontaneously fissile materials such as Pu. At high event rates, dead-time losses perturb the count rates with the Singles, Doubles and Triples being increasingly affected. Without correction these perturbations are a major source of inaccuracy in the measured count rates and assay values derived from them. This paper presents the simulation of dead-time losses and investigates the effect of applying different dead-time models on the observed MSR data. Monte Carlo methods have beenmore » used to simulate neutron pulse trains for a variety of source intensities and with ideal detection geometry, providing an event by event record of the time distribution of neutron captures within the detection system. The action of the MSR electronics was modelled in software to analyse these pulse trains. Stored pulse trains were perturbed in software to apply the effects of dead-time according to the chosen physical process; for example, the ideal paralysable (extending) and non-paralysable models with an arbitrary dead-time parameter. Results of the simulations demonstrate the change in the observed MSR data when the system dead-time parameter is varied. In addition, the paralysable and non-paralysable models of deadtime are compared. These results form part of a larger study to evaluate existing dead-time corrections and to extend their application to correlated sources. (authors)« less

  15. Modeling Photo-multiplier Gain and Regenerating Pulse Height Data for Application Development

    NASA Astrophysics Data System (ADS)

    Aspinall, Michael D.; Jones, Ashley R.

    2018-01-01

    Systems that adopt organic scintillation detector arrays often require a calibration process prior to the intended measurement campaign to correct for significant performance variances between detectors within the array. These differences exist because of low tolerances associated with photo-multiplier tube technology and environmental influences. Differences in detector response can be corrected for by adjusting the supplied photo-multiplier tube voltage to control its gain and the effect that this has on the pulse height spectra from a gamma-only calibration source with a defined photo-peak. Automated methods that analyze these spectra and adjust the photo-multiplier tube bias accordingly are emerging for hardware that integrate acquisition electronics and high voltage control. However, development of such algorithms require access to the hardware, multiple detectors and calibration source for prolonged periods, all with associated constraints and risks. In this work, we report on a software function and related models developed to rescale and regenerate pulse height data acquired from a single scintillation detector. Such a function could be used to generate significant and varied pulse height data that can be used to integration-test algorithms that are capable of automatically response matching multiple detectors using pulse height spectra analysis. Furthermore, a function of this sort removes the dependence on multiple detectors, digital analyzers and calibration source. Results show a good match between the real and regenerated pulse height data. The function has also been used successfully to develop auto-calibration algorithms.

  16. Particle Beam Radiography

    NASA Astrophysics Data System (ADS)

    Peach, Ken; Ekdahl, Carl

    2014-02-01

    Particle beam radiography, which uses a variety of particle probes (neutrons, protons, electrons, gammas and potentially other particles) to study the structure of materials and objects noninvasively, is reviewed, largely from an accelerator perspective, although the use of cosmic rays (mainly muons but potentially also high-energy neutrinos) is briefly reviewed. Tomography is a form of radiography which uses multiple views to reconstruct a three-dimensional density map of an object. There is a very wide range of applications of radiography and tomography, from medicine to engineering and security, and advances in instrumentation, specifically the development of electronic detectors, allow rapid analysis of the resultant radiographs. Flash radiography is a diagnostic technique for large high-explosive-driven hydrodynamic experiments that is used at many laboratories. The bremsstrahlung radiation pulse from an intense relativistic electron beam incident onto a high-Z target is the source of these radiographs. The challenge is to provide radiation sources intense enough to penetrate hundreds of g/cm2 of material, in pulses short enough to stop the motion of high-speed hydrodynamic shocks, and with source spots small enough to resolve fine details. The challenge has been met with a wide variety of accelerator technologies, including pulsed-power-driven diodes, air-core pulsed betatrons and high-current linear induction accelerators. Accelerator technology has also evolved to accommodate the experimenters' continuing quest for multiple images in time and space. Linear induction accelerators have had a major role in these advances, especially in providing multiple-time radiographs of the largest hydrodynamic experiments.

  17. A Tracking Analyst for large 3D spatiotemporal data from multiple sources (case study: Tracking volcanic eruptions in the atmosphere)

    NASA Astrophysics Data System (ADS)

    Gad, Mohamed A.; Elshehaly, Mai H.; Gračanin, Denis; Elmongui, Hicham G.

    2018-02-01

    This research presents a novel Trajectory-based Tracking Analyst (TTA) that can track and link spatiotemporally variable data from multiple sources. The proposed technique uses trajectory information to determine the positions of time-enabled and spatially variable scatter data at any given time through a combination of along trajectory adjustment and spatial interpolation. The TTA is applied in this research to track large spatiotemporal data of volcanic eruptions (acquired using multi-sensors) in the unsteady flow field of the atmosphere. The TTA enables tracking injections into the atmospheric flow field, the reconstruction of the spatiotemporally variable data at any desired time, and the spatiotemporal join of attribute data from multiple sources. In addition, we were able to create a smooth animation of the volcanic ash plume at interactive rates. The initial results indicate that the TTA can be applied to a wide range of multiple-source data.

  18. Simultaneous Quantification of Free Cholesterol, Cholesteryl Esters, and Triglycerides without Ester Hydrolysis by UHPLC Separation and In-Source Collision Induced Dissociation Coupled MS/MS

    NASA Astrophysics Data System (ADS)

    Gardner, Michael S.; McWilliams, Lisa G.; Jones, Jeffrey I.; Kuklenyik, Zsuzsanna; Pirkle, James L.; Barr, John R.

    2017-08-01

    We demonstrate the application of in-source nitrogen collision-induced dissociation (CID) that eliminates the need for ester hydrolysis before simultaneous analysis of esterified cholesterol (EC) and triglycerides (TG) along with free cholesterol (FC) from human serum, using normal phase liquid chromatography (LC) coupled to atmospheric pressure chemical ionization (APCI) tandem mass spectrometry (MS/MS). The analysis requires only 50 μL of 1:100 dilute serum with a high-throughput, precipitation/evaporation/extraction protocol in one pot. Known representative mixtures of EC and TG species were used as calibrators with stable isotope labeled analogs as internal standards. The APCI MS source was operated with nitrogen source gas. Reproducible in-source CID was achieved with the use of optimal cone voltage (declustering potential), generating FC, EC, and TG lipid class-specific precursor fragment ions for multiple reaction monitoring (MRM). Using a representative mixture of purified FC, CE, and TG species as calibrators, the method accuracy was assessed with analysis of five inter-laboratory standardization materials, showing -10% bias for Total-C and -3% for Total-TG. Repeated duplicate analysis of a quality control pool showed intra-day and inter-day variation of 5% and 5.8% for FC, 5.2% and 8.5% for Total-C, and 4.1% and 7.7% for Total-TG. The applicability of the method was demonstrated on 32 serum samples and corresponding lipoprotein sub-fractions collected from normolipidemic, hypercholesterolemic, hypertriglyceridemic, and hyperlipidemic donors. The results show that in-source CID coupled with isotope dilution UHPLC-MS/MS is a viable high precision approach for translational research studies where samples are substantially diluted or the amounts of archived samples are limited. [Figure not available: see fulltext.

  19. Analysis of XMM-Newton Data from Extended Sources and the Diffuse X-Ray Background

    NASA Technical Reports Server (NTRS)

    Snowden, Steven

    2011-01-01

    Reduction of X-ray data from extended objects and the diffuse background is a complicated process that requires attention to the details of the instrumental response as well as an understanding of the multiple background components. We present methods and software that we have developed to reduce data from XMM-Newton EPIC imaging observations for both the MOS and PN instruments. The software has now been included in the Science Analysis System (SAS) package available through the XMM-Newton Science Operations Center (SOC).

  20. Uncertainty Analysis of Consequence Management (CM) Data Products.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  1. Multilevel analysis of the influence of patients' and general practitioners' characteristics on patented versus multiple-sourced statin prescribing in France.

    PubMed

    Pichetti, Sylvain; Sermet, Catherine; Godman, Brian; Campbell, Stephen M; Gustafsson, Lars L

    2013-06-01

    The French National Health Insurance and the Ministry of Health have introduced multiple reforms in recent years to increase prescribing efficiency. These include guidelines, academic detailing, financial incentives for the prescribing and dispensing of generics drugs as well as a voluntary pay-for-performance programme. However, the quality and efficiency of prescribing could be enhanced potentially if there was better understanding of the dynamics of prescribing behaviour in France. To analyse the patient and general practitioner characteristics that influence patented versus multiple-sourced statin prescribing in France. Statistical analysis was performed on the statin prescribing habits from 341 general practitioners (GPs) that were included in the IMS-Health Permanent Survey on Medical Prescription in France, which was conducted between 2009 and 2010 and involved 14,360 patients. Patient characteristics included their age and gender as well as five medical profiles that were constructed from the diagnoses obtained during consultations. These were (1) disorders of lipoprotein metabolism, (2) heart disease, (3) diabetes, (4) complex profiles and (5) profiles based on other diagnoses. Physician characteristics included their age, gender, solo or group practice, weekly workload and payment scheme. Patient age had a statistically significant impact on statin prescribing for patients in profile 1 (disorders of lipoprotein metabolism) and profile 3 (complex profiles) with a greater number of patented statins being prescribed for the youngest patients. For instance, patients older than 76 years with a complex profile were prescribed fewer patented statins than patients aged 68-76 years old with the same medical profile (coefficient: -0.225; p = 0.0008). By contrast, regardless of the patient's age, the medical profile did not affect the probability of prescribing a patented statin except in young patients with heart diseases who were prescribed a greater number of patented statins (coefficient: 0.3992; p = 0.0007). Prescribing was also statistically influenced by physician features, e.g., older male physicians were more likely to prescribe patented statins (coefficient: 0.245; p = 0.0417) and GPs practicing in groups were more likely to prescribe multiple sourced statins (coefficient: -0.178; p = 0.0338), which is an important finding of the study. GPs with a lower workload prescribed a greater number of patented statins. There is significant variability in the prescribing of different statins among patient and physician profiles as well as between solo and group practices. Consequently, there are opportunities to target demand-side measures to enhance the prescribing of multiple-sourced statins. Further studies are warranted, in particular in other therapeutic classes, to provide a counter-balance to the considerable marketing activities of pharmaceutical companies.

  2. VizieR Online Data Catalog: GUViCS. Ultraviolet Source Catalogs (Voyer+, 2014)

    NASA Astrophysics Data System (ADS)

    Voyer, E. N.; Boselli, A.; Boissier, S.; Heinis, S.; Cortese, L.; Ferrarese, L.; Cote, P.; Cuillandre, J.-C.; Gwyn, S. D. J.; Peng, E. W.; Zhang, H.; Liu, C.

    2014-07-01

    These catalogs are based on GALEX NUV and FUV source detections in and behind the Virgo Cluster. The detections are split into catalogs of extended sources and point-like sources. The UV Virgo Cluster Extended Source catalog (UV_VES.fit) provides the deepest and most extensive UV photometric data of extended galaxies in Virgo to date. If certain data is not available for a given source then a null value is entered (e.g. -999, -99). UV point-like sources are matched with SDSS, NGVS, and NED and the relevant photometry and further data from these databases/catalogs are provided in this compilation of catalogs. The primary GUViCS UV Virgo Cluster Point-Like Source catalog is UV_VPS.fit. This catalog provides the most useful GALEX pipeline NUV and FUV photometric parameters, and categorizes sources as stars, Virgo members, and background sources, when possible. It also provides identifiers for optical matches in the SDSS and NED, and indicates if a match exists in the NGVS, only if GUViCS-optical matches are one-to-one. NED spectroscopic redshifts are also listed for GUViCS-NED one-to-one matches. If certain data is not available for a given source a null value is entered. Additionally, the catalog is useful for quick access to optical data on one-to-one GUViCS-SDSS matches.The only parameter available in the catalog for UV sources that have multiple SDSS matches is the total number of multiple matches, i.e. SDSSNUMMTCHS. Multiple GUViCS sources matched to the same SDSS source are also flagged given a total number of matches, SDSSNUMMTCHS, of one. All other fields for multiple matches are set to a null value of -99. In order to obtain full optical SDSS data for multiply matched UV sources in both scenarios, the user can cross-correlate the GUViCS ID of the sources of interest with the full GUViCS-SDSS matched catalog in GUV_SDSS.fit. The GUViCS-SDSS matched catalog, GUV_SDSS.fit, provides the most relevant SDSS data on all GUViCS-SDSS matches, including one-to-one matches and multiply matched sources. The catalog gives full SDSS identification information, complete SDSS photometric measurements in multiple aperture types, and complete redshift information (photometric and spectroscopic). It is ideal for large statistical studies of galaxy populations at multiple wavelengths in the background of the Virgo Cluster. The catalog can also be used as a starting point to study and search for previously unknown UV-bright point-like objects within the Virgo Cluster. If certain data is not available for a given source that field is given a null value. (6 data files).

  3. Stochastic description of geometric phase for polarized waves in random media

    NASA Astrophysics Data System (ADS)

    Boulanger, Jérémie; Le Bihan, Nicolas; Rossetto, Vincent

    2013-01-01

    We present a stochastic description of multiple scattering of polarized waves in the regime of forward scattering. In this regime, if the source is polarized, polarization survives along a few transport mean free paths, making it possible to measure an outgoing polarization distribution. We consider thin scattering media illuminated by a polarized source and compute the probability distribution function of the polarization on the exit surface. We solve the direct problem using compound Poisson processes on the rotation group SO(3) and non-commutative harmonic analysis. We obtain an exact expression for the polarization distribution which generalizes previous works and design an algorithm solving the inverse problem of estimating the scattering properties of the medium from the measured polarization distribution. This technique applies to thin disordered layers, spatially fluctuating media and multiple scattering systems and is based on the polarization but not on the signal amplitude. We suggest that it can be used as a non-invasive testing method.

  4. A novel algorithm for solving the true coincident counting issues in Monte Carlo simulations for radiation spectroscopy.

    PubMed

    Guan, Fada; Johns, Jesse M; Vasudevan, Latha; Zhang, Guoqing; Tang, Xiaobin; Poston, John W; Braby, Leslie A

    2015-06-01

    Coincident counts can be observed in experimental radiation spectroscopy. Accurate quantification of the radiation source requires the detection efficiency of the spectrometer, which is often experimentally determined. However, Monte Carlo analysis can be used to supplement experimental approaches to determine the detection efficiency a priori. The traditional Monte Carlo method overestimates the detection efficiency as a result of omitting coincident counts caused mainly by multiple cascade source particles. In this study, a novel "multi-primary coincident counting" algorithm was developed using the Geant4 Monte Carlo simulation toolkit. A high-purity Germanium detector for ⁶⁰Co gamma-ray spectroscopy problems was accurately modeled to validate the developed algorithm. The simulated pulse height spectrum agreed well qualitatively with the measured spectrum obtained using the high-purity Germanium detector. The developed algorithm can be extended to other applications, with a particular emphasis on challenging radiation fields, such as counting multiple types of coincident radiations released from nuclear fission or used nuclear fuel.

  5. Systematic Review of International Colposcopy Quality Improvement Guidelines.

    PubMed

    Mayeaux, Edward J; Novetsky, Akiva P; Chelmow, David; Choma, Kim; Garcia, Francisco; Liu, Angela H; Papasozomenos, Theognosia; Einstein, Mark H

    2017-10-01

    The American Society for Colposcopy and Cervical Pathology Colposcopy Standards Committee organized multiple working groups to draft colposcopy standards for the United States. As part of this project, international quality assurance and improvement measures were examined. The quality improvement working group performed a systematic review of the literature to collate international guidelines related to quality improvement. Source guidelines were collected using searches in Medline, Google Scholar, the International Federation of Cervical Pathology and Colposcopy Web site, other regional colposcopy group's Web sites, and communications with International Federation of Cervical Pathology and Colposcopy board of directors' members and other expert members of various national groups. Once identified, the sources were reviewed by multiple workgroup members for potential guideline materials. Fifty-six unique documents were identified, of which 18 met inclusion criteria and contributed data to the analysis. Information was abstracted and grouped by related subject. Wide variation exists in colposcopy guidance and quality indicators from regional and national colposcopy societies. Abstracted international guidelines are presented.

  6. The Chandra Source Catalog : Automated Source Correlation

    NASA Astrophysics Data System (ADS)

    Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).

  7. Functional Interaction Network Construction and Analysis for Disease Discovery.

    PubMed

    Wu, Guanming; Haw, Robin

    2017-01-01

    Network-based approaches project seemingly unrelated genes or proteins onto a large-scale network context, therefore providing a holistic visualization and analysis platform for genomic data generated from high-throughput experiments, reducing the dimensionality of data via using network modules and increasing the statistic analysis power. Based on the Reactome database, the most popular and comprehensive open-source biological pathway knowledgebase, we have developed a highly reliable protein functional interaction network covering around 60 % of total human genes and an app called ReactomeFIViz for Cytoscape, the most popular biological network visualization and analysis platform. In this chapter, we describe the detailed procedures on how this functional interaction network is constructed by integrating multiple external data sources, extracting functional interactions from human curated pathway databases, building a machine learning classifier called a Naïve Bayesian Classifier, predicting interactions based on the trained Naïve Bayesian Classifier, and finally constructing the functional interaction database. We also provide an example on how to use ReactomeFIViz for performing network-based data analysis for a list of genes.

  8. Cluster Analysis of Campylobacter jejuni Genotypes Isolated from Small and Medium-Sized Mammalian Wildlife and Bovine Livestock from Ontario Farms.

    PubMed

    Viswanathan, M; Pearl, D L; Taboada, E N; Parmley, E J; Mutschall, S K; Jardine, C M

    2017-05-01

    Using data collected from a cross-sectional study of 25 farms (eight beef, eight swine and nine dairy) in 2010, we assessed clustering of molecular subtypes of C. jejuni based on a Campylobacter-specific 40 gene comparative genomic fingerprinting assay (CGF40) subtypes, using unweighted pair-group method with arithmetic mean (UPGMA) analysis, and multiple correspondence analysis. Exact logistic regression was used to determine which genes differentiate wildlife and livestock subtypes in our study population. A total of 33 bovine livestock (17 beef and 16 dairy), 26 wildlife (20 raccoon (Procyon lotor), five skunk (Mephitis mephitis) and one mouse (Peromyscus spp.) C. jejuni isolates were subtyped using CGF40. Dendrogram analysis, based on UPGMA, showed distinct branches separating bovine livestock and mammalian wildlife isolates. Furthermore, two-dimensional multiple correspondence analysis was highly concordant with dendrogram analysis showing clear differentiation between livestock and wildlife CGF40 subtypes. Based on multilevel logistic regression models with a random intercept for farm of origin, we found that isolates in general, and raccoons more specifically, were significantly more likely to be part of the wildlife branch. Exact logistic regression conducted gene by gene revealed 15 genes that were predictive of whether an isolate was of wildlife or bovine livestock isolate origin. Both multiple correspondence analysis and exact logistic regression revealed that in most cases, the presence of a particular gene (13 of 15) was associated with an isolate being of livestock rather than wildlife origin. In conclusion, the evidence gained from dendrogram analysis, multiple correspondence analysis and exact logistic regression indicates that mammalian wildlife carry CGF40 subtypes of C. jejuni distinct from those carried by bovine livestock. Future studies focused on source attribution of C. jejuni in human infections will help determine whether wildlife transmit Campylobacter jejuni directly to humans. © 2016 Blackwell Verlag GmbH.

  9. Gpufit: An open-source toolkit for GPU-accelerated curve fitting.

    PubMed

    Przybylski, Adrian; Thiel, Björn; Keller-Findeisen, Jan; Stock, Bernd; Bates, Mark

    2017-11-16

    We present a general purpose, open-source software library for estimation of non-linear parameters by the Levenberg-Marquardt algorithm. The software, Gpufit, runs on a Graphics Processing Unit (GPU) and executes computations in parallel, resulting in a significant gain in performance. We measured a speed increase of up to 42 times when comparing Gpufit with an identical CPU-based algorithm, with no loss of precision or accuracy. Gpufit is designed such that it is easily incorporated into existing applications or adapted for new ones. Multiple software interfaces, including to C, Python, and Matlab, ensure that Gpufit is accessible from most programming environments. The full source code is published as an open source software repository, making its function transparent to the user and facilitating future improvements and extensions. As a demonstration, we used Gpufit to accelerate an existing scientific image analysis package, yielding significantly improved processing times for super-resolution fluorescence microscopy datasets.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinker, Alexander

    Here, we study control of the angular-velocity actuated nonholonomic unicycle, via a simple, bounded extremum seeking controller which is robust to external disturbances and measurement noise. The vehicle performs source seeking despite not having any position information about itself or the source, able only to sense a noise corrupted scalar value whose extremum coincides with the unknown source location. In order to control the angular velocity, rather than the angular heading directly, a controller is developed such that the closed loop system exhibits multiple time scales and requires an analysis approach expanding the previous work of Kurzweil, Jarnik, Sussmann, andmore » Liu, utilizing weak limits. We provide analytic proof of stability and demonstrate how this simple scheme can be extended to include position-independent source seeking, tracking, and collision avoidance of groups on autonomous vehicles in GPS-denied environments, based only on a measure of distance to an obstacle, which is an especially important feature for an autonomous agent.« less

  11. Using Generalizability Theory to Examine Sources of Variance in Observed Behaviors within High School Classrooms

    ERIC Educational Resources Information Center

    Abry, Tashia; Cash, Anne H.; Bradshaw, Catherine P.

    2014-01-01

    Generalizability theory (GT) offers a useful framework for estimating the reliability of a measure while accounting for multiple sources of error variance. The purpose of this study was to use GT to examine multiple sources of variance in and the reliability of school-level teacher and high school student behaviors as observed using the tool,…

  12. Recent Developments of an Opto-Electronic THz Spectrometer for High-Resolution Spectroscopy

    PubMed Central

    Hindle, Francis; Yang, Chun; Mouret, Gael; Cuisset, Arnaud; Bocquet, Robin; Lampin, Jean-François; Blary, Karine; Peytavit, Emilien; Akalin, Tahsin; Ducournau, Guillaume

    2009-01-01

    A review is provided of sources and detectors that can be employed in the THz range before the description of an opto-electronic source of monochromatic THz radiation. The realized spectrometer has been applied to gas phase spectroscopy. Air-broadening coefficients of HCN are determined and the insensitivity of this technique to aerosols is demonstrated by the analysis of cigarette smoke. A multiple pass sample cell has been used to obtain a sensitivity improvement allowing transitions of the volatile organic compounds to be observed. A solution to the frequency metrology is presented and promises to yield accurate molecular line center measurements. PMID:22291552

  13. Investigation of automated feature extraction using multiple data sources

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Perkins, Simon J.; Pope, Paul A.; Theiler, James P.; David, Nancy A.; Porter, Reid B.

    2003-04-01

    An increasing number and variety of platforms are now capable of collecting remote sensing data over a particular scene. For many applications, the information available from any individual sensor may be incomplete, inconsistent or imprecise. However, other sources may provide complementary and/or additional data. Thus, for an application such as image feature extraction or classification, it may be that fusing the mulitple data sources can lead to more consistent and reliable results. Unfortunately, with the increased complexity of the fused data, the search space of feature-extraction or classification algorithms also greatly increases. With a single data source, the determination of a suitable algorithm may be a significant challenge for an image analyst. With the fused data, the search for suitable algorithms can go far beyond the capabilities of a human in a realistic time frame, and becomes the realm of machine learning, where the computational power of modern computers can be harnessed to the task at hand. We describe experiments in which we investigate the ability of a suite of automated feature extraction tools developed at Los Alamos National Laboratory to make use of multiple data sources for various feature extraction tasks. We compare and contrast this software's capabilities on 1) individual data sets from different data sources 2) fused data sets from multiple data sources and 3) fusion of results from multiple individual data sources.

  14. Energetic Phenomena on the Sun: The Solar Maximum Mission Flare Workshop. Proceedings

    NASA Technical Reports Server (NTRS)

    Kundu, Mukul (Editor); Woodgate, Bruce (Editor)

    1986-01-01

    The general objectives of the conference were as follows: (1) Synthesize flare studies after three years of Solar Maximum Mission (SSM) data analysis. Encourage a broader participation in the SMM data analysis and combine this more fully with theory and other data sources-data obtained with other spacecraft such as the HINOTORI, p78-1, and ISEE-3 spacecrafts, and with the Very Large Array (VLA) and many other ground-based instruments. Many coordinated data sets, unprecedented in their breadth of coverage and multiplicity of sources, had been obtained within the structure of the Solar Maximum Year (SMY). (2) Stimulate joint studies, and publication in the general scientific literature. The intended primary benefit was for informal collaborations to be started or broadened at the Workshops with subsequent publications. (3) Provide a special publication resulting from the Workshop.

  15. Automatic identification of resting state networks: an extended version of multiple template-matching

    NASA Astrophysics Data System (ADS)

    Guaje, Javier; Molina, Juan; Rudas, Jorge; Demertzi, Athena; Heine, Lizette; Tshibanda, Luaba; Soddu, Andrea; Laureys, Steven; Gómez, Francisco

    2015-12-01

    Functional magnetic resonance imaging in resting state (fMRI-RS) constitutes an informative protocol to investigate several pathological and pharmacological conditions. A common approach to study this data source is through the analysis of changes in the so called resting state networks (RSNs). These networks correspond to well-defined functional entities that have been associated to different low and high brain order functions. RSNs may be characterized by using Independent Component Analysis (ICA). ICA provides a decomposition of the fMRI-RS signal into sources of brain activity, but it lacks of information about the nature of the signal, i.e., if the source is artifactual or not. Recently, a multiple template-matching (MTM) approach was proposed to automatically recognize RSNs in a set of Independent Components (ICs). This method provides valuable information to assess subjects at individual level. Nevertheless, it lacks of a mechanism to quantify how much certainty there is about the existence/absence of each network. This information may be important for the assessment of patients with severely damaged brains, in which RSNs may be greatly affected as a result of the pathological condition. In this work we propose a set of changes to the original MTM that improves the RSNs recognition task and also extends the functionality of the method. The key points of this improvement is a standardization strategy and a modification of method's constraints that adds flexibility to the approach. Additionally, we also introduce an analysis to the trustworthiness measurement of each RSN obtained by using template-matching approach. This analysis consists of a thresholding strategy applied over the computed Goodness-of-Fit (GOF) between the set of templates and the ICs. The proposed method was validated on 2 two independent studies (Baltimore, 23 healthy subjects and Liege, 27 healthy subjects) with different configurations of MTM. Results suggest that the method will provide complementary information for characterization of RSNs at individual level.

  16. Source apportionment of PM2.5 at the Lin'an regional background site in China with three receptor models

    NASA Astrophysics Data System (ADS)

    Deng, Junjun; Zhang, Yanru; Qiu, Yuqing; Zhang, Hongliang; Du, Wenjiao; Xu, Lingling; Hong, Youwei; Chen, Yanting; Chen, Jinsheng

    2018-04-01

    Source apportionment of fine particulate matter (PM2.5) were conducted at the Lin'an Regional Atmospheric Background Station (LA) in the Yangtze River Delta (YRD) region in China from July 2014 to April 2015 with three receptor models including principal component analysis combining multiple linear regression (PCA-MLR), UNMIX and Positive Matrix Factorization (PMF). The model performance, source identification and source contribution of the three models were analyzed and inter-compared. Source apportionment of PM2.5 was also conducted with the receptor models. Good correlations between the reconstructed and measured concentrations of PM2.5 and its major chemical species were obtained for all models. PMF resolved almost all masses of PM2.5, while PCA-MLR and UNMIX explained about 80%. Five, four and seven sources were identified by PCA-MLR, UNMIX and PMF, respectively. Combustion, secondary source, marine source, dust and industrial activities were identified by all the three receptor models. Combustion source and secondary source were the major sources, and totally contributed over 60% to PM2.5. The PMF model had a better performance on separating the different combustion sources. These findings improve the understanding of PM2.5 sources in background region.

  17. The BL LAC phenomenon: X-ray observations of transition objects and determination of the x-ray spectrum of a complete sample of flat-spectrum radio sources

    NASA Technical Reports Server (NTRS)

    Worrall, Diana M.

    1994-01-01

    This report summarizes the activities related to two ROSAT investigations: (1) x-ray properties of radio galaxies thought to contain BL Lac type nuclei; and (2) x-ray spectra of a complete sample of flat-spectrum radio sources. The following papers describing the research are provided as attachments: Multiple X-ray Emission Components in Low Power Radio Galaxies; New X-ray Results on Radio Galaxies; Analysis Techniques for a Multiwavelength Study of Radio Galaxies; Separation of X-ray Emission Components in Radio Galaxies; X-ray Emission in Powerful Radio Galaxies and Quasars; Extended and Compact X-ray Emission in Powerful Radio Galaxies; and X-ray Spectra of a Complete Sample of Extragalactic Core-dominated Radio Sources.

  18. Detection of ventricular fibrillation from multiple sensors

    NASA Astrophysics Data System (ADS)

    Lindsley, Stephanie A.; Ludeman, Lonnie C.

    1992-07-01

    Ventricular fibrillation is a potentially fatal medical condition in which the flow of blood through the body is terminated due to the lack of an organized electric potential in the heart. Automatic implantable defibrillators are becoming common as a means for helping patients confronted with repeated episodes of ventricular fibrillation. Defibrillators must first accurately detect ventricular fibrillation and then provide an electric shock to the heart to allow a normal sinus rhythm to resume. The detection of ventricular fibrillation by using an array of multiple sensors to distinguish between signals recorded from single (normal sinus rhythm) or multiple (ventricular fibrillation) sources is presented. An idealistic model is presented and the analysis of data generated by this model suggests that the method is promising as a method for accurately and quickly detecting ventricular fibrillation from signals recorded from sensors placed on the epicardium.

  19. The Multiple Doppler Radar Workshop, November 1979.

    NASA Astrophysics Data System (ADS)

    Carbone, R. E.; Harris, F. I.; Hildebrand, P. H.; Kropfli, R. A.; Miller, L. J.; Moninger, W.; Strauch, R. G.; Doviak, R. J.; Johnson, K. W.; Nelson, S. P.; Ray, P. S.; Gilet, M.

    1980-10-01

    The findings of the Multiple Doppler Radar Workshop are summarized by a series of six papers. Part I of this series briefly reviews the history of multiple Doppler experimentation, fundamental concepts of Doppler signal theory, and organization and objectives of the Workshop. Invited presentations by dynamicists and cloud physicists are also summarized.Experimental design and procedures (Part II) are shown to be of critical importance. Well-defined and limited experimental objectives are necessary in view of technological limitations. Specified radar scanning procedures that balance temporal and spatial resolution considerations are discussed in detail. Improved siting for suppression of ground clutter as well as scanning procedures to minimize errors at echo boundaries are discussed. The need for accelerated research using numerically simulated proxy data sets is emphasized.New technology to eliminate various sampling limitations is cited as an eventual solution to many current problems in Part III. Ground clutter contamination may be curtailed by means of full spectral processing, digital filters in real time, and/or variable pulse repetition frequency. Range and velocity ambiguities also may be minimized by various pulsing options as well as random phase transmission. Sidelobe contamination can be reduced through improvements in radomes, illumination patterns, and antenna feed types. Radar volume-scan time can be sharply reduced by means of wideband transmission, phased array antennas, multiple beam antennas, and frequency agility.Part IV deals with synthesis of data from several radars in the context of scientific requirements in cumulus clouds, widespread precipitation, and severe convective storms. The important temporal and spatial scales are examined together with the accuracy required for vertical air motion in each phenomenon. Factors that introduce errors in the vertical velocity field are identified and synthesis techniques are discussed separately for the dual Doppler and multiple Doppler cases. Various filters and techniques, including statistical and variational approaches, are mentioned. Emphasis is placed on the importance of experiment design and procedures, technological improvements, incorporation of all information from supporting sensors, and analysis priority for physically simple cases. Integrated reliability is proposed as an objective tool for radar siting.Verification of multiple Doppler-derived vertical velocity is discussed in Part V. Three categories of verification are defined as direct, deductive, and theoretical/numerical. Direct verification consists of zenith-pointing radar measurements (from either airborne or ground-based systems), air motion sensing aircraft, instrumented towers, and tracking of radar chaff. Deductive sources include mesonetworks, aircraft (thermodynamic and microphysical) measurements, satellite observations, radar reflectivity, multiple Doppler consistency, and atmospheric soundings. Theoretical/numerical sources of verification include proxy data simulation, momentum checking, and numerical cloud models. New technology, principally in the form of wide bandwidth radars, is seen as a development that may reduce the need for extensive verification of multiple Doppler-derived vertical air motions. Airborne Doppler radar is perceived as the single most important source of verification within the bounds of existing technology.Nine stages of data processing and display are identified in Part VI. The stages are identified as field checks, archival, selection, editing, coordinate transformation, synthesis of Cartesian fields, filtering, display, and physical analysis. Display of data is considered to be a problem critical to assimilation of data at all stages. Interactive computing systems and software are concluded to be very important, particularly for the editing stage. Three- and 4-dimensional displays are considered essential for data assimilation, particularly at the physical analysis stage. The concept of common data tape formats is approved both for data in radar spherical space as well as for synthesized Cartesian output.1169

  20. Probabilistic tsunami hazard analysis: Multiple sources and global applications

    USGS Publications Warehouse

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-01-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  1. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    NASA Astrophysics Data System (ADS)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  2. The Chandra Source Catalog 2.0: Spectral Properties

    NASA Astrophysics Data System (ADS)

    McCollough, Michael L.; Siemiginowska, Aneta; Burke, Douglas; Nowak, Michael A.; Primini, Francis Anthony; Laurino, Omar; Nguyen, Dan T.; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Paxson, Charles; Plummer, David A.; Rots, Arnold H.; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula; Chandra Source Catalog Team

    2018-01-01

    The second release of the Chandra Source Catalog (CSC) contains all sources identified from sixteen years' worth of publicly accessible observations. The vast majority of these sources have been observed with the ACIS detector and have spectral information in 0.5-7 keV energy range. Here we describe the methods used to automatically derive spectral properties for each source detected by the standard processing pipeline and included in the final CSC. The sources with high signal to noise ratio (exceeding 150 net counts) were fit in Sherpa (the modeling and fitting application from the Chandra Interactive Analysis of Observations package) using wstat as a fit statistic and Bayesian draws method to determine errors. Three models were fit to each source: an absorbed power-law, blackbody, and Bremsstrahlung emission. The fitted parameter values for the power-law, blackbody, and Bremsstrahlung models were included in the catalog with the calculated flux for each model. The CSC also provides the source energy fluxes computed from the normalizations of predefined absorbed power-law, black-body, Bremsstrahlung, and APEC models needed to match the observed net X-ray counts. For sources that have been observed multiple times we performed a Bayesian Blocks analysis will have been performed (see the Primini et al. poster) and the most significant block will have a joint fit performed for the mentioned spectral models. In addition, we provide access to data products for each source: a file with source spectrum, the background spectrum, and the spectral response of the detector. Hardness ratios were calculated for each source between pairs of energy bands (soft, medium and hard). This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.

  3. Verification of nonlinear dynamic structural test results by combined image processing and acoustic analysis

    NASA Astrophysics Data System (ADS)

    Tene, Yair; Tene, Noam; Tene, G.

    1993-08-01

    An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.

  4. The multiple infrared source GL 437

    NASA Technical Reports Server (NTRS)

    Wynn-Williams, C. G.; Becklin, E. E.; Beichman, C. A.; Capps, R.; Shakeshaft, J. R.

    1981-01-01

    Infrared and radio continuum observations of the multiple infrared source GL 437 show that it consists of a compact H II region plus two objects which are probably early B stars undergoing rapid mass loss. The group of sources appears to be a multiple system of young stars that have recently emerged from the near side of a molecular cloud. Emission in the unidentified 3.3 micron feature is associated with, but more extended than, the emission from the compact H II region; it probably arises from hot dust grains at the interface between the H II region and the molecular cloud.

  5. Functional principal component analysis of glomerular filtration rate curves after kidney transplant.

    PubMed

    Dong, Jianghu J; Wang, Liangliang; Gill, Jagbir; Cao, Jiguo

    2017-01-01

    This article is motivated by some longitudinal clinical data of kidney transplant recipients, where kidney function progression is recorded as the estimated glomerular filtration rates at multiple time points post kidney transplantation. We propose to use the functional principal component analysis method to explore the major source of variations of glomerular filtration rate curves. We find that the estimated functional principal component scores can be used to cluster glomerular filtration rate curves. Ordering functional principal component scores can detect abnormal glomerular filtration rate curves. Finally, functional principal component analysis can effectively estimate missing glomerular filtration rate values and predict future glomerular filtration rate values.

  6. Seismology of the Oso-Steelhead landslide

    NASA Astrophysics Data System (ADS)

    Hibert, C.; Stark, C. P.; Ekström, G.

    2014-12-01

    We carry out a combined analysis of the short- and long-period seismic signals generated by the devastating Oso-Steelhead landslide that occurred on 22 March 2014. The seismic records show that the Oso-Steelhead landslide was not a single slope failure, but a succession of multiple failures distinguished by two major collapses that occurred approximately three minutes apart. The first generated long-period surface waves that were recorded at several proximal stations. We invert these long-period signals for the forces acting at the source, and obtain estimates of the first failure runout and kinematics, as well as its mass after calibration against the mass-center displacement estimated from remote-sensing imagery. Short-period analysis of both events suggests that the source dynamics of the second are more complex than the first. No distinct long-period surface waves were recorded for the second failure, which prevents inversion for its source parameters. However, by comparing the seismic energy of the short-period waves generated by both events we are able to estimate the volume of the second. Our analysis suggests that the volume of the second failure is about 15-30% of the total landslide volume, which is in agreement with ground observations.

  7. Multiple correspondence analysis and random amplified polymorphic DNA molecular typing to assess the sources of Staphylococcus aureus contamination in alheira production lines.

    PubMed

    Esteves, A; Patarata, L; Aymerich, T; Garriga, M; Martins, C

    2007-03-01

    Sources and tracing of Staphylococcus aureus in alheira (garlic sausage) production were evaluated by multifactorial correspondence analysis (MCA) of occurrence data and a random amplified polymorphic DNA (RAPD) on S. aureus isolates. Samples from four production lines, four different production batches, and 14 different sampling sites (including raw material, different contact surfaces, and several stages of alheira manufacturing) were analyzed at four sampling times. From the 896 microbial analyses completed, a collection of 170 S. aureus isolates was obtained. Although analysis of the occurrence data alone was not elucidative enough, MCA and RAPD-PCR were able to assess the sources of contamination and to trace the spread of this microorganism along the production lines. MCA results indicated that the presence of S. aureus in alheira was related to its presence in the intermediate manufacturing stages after heat treatment but before stuffing in the casings. It was also possible to associate a cross-contamination path related to handler procedures. RAPD-PCR typing in accordance to MCA results confirmed the cross-contamination path between the raw material and casings and the role of handlers as an important cross-contamination vehicle.

  8. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    PubMed Central

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692

  9. Grant-Writing Courses in the United States: A Descriptive Review of Syllabi and Factors That Influence Instructor Choice of Course Texts

    ERIC Educational Resources Information Center

    Walsh, Bridget A.; Bonner, Dave; Springer, Victoria; Lalasz, Camille B.; Ives, Bob

    2013-01-01

    Little information exists about the structure and content of grant writing courses offered in the United States. To fill this gap, we used multiple data sources, including a content analysis of syllabi from 93 graduate-level grant writing courses in the United States, and an online survey that sought insight into (a) the ways in which textbooks…

  10. Spatial and Temporal Evolution of Earthquake Dynamics: Case Study of the Mw 8.3 Illapel Earthquake, Chile

    NASA Astrophysics Data System (ADS)

    Yin, Jiuxun; Denolle, Marine A.; Yao, Huajian

    2018-01-01

    We develop a methodology that combines compressive sensing backprojection (CS-BP) and source spectral analysis of teleseismic P waves to provide metrics relevant to earthquake dynamics of large events. We improve the CS-BP method by an autoadaptive source grid refinement as well as a reference source adjustment technique to gain better spatial and temporal resolution of the locations of the radiated bursts. We also use a two-step source spectral analysis based on (i) simple theoretical Green's functions that include depth phases and water reverberations and on (ii) empirical P wave Green's functions. Furthermore, we propose a source spectrogram methodology that provides the temporal evolution of dynamic parameters such as radiated energy and falloff rates. Bridging backprojection and spectrogram analysis provides a spatial and temporal evolution of these dynamic source parameters. We apply our technique to the recent 2015 Mw 8.3 megathrust Illapel earthquake (Chile). The results from both techniques are consistent and reveal a depth-varying seismic radiation that is also found in other megathrust earthquakes. The low-frequency content of the seismic radiation is located in the shallow part of the megathrust, propagating unilaterally from the hypocenter toward the trench while most of the high-frequency content comes from the downdip part of the fault. Interpretation of multiple rupture stages in the radiation is also supported by the temporal variations of radiated energy and falloff rates. Finally, we discuss the possible mechanisms, either from prestress, fault geometry, and/or frictional properties to explain our observables. Our methodology is an attempt to bridge kinematic observations with earthquake dynamics.

  11. Multi Dimensional Honey Bee Foraging Algorithm Based on Optimal Energy Consumption

    NASA Astrophysics Data System (ADS)

    Saritha, R.; Vinod Chandra, S. S.

    2017-10-01

    In this paper a new nature inspired algorithm is proposed based on natural foraging behavior of multi-dimensional honey bee colonies. This method handles issues that arise when food is shared from multiple sources by multiple swarms at multiple destinations. The self organizing nature of natural honey bee swarms in multiple colonies is based on the principle of energy consumption. Swarms of multiple colonies select a food source to optimally fulfill the requirements of its colonies. This is based on the energy requirement for transporting food between a source and destination. Minimum use of energy leads to maximizing profit in each colony. The mathematical model proposed here is based on this principle. This has been successfully evaluated by applying it on multi-objective transportation problem for optimizing cost and time. The algorithm optimizes the needs at each destination in linear time.

  12. Multiple Food-Animal-Borne Route in Transmission of Antibiotic-Resistant Salmonella Newport to Humans

    PubMed Central

    Pan, Hang; Paudyal, Narayan; Li, Xiaoliang; Fang, Weihuan; Yue, Min

    2018-01-01

    Characterization of transmission routes of Salmonella among various food-animal reservoirs and their antibiogram is crucial for appropriate intervention and medical treatment. Here, we analyzed 3728 Salmonella enterica serovar Newport (S. Newport) isolates collected from various food-animals, retail meats and humans in the United States between 1996 and 2015, based on their minimum inhibitory concentration (MIC) toward 27 antibiotics. Random Forest and Hierarchical Clustering statistic was used to group the isolates according to their MICs. Classification and Regression Tree (CART) analysis was used to identify the appropriate antibiotic and its cut-off value between human- and animal-population. Two distinct populations were revealed based on the MICs of individual strain by both methods, with the animal population having significantly higher MICs which correlates to antibiotic-resistance (AR) phenotype. Only ∼9.7% (267/2763) human isolates could be attributed to food–animal origins. Furthermore, the isolates of animal origin had less diverse antibiogram than human isolates (P < 0.001), suggesting multiple sources involved in human infections. CART identified trimethoprim-sulfamethoxazole to be the best classifier for differentiating the animal and human isolates. Additionally, two typical AR patterns, MDR-Amp and Tet-SDR dominant in bovine- or turkey-population, were identified, indicating that distinct food-animal sources could be involved in human infections. The AR analysis suggested fluoroquinolones (i.e., ciprofloxacin), but not extended-spectrum cephalosporins (i.e., ceftriaxone, cefoxitin), is the adaptive choice for empirical therapy. Antibiotic-resistant S. Newport from humans has multiple origins, with distinct food-animal-borne route contributing to a significant proportion of heterogeneous isolates. PMID:29410657

  13. Analysis of the resilience of team performance during a nuclear emergency response exercise.

    PubMed

    Gomes, José Orlando; Borges, Marcos R S; Huber, Gilbert J; Carvalho, Paulo Victor R

    2014-05-01

    The current work presents results from a cognitive task analysis (CTA) of a nuclear disaster simulation. Audio-visual records were collected from an emergency room team composed of individuals from 26 different agencies as they responded to multiple scenarios in a simulated nuclear disaster. This simulation was part of a national emergency response training activity for a nuclear power plant located in a developing country. The objectives of this paper are to describe sources of resilience and brittleness in these activities, identify cues of potential improvements for future emergency simulations, and leveraging the resilience of the emergency response system in case of a real disaster. Multiple CTA techniques were used to gain a better understanding of the cognitive dimensions of the activity and to identify team coordination and crisis management patterns that emerged from the simulation exercises. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  14. Three-dimensional finite elements for the analysis of soil contamination using a multiple-porosity approach

    NASA Astrophysics Data System (ADS)

    El-Zein, Abbas; Carter, John P.; Airey, David W.

    2006-06-01

    A three-dimensional finite-element model of contaminant migration in fissured clays or contaminated sand which includes multiple sources of non-equilibrium processes is proposed. The conceptual framework can accommodate a regular network of fissures in 1D, 2D or 3D and immobile solutions in the macro-pores of aggregated topsoils, as well as non-equilibrium sorption. A Galerkin weighted-residual statement for the three-dimensional form of the equations in the Laplace domain is formulated. Equations are discretized using linear and quadratic prism elements. The system of algebraic equations is solved in the Laplace domain and solution is inverted to the time domain numerically. The model is validated and its scope is illustrated through the analysis of three problems: a waste repository deeply buried in fissured clay, a storage tank leaking into sand and a sanitary landfill leaching into fissured clay over a sand aquifer.

  15. Laser-assisted atom probe tomography of four paired poly-Si/SiO2 multiple-stacks with each thickness of 10 nm

    NASA Astrophysics Data System (ADS)

    Kwak, C.-M.; Seol, J.-B.; Kim, Y.-T.; Park, C.-G.

    2017-02-01

    For the past 10 years, laser-assisted atom probe tomography (APT) analysis has been performed to quantify the near-atomic scale distribution of elements and their local chemical compositions within interfaces that determine the design, processing, and properties of virtually all materials. However, the nature of the occurring laser-induced emission at the surface of needle-shaped sample is highly complex and it has been an ongoing challenge to understand the surface-related interactions between laser-sources and tips containing non-conductive oxides for a robust and reliable analysis of multiple-stacked devices. Here, we find that the APT analysis of four paired poly-Si/SiO2 (conductive/non-conductive) multiple stacks with each thickness of 10 nm is governed by experimentally monitoring three experimental conditions, such as laser-beam energies ranged from 30 to 200 nJ, analysis temperatures varying with 30-100 K, and the inclination of aligned interfaces within a given tip toward analysis direction. Varying with laser-energy and analysis temperature, a drastic compositional ratio of doubly charged Si ions to single charged Si ions within conductive poly-Si layers is modified, as compared with ones detected in the non-conductive layers. Severe distorted APT images of multiple stacks are also inevitable, especially at the conductive layers, and leading to a lowering of the successful analysis yields. This lower throughput has been overcome though changing the inclination of interfaces within a given tip to analysis direction (planar interfaces parallel to the tip axis), but significant deviations in chemical compositions of a conductive layer counted from those of tips containing planar interfaces perpendicular to the tip axis are unavoidable owing to the Si2, SiH2O, and Si2O ions detected, for the first time, within poly-Si layers.

  16. Model-free data analysis for source separation based on Non-Negative Matrix Factorization and k-means clustering (NMFk)

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Alexandrov, B.

    2014-12-01

    The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the location of the water-supply pumping wells based on the available data. The possible applications of the NMFk algorithm are not limited to hydrology problems; NMFk can be applied to any problem where temporal system behavior is observed at multiple locations and an unknown number of physical sources are causing these fluctuations.

  17. Method of analyzing multiple sample simultaneously by detecting absorption and systems for use in such a method

    DOEpatents

    Yeung, Edward S.; Gong, Xiaoyi

    2004-09-07

    The present invention provides a method of analyzing multiple samples simultaneously by absorption detection. The method comprises: (i) providing a planar array of multiple containers, each of which contains a sample comprising at least one absorbing species, (ii) irradiating the planar array of multiple containers with a light source and (iii) detecting absorption of light with a detetion means that is in line with the light source at a distance of at leaat about 10 times a cross-sectional distance of a container in the planar array of multiple containers. The absorption of light by a sample indicates the presence of an absorbing species in it. The method can further comprise: (iv) measuring the amount of absorption of light detected in (iii) indicating the amount of the absorbing species in the sample. Also provided by the present invention is a system for use in the abov metho.The system comprises; (i) a light source comrnpising or consisting essentially of at leaat one wavelength of light, the absorption of which is to be detected, (ii) a planar array of multiple containers, and (iii) a detection means that is in line with the light source and is positioned in line with and parallel to the planar array of multiple contiainers at a distance of at least about 10 times a cross-sectional distance of a container.

  18. Light scattering and transmission measurement using digital imaging for online analysis of constituents in milk

    NASA Astrophysics Data System (ADS)

    Jain, Pranay; Sarma, Sanjay E.

    2015-05-01

    Milk is an emulsion of fat globules and casein micelles dispersed in an aqueous medium with dissolved lactose, whey proteins and minerals. Quantification of constituents in milk is important in various stages of the dairy supply chain for proper process control and quality assurance. In field-level applications, spectrophotometric analysis is an economical option due to the low-cost of silicon photodetectors, sensitive to UV/Vis radiation with wavelengths between 300 - 1100 nm. Both absorption and scattering are witnessed as incident UV/Vis radiation interacts with dissolved and dispersed constituents in milk. These effects can in turn be used to characterize the chemical and physical composition of a milk sample. However, in order to simplify analysis, most existing instrument require dilution of samples to avoid effects of multiple scattering. The sample preparation steps are usually expensive, prone to human errors and unsuitable for field-level and online analysis. This paper introduces a novel digital imaging based method of online spectrophotometric measurements on raw milk without any sample preparation. Multiple LEDs of different emission spectra are used as discrete light sources and a digital CMOS camera is used as an image sensor. The extinction characteristic of samples is derived from captured images. The dependence of multiple scattering on power of incident radiation is exploited to quantify scattering. The method has been validated with experiments for response with varying fat concentrations and fat globule sizes. Despite of the presence of multiple scattering, the method is able to unequivocally quantify extinction of incident radiation and relate it to the fat concentrations and globule sizes of samples.

  19. Source Apportionment and Influencing Factor Analysis of Residential Indoor PM2.5 in Beijing

    PubMed Central

    Yang, Yibing; Liu, Liu; Xu, Chunyu; Li, Na; Liu, Zhe; Wang, Qin; Xu, Dongqun

    2018-01-01

    In order to identify the sources of indoor PM2.5 and to check which factors influence the concentration of indoor PM2.5 and chemical elements, indoor concentrations of PM2.5 and its related elements in residential houses in Beijing were explored. Indoor and outdoor PM2.5 samples that were monitored continuously for one week were collected. Indoor and outdoor concentrations of PM2.5 and 15 elements (Al, As, Ca, Cd, Cu, Fe, K, Mg, Mn, Na, Pb, Se, Tl, V, Zn) were calculated and compared. The median indoor concentration of PM2.5 was 57.64 μg/m3. For elements in indoor PM2.5, Cd and As may be sensitive to indoor smoking, Zn, Ca and Al may be related to indoor sources other than smoking, Pb, V and Se may mainly come from outdoor. Five factors were extracted for indoor PM2.5 by factor analysis, explained 76.8% of total variance, outdoor sources contributed more than indoor sources. Multiple linear regression analysis for indoor PM2.5, Cd and Pb was performed. Indoor PM2.5 was influenced by factors including outdoor PM2.5, smoking during sampling, outdoor temperature and time of air conditioner use. Indoor Cd was affected by factors including smoking during sampling, outdoor Cd and building age. Indoor Pb concentration was associated with factors including outdoor Pb and time of window open per day, building age and RH. In conclusion, indoor PM2.5 mainly comes from outdoor sources, and the contributions of indoor sources also cannot be ignored. Factors associated indoor and outdoor air exchange can influence the concentrations of indoor PM2.5 and its constituents. PMID:29621164

  20. Source Apportionment and Influencing Factor Analysis of Residential Indoor PM2.5 in Beijing.

    PubMed

    Yang, Yibing; Liu, Liu; Xu, Chunyu; Li, Na; Liu, Zhe; Wang, Qin; Xu, Dongqun

    2018-04-05

    In order to identify the sources of indoor PM 2.5 and to check which factors influence the concentration of indoor PM 2.5 and chemical elements, indoor concentrations of PM 2.5 and its related elements in residential houses in Beijing were explored. Indoor and outdoor PM 2.5 samples that were monitored continuously for one week were collected. Indoor and outdoor concentrations of PM 2.5 and 15 elements (Al, As, Ca, Cd, Cu, Fe, K, Mg, Mn, Na, Pb, Se, Tl, V, Zn) were calculated and compared. The median indoor concentration of PM 2.5 was 57.64 μg/m³. For elements in indoor PM 2.5 , Cd and As may be sensitive to indoor smoking, Zn, Ca and Al may be related to indoor sources other than smoking, Pb, V and Se may mainly come from outdoor. Five factors were extracted for indoor PM 2.5 by factor analysis, explained 76.8% of total variance, outdoor sources contributed more than indoor sources. Multiple linear regression analysis for indoor PM 2.5 , Cd and Pb was performed. Indoor PM 2.5 was influenced by factors including outdoor PM 2.5 , smoking during sampling, outdoor temperature and time of air conditioner use. Indoor Cd was affected by factors including smoking during sampling, outdoor Cd and building age. Indoor Pb concentration was associated with factors including outdoor Pb and time of window open per day, building age and RH. In conclusion, indoor PM 2.5 mainly comes from outdoor sources, and the contributions of indoor sources also cannot be ignored. Factors associated indoor and outdoor air exchange can influence the concentrations of indoor PM 2.5 and its constituents.

  1. Genetic source tracking of an anthrax outbreak in Shaanxi province, China.

    PubMed

    Liu, Dong-Li; Wei, Jian-Chun; Chen, Qiu-Lan; Guo, Xue-Jun; Zhang, En-Min; He, Li; Liang, Xu-Dong; Ma, Guo-Zhu; Zhou, Ti-Cao; Yin, Wen-Wu; Liu, Wei; Liu, Kai; Shi, Yi; Ji, Jian-Jun; Zhang, Hui-Juan; Ma, Lin; Zhang, Fa-Xin; Zhang, Zhi-Kai; Zhou, Hang; Yu, Hong-Jie; Kan, Biao; Xu, Jian-Guo; Liu, Feng; Li, Wei

    2017-01-17

    Anthrax is an acute zoonotic infectious disease caused by the bacterium known as Bacillus anthracis. From 26 July to 8 August 2015, an outbreak with 20 suspected cutaneous anthrax cases was reported in Ganquan County, Shaanxi province in China. The genetic source tracking analysis of the anthrax outbreak was performed by molecular epidemiological methods in this study. Three molecular typing methods, namely canonical single nucleotide polymorphisms (canSNP), multiple-locus variable-number tandem repeat analysis (MLVA), and single nucleotide repeat (SNR) analysis, were used to investigate the possible source of transmission and identify the genetic relationship among the strains isolated from human cases and diseased animals during the outbreak. Five strains isolated from diseased mules were clustered together with patients' isolates using canSNP typing and MLVA. The causative B. anthracis lineages in this outbreak belonged to the A.Br.001/002 canSNP subgroup and the MLVA15-31 genotype (the 31 genotype in MLVA15 scheme). Because nine isolates from another four provinces in China were clustered together with outbreak-related strains by the canSNP (A.Br.001/002 subgroup) and MLVA15 method (MLVA15-31 genotype), still another SNR analysis (CL10, CL12, CL33, and CL35) was used to source track the outbreak, and the results suggesting that these patients in the anthrax outbreak were probably infected by the same pathogen clone. It was deduced that the anthrax outbreak occurred in Shaanxi province, China in 2015 was a local occurrence.

  2. GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis.

    PubMed

    Zheng, Qi; Wang, Xiu-Jie

    2008-07-01

    Gene Ontology (GO) analysis has become a commonly used approach for functional studies of large-scale genomic or transcriptomic data. Although there have been a lot of software with GO-related analysis functions, new tools are still needed to meet the requirements for data generated by newly developed technologies or for advanced analysis purpose. Here, we present a Gene Ontology Enrichment Analysis Software Toolkit (GOEAST), an easy-to-use web-based toolkit that identifies statistically overrepresented GO terms within given gene sets. Compared with available GO analysis tools, GOEAST has the following improved features: (i) GOEAST displays enriched GO terms in graphical format according to their relationships in the hierarchical tree of each GO category (biological process, molecular function and cellular component), therefore, provides better understanding of the correlations among enriched GO terms; (ii) GOEAST supports analysis for data from various sources (probe or probe set IDs of Affymetrix, Illumina, Agilent or customized microarrays, as well as different gene identifiers) and multiple species (about 60 prokaryote and eukaryote species); (iii) One unique feature of GOEAST is to allow cross comparison of the GO enrichment status of multiple experiments to identify functional correlations among them. GOEAST also provides rigorous statistical tests to enhance the reliability of analysis results. GOEAST is freely accessible at http://omicslab.genetics.ac.cn/GOEAST/

  3. Simultaneous Exposure to Multiple Air Pollutants Influences Alveolar Epithelial Cell Ion Transport

    EPA Science Inventory

    Purpose. Air pollution sources generally release multiple pollutants simultaneously and yet, research has historically focused on the source-to-health linkages of individual air pollutants. We recently showed that exposure of alveolar epithelial cells to a combination of particul...

  4. Time-Frequency Analysis of the Dispersion of Lamb Modes

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Seale, Michael D.; Smith, Barry T.

    1999-01-01

    Accurate knowledge of the velocity dispersion of Lamb modes is important for ultrasonic nondestructive evaluation methods used in detecting and locating flaws in thin plates and in determining their elastic stiffness coefficients. Lamb mode dispersion is also important in the acoustic emission technique for accurately triangulating the location of emissions in thin plates. In this research, the ability to characterize Lamb mode dispersion through a time-frequency analysis (the pseudo Wigner-Ville distribution) was demonstrated. A major advantage of time-frequency methods is the ability to analyze acoustic signals containing multiple propagation modes, which overlap and superimpose in the time domain signal. By combining time-frequency analysis with a broadband acoustic excitation source, the dispersion of multiple Lamb modes over a wide frequency range can be determined from as little as a single measurement. In addition, the technique provides a direct measurement of the group velocity dispersion. The technique was first demonstrated in the analysis of a simulated waveform in an aluminum plate in which the Lamb mode dispersion was well known. Portions of the dispersion curves of the A(sub 0), A(sub 1), S(sub 0), and S(sub 2)Lamb modes were obtained from this one waveform. The technique was also applied for the analysis of experimental waveforms from a unidirectional graphite/epoxy composite plate. Measurements were made both along, and perpendicular to the fiber direction. In this case, the signals contained only the lowest order symmetric and antisymmetric modes. A least squares fit of the results from several source to detector distances was used. Theoretical dispersion curves were calculated and are shown to be in good agreement with experimental results.

  5. Power Watch: Increasing Transparency and Accessibility of Data in the Global Power Sector to Accelerate the Transition to a Lower Carbon Economy

    NASA Astrophysics Data System (ADS)

    Hennig, R. J.; Friedrich, J.; Malaguzzi Valeri, L.; McCormick, C.; Lebling, K.; Kressig, A.

    2016-12-01

    The Power Watch project will offer open data on the global electricity sector starting with power plants and their impacts on climate and water systems; it will also offer visualizations and decision making tools. Power Watch will create the first comprehensive, open database of power plants globally by compiling data from national governments, public and private utilities, transmission grid operators, and other data providers to create a core dataset that has information on over 80% of global installed capacity for electrical generation. Power plant data will at a minimum include latitude and longitude, capacity, fuel type, emissions, water usage, ownership, and annual generation. By providing data that is both comprehensive, as well as making it publically available, this project will support decision making and analysis by actors across the economy and in the research community. The Power Watch research effort focuses on creating a global standard for power plant information, gathering and standardizing data from multiple sources, matching information from multiple sources on a plant level, testing cross-validation approaches (regional statistics, crowdsourcing, satellite data, and others) and developing estimation methodologies for generation, emissions, and water usage. When not available from official reports, emissions, annual generation, and water usage will be estimated. Water use estimates of power plants will be based on capacity, fuel type and satellite imagery to identify cooling types. This analysis is being piloted in several states in India and will then be scaled up to a global level. Other planned applications of of the Power Watch data include improving understanding of energy access, air pollution, emissions estimation, stranded asset analysis, life cycle analysis, tracking of proposed plants and curtailment analysis.

  6. Determinants of Internet use as a preferred source of information on personal health.

    PubMed

    Lemire, Marc; Paré, Guy; Sicotte, Claude; Harvey, Charmian

    2008-11-01

    To understand the personal, social and cultural factors likely to explain recourse to the Internet as a preferred source of personal health information. A cross-sectional survey was conducted among a population of 2923 Internet users visiting a firmly established website that offers information on personal health. Multiple regression analysis was performed to identify the determinants of site use. The analysis template comprised four classes of determinants likely to explain Internet use: beliefs, intentions, user satisfaction and socio-demographic characteristics. Seven-point Likert scales were used. An analysis of the psychometric qualities of the variables provided compelling evidence of the construct's validity and reliability. A confirmatory factor analysis confirmed the correspondence with the factors predicted by the theoretical model. The regression analysis explained 35% of the variance in Internet use. Use was directly associated with five factors: perceived usefulness, importance given to written media in searches for health information, concern for personal health, importance given to the opinions of physicians and other health professionals, and the trust placed in the information available on the site itself. This study confirms the importance of the credibility of information on the frequency of Internet use as a preferred source of information on personal health. It also shows the potentially influential role of the Internet in the development of personal knowledge of health issues.

  7. Using data sources beyond PubMed has a modest impact on the results of systematic reviews of therapeutic interventions.

    PubMed

    Halladay, Christopher W; Trikalinos, Thomas A; Schmid, Ian T; Schmid, Christopher H; Dahabreh, Issa J

    2015-09-01

    Searching multiple sources when conducting systematic reviews is considered good practice. We aimed to investigate the impact of using sources beyond PubMed in systematic reviews of therapeutic interventions. We randomly selected 50 Cochrane reviews that searched the PubMed (or MEDLINE) and EMBASE databases and included a meta-analysis of ≥10 studies. We checked whether each eligible record in each review (n = 2,700) was retrievable in PubMed and EMBASE. For the first-listed meta-analysis of ≥10 studies in each review, we examined whether excluding studies not found in PubMed affected results. A median of one record per review was indexed in EMBASE but not in PubMed; a median of four records per review was not indexed in PubMed or EMBASE. Meta-analyses included a median of 13.5 studies; a median of zero studies per meta-analysis was indexed in EMBASE but not in PubMed; a median of one study per meta-analysis was not indexed in PubMed or EMBASE. Meta-analysis using only PubMed-indexed vs. all available studies led to a different conclusion in a single case (on the basis of conventional criteria for statistical significance). In meta-regression analyses, effects in PubMed- vs. non-PubMed-indexed studies were statistically significantly different in a single data set. For systematic reviews of the effects of therapeutic interventions, gains from searching sources beyond PubMed, and from searching EMBASE in particular are modest. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Multi-diversity combining and selection for relay-assisted mixed RF/FSO system

    NASA Astrophysics Data System (ADS)

    Chen, Li; Wang, Weidong

    2017-12-01

    We propose and analyze multi-diversity combining and selection to enhance the performance of relay-assisted mixed radio frequency/free-space optics (RF/FSO) system. We focus on a practical scenario for cellular network where a single-antenna source is communicating to a multi-apertures destination through a relay equipped with multiple receive antennas and multiple transmit apertures. The RF single input multiple output (SIMO) links employ either maximal-ratio combining (MRC) or receive antenna selection (RAS), and the FSO multiple input multiple output (MIMO) links adopt either repetition coding (RC) or transmit laser selection (TLS). The performance is evaluated via an outage probability analysis over Rayleigh fading RF links and Gamma-Gamma atmospheric turbulence FSO links with pointing errors where channel state information (CSI) assisted amplify-and-forward (AF) scheme is considered. Asymptotic closed-form expressions at high signal-to-noise ratio (SNR) are also derived. Coding gain and diversity order for different combining and selection schemes are further discussed. Numerical results are provided to verify and illustrate the analytical results.

  9. Rg-Lg coupling as a Lg-wave excitation mechanism

    NASA Astrophysics Data System (ADS)

    Ge, Z.; Xie, X.

    2003-12-01

    Regional phase Lg is predominantly comprised of shear wave energy trapped in the crust. Explosion sources are expected to be less efficient for excitation of Lg phases than earthquakes to the extent that the source can be approximated as isotropic. Shallow explosions generate relatively large surface wave Rg compared to deeper earthquakes, and Rg is readily disrupted by crustal heterogeneity. Rg energy may thus scatter into trapped crustal S-waves near the source region and contribute to low-frequency Lg wave. In this study, a finite-difference modeling plus the slowness analysis are used for investigating the above mentioned Lg-wave excitation mechanism. The method allows us to investigate near source energy partitioning in multiple domains including frequency, slowness and time. The main advantage of this method is that it can be applied at close range, before Lg is actually formed, which allows us to use very fine near source velocity model to simulate the energy partitioning process. We use a layered velocity structure as the background model and add small near source random velocity patches to the model to generate the Rg to Lg coupling. Two types of simulations are conducted, (1) a fixed shallow explosion source vs. randomness at different depths and (2) a fixed shallow randomness vs. explosion sources at different depths. The results show apparent couplings between the Rg and Lg waves at lower frequencies (0.3-1.5 Hz). A shallow source combined with shallow randomness generates the maximum Lg-wave, which is consistent with the Rg energy distribution of a shallow explosion source. The Rg energy and excited Lg energy show a near linear relationship. The numerical simulation and slowness analysis suggest that the Rg to Lg coupling is an effective excitation mechanism for low frequency Lg-waves from a shallow explosion source.

  10. An Analysis of the Relationship Between Atmospheric Heat Transport and the Position of the ITCZ in NASA NEWS products, CMIP5 GCMs, and Multiple Reanalyses

    NASA Astrophysics Data System (ADS)

    Stanfield, R.; Dong, X.; Su, H.; Xi, B.; Jiang, J. H.

    2016-12-01

    In the past few years, studies have found a strong connection between atmospheric heat transport across the equator (AHTEQ) and the position of the ITCZ. This study investigates the seasonal, annual-mean and interannual variability of the ITCZ position and explores the relationships between the ITCZ position and inter-hemispheric energy transport in NASA NEWS products, multiple reanalyses datasets, and CMIP5 simulations. We find large discrepancies exist in the ITCZ-AHTEQ relationships in these datasets and model simulations. The components of energy fluxes are examined to identify the primary sources for the discrepancies among the datasets and models results.

  11. Laser Spiderweb Sensor Used with Portable Handheld Devices

    NASA Technical Reports Server (NTRS)

    Scott, David C. (Inventor); Ksendzov, Alexander (Inventor); George, Warren P. (Inventor); Smith, James A. (Inventor); Steinkraus, Joel M. (Inventor); Hofmann, Douglas C. (Inventor); Aljabri, Abdullah S. (Inventor); Bendig, Rudi M. (Inventor)

    2017-01-01

    A portable spectrometer, including a smart phone case storing a portable spectrometer, wherein the portable spectrometer includes a cavity; a source for emitting electromagnetic radiation that is directed on a sample in the cavity, wherein the electromagnetic radiation is reflected within the cavity to form multiple passes of the electromagnetic radiation through the sample; a detector for detecting the electromagnetic radiation after the electromagnetic radiation has made the multiple passes through the sample in the cavity, the detector outputting a signal in response to the detecting; and a device for communicating the signal to a smart phone, wherein the smart phone executes an application that performs a spectral analysis of the signal.

  12. Computational overlay metrology with adaptive data analytics

    NASA Astrophysics Data System (ADS)

    Schmitt-Weaver, Emil; Subramony, Venky; Ullah, Zakir; Matsunobu, Masazumi; Somasundaram, Ravin; Thomas, Joel; Zhang, Linmiao; Thul, Klaus; Bhattacharyya, Kaustuve; Goossens, Ronald; Lambregts, Cees; Tel, Wim; de Ruiter, Chris

    2017-03-01

    With photolithography as the fundamental patterning step in the modern nanofabrication process, every wafer within a semiconductor fab will pass through a lithographic apparatus multiple times. With more than 20,000 sensors producing more than 700GB of data per day across multiple subsystems, the combination of a light source and lithographic apparatus provide a massive amount of information for data analytics. This paper outlines how data analysis tools and techniques that extend insight into data that traditionally had been considered unmanageably large, known as adaptive analytics, can be used to show how data collected before the wafer is exposed can be used to detect small process dependent wafer-towafer changes in overlay.

  13. PSAT: A web tool to compare genomic neighborhoods of multiple prokaryotic genomes

    PubMed Central

    Fong, Christine; Rohmer, Laurence; Radey, Matthew; Wasnick, Michael; Brittnacher, Mitchell J

    2008-01-01

    Background The conservation of gene order among prokaryotic genomes can provide valuable insight into gene function, protein interactions, or events by which genomes have evolved. Although some tools are available for visualizing and comparing the order of genes between genomes of study, few support an efficient and organized analysis between large numbers of genomes. The Prokaryotic Sequence homology Analysis Tool (PSAT) is a web tool for comparing gene neighborhoods among multiple prokaryotic genomes. Results PSAT utilizes a database that is preloaded with gene annotation, BLAST hit results, and gene-clustering scores designed to help identify regions of conserved gene order. Researchers use the PSAT web interface to find a gene of interest in a reference genome and efficiently retrieve the sequence homologs found in other bacterial genomes. The tool generates a graphic of the genomic neighborhood surrounding the selected gene and the corresponding regions for its homologs in each comparison genome. Homologs in each region are color coded to assist users with analyzing gene order among various genomes. In contrast to common comparative analysis methods that filter sequence homolog data based on alignment score cutoffs, PSAT leverages gene context information for homologs, including those with weak alignment scores, enabling a more sensitive analysis. Features for constraining or ordering results are designed to help researchers browse results from large numbers of comparison genomes in an organized manner. PSAT has been demonstrated to be useful for helping to identify gene orthologs and potential functional gene clusters, and detecting genome modifications that may result in loss of function. Conclusion PSAT allows researchers to investigate the order of genes within local genomic neighborhoods of multiple genomes. A PSAT web server for public use is available for performing analyses on a growing set of reference genomes through any web browser with no client side software setup or installation required. Source code is freely available to researchers interested in setting up a local version of PSAT for analysis of genomes not available through the public server. Access to the public web server and instructions for obtaining source code can be found at . PMID:18366802

  14. A Comparison of Mathematical Models of Fish Mercury Concentration as a Function of Atmospheric Mercury Deposition Rate and Watershed Characteristics

    NASA Astrophysics Data System (ADS)

    Smith, R. A.; Moore, R. B.; Shanley, J. B.; Miller, E. K.; Kamman, N. C.; Nacci, D.

    2009-12-01

    Mercury (Hg) concentrations in fish and aquatic wildlife are complex functions of atmospheric Hg deposition rate, terrestrial and aquatic watershed characteristics that influence Hg methylation and export, and food chain characteristics determining Hg bioaccumulation. Because of the complexity and incomplete understanding of these processes, regional-scale models of fish tissue Hg concentration are necessarily empirical in nature, typically constructed through regression analysis of fish tissue Hg concentration data from many sampling locations on a set of potential explanatory variables. Unless the data sets are unusually long and show clear time trends, the empirical basis for model building must be based solely on spatial correlation. Predictive regional scale models are highly useful for improving understanding of the relevant biogeochemical processes, as well as for practical fish and wildlife management and human health protection. Mechanistically, the logical arrangement of explanatory variables is to multiply each of the individual Hg source terms (e.g. dry, wet, and gaseous deposition rates, and residual watershed Hg) for a given fish sampling location by source-specific terms pertaining to methylation, watershed transport, and biological uptake for that location (e.g. SO4 availability, hill slope, lake size). This mathematical form has the desirable property that predicted tissue concentration will approach zero as all individual source terms approach zero. One complication with this form, however, is that it is inconsistent with the standard linear multiple regression equation in which all terms (including those for sources and physical conditions) are additive. An important practical disadvantage of a model in which the Hg source terms are additive (rather than multiplicative) with their modifying factors is that predicted concentration is not zero when all sources are zero, making it unreliable for predicting the effects of large future reductions in Hg deposition. In this paper we compare the results of using several different linear and non-linear models in an analysis of watershed and fish Hg data for 450 New England lakes. The differences in model results pertain to both their utility in interpreting methylation and export processes as well as in fisheries management.

  15. Error tolerance analysis of wave diagnostic based on coherent modulation imaging in high power laser system

    NASA Astrophysics Data System (ADS)

    Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang

    2018-02-01

    Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.

  16. A supertree of early tetrapods.

    PubMed Central

    Ruta, Marcello; Jeffery, Jonathan E; Coates, Michael I

    2003-01-01

    A genus-level supertree for early tetrapods is built using a matrix representation of 50 source trees. The analysis of all combined trees delivers a long-stemmed topology in which most taxonomic groups are assigned to the tetrapod stem. A second analysis, which excludes source trees superseded by more comprehensive studies, supports a deep phylogenetic split between lissamphibian and amniote total groups. Instances of spurious groups are rare in both analyses. The results of the pruned second analysis are mostly comparable with those of a recent, character-based and large-scale phylogeny of Palaeozoic tetrapods. Outstanding areas of disagreement include the branching sequence of lepospondyls and the content of the amniote crown group, in particular the placement of diadectomorphs as stem diapsids. Supertrees are unsurpassed in their ability to summarize relationship patterns from multiple independent topologies. Therefore, they might be used as a simple test of the degree of corroboration of nodes in the contributory analyses. However, we urge caution in using them as a replacement for character-based cladograms and for inferring macroevolutionary patterns. PMID:14667343

  17. Systemic Analysis of Foodborne Disease Outbreak in Korea.

    PubMed

    Lee, Jong-Kyung; Kwak, No-Seong; Kim, Hyun Jung

    2016-02-01

    This study systemically analyzed data on the prevalence of foodborne pathogens and foodborne disease outbreaks to identify the priorities of foodborne infection risk management in Korea. Multiple correspondence analysis was applied to three variables: origin of food source, phase of food supply chain, and 12 pathogens using 358 cases from 76 original papers and official reports published in 1998-2012. In addition, correspondence analysis of two variables--place and pathogen--was conducted based on epidemiological data of 2357 foodborne outbreaks in 2002-2011 provided by the Korean Ministry of Food and Drug Safety. The results of this study revealed three distinct areas of food monitoring: (1) livestock-derived raw food contaminated with Campylobacter spp., pathogenic Escherichia coli, Salmonella spp., and Listeria monocytogenes; (2) multi-ingredient and ready-to-eat food related to Staphylococcus aureus; and (3) water associated with norovirus. Our findings emphasize the need to track the sources and contamination pathways of foodborne pathogens for more effective risk management.

  18. Management of Globally Distributed Software Development Projects in Multiple-Vendor Constellations

    NASA Astrophysics Data System (ADS)

    Schott, Katharina; Beck, Roman; Gregory, Robert Wayne

    Global information systems development outsourcing is an apparent trend that is expected to continue in the foreseeable future. Thereby, IS-related services are not only increasingly provided from different geographical sites simultaneously but beyond that from multiple service providers based in different countries. The purpose of this paper is to understand how the involvement of multiple service providers affects the management of the globally distributed information systems development projects. As research on this topic is scarce, we applied an exploratory in-depth single-case study design as research approach. The case we analyzed comprises a global software development outsourcing project initiated by a German bank together with several globally distributed vendors. For data collection and data analysis we have adopted techniques suggested by the grounded theory method. Whereas the extant literature points out the increased management overhead associated with multi-sourcing, the analysis of our case suggests that the required effort for managing global outsourcing projects with multiple vendors depends among other things on the maturation level of the cooperation within the vendor portfolio. Furthermore, our data indicate that this interplay maturity is positively impacted through knowledge about the client that has been derived based on already existing client-vendor relationships. The paper concludes by offering theoretical and practical implications.

  19. Organic molecules in the Sheepbed Mudstone, Gale Crater, Mars

    PubMed Central

    Freissinet, C; Glavin, D P; Mahaffy, P R; Miller, K E; Eigenbrode, J L; Summons, R E; Brunner, A E; Buch, A; Szopa, C; Archer, P D; Franz, H B; Atreya, S K; Brinckerhoff, W B; Cabane, M; Coll, P; Conrad, P G; Des Marais, D J; Dworkin, J P; Fairén, A G; François, P; Grotzinger, J P; Kashyap, S; ten Kate, I L; Leshin, L A; Malespin, C A; Martin, M G; Martin-Torres, F J; McAdam, A C; Ming, D W; Navarro-González, R; Pavlov, A A; Prats, B D; Squyres, S W; Steele, A; Stern, J C; Sumner, D Y; Sutter, B; Zorzano, M-P

    2015-01-01

    The Sample Analysis at Mars (SAM) instrument on board the Mars Science Laboratory Curiosity rover is designed to conduct inorganic and organic chemical analyses of the atmosphere and the surface regolith and rocks to help evaluate the past and present habitability potential of Mars at Gale Crater. Central to this task is the development of an inventory of any organic molecules present to elucidate processes associated with their origin, diagenesis, concentration, and long-term preservation. This will guide the future search for biosignatures. Here we report the definitive identification of chlorobenzene (150–300 parts per billion by weight (ppbw)) and C2 to C4 dichloroalkanes (up to 70 ppbw) with the SAM gas chromatograph mass spectrometer (GCMS) and detection of chlorobenzene in the direct evolved gas analysis (EGA) mode, in multiple portions of the fines from the Cumberland drill hole in the Sheepbed mudstone at Yellowknife Bay. When combined with GCMS and EGA data from multiple scooped and drilled samples, blank runs, and supporting laboratory analog studies, the elevated levels of chlorobenzene and the dichloroalkanes cannot be solely explained by instrument background sources known to be present in SAM. We conclude that these chlorinated hydrocarbons are the reaction products of Martian chlorine and organic carbon derived from Martian sources (e.g., igneous, hydrothermal, atmospheric, or biological) or exogenous sources such as meteorites, comets, or interplanetary dust particles. Key Points First in situ evidence of nonterrestrial organics in Martian surface sediments Chlorinated hydrocarbons identified in the Sheepbed mudstone by SAM Organics preserved in sample exposed to ionizing radiation and oxidative condition PMID:26690960

  20. Organic molecules in the Sheepbed Mudstone, Gale Crater, Mars.

    PubMed

    Freissinet, C; Glavin, D P; Mahaffy, P R; Miller, K E; Eigenbrode, J L; Summons, R E; Brunner, A E; Buch, A; Szopa, C; Archer, P D; Franz, H B; Atreya, S K; Brinckerhoff, W B; Cabane, M; Coll, P; Conrad, P G; Des Marais, D J; Dworkin, J P; Fairén, A G; François, P; Grotzinger, J P; Kashyap, S; Ten Kate, I L; Leshin, L A; Malespin, C A; Martin, M G; Martin-Torres, F J; McAdam, A C; Ming, D W; Navarro-González, R; Pavlov, A A; Prats, B D; Squyres, S W; Steele, A; Stern, J C; Sumner, D Y; Sutter, B; Zorzano, M-P

    2015-03-01

    The Sample Analysis at Mars (SAM) instrument on board the Mars Science Laboratory Curiosity rover is designed to conduct inorganic and organic chemical analyses of the atmosphere and the surface regolith and rocks to help evaluate the past and present habitability potential of Mars at Gale Crater. Central to this task is the development of an inventory of any organic molecules present to elucidate processes associated with their origin, diagenesis, concentration, and long-term preservation. This will guide the future search for biosignatures. Here we report the definitive identification of chlorobenzene (150-300 parts per billion by weight (ppbw)) and C 2 to C 4 dichloroalkanes (up to 70 ppbw) with the SAM gas chromatograph mass spectrometer (GCMS) and detection of chlorobenzene in the direct evolved gas analysis (EGA) mode, in multiple portions of the fines from the Cumberland drill hole in the Sheepbed mudstone at Yellowknife Bay. When combined with GCMS and EGA data from multiple scooped and drilled samples, blank runs, and supporting laboratory analog studies, the elevated levels of chlorobenzene and the dichloroalkanes cannot be solely explained by instrument background sources known to be present in SAM. We conclude that these chlorinated hydrocarbons are the reaction products of Martian chlorine and organic carbon derived from Martian sources (e.g., igneous, hydrothermal, atmospheric, or biological) or exogenous sources such as meteorites, comets, or interplanetary dust particles. First in situ evidence of nonterrestrial organics in Martian surface sediments Chlorinated hydrocarbons identified in the Sheepbed mudstone by SAM Organics preserved in sample exposed to ionizing radiation and oxidative condition.

  1. A solution to the water resources crisis in wetlands: development of a scenario-based modeling approach with uncertain features.

    PubMed

    Lv, Ying; Huang, Guohe; Sun, Wei

    2013-01-01

    A scenario-based interval two-phase fuzzy programming (SITF) method was developed for water resources planning in a wetland ecosystem. The SITF approach incorporates two-phase fuzzy programming, interval mathematical programming, and scenario analysis within a general framework. It can tackle fuzzy and interval uncertainties in terms of cost coefficients, resources availabilities, water demands, hydrological conditions and other parameters within a multi-source supply and multi-sector consumption context. The SITF method has the advantage in effectively improving the membership degrees of the system objective and all fuzzy constraints, so that both higher satisfactory grade of the objective and more efficient utilization of system resources can be guaranteed. Under the systematic consideration of water demands by the ecosystem, the SITF method was successfully applied to Baiyangdian Lake, which is the largest wetland in North China. Multi-source supplies (including the inter-basin water sources of Yuecheng Reservoir and Yellow River), and multiple water users (including agricultural, industrial and domestic sectors) were taken into account. The results indicated that, the SITF approach would generate useful solutions to identify long-term water allocation and transfer schemes under multiple economic, environmental, ecological, and system-security targets. It can address a comparative analysis for the system satisfactory degrees of decisions under various policy scenarios. Moreover, it is of significance to quantify the relationship between hydrological change and human activities, such that a scheme on ecologically sustainable water supply to Baiyangdian Lake can be achieved. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. The Toxicological Evaluation of Realistic Emissions of Source Aerosols Study: Statistical Methods

    PubMed Central

    Coull, Brent A.; Wellenius, Gregory A.; Gonzalez-Flecha, Beatriz; Diaz, Edgar; Koutrakis, Petros; Godleski, John J.

    2013-01-01

    The Toxicological Evaluation of Realistic Emissions of Source Aerosols (TERESA) study involved withdrawal, aging, and atmospheric transformation of emissions of three coal-fired power plants. Toxicological evaluations were carried out in rats exposed to different emission scenarios with extensive exposure characterization. Data generated had multiple levels of resolution: exposure, scenario and constituent chemical composition. Here, we outline a multilayered approach to analyze the associations between exposure and health effects beginning with standard ANOVA models that treat exposure as a categorical variable. The model assessed differences in exposure effects across scenarios (by plant). To assess unadjusted associations between pollutant concentrations and health, univariate analyses were conducted using the difference between the response means under exposed and control conditions and a single constituent concentration as the predictor. Then, a novel multivariate analysis of exposure composition and health was used based on random forests, a recent extension of classification and regression trees that were applied to the outcome differences. For each exposure constituent, this approach yielded a nonparametric measure of the importance of that constituent in predicting differences in response on a given day, controlling for the other measured constituent concentrations in the model. Finally, an R2 analysis compared the relative importance of exposure scenario, plant, and constituent concentrations on each outcome. Peak expiratory flow is used to demonstrate how the multiple levels of the analysis complement each other to assess constituents most strongly associated with health effects. PMID:21913820

  3. The toxicological evaluation of realistic emissions of source aerosols study: statistical methods.

    PubMed

    Coull, Brent A; Wellenius, Gregory A; Gonzalez-Flecha, Beatriz; Diaz, Edgar; Koutrakis, Petros; Godleski, John J

    2011-08-01

    The Toxicological Evaluation of Realistic Emissions of Source Aerosols (TERESA) study involved withdrawal, aging, and atmospheric transformation of emissions of three coal-fired power plants. Toxicological evaluations were carried out in rats exposed to different emission scenarios with extensive exposure characterization. Data generated had multiple levels of resolution: exposure, scenario, and constituent chemical composition. Here, we outline a multilayered approach to analyze the associations between exposure and health effects beginning with standard ANOVA models that treat exposure as a categorical variable. The model assessed differences in exposure effects across scenarios (by plant). To assess unadjusted associations between pollutant concentrations and health, univariate analyses were conducted using the difference between the response means under exposed and control conditions and a single constituent concentration as the predictor. Then, a novel multivariate analysis of exposure composition and health was used based on Random Forests(™), a recent extension of classification and regression trees that were applied to the outcome differences. For each exposure constituent, this approach yielded a nonparametric measure of the importance of that constituent in predicting differences in response on a given day, controlling for the other measured constituent concentrations in the model. Finally, an R(2) analysis compared the relative importance of exposure scenario, plant, and constituent concentrations on each outcome. Peak expiratory flow (PEF) is used to demonstrate how the multiple levels of the analysis complement each other to assess constituents most strongly associated with health effects.

  4. [Work-family conflict in call center].

    PubMed

    Ghislieri, Chiara; Ricotta, Simona; Colombo, Lara

    2012-01-01

    The working environment of call centers, which have seen a significant growth in recent years, has been the subject of several studies aiming at understanding its specific dynamics, with particular attention to the possible causes of stress and discomfort. Despite the fact that the work-family conflict is considered a source of stress responsible for undermining workers' well-being, and as such has been explored in many work environments, there is still very little research specific to call centers. This study had the following aims: to explore work-family conflict perceived by call-center operators taking account of any differences related to respondents'professional and personal characteristics; to understand which demands and resources can have an impact on work-family conflict in this context. The study was carried out on a sample of 898 call center operators in a telecommunications company through the administration of a self-reporting questionnaire. Data analysis included: t-test, one-way analysis of variance, linear correlations and multiple regressions. A higher perception of work-family conflict among workers having a full-time contract was observed compared to those having part-time contracts. Multiple regression analysis identified as sources of influence on work-family conflict: emotional dissonance, uneasiness due customer dissatisfaction, workload, avoidance coping and working hours. Work-family conflict in the context studied is not particularly critical: it is in part influenced by professional and personal characteristics of respondents and primarily caused by work demands. Managerial implications are discussed, especially referred to training activities.

  5. Diagnosis of potential stressors adversely affecting benthic invertebrate communities in Greenwich Bay, Rhode Island, USA.

    PubMed

    Pelletier, Marguerite; Ho, Kay; Cantwell, Mark; Perron, Monique; Rocha, Kenneth; Burgess, Robert M; Johnson, Roxanne; Perez, Kenneth; Cardin, John; Charpentier, Michael A

    2017-02-01

    Greenwich Bay is an urbanized embayment of Narragansett Bay potentially impacted by multiple stressors. The present study identified the important stressors affecting Greenwich Bay benthic fauna. First, existing data and information were used to confirm that the waterbody was impaired. Second, the presence of source, stressor, and effect were established. Then linkages between source, stressor, and effect were developed. This allows identification of probable stressors adversely affecting the waterbody. Three pollutant categories were assessed: chemicals, nutrients, and suspended sediments. This weight of evidence approach indicated that Greenwich Bay was primarily impacted by eutrophication-related stressors. The sediments of Greenwich Bay were carbon enriched and low dissolved oxygen concentrations were commonly seen, especially in the western portions of Greenwich Bay. The benthic community was depauperate, as would be expected under oxygen stress. Although our analysis indicated that contaminant loads in Greenwich Bay were at concentrations where adverse effects might be expected, no toxicity was observed, as a result of high levels of organic carbon in these sediments reducing contaminant bioavailability. Our analysis also indicated that suspended sediment impacts were likely nonexistent for much of the Bay. This analysis demonstrates that the diagnostic procedure was useful to organize and assess the potential stressors impacting the ecological well-being of Greenwich Bay. This diagnostic procedure is useful for management of waterbodies impacted by multiple stressors. Environ Toxicol Chem 2017;36:449-462. © 2016 SETAC. © 2016 SETAC.

  6. Aspiring to Spectral Ignorance in Earth Observation

    NASA Astrophysics Data System (ADS)

    Oliver, S. A.

    2016-12-01

    Enabling robust, defensible and integrated decision making in the Era of Big Earth Data requires the fusion of data from multiple and diverse sensor platforms and networks. While the application of standardised global grid systems provides a common spatial analytics framework that facilitates the computationally efficient and statistically valid integration and analysis of these various data sources across multiple scales, there remains the challenge of sensor equivalency; particularly when combining data from different earth observation satellite sensors (e.g. combining Landsat and Sentinel-2 observations). To realise the vision of a sensor ignorant analytics platform for earth observation we require automation of spectral matching across the available sensors. Ultimately, the aim is to remove the requirement for the user to possess any sensor knowledge in order to undertake analysis. This paper introduces the concept of spectral equivalence and proposes a methodology through which equivalent bands may be sourced from a set of potential target sensors through application of equivalence metrics and thresholds. A number of parameters can be used to determine whether a pair of spectra are equivalent for the purposes of analysis. A baseline set of thresholds for these parameters and how to apply them systematically to enable relation of spectral bands amongst numerous different sensors is proposed. The base unit for comparison in this work is the relative spectral response. From this input, determination of a what may constitute equivalence can be related by a user, based on their own conceptualisation of equivalence.

  7. Analysis and Testing of Mobile Wireless Networks

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Evenson, Darin; Rundquist, Victor; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Wireless networks are being used to connect mobile computing elements in more applications as the technology matures. There are now many products (such as 802.11 and 802.11b) which ran in the ISM frequency band and comply with wireless network standards. They are being used increasingly to link mobile Intranet into Wired networks. Standard methods of analyzing and testing their performance and compatibility are needed to determine the limits of the technology. This paper presents analytical and experimental methods of determining network throughput, range and coverage, and interference sources. Both radio frequency (BE) domain and network domain analysis have been applied to determine wireless network throughput and range in the outdoor environment- Comparison of field test data taken under optimal conditions, with performance predicted from RF analysis, yielded quantitative results applicable to future designs. Layering multiple wireless network- sooners can increase performance. Wireless network components can be set to different radio frequency-hopping sequences or spreading functions, allowing more than one sooner to coexist. Therefore, we ran multiple 802.11-compliant systems concurrently in the same geographical area to determine interference effects and scalability, The results can be used to design of more robust networks which have multiple layers of wireless data communication paths and provide increased throughput overall.

  8. Measurement and Analysis of Multiple Output Transient Propagation in BJT Analog Circuits

    NASA Astrophysics Data System (ADS)

    Roche, Nicolas J.-H.; Khachatrian, A.; Warner, J. H.; Buchner, S. P.; McMorrow, D.; Clymer, D. A.

    2016-08-01

    The propagation of Analog Single Event Transients (ASETs) to multiple outputs of Bipolar Junction Transistor (BJTs) Integrated Circuits (ICs) is reported for the first time. The results demonstrate that ASETs can appear at several outputs of a BJT amplifier or comparator as a result of a single ion or single laser pulse strike at a single physical location on the chip of a large-scale integrated BJT analog circuit. This is independent of interconnect cross-talk or charge-sharing effects. Laser experiments, together with SPICE simulations and analysis of the ASET's propagation in the s-domain are used to explain how multiple-output transients (MOTs) are generated and propagate in the device. This study demonstrates that both the charge collection associated with an ASET and the ASET's shape, commonly used to characterize the propagation of SETs in devices and systems, are unable to explain quantitatively how MOTs propagate through an integrated analog circuit. The analysis methodology adopted here involves combining the Fourier transform of the propagating signal and the current-source transfer function in the s-domain. This approach reveals the mechanisms involved in the transient signal propagation from its point of generation to one or more outputs without the signal following a continuous interconnect path.

  9. Combinatorial Fusion Analysis for Meta Search Information Retrieval

    NASA Astrophysics Data System (ADS)

    Hsu, D. Frank; Taksa, Isak

    Leading commercial search engines are built as single event systems. In response to a particular search query, the search engine returns a single list of ranked search results. To find more relevant results the user must frequently try several other search engines. A meta search engine was developed to enhance the process of multi-engine querying. The meta search engine queries several engines at the same time and fuses individual engine results into a single search results list. The fusion of multiple search results has been shown (mostly experimentally) to be highly effective. However, the question of why and how the fusion should be done still remains largely unanswered. In this chapter, we utilize the combinatorial fusion analysis proposed by Hsu et al. to analyze combination and fusion of multiple sources of information. A rank/score function is used in the design and analysis of our framework. The framework provides a better understanding of the fusion phenomenon in information retrieval. For example, to improve the performance of the combined multiple scoring systems, it is necessary that each of the individual scoring systems has relatively high performance and the individual scoring systems are diverse. Additionally, we illustrate various applications of the framework using two examples from the information retrieval domain.

  10. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  11. Spatio-Temporal Data Model for Integrating Evolving Nation-Level Datasets

    NASA Astrophysics Data System (ADS)

    Sorokine, A.; Stewart, R. N.

    2017-10-01

    Ability to easily combine the data from diverse sources in a single analytical workflow is one of the greatest promises of the Big Data technologies. However, such integration is often challenging as datasets originate from different vendors, governments, and research communities that results in multiple incompatibilities including data representations, formats, and semantics. Semantics differences are hardest to handle: different communities often use different attribute definitions and associate the records with different sets of evolving geographic entities. Analysis of global socioeconomic variables across multiple datasets over prolonged time is often complicated by the difference in how boundaries and histories of countries or other geographic entities are represented. Here we propose an event-based data model for depicting and tracking histories of evolving geographic units (countries, provinces, etc.) and their representations in disparate data. The model addresses the semantic challenge of preserving identity of geographic entities over time by defining criteria for the entity existence, a set of events that may affect its existence, and rules for mapping between different representations (datasets). Proposed model is used for maintaining an evolving compound database of global socioeconomic and environmental data harvested from multiple sources. Practical implementation of our model is demonstrated using PostgreSQL object-relational database with the use of temporal, geospatial, and NoSQL database extensions.

  12. Single photon source with individualized single photon certifications

    NASA Astrophysics Data System (ADS)

    Migdall, Alan L.; Branning, David A.; Castelletto, Stefania; Ware, M.

    2002-12-01

    As currently implemented, single-photon sources cannot be made to produce single photons with high probability, while simultaneously suppressing the probability of yielding two or more photons. Because of this, single photon sources cannot really produce single photons on demand. We describe a multiplexed system that allows the probabilities of producing one and more photons to be adjusted independently, enabling a much better approximation of a source of single photons on demand. The scheme uses a heralded photon source based on parametric downconversion, but by effectively breaking the trigger detector area into multiple regions, we are able to extract more information about a heralded photon than is possible with a conventional arrangement. This scheme allows photons to be produced along with a quantitative 'certification' that they are single photons. Some of the single-photon certifications can be significantly better than what is possible with conventional downconversion sources, as well as being better than faint laser sources. With such a source of more tightly certified single photons, it should be possible to improve the maximum secure bit rate possible over a quantum cryptographic link. We present an analysis of the relative merits of this method over the conventional arrangement.

  13. PARALLAX AND ORBITAL EFFECTS IN ASTROMETRIC MICROLENSING WITH BINARY SOURCES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nucita, A. A.; Paolis, F. De; Ingrosso, G.

    2016-06-01

    In gravitational microlensing, binary systems may act as lenses or sources. Identifying lens binarity is generally easy, in particular in events characterized by caustic crossing since the resulting light curve exhibits strong deviations from a smooth single-lensing light curve. In contrast, light curves with minor deviations from a Paczyński behavior do not allow one to identify the source binarity. A consequence of gravitational microlensing is the shift of the position of the multiple image centroid with respect to the source star location — the so-called astrometric microlensing signal. When the astrometric signal is considered, the presence of a binary sourcemore » manifests with a path that largely differs from that expected for single source events. Here, we investigate the astrometric signatures of binary sources taking into account their orbital motion and the parallax effect due to the Earth’s motion, which turn out not to be negligible in most cases. We also show that considering the above-mentioned effects is important in the analysis of astrometric data in order to correctly estimate the lens-event parameters.« less

  14. Quantum Theory of Superresolution for Incoherent Optical Imaging

    NASA Astrophysics Data System (ADS)

    Tsang, Mankei

    Rayleigh's criterion for resolving two incoherent point sources has been the most influential measure of optical imaging resolution for over a century. In the context of statistical image processing, violation of the criterion is especially detrimental to the estimation of the separation between the sources, and modern far-field superresolution techniques rely on suppressing the emission of close sources to enhance the localization precision. Using quantum optics, quantum metrology, and statistical analysis, here we show that, even if two close incoherent sources emit simultaneously, measurements with linear optics and photon counting can estimate their separation from the far field almost as precisely as conventional methods do for isolated sources, rendering Rayleigh's criterion irrelevant to the problem. Our results demonstrate that superresolution can be achieved not only for fluorophores but also for stars. Recent progress in generalizing our theory for multiple sources and spectroscopy will also be discussed. This work is supported by the Singapore National Research Foundation under NRF Grant No. NRF-NRFF2011-07 and the Singapore Ministry of Education Academic Research Fund Tier 1 Project R-263-000-C06-112.

  15. Tunable optical frequency comb enabled scalable and cost-effective multiuser orthogonal frequency-division multiple access passive optical network with source-free optical network units.

    PubMed

    Chen, Chen; Zhang, Chongfu; Liu, Deming; Qiu, Kun; Liu, Shuang

    2012-10-01

    We propose and experimentally demonstrate a multiuser orthogonal frequency-division multiple access passive optical network (OFDMA-PON) with source-free optical network units (ONUs), enabled by tunable optical frequency comb generation technology. By cascading a phase modulator (PM) and an intensity modulator and dynamically controlling the peak-to-peak voltage of a PM driven signal, a tunable optical frequency comb source can be generated. It is utilized to assist the configuration of a multiple source-free ONUs enhanced OFDMA-PON where simultaneous and interference-free multiuser upstream transmission over a single wavelength can be efficiently supported. The proposed multiuser OFDMA-PON is scalable and cost effective, and its feasibility is successfully verified by experiment.

  16. COHeRE: Cross-Ontology Hierarchical Relation Examination for Ontology Quality Assurance.

    PubMed

    Cui, Licong

    Biomedical ontologies play a vital role in healthcare information management, data integration, and decision support. Ontology quality assurance (OQA) is an indispensable part of the ontology engineering cycle. Most existing OQA methods are based on the knowledge provided within the targeted ontology. This paper proposes a novel cross-ontology analysis method, Cross-Ontology Hierarchical Relation Examination (COHeRE), to detect inconsistencies and possible errors in hierarchical relations across multiple ontologies. COHeRE leverages the Unified Medical Language System (UMLS) knowledge source and the MapReduce cloud computing technique for systematic, large-scale ontology quality assurance work. COHeRE consists of three main steps with the UMLS concepts and relations as the input. First, the relations claimed in source vocabularies are filtered and aggregated for each pair of concepts. Second, inconsistent relations are detected if a concept pair is related by different types of relations in different source vocabularies. Finally, the uncovered inconsistent relations are voted according to their number of occurrences across different source vocabularies. The voting result together with the inconsistent relations serve as the output of COHeRE for possible ontological change. The highest votes provide initial suggestion on how such inconsistencies might be fixed. In UMLS, 138,987 concept pairs were found to have inconsistent relationships across multiple source vocabularies. 40 inconsistent concept pairs involving hierarchical relationships were randomly selected and manually reviewed by a human expert. 95.8% of the inconsistent relations involved in these concept pairs indeed exist in their source vocabularies rather than being introduced by mistake in the UMLS integration process. 73.7% of the concept pairs with suggested relationship were agreed by the human expert. The effectiveness of COHeRE indicates that UMLS provides a promising environment to enhance qualities of biomedical ontologies by performing cross-ontology examination.

  17. Characterization and quantification of suspended sediment sources to the Manawatu River, New Zealand.

    PubMed

    Vale, S S; Fuller, I C; Procter, J N; Basher, L R; Smith, I E

    2016-02-01

    Knowledge of sediment movement throughout a catchment environment is essential due to its influence on the character and form of our landscape relating to agricultural productivity and ecological health. Sediment fingerprinting is a well-used tool for evaluating sediment sources within a fluvial catchment but still faces areas of uncertainty for applications to large catchments that have a complex arrangement of sources. Sediment fingerprinting was applied to the Manawatu River Catchment to differentiate 8 geological and geomorphological sources. The source categories were Mudstone, Hill Subsurface, Hill Surface, Channel Bank, Mountain Range, Gravel Terrace, Loess and Limestone. Geochemical analysis was conducted using XRF and LA-ICP-MS. Geochemical concentrations were analysed using Discriminant Function Analysis and sediment un-mixing models. Two mixing models were used in conjunction with GRG non-linear and Evolutionary optimization methods for comparison. Discriminant Function Analysis required 16 variables to correctly classify 92.6% of sediment sources. Geological explanations were achieved for some of the variables selected, although there is a need for mineralogical information to confirm causes for the geochemical signatures. Consistent source estimates were achieved between models with optimization techniques providing globally optimal solutions for sediment quantification. Sediment sources was attributed primarily to Mudstone, ≈38-46%; followed by the Mountain Range, ≈15-18%; Hill Surface, ≈12-16%; Hill Subsurface, ≈9-11%; Loess, ≈9-15%; Gravel Terrace, ≈0-4%; Channel Bank, ≈0-5%; and Limestone, ≈0%. Sediment source apportionment fits with the conceptual understanding of the catchment which has recognized soft sedimentary mudstone to be highly susceptible to erosion. Inference of the processes responsible for sediment generation can be made for processes where there is a clear relationship with the geomorphology, but is problematic for processes which occur within multiple terrains. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Tracking iron in multiple sclerosis: a combined imaging and histopathological study at 7 Tesla

    PubMed Central

    Hametner, Simon; Yao, Bing; van Gelderen, Peter; Merkle, Hellmut; Cantor, Fredric K.; Lassmann, Hans; Duyn, Jeff H.

    2011-01-01

    Previous authors have shown that the transverse relaxivity R2* and frequency shifts that characterize gradient echo signal decay in magnetic resonance imaging are closely associated with the distribution of iron and myelin in the brain's white matter. In multiple sclerosis, iron accumulation in brain tissue may reflect a multiplicity of pathological processes. Hence, iron may have the unique potential to serve as an in vivo magnetic resonance imaging tracer of disease pathology. To investigate the ability of iron in tracking multiple sclerosis-induced pathology by magnetic resonance imaging, we performed qualitative histopathological analysis of white matter lesions and normal-appearing white matter regions with variable appearance on gradient echo magnetic resonance imaging at 7 Tesla. The samples used for this study derive from two patients with multiple sclerosis and one non-multiple sclerosis donor. Magnetic resonance images were acquired using a whole body 7 Tesla magnetic resonance imaging scanner equipped with a 24-channel receive-only array designed for tissue imaging. A 3D multi-gradient echo sequence was obtained and quantitative R2* and phase maps were reconstructed. Immunohistochemical stainings for myelin and oligodendrocytes, microglia and macrophages, ferritin and ferritin light polypeptide were performed on 3- to 5-µm thick paraffin sections. Iron was detected with Perl's staining and 3,3′-diaminobenzidine-tetrahydrochloride enhanced Turnbull blue staining. In multiple sclerosis tissue, iron presence invariably matched with an increase in R2*. Conversely, R2* increase was not always associated with the presence of iron on histochemical staining. We interpret this finding as the effect of embedding, sectioning and staining procedures. These processes likely affected the histopathological analysis results but not the magnetic resonance imaging that was obtained before tissue manipulations. Several cellular sources of iron were identified. These sources included oligodendrocytes in normal-appearing white matter and activated macrophages/microglia at the edges of white matter lesions. Additionally, in white matter lesions, iron precipitation in aggregates typical of microbleeds was shown by the Perl's staining. Our combined imaging and pathological study shows that multi-gradient echo magnetic resonance imaging is a sensitive technique for the identification of iron in the brain tissue of patients with multiple sclerosis. However, magnetic resonance imaging-identified iron does not necessarily reflect pathology and may also be seen in apparently normal tissue. Iron identification by multi-gradient echo magnetic resonance imaging in diseased tissues can shed light on the pathological processes when coupled with topographical information and patient disease history. PMID:22171355

  19. Engaging in science inquiry: Prospective elementary teachers' learning in an innovative life science course

    NASA Astrophysics Data System (ADS)

    Haefner, Leigh Boardman

    2001-10-01

    This study examined prospective elementary teachers' learning about science inquiry in the context of an innovative life science course that engaged them in an original science investigation. Eleven elementary education majors participated in the study. A multiple case study approach that was descriptive, interpretive, and framed by grounded theory was employed. Primary data sources included transcripts of semi-structured interviews, text associated with online threaded discussions, and course project documents, such as lesson plans and written reflections. Secondary data sources included videotaped class sessions and field notes. Data were analyzed using analytical induction techniques, and trustworthiness was developed through the use of multiple data sources, triangulation of data, and the use of counterexamples to the assertions. Three major findings emerged from the cross-case analysis. First, engaging in an original science investigation assisted prospective teachers in becoming more attentive to the processes of science and developing more elaborated and data-driven explanations of how science is practiced. Second, when prospective teachers struggled with particular aspects of their investigations, those aspects became foci of change in their thinking about science and doing science. Third, as prospective teachers came to place a greater emphasis on questions, observations, and experimentation as fundamental aspects of doing science, they became more accepting of approaches to teaching science that encourage children's questions about science phenomena. Implications include the need to re-conceptualize teacher preparation programs to include multiple opportunities to engage prospective teachers in learning science as inquiry, and attend to connections among subject matter knowledge, subject-specific pedagogy and experiences with children.

  20. Feasibility and utility of applications of the common data model to multiple, disparate observational health databases

    PubMed Central

    Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B

    2015-01-01

    Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757

  1. Stability of Language in Childhood: A Multi-Age, -Domain, -Measure, and -Source Study

    PubMed Central

    Bornstein, Marc H.; Putnick, Diane L.

    2011-01-01

    The stability of language across childhood is traditionally assessed by exploring longitudinal relations between individual language measures. However, language encompasses many domains and varies with different sources (child speech, parental report, experimenter assessment). This study evaluated individual variation in multiple age-appropriate measures of child language derived from multiple sources and stability between their latent variables in 192 young children across more than 2 years. Structural equation modeling demonstrated the loading of multiple measures of child language from different sources on single latent variables of language at ages 20 and 48 months. A large stability coefficient (r = .84) obtained between the 2 language latent variables. This stability obtained even when accounting for family socioeconomic status, maternal verbal intelligence, education, speech, and tendency to respond in a socially desirable fashion, and child social competence. Stability was also equivalent for children in diverse childcare situations and for girls and boys. Across age, from the beginning of language acquisition to just before school entry, aggregating multiple age-appropriate methods and measures at each age and multiple reporters, children show strong stability of individual differences in general language development. PMID:22004343

  2. Transfer functions of double- and multiple-cavity Fabry-Perot filters driven by Lorentzian sources.

    PubMed

    Marti, J; Capmany, J

    1996-12-20

    We derive expressions for the transfer functions of double- and multiple-cavity Fabry-Perot filters driven by laser sources with Lorentzian spectrum. These are of interest because of their applications in sensing and channel filtering in optical frequency-division multiplexing networks.

  3. Transfer functions of double- and multiple-cavity Fabry Perot filters driven by Lorentzian sources

    NASA Astrophysics Data System (ADS)

    Marti, Javier; Capmany, Jose

    1996-12-01

    We derive expressions for the transfer functions of double- and multiple-cavity Fabry Perot filters driven by laser sources with Lorentzian spectrum. These are of interest because of their applications in sensing and channel filtering in optical frequency-division multiplexing networks.

  4. Qualitative Evaluation Methods in Ethics Education: A Systematic Review and Analysis of Best Practices.

    PubMed

    Watts, Logan L; Todd, E Michelle; Mulhearn, Tyler J; Medeiros, Kelsey E; Mumford, Michael D; Connelly, Shane

    2017-01-01

    Although qualitative research offers some unique advantages over quantitative research, qualitative methods are rarely employed in the evaluation of ethics education programs and are often criticized for a lack of rigor. This systematic review investigated the use of qualitative methods in studies of ethics education. Following a review of the literature in which 24 studies were identified, each study was coded based on 16 best practices characteristics in qualitative research. General thematic analysis and grounded theory were found to be the dominant approaches used. Researchers are effectively executing a number of best practices, such as using direct data sources, structured data collection instruments, non-leading questioning, and expert raters. However, other best practices were rarely present in the courses reviewed, such as collecting data using multiple sources, methods, raters, and timepoints, evaluating reliability, and employing triangulation analyses to assess convergence. Recommendations are presented for improving future qualitative research studies in ethics education.

  5. Volatile, isotope, and organic analysis of martian fines with the Mars Curiosity rover.

    PubMed

    Leshin, L A; Mahaffy, P R; Webster, C R; Cabane, M; Coll, P; Conrad, P G; Archer, P D; Atreya, S K; Brunner, A E; Buch, A; Eigenbrode, J L; Flesch, G J; Franz, H B; Freissinet, C; Glavin, D P; McAdam, A C; Miller, K E; Ming, D W; Morris, R V; Navarro-González, R; Niles, P B; Owen, T; Pepin, R O; Squyres, S; Steele, A; Stern, J C; Summons, R E; Sumner, D Y; Sutter, B; Szopa, C; Teinturier, S; Trainer, M G; Wray, J J; Grotzinger, J P

    2013-09-27

    Samples from the Rocknest aeolian deposit were heated to ~835°C under helium flow and evolved gases analyzed by Curiosity's Sample Analysis at Mars instrument suite. H2O, SO2, CO2, and O2 were the major gases released. Water abundance (1.5 to 3 weight percent) and release temperature suggest that H2O is bound within an amorphous component of the sample. Decomposition of fine-grained Fe or Mg carbonate is the likely source of much of the evolved CO2. Evolved O2 is coincident with the release of Cl, suggesting that oxygen is produced from thermal decomposition of an oxychloride compound. Elevated δD values are consistent with recent atmospheric exchange. Carbon isotopes indicate multiple carbon sources in the fines. Several simple organic compounds were detected, but they are not definitively martian in origin.

  6. Conducting requirements analyses for research using routinely collected health data: a model driven approach.

    PubMed

    de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul

    2012-01-01

    Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.

  7. In Silico Gene Prioritization by Integrating Multiple Data Sources

    PubMed Central

    Zhou, Yingyao; Shields, Robert; Chanda, Sumit K.; Elston, Robert C.; Li, Jing

    2011-01-01

    Identifying disease genes is crucial to the understanding of disease pathogenesis, and to the improvement of disease diagnosis and treatment. In recent years, many researchers have proposed approaches to prioritize candidate genes by considering the relationship of candidate genes and existing known disease genes, reflected in other data sources. In this paper, we propose an expandable framework for gene prioritization that can integrate multiple heterogeneous data sources by taking advantage of a unified graphic representation. Gene-gene relationships and gene-disease relationships are then defined based on the overall topology of each network using a diffusion kernel measure. These relationship measures are in turn normalized to derive an overall measure across all networks, which is utilized to rank all candidate genes. Based on the informativeness of available data sources with respect to each specific disease, we also propose an adaptive threshold score to select a small subset of candidate genes for further validation studies. We performed large scale cross-validation analysis on 110 disease families using three data sources. Results have shown that our approach consistently outperforms other two state of the art programs. A case study using Parkinson disease (PD) has identified four candidate genes (UBB, SEPT5, GPR37 and TH) that ranked higher than our adaptive threshold, all of which are involved in the PD pathway. In particular, a very recent study has observed a deletion of TH in a patient with PD, which supports the importance of the TH gene in PD pathogenesis. A web tool has been implemented to assist scientists in their genetic studies. PMID:21731658

  8. Pitching Flexible Propulsors: Experimental Assessment of Performance Characteristics

    DTIC Science & Technology

    2014-05-09

    velocities pointing in this direction contribute to an overall momentum deficit in the wake , which may be quantitatively related to the drag force on...and explained the source of some of the additional vorticity in the wake of the foil that may have otherwise been ignored or treated as noise in the...is conducted through reduction of the measured force and torque data and multiple wake flow analysis techniques, including particle image

  9. Preliminary Analysis of Fluctuations in the Received Uplink-Beacon-Power Data Obtained From the GOLD Experiments

    NASA Technical Reports Server (NTRS)

    Jeganathan, M.; Wilson, K. E.; Lesh, J. R.

    1996-01-01

    Uplink data from recent free-space optical communication experiments carried out between the Table Mountain Facility and the Japanese Engineering Test Satellite are used to study fluctuations caused by beam propagation through the atmosphere. The influence of atmospheric scintillation, beam wander and jitter, and multiple uplink beams on the statistics of power received by the satellite is analyzed and compared to experimental data. Preliminary analysis indicates the received signal obeys an approximate lognormal distribution, as predicted by the weak-turbulence model, but further characterization of other sources of fluctuations is necessary for accurate link predictions.

  10. Joint principal trend analysis for longitudinal high-dimensional data.

    PubMed

    Zhang, Yuping; Ouyang, Zhengqing

    2018-06-01

    We consider a research scenario motivated by integrating multiple sources of information for better knowledge discovery in diverse dynamic biological processes. Given two longitudinal high-dimensional datasets for a group of subjects, we want to extract shared latent trends and identify relevant features. To solve this problem, we present a new statistical method named as joint principal trend analysis (JPTA). We demonstrate the utility of JPTA through simulations and applications to gene expression data of the mammalian cell cycle and longitudinal transcriptional profiling data in response to influenza viral infections. © 2017, The International Biometric Society.

  11. Inter-noise 89 - Engineering for environmental noise control; Proceedings of the International Conference on Noise Control Engineering, Newport Beach, CA, Dec. 4-6, 1989. Vols. 1 & 2

    NASA Astrophysics Data System (ADS)

    Maling, George C., Jr.

    Recent advances in noise analysis and control theory and technology are discussed in reviews and reports. Topics addressed include noise generation; sound-wave propagation; noise control by external treatments; vibration and shock generation, transmission, isolation, and reduction; multiple sources and paths of environmental noise; noise perception and the physiological and psychological effects of noise; instrumentation, signal processing, and analysis techniques; and noise standards and legal aspects. Diagrams, drawings, graphs, photographs, and tables of numerical data are provided.

  12. Preliminary analysis of fluctuations in the received uplink-beacon-power data obtained from the GOLD experiments

    NASA Technical Reports Server (NTRS)

    Jeganathan, M.; Wilson, K. E.; Lesh, J. R.

    1996-01-01

    Uplink data from recent free-space optical communication experiments carried out between the Table Mountain Facility and the Japanese Engineering Test Satellite are used to study fluctuations caused by beam propagation through the atmosphere. The influence of atmospheric scintillation, beam wander and jitter, and multiple uplink beams on the statistics of power received by the satellite is analyzed and compared to experimental data. Preliminary analysis indicates the received signal obeys an approximate lognormal distribution, as predicted by the weak-turbulence model, but further characterization of other sources of fluctuations is necessary for accurate link predictions.

  13. Large-region acoustic source mapping using a movable array and sparse covariance fitting.

    PubMed

    Zhao, Shengkui; Tuna, Cagdas; Nguyen, Thi Ngoc Tho; Jones, Douglas L

    2017-01-01

    Large-region acoustic source mapping is important for city-scale noise monitoring. Approaches using a single-position measurement scheme to scan large regions using small arrays cannot provide clean acoustic source maps, while deploying large arrays spanning the entire region of interest is prohibitively expensive. A multiple-position measurement scheme is applied to scan large regions at multiple spatial positions using a movable array of small size. Based on the multiple-position measurement scheme, a sparse-constrained multiple-position vectorized covariance matrix fitting approach is presented. In the proposed approach, the overall sample covariance matrix of the incoherent virtual array is first estimated using the multiple-position array data and then vectorized using the Khatri-Rao (KR) product. A linear model is then constructed for fitting the vectorized covariance matrix and a sparse-constrained reconstruction algorithm is proposed for recovering source powers from the model. The user parameter settings are discussed. The proposed approach is tested on a 30 m × 40 m region and a 60 m × 40 m region using simulated and measured data. Much cleaner acoustic source maps and lower sound pressure level errors are obtained compared to the beamforming approaches and the previous sparse approach [Zhao, Tuna, Nguyen, and Jones, Proc. IEEE Intl. Conf. on Acoustics, Speech and Signal Processing (ICASSP) (2016)].

  14. Deep-level stereoscopic multiple traps of acoustic vortices

    NASA Astrophysics Data System (ADS)

    Li, Yuzhi; Guo, Gepu; Ma, Qingyu; Tu, Juan; Zhang, Dong

    2017-04-01

    Based on the radiation pattern of a planar piston transducer, the mechanisms underlying the generation of axially controllable deep-level stereoscopic multiple traps of acoustic vortices (AV) using sparse directional sources were proposed with explicit formulae. Numerical simulations for the axial and cross-sectional distributions of acoustic pressure and phase were conducted for various ka (product of the wave number and the radius of transducer) values at the frequency of 1 MHz. It was demonstrated that, for bigger ka, besides the main-AV (M-AV) generated by the main lobes of the sources, cone-shaped side-AV (S-AV) produced by the side lobes were closer to the source plane at a relatively lower pressure. Corresponding to the radiation angles of pressure nulls between the main lobe and the side lobes of the sources, vortex valleys with nearly pressure zero could be generated on the central axis to form multiple traps, based on Gor'kov potential theory. The number and locations of vortex valleys could be controlled accurately by the adjustment of ka. With the established eight-source AV generation system, the existence of the axially controllable multiple traps was verified by the measured M-AV and S-AVs as well as the corresponding vortex valleys. The favorable results provided the feasibility of deep-level stereoscopic control of AV and suggested potential application of multiple traps for particle manipulation in the area of biomedical engineering.

  15. Reactive nitrogen oxides in the southeast United States national parks: source identification, origin, and process budget

    NASA Astrophysics Data System (ADS)

    Tong, Daniel Quansong; Kang, Daiwen; Aneja, Viney P.; Ray, John D.

    2005-01-01

    We present in this study both measurement-based and modeling analyses for elucidation of source attribution, influence areas, and process budget of reactive nitrogen oxides at two rural southeast United States sites (Great Smoky Mountains national park (GRSM) and Mammoth Cave national park (MACA)). Availability of nitrogen oxides is considered as the limiting factor to ozone production in these areas and the relative source contribution of reactive nitrogen oxides from point or mobile sources is important in understanding why these areas have high ozone. Using two independent observation-based techniques, multiple linear regression analysis and emission inventory analysis, we demonstrate that point sources contribute a minimum of 23% of total NOy at GRSM and 27% at MACA. The influence areas for these two sites, or origins of nitrogen oxides, are investigated using trajectory-cluster analysis. The result shows that air masses from the West and Southwest sweep over GRSM most frequently, while pollutants transported from the eastern half (i.e., East, Northeast, and Southeast) have limited influence (<10% out of all air masses) on air quality at GRSM. The processes responsible for formation and removal of reactive nitrogen oxides are investigated using a comprehensive 3-D air quality model (Multiscale Air Quality SImulation Platform (MAQSIP)). The NOy contribution associated with chemical transformations to NOz and O3, based on process budget analysis, is as follows: 32% and 84% for NOz, and 26% and 80% for O3 at GRSM and MACA, respectively. The similarity between NOz and O3 process budgets suggests a close association between nitrogen oxides and effective O3 production at these rural locations.

  16. Development of a novel method for unraveling the origin of natron flux used in Roman glass production based on B isotopic analysis via multicollector inductively coupled plasma mass spectrometry.

    PubMed

    Devulder, Veerle; Degryse, Patrick; Vanhaecke, Frank

    2013-12-17

    The provenance of the flux raw material used in the manufacturing of Roman glass is an understudied topic in archaeology. Whether one or multiple sources of natron mineral salts were exploited during this period is still open for debate, largely because of the lack of a good provenance indicator. The flux is the major source of B in Roman glass. Therefore, B isotopic analysis of a sufficiently large collection and variety (origin and age) of such glass samples might give an indication of the number of flux sources used. For this purpose, a method based on acid digestion, chromatographic B isolation and B isotopic analysis using multicollector inductively coupled plasma mass spectrometry was developed. B isolation was accomplished using a combination of strong cation exchange and strong anion exchange chromatography. Although the B fraction was not completely matrix-free, the remaining Sb was shown not to affect the δ(11)B result. The method was validated using obsidian and archaeological glass samples that were stripped of their B content, after which an isotopic reference material with known B isotopic composition was added. Absence of artificial B isotope fractionation was demonstrated, and the total uncertainty was shown to be <2‰. A proof-of-concept application to natron glass samples showed a narrow range of δ(11)B, whereas first results for natron salt samples do show a larger difference in δ(11)B. These results suggest the use of only one natron source or of several sources with similar δ(11)B. This indicates that B isotopic analysis is a promising tool for the provenance determination of this flux raw material.

  17. Ephus: Multipurpose Data Acquisition Software for Neuroscience Experiments

    PubMed Central

    Suter, Benjamin A.; O'Connor, Timothy; Iyer, Vijay; Petreanu, Leopoldo T.; Hooks, Bryan M.; Kiritani, Taro; Svoboda, Karel; Shepherd, Gordon M. G.

    2010-01-01

    Physiological measurements in neuroscience experiments often involve complex stimulus paradigms and multiple data channels. Ephus (http://www.ephus.org) is an open-source software package designed for general-purpose data acquisition and instrument control. Ephus operates as a collection of modular programs, including an ephys program for standard whole-cell recording with single or multiple electrodes in typical electrophysiological experiments, and a mapper program for synaptic circuit mapping experiments involving laser scanning photostimulation based on glutamate uncaging or channelrhodopsin-2 excitation. Custom user functions allow user-extensibility at multiple levels, including on-line analysis and closed-loop experiments, where experimental parameters can be changed based on recently acquired data, such as during in vivo behavioral experiments. Ephus is compatible with a variety of data acquisition and imaging hardware. This paper describes the main features and modules of Ephus and their use in representative experimental applications. PMID:21960959

  18. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Filling Terrorism Gaps: VEOs, Evaluating Databases, and Applying Risk Terrain Modeling to Terrorism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagan, Ross F.

    2016-08-29

    This paper aims to address three issues: the lack of literature differentiating terrorism and violent extremist organizations (VEOs), terrorism incident databases, and the applicability of Risk Terrain Modeling (RTM) to terrorism. Current open source literature and publicly available government sources do not differentiate between terrorism and VEOs; furthermore, they fail to define them. Addressing the lack of a comprehensive comparison of existing terrorism data sources, a matrix comparing a dozen terrorism databases is constructed, providing insight toward the array of data available. RTM, a method for spatial risk analysis at a micro level, has some applicability to terrorism research, particularlymore » for studies looking at risk indicators of terrorism. Leveraging attack data from multiple databases, combined with RTM, offers one avenue for closing existing research gaps in terrorism literature.« less

  20. BioContainers: an open-source and community-driven framework for software standardization.

    PubMed

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

Top