López-Pacheco, María G; Sánchez-Fernández, Luis P; Molina-Lozano, Herón
2014-01-15
Noise levels of common sources such as vehicles, whistles, sirens, car horns and crowd sounds are mixed in urban soundscapes. Nowadays, environmental acoustic analysis is performed based on mixture signals recorded by monitoring systems. These mixed signals make it difficult for individual analysis which is useful in taking actions to reduce and control environmental noise. This paper aims at separating, individually, the noise source from recorded mixtures in order to evaluate the noise level of each estimated source. A method based on blind deconvolution and blind source separation in the wavelet domain is proposed. This approach provides a basis to improve results obtained in monitoring and analysis of common noise sources in urban areas. The method validation is through experiments based on knowledge of the predominant noise sources in urban soundscapes. Actual recordings of common noise sources are used to acquire mixture signals using a microphone array in semi-controlled environments. The developed method has demonstrated great performance improvements in identification, analysis and evaluation of common urban sources. © 2013 Elsevier B.V. All rights reserved.
A Method for the Analysis of Information Use in Source-Based Writing
ERIC Educational Resources Information Center
Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto
2012-01-01
Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
NASA Technical Reports Server (NTRS)
Cunefare, K. A.; Koopmann, G. H.
1991-01-01
This paper presents the theoretical development of an approach to active noise control (ANC) applicable to three-dimensional radiators. The active noise control technique, termed ANC Optimization Analysis, is based on minimizing the total radiated power by adding secondary acoustic sources on the primary noise source. ANC Optimization Analysis determines the optimum magnitude and phase at which to drive the secondary control sources in order to achieve the best possible reduction in the total radiated power from the noise source/control source combination. For example, ANC Optimization Analysis predicts a 20 dB reduction in the total power radiated from a sphere of radius at a dimensionless wavenumber ka of 0.125, for a single control source representing 2.5 percent of the total area of the sphere. ANC Optimization Analysis is based on a boundary element formulation of the Helmholtz Integral Equation, and thus, the optimization analysis applies to a single frequency, while multiple frequencies can be treated through repeated analyses.
Zou, Yonghong; Wang, Lixia; Christensen, Erik R
2015-10-01
This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). Copyright © 2015. Published by Elsevier Ltd.
Rifai Chai; Naik, Ganesh R; Tran, Yvonne; Sai Ho Ling; Craig, Ashley; Nguyen, Hung T
2015-08-01
An electroencephalography (EEG)-based counter measure device could be used for fatigue detection during driving. This paper explores the classification of fatigue and alert states using power spectral density (PSD) as a feature extractor and fuzzy swarm based-artificial neural network (ANN) as a classifier. An independent component analysis of entropy rate bound minimization (ICA-ERBM) is investigated as a novel source separation technique for fatigue classification using EEG analysis. A comparison of the classification accuracy of source separator versus no source separator is presented. Classification performance based on 43 participants without the inclusion of the source separator resulted in an overall sensitivity of 71.67%, a specificity of 75.63% and an accuracy of 73.65%. However, these results were improved after the inclusion of a source separator module, resulting in an overall sensitivity of 78.16%, a specificity of 79.60% and an accuracy of 78.88% (p <; 0.05).
Coal-tar-based sealcoated pavement: a major PAH source to urban stream sediments.
Witter, Amy E; Nguyen, Minh H; Baidar, Sunil; Sak, Peter B
2014-02-01
We used land-use analysis, PAH concentrations and assemblages, and multivariate statistics to identify sediment PAH sources in a small (~1303 km(2)) urbanizing watershed located in South-Central, Pennsylvania, USA. A geographic information system (GIS) was employed to quantify land-use features that may serve as PAH sources. Urban PAH concentrations were three times higher than rural levels, and were significantly and highly correlated with combined residential/commercial/industrial land use. Principal components analysis (PCA) was used to group sediments with similar PAH assemblages, and correlation analysis compared PAH sediment assemblages to common PAH sources. The strongest correlations were observed between rural sediments (n = 7) and coke-oven emissions sources (r = 0.69-0.78, n = 5), and between urban sediments (n = 22) and coal-tar-based sealcoat dust (r = 0.94, n = 47) suggesting that coal-tar-based sealcoat is an important urban PAH source in this watershed linked to residential and commercial/industrial land use. Copyright © 2013 Elsevier Ltd. All rights reserved.
SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis
NASA Astrophysics Data System (ADS)
Young, M. D.; Hayashi, S.; Gopu, A.
2014-05-01
As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.
NASA Astrophysics Data System (ADS)
Koutsodendris, Andreas; Papatheodorou, George; Kougiourouki, Ourania; Georgiadis, Michalis
2008-04-01
The types, abundance, distribution and sources of benthic marine litter found in four Greek Gulfs (Patras, Corinth, Echinades and Lakonikos) were studied using bottom trawl nets. Mean distribution and weight densities range between 72-437 Item/km 2 and 6.7-47.4 kg/km 2. Litter items were sorted into material and usage categories. Plastic litter (56%) is the most dominant material category followed by metal (17%) and glass (11%). Beverage packaging (32%) is the dominant usage category followed by general packaging (28%) and food packaging (21%). Based on the typological results three dominant litter sources were identified; land-based, vessel-based and fishery-based. Application of factor analysis (R- and Q-mode) conducted on both material and usage litter datasets confirmed the existence of the three dominant litter sources. Q-mode analysis further resulted in the quantification of the litter sources; land-based ones provided the majority (69%) of the total litter items followed by vessel-based (26%) and fishery-based (5%) sources. Diverse environmental parameters influence significantly these amounts among the four Gulfs.
Java Source Code Analysis for API Migration to Embedded Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winter, Victor; McCoy, James A.; Guerrero, Jonathan
Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less
Joint source based analysis of multiple brain structures in studying major depressive disorder
NASA Astrophysics Data System (ADS)
Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang
2014-03-01
We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.
NASA Astrophysics Data System (ADS)
Anggit Maulana, Hiska; Haris, Abdul
2018-05-01
Reservoir and source rock Identification has been performed to deliniate the reservoir distribution of Talangakar Formation South Sumatra Basin. This study is based on integrated geophysical, geological and petrophysical data. The aims of study to determine the characteristics of the reservoir and source rock, to differentiate reservoir and source rock in same Talangakar formation, to find out the distribution of net pay reservoir and source rock layers. The method of geophysical included seismic data interpretation using time and depth structures map, post-stack inversion, interval velocity, geological interpretations included the analysis of structures and faults, and petrophysical processing is interpret data log wells that penetrating Talangakar formation containing hydrocarbons (oil and gas). Based on seismic interpretation perform subsurface mapping on Layer A and Layer I to determine the development of structures in the Regional Research. Based on the geological interpretation, trapping in the form of regional research is anticline structure on southwest-northeast trending and bounded by normal faults on the southwest-southeast regional research structure. Based on petrophysical analysis, the main reservoir in the field of research, is a layer 1,375 m of depth and a thickness 2 to 8.3 meters.
CognitionMaster: an object-based image analysis framework
2013-01-01
Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542
ERIC Educational Resources Information Center
Chatzarakis, G. E.
2009-01-01
This paper presents a new pedagogical method for nodal analysis optimization based on the use of virtual current sources, applicable to any linear electric circuit (LEC), regardless of its complexity. The proposed method leads to straightforward solutions, mostly arrived at by inspection. Furthermore, the method is easily adapted to computer…
Ueguchi, Takashi; Ogihara, Ryota; Yamada, Sachiko
2018-03-21
To investigate the accuracy of dual-energy virtual monochromatic computed tomography (CT) numbers obtained by two typical hardware and software implementations: the single-source projection-based method and the dual-source image-based method. A phantom with different tissue equivalent inserts was scanned with both single-source and dual-source scanners. A fast kVp-switching feature was used on the single-source scanner, whereas a tin filter was used on the dual-source scanner. Virtual monochromatic CT images of the phantom at energy levels of 60, 100, and 140 keV were obtained by both projection-based (on the single-source scanner) and image-based (on the dual-source scanner) methods. The accuracy of virtual monochromatic CT numbers for all inserts was assessed by comparing measured values to their corresponding true values. Linear regression analysis was performed to evaluate the dependency of measured CT numbers on tissue attenuation, method, and their interaction. Root mean square values of systematic error over all inserts at 60, 100, and 140 keV were approximately 53, 21, and 29 Hounsfield unit (HU) with the single-source projection-based method, and 46, 7, and 6 HU with the dual-source image-based method, respectively. Linear regression analysis revealed that the interaction between the attenuation and the method had a statistically significant effect on the measured CT numbers at 100 and 140 keV. There were attenuation-, method-, and energy level-dependent systematic errors in the measured virtual monochromatic CT numbers. CT number reproducibility was comparable between the two scanners, and CT numbers had better accuracy with the dual-source image-based method at 100 and 140 keV. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Evans, Ian; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula
2009-09-01
The first release of the Chandra Source Catalog (CSC) was published in 2009 March, and includes information about 94,676 X-ray sources detected in a subset of public ACIS imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents <˜30''.The CSC is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime.The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports medium sophistication scientific analysis on using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools; and (4) includes real X-ray sources detected with flux significance greater than a predefined threshold, while maintaining the number of spurious sources at an acceptable level. For each detected X-ray source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a source is detected.
ERIC Educational Resources Information Center
Vlas, Radu Eduard
2012-01-01
Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…
Ma, Wan-Li; Sun, De-Zhi; Shen, Wei-Guo; Yang, Meng; Qi, Hong; Liu, Li-Yan; Shen, Ji-Min; Li, Yi-Fan
2011-07-01
A comprehensive sampling campaign was carried out to study atmospheric concentration of polycyclic aromatic hydrocarbons (PAHs) in Beijing and to evaluate the effectiveness of source control strategies in reducing PAHs pollution after the 29th Olympic Games. The sub-cooled liquid vapor pressure (logP(L)(o))-based model and octanol-air partition coefficient (K(oa))-based model were applied based on each seasonal dateset. Regression analysis among log K(P), logP(L)(o) and log K(oa) exhibited high significant correlations for four seasons. Source factors were identified by principle component analysis and contributions were further estimated by multiple linear regression. Pyrogenic sources and coke oven emission were identified as major sources for both the non-heating and heating seasons. As compared with literatures, the mean PAH concentrations before and after the 29th Olympic Games were reduced by more than 60%, indicating that the source control measures were effective for reducing PAHs pollution in Beijing. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai
2017-08-01
Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Li, Shanlan; Kim, Jooil; Park, Sunyoung; Kim, Seung-Kyu; Park, Mi-Kyung; Mühle, Jens; Lee, Gangwoong; Lee, Meehye; Jo, Chun Ok; Kim, Kyung-Ryul
2014-01-01
The sources of halogenated compounds in East Asia associated with stratospheric ozone depletion and climate change are relatively poorly understood. High-precision in situ measurements of 18 halogenated compounds and carbonyl sulfide (COS) made at Gosan, Jeju Island, Korea, from November 2007 to December 2011 were analyzed by a positive matrix factorization (PMF). Seven major industrial sources were identified from the enhanced concentrations of halogenated compounds observed at Gosan and corresponding concentration-based source contributions were also suggested: primary aluminum production explaining 37% of total concentration enhancements, solvent usage of which source apportionment is 25%, fugitive emissions from HCFC/HFC production with 11%, refrigerant replacements (9%), semiconductor/electronics industry (9%), foam blowing agents (6%), and fumigation (3%). Statistical trajectory analysis was applied to specify the potential emission regions for seven sources using back trajectories. Primary aluminum production, solvent usage and fugitive emission sources were mainly contributed by China. Semiconductor/electronics sources were dominantly located in Korea. Refrigerant replacement, fumigation and foam blowing agent sources were spread throughout East Asian countries. The specified potential source regions are consistent with country-based consumptions and emission patterns, verifying the PMF analysis results. The industry-based emission sources of halogenated compounds identified in this study help improve our understanding of the East Asian countries' industrial contributions to halogenated compound emissions.
Smolinski, Tomasz G; Buchanan, Roger; Boratyn, Grzegorz M; Milanova, Mariofanna; Prinz, Astrid A
2006-01-01
Background Independent Component Analysis (ICA) proves to be useful in the analysis of neural activity, as it allows for identification of distinct sources of activity. Applied to measurements registered in a controlled setting and under exposure to an external stimulus, it can facilitate analysis of the impact of the stimulus on those sources. The link between the stimulus and a given source can be verified by a classifier that is able to "predict" the condition a given signal was registered under, solely based on the components. However, the ICA's assumption about statistical independence of sources is often unrealistic and turns out to be insufficient to build an accurate classifier. Therefore, we propose to utilize a novel method, based on hybridization of ICA, multi-objective evolutionary algorithms (MOEA), and rough sets (RS), that attempts to improve the effectiveness of signal decomposition techniques by providing them with "classification-awareness." Results The preliminary results described here are very promising and further investigation of other MOEAs and/or RS-based classification accuracy measures should be pursued. Even a quick visual analysis of those results can provide an interesting insight into the problem of neural activity analysis. Conclusion We present a methodology of classificatory decomposition of signals. One of the main advantages of our approach is the fact that rather than solely relying on often unrealistic assumptions about statistical independence of sources, components are generated in the light of a underlying classification problem itself. PMID:17118151
Dierdorff, Erich C; Ellington, J Kemp
2008-07-01
The consequences of work-family conflict for both individuals and organizations have been well documented, and the various sources of such conflict have received substantial attention. However, the vast majority of extant research has focused on only time- and strain-based sources, largely neglecting behavior-based sources. Integrating two nationally representative databases, the authors examine 3 behavior-based antecedents of work-family conflict linked specifically to occupational work role requirements (interdependence, responsibility for others, and interpersonal conflict). Results from multilevel analysis indicate that significant variance in work-family conflict is attributable to the occupation in which someone works. Interdependence and responsibility for others predict work-family conflict, even after controlling for several time- and strain-based sources.
MOLECULAR MARKER ANALYSIS OF DEARS SAMPLES
Source apportionment based on organic molecular markers provides a promising approach for meeting the Detroit Exposure and Aerosol Research Study (DEARS) objective of comparing source contributions between community air monitoring stations and various neighborhoods. Source appor...
(ISEA) MOLECULAR MARKER ANALYSIS OF DEARS SAMPLES
Source apportionment based on organic molecular markers provides a promising approach for meeting the Detroit Exposure and Aerosol Research Study (DEARS) objective of comparing source contributions between community air monitoring stations and various neighborhoods. Source appor...
Radley, Ian [Glenmont, NY; Bievenue, Thomas J [Delmar, NY; Burdett, John H [Charlton, NY; Gallagher, Brian W [Guilderland, NY; Shakshober, Stuart M [Hudson, NY; Chen, Zewu [Schenectady, NY; Moore, Michael D [Alplaus, NY
2008-06-08
An x-ray source assembly and method of operation are provided having enhanced output stability. The assembly includes an anode having a source spot upon which electrons impinge and a control system for controlling position of the anode source spot relative to an output structure. The control system can maintain the anode source spot location relative to the output structure notwithstanding a change in one or more operating conditions of the x-ray source assembly. One aspect of the disclosed invention is most amenable to the analysis of sulfur in petroleum-based fuels.
How Many Separable Sources? Model Selection In Independent Components Analysis
Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988
NASA Astrophysics Data System (ADS)
Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula
2010-07-01
The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents lsim30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of lsim1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a source is detected.
Paper focuses on trading schemes in which regulated point sources are allowed to avoid upgrading their pollution control technology to meet water quality-based effluent limits if they pay for equivalent (or greater) reductions in nonpoint source pollution.
Sources and Nature of Cost Analysis Data Base Reference Manual.
1983-07-01
COVERED Sources and Nature of Cost Analysis Data Base Interim Report (Update) Reference Manual 6 . PERFORMING ORG. REPORT NUMBER USAAVRADCOM TM 83-F-3 7 ...SECTION 6 - DATA FOR MULTIPLE APPLICATIONS 6.0.0 7.0.0 SECTION 7 - GLOSSARY OF COST ANALYSIS TERMS SECTION 8 - REFERENCES 8.0.0 SECTION 9 - BIBLIOGRAPHY...Relationsh-;ips Manual for the Army 1.14. 1 Yateri ci Command, TP-449, Mla; 1912 ( 7 21 RACKFORS JiR 1CO(PTER, INC. xlB.Aii- 6 -4A 1.15. 1 Z FNE>:THj MUNSON
Comparison of two trajectory based models for locating particle sources for two rural New York sites
NASA Astrophysics Data System (ADS)
Zhou, Liming; Hopke, Philip K.; Liu, Wei
Two back trajectory-based statistical models, simplified quantitative transport bias analysis and residence-time weighted concentrations (RTWC) have been compared for their capabilities of identifying likely locations of source emissions contributing to observed particle concentrations at Potsdam and Stockton, New York. Quantitative transport bias analysis (QTBA) attempts to take into account the distribution of concentrations around the directions of the back trajectories. In full QTBA approach, deposition processes (wet and dry) are also considered. Simplified QTBA omits the consideration of deposition. It is best used with multiple site data. Similarly the RTWC approach uses concentrations measured at different sites along with the back trajectories to distribute the concentration contributions across the spatial domain of the trajectories. In this study, these models are used in combination with the source contribution values obtained by the previous positive matrix factorization analysis of particle composition data from Potsdam and Stockton. The six common sources for the two sites, sulfate, soil, zinc smelter, nitrate, wood smoke and copper smelter were analyzed. The results of the two methods are consistent and locate large and clearly defined sources well. RTWC approach can find more minor sources but may also give unrealistic estimations of the source locations.
Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi
2014-01-01
The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.
DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis
NASA Astrophysics Data System (ADS)
Pernigotti, D.; Belis, C. A.
2018-05-01
DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.
Mestdagh, Inge; Bonicelli, Bernard; Laplana, Ramon; Roettele, Manfred
2009-01-01
Based on the results and lessons learned from the TOPPS project (Training the Operators to prevent Pollution from Point Sources), a proposal on a sustainable strategy to avoid point source pollution from Plant Protection Products (PPPs) was made. Within this TOPPS project (2005-2008), stakeholders were interviewed and research and analysis were done in 6 pilot catchment areas (BE, FR, DE, DK, IT, PL). Next, there was a repeated survey on operators' perception and opinion to measure changes resulting from TOPPS activities and good and bad practices were defined based on the Best Management Practices (risk analysis). Aim of the proposal is to suggest a strategy considering the differences between countries which can be implemented on Member State level in order to avoid PPP pollution of water through point sources. The methodology used for the up-scaLing proposal consists of the analysis of the current situation, a gap analysis, a consistency analysis and organisational structures for implementation. The up-scaling proposal focuses on the behaviour of the operators, on the equipment and infrastructure available with the operators. The proposal defines implementation structures to support correct behaviour through the development and updating of Best Management Practices (BMPs) and through the transfer and the implementation of these BMPs. Next, the proposal also defines requirements for the improvement of equipment and infrastructure based on the defined key factors related to point source pollution. It also contains cost estimates for technical and infrastructure upgrades to comply with BMPs.
The Iterative Reweighted Mixed-Norm Estimate for Spatio-Temporal MEG/EEG Source Reconstruction.
Strohmeier, Daniel; Bekhti, Yousra; Haueisen, Jens; Gramfort, Alexandre
2016-10-01
Source imaging based on magnetoencephalography (MEG) and electroencephalography (EEG) allows for the non-invasive analysis of brain activity with high temporal and good spatial resolution. As the bioelectromagnetic inverse problem is ill-posed, constraints are required. For the analysis of evoked brain activity, spatial sparsity of the neuronal activation is a common assumption. It is often taken into account using convex constraints based on the l 1 -norm. The resulting source estimates are however biased in amplitude and often suboptimal in terms of source selection due to high correlations in the forward model. In this work, we demonstrate that an inverse solver based on a block-separable penalty with a Frobenius norm per block and a l 0.5 -quasinorm over blocks addresses both of these issues. For solving the resulting non-convex optimization problem, we propose the iterative reweighted Mixed Norm Estimate (irMxNE), an optimization scheme based on iterative reweighted convex surrogate optimization problems, which are solved efficiently using a block coordinate descent scheme and an active set strategy. We compare the proposed sparse imaging method to the dSPM and the RAP-MUSIC approach based on two MEG data sets. We provide empirical evidence based on simulations and analysis of MEG data that the proposed method improves on the standard Mixed Norm Estimate (MxNE) in terms of amplitude bias, support recovery, and stability.
Radley, Ian; Bievenue, Thomas J.; Burdett Jr., John H.; Gallagher, Brian W.; Shakshober, Stuart M.; Chen, Zewu; Moore, Michael D.
2007-04-24
An x-ray source assembly (2700) and method of operation are provided having enhanced output stability. The assembly includes an anode (2125) having a source spot upon which electrons (2120) impinge and a control system (2715/2720) for controlling position of the anode source spot relative to an output structure. The control system can maintain the anode source spot location relative to the output structure (2710) notwithstanding a change in one or more operating conditions of the x-ray source assembly. One aspect of the disclosed invention is most amenable to the analysis of sulfur in petroleum-based fuels.
Gollob, Stephan; Kocur, Georg Karl; Schumacher, Thomas; Mhamdi, Lassaad; Vogel, Thomas
2017-02-01
In acoustic emission analysis, common source location algorithms assume, independently of the nature of the propagation medium, a straight (shortest) wave path between the source and the sensors. For heterogeneous media such as concrete, the wave travels in complex paths due to the interaction with the dissimilar material contents and with the possible geometrical and material irregularities present in these media. For instance, cracks and large air voids present in concrete influence significantly the way the wave travels, by causing wave path deviations. Neglecting these deviations by assuming straight paths can introduce significant errors to the source location results. In this paper, a novel source localization method called FastWay is proposed. It accounts, contrary to most available shortest path-based methods, for the different effects of material discontinuities (cracks and voids). FastWay, based on a heterogeneous velocity model, uses the fastest rather than the shortest travel paths between the source and each sensor. The method was evaluated both numerically and experimentally and the results from both evaluation tests show that, in general, FastWay was able to locate sources of acoustic emissions more accurately and reliably than the traditional source localization methods. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Saad, S. M.; Shakaff, A. Y. M.; Saad, A. R. M.; Yusof, A. M.; Andrew, A. M.; Zakaria, A.; Adom, A. H.
2017-03-01
There are various sources influencing indoor air quality (IAQ) which could emit dangerous gases such as carbon monoxide (CO), carbon dioxide (CO2), ozone (O3) and particulate matter. These gases are usually safe for us to breathe in if they are emitted in safe quantity but if the amount of these gases exceeded the safe level, they might be hazardous to human being especially children and people with asthmatic problem. Therefore, a smart indoor air quality monitoring system (IAQMS) is needed that able to tell the occupants about which sources that trigger the indoor air pollution. In this project, an IAQMS that able to classify sources influencing IAQ has been developed. This IAQMS applies a classification method based on Probabilistic Neural Network (PNN). It is used to classify the sources of indoor air pollution based on five conditions: ambient air, human activity, presence of chemical products, presence of food and beverage, and presence of fragrance. In order to get good and best classification accuracy, an analysis of several feature selection based on data pre-processing method is done to discriminate among the sources. The output from each data pre-processing method has been used as the input for the neural network. The result shows that PNN analysis with the data pre-processing method give good classification accuracy of 99.89% and able to classify the sources influencing IAQ high classification rate.
Functional connectivity analysis in EEG source space: The choice of method
Knyazeva, Maria G.
2017-01-01
Functional connectivity (FC) is among the most informative features derived from EEG. However, the most straightforward sensor-space analysis of FC is unreliable owing to volume conductance effects. An alternative—source-space analysis of FC—is optimal for high- and mid-density EEG (hdEEG, mdEEG); however, it is questionable for widely used low-density EEG (ldEEG) because of inadequate surface sampling. Here, using simulations, we investigate the performance of the two source FC methods, the inverse-based source FC (ISFC) and the cortical partial coherence (CPC). To examine the effects of localization errors of the inverse method on the FC estimation, we simulated an oscillatory source with varying locations and SNRs. To compare the FC estimations by the two methods, we simulated two synchronized sources with varying between-source distance and SNR. The simulations were implemented for hdEEG, mdEEG, and ldEEG. We showed that the performance of both methods deteriorates for deep sources owing to their inaccurate localization and smoothing. The accuracy of both methods improves with the increasing between-source distance. The best ISFC performance was achieved using hd/mdEEG, while the best CPC performance was observed with ldEEG. In conclusion, with hdEEG, ISFC outperforms CPC and therefore should be the preferred method. In the studies based on ldEEG, the CPC is a method of choice. PMID:28727750
Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun
2018-09-01
Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.
Self-constrained inversion of potential fields
NASA Astrophysics Data System (ADS)
Paoletti, V.; Ialongo, S.; Florio, G.; Fedi, M.; Cella, F.
2013-11-01
We present a potential-field-constrained inversion procedure based on a priori information derived exclusively from the analysis of the gravity and magnetic data (self-constrained inversion). The procedure is designed to be applied to underdetermined problems and involves scenarios where the source distribution can be assumed to be of simple character. To set up effective constraints, we first estimate through the analysis of the gravity or magnetic field some or all of the following source parameters: the source depth-to-the-top, the structural index, the horizontal position of the source body edges and their dip. The second step is incorporating the information related to these constraints in the objective function as depth and spatial weighting functions. We show, through 2-D and 3-D synthetic and real data examples, that potential field-based constraints, for example, structural index, source boundaries and others, are usually enough to obtain substantial improvement in the density and magnetization models.
Joint Blind Source Separation by Multi-set Canonical Correlation Analysis
Li, Yi-Ou; Adalı, Tülay; Wang, Wei; Calhoun, Vince D
2009-01-01
In this work, we introduce a simple and effective scheme to achieve joint blind source separation (BSS) of multiple datasets using multi-set canonical correlation analysis (M-CCA) [1]. We first propose a generative model of joint BSS based on the correlation of latent sources within and between datasets. We specify source separability conditions, and show that, when the conditions are satisfied, the group of corresponding sources from each dataset can be jointly extracted by M-CCA through maximization of correlation among the extracted sources. We compare source separation performance of the M-CCA scheme with other joint BSS methods and demonstrate the superior performance of the M-CCA scheme in achieving joint BSS for a large number of datasets, group of corresponding sources with heterogeneous correlation values, and complex-valued sources with circular and non-circular distributions. We apply M-CCA to analysis of functional magnetic resonance imaging (fMRI) data from multiple subjects and show its utility in estimating meaningful brain activations from a visuomotor task. PMID:20221319
NASA Astrophysics Data System (ADS)
Aktas, Metin; Maral, Hakan; Akgun, Toygar
2018-02-01
Extinction ratio is an inherent limiting factor that has a direct effect on the detection performance of phase-OTDR based distributed acoustics sensing systems. In this work we present a model based analysis of Rayleigh scattering to simulate the effects of extinction ratio on the received signal under varying signal acquisition scenarios and system parameters. These signal acquisition scenarios are constructed to represent typically observed cases such as multiple vibration sources cluttered around the target vibration source to be detected, continuous wave light sources with center frequency drift, varying fiber optic cable lengths and varying ADC bit resolutions. Results show that an insufficient ER can result in high optical noise floor and effectively hide the effects of elaborate system improvement efforts.
Plis, Sergey M; George, J S; Jun, S C; Paré-Blagoev, J; Ranken, D M; Wood, C C; Schmidt, D M
2007-01-01
We propose a new model to approximate spatiotemporal noise covariance for use in neural electromagnetic source analysis, which better captures temporal variability in background activity. As with other existing formalisms, our model employs a Kronecker product of matrices representing temporal and spatial covariance. In our model, spatial components are allowed to have differing temporal covariances. Variability is represented as a series of Kronecker products of spatial component covariances and corresponding temporal covariances. Unlike previous attempts to model covariance through a sum of Kronecker products, our model is designed to have a computationally manageable inverse. Despite increased descriptive power, inversion of the model is fast, making it useful in source analysis. We have explored two versions of the model. One is estimated based on the assumption that spatial components of background noise have uncorrelated time courses. Another version, which gives closer approximation, is based on the assumption that time courses are statistically independent. The accuracy of the structural approximation is compared to an existing model, based on a single Kronecker product, using both Frobenius norm of the difference between spatiotemporal sample covariance and a model, and scatter plots. Performance of ours and previous models is compared in source analysis of a large number of single dipole problems with simulated time courses and with background from authentic magnetoencephalography data.
An integrated approach to assess heavy metal source apportionment in peri-urban agricultural soils.
Huang, Ying; Li, Tingqiang; Wu, Chengxian; He, Zhenli; Japenga, Jan; Deng, Meihua; Yang, Xiaoe
2015-12-15
Three techniques (Isotope Ratio Analysis, GIS mapping, and Multivariate Statistical Analysis) were integrated to assess heavy metal pollution and source apportionment in peri-urban agricultural soils. The soils in the study area were moderately polluted with cadmium (Cd) and mercury (Hg), lightly polluted with lead (Pb), and chromium (Cr). GIS Mapping suggested Cd pollution originates from point sources, whereas Hg, Pb, Cr could be traced back to both point and non-point sources. Principal component analysis (PCA) indicated aluminum (Al), manganese (Mn), nickel (Ni) were mainly inherited from natural sources, while Hg, Pb, and Cd were associated with two different kinds of anthropogenic sources. Cluster analysis (CA) further identified fertilizers, waste water, industrial solid wastes, road dust, and atmospheric deposition as potential sources. Based on isotope ratio analysis (IRA) organic fertilizers and road dusts accounted for 74-100% and 0-24% of the total Hg input, while road dusts and solid wastes contributed for 0-80% and 19-100% of the Pb input. This study provides a reliable approach for heavy metal source apportionment in this particular peri-urban area, with a clear potential for future application in other regions. Copyright © 2015 Elsevier B.V. All rights reserved.
Jiang, Jheng Jie; Lee, Chon Lin; Fang, Meng Der; Boyd, Kenneth G.; Gibb, Stuart W.
2015-01-01
This paper presents a methodology based on multivariate data analysis for characterizing potential source contributions of emerging contaminants (ECs) detected in 26 river water samples across multi-scape regions during dry and wet seasons. Based on this methodology, we unveil an approach toward potential source contributions of ECs, a concept we refer to as the “Pharmaco-signature.” Exploratory analysis of data points has been carried out by unsupervised pattern recognition (hierarchical cluster analysis, HCA) and receptor model (principal component analysis-multiple linear regression, PCA-MLR) in an attempt to demonstrate significant source contributions of ECs in different land-use zone. Robust cluster solutions grouped the database according to different EC profiles. PCA-MLR identified that 58.9% of the mean summed ECs were contributed by domestic impact, 9.7% by antibiotics application, and 31.4% by drug abuse. Diclofenac, ibuprofen, codeine, ampicillin, tetracycline, and erythromycin-H2O have significant pollution risk quotients (RQ>1), indicating potentially high risk to aquatic organisms in Taiwan. PMID:25874375
Roy, Debananda; Singh, Gurdeep; Yadav, Pankaj
2016-10-01
Source apportionment study of PM 10 (Particulate Matter) in a critically polluted area of Jharia coalfield, India has been carried out using Dispersion model, Principle Component Analysis (PCA) and Chemical Mass Balance (CMB) techniques. Dispersion model Atmospheric Dispersion Model (AERMOD) was introduced to simplify the complexity of sources in Jharia coalfield. PCA and CMB analysis indicates that monitoring stations near the mining area were mainly affected by the emission from open coal mining and its associated activities such as coal transportation, loading and unloading of coal. Mine fire emission also contributed a considerable amount of particulate matters in monitoring stations. Locations in the city area were mostly affected by vehicular, Liquid Petroleum Gas (LPG) & Diesel Generator (DG) set emissions, residential, and commercial activities. The experimental data sampling and their analysis could aid understanding how dispersion based model technique along with receptor model based concept can be strategically used for quantitative analysis of Natural and Anthropogenic sources of PM 10 . Copyright © 2016. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Lee, Jangho; Kim, Kwang-Yul
2018-02-01
CSEOF analysis is applied for the springtime (March, April, May) daily PM10 concentrations measured at 23 Ministry of Environment stations in Seoul, Korea for the period of 2003-2012. Six meteorological variables at 12 pressure levels are also acquired from the ERA Interim reanalysis datasets. CSEOF analysis is conducted for each meteorological variable over East Asia. Regression analysis is conducted in CSEOF space between the PM10 concentrations and individual meteorological variables to identify associated atmospheric conditions for each CSEOF mode. By adding the regressed loading vectors with the mean meteorological fields, the daily atmospheric conditions are obtained for the first five CSEOF modes. Then, HYSPLIT model is run with the atmospheric conditions for each CSEOF mode in order to back trace the air parcels and dust reaching Seoul. The K-means clustering algorithm is applied to identify major source regions for each CSEOF mode of the PM10 concentrations in Seoul. Three main source regions identified based on the mean fields are: (1) northern Taklamakan Desert (NTD), (2) Gobi Desert and (GD), and (3) East China industrial area (ECI). The main source regions for the mean meteorological fields are consistent with those of previous study; 41% of the source locations are located in GD followed by ECI (37%) and NTD (21%). Back trajectory calculations based on CSEOF analysis of meteorological variables identify distinct source characteristics associated with each CSEOF mode and greatly facilitate the interpretation of the PM10 variability in Seoul in terms of transportation route and meteorological conditions including the source area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohd, Shukri; Holford, Karen M.; Pullin, Rhys
2014-02-12
Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less
USDA-ARS?s Scientific Manuscript database
This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...
NASA Astrophysics Data System (ADS)
Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.
2012-04-01
Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m resolution was used for the analysis. Three failure mechanisms were analyzed: planar and wedge sliding, as well as toppling. Based on this kinematic analysis, areas where failure is feasible were used as source areas for run out analysis using Rockyfor3D v. 4.1 (www.ecorisq.org). The software calculates trajectories of single falling blocks in three dimensions using physically based algorithms developed under a stochastic approach. The ALS-DEM was down-scaled to 5 m resolution to optimize processing time. Results were compared with run-out simulations using Rockyfor3D with the whole rock wall as source area, and with maps of deposits generated from field observations and aerial photo interpretation. The results product of our implementation show a better correlation with field observations, and help to produce more accurate rock fall hazard assessment maps by a better definition of the source areas. It reduces the time processing for the analysis as well. The findings presented in this contribution are part of an effort to produce guidelines for natural hazard mapping in Norway. Guidelines will be used in upcoming years for hazard mapping in areas where larger groups of population are exposed to mass movements from steep slopes.
Pulley, Simon; Foster, Ian; Collins, Adrian L
2017-06-01
The objective classification of sediment source groups is at present an under-investigated aspect of source tracing studies, which has the potential to statistically improve discrimination between sediment sources and reduce uncertainty. This paper investigates this potential using three different source group classification schemes. The first classification scheme was simple surface and subsurface groupings (Scheme 1). The tracer signatures were then used in a two-step cluster analysis to identify the sediment source groupings naturally defined by the tracer signatures (Scheme 2). The cluster source groups were then modified by splitting each one into a surface and subsurface component to suit catchment management goals (Scheme 3). The schemes were tested using artificial mixtures of sediment source samples. Controlled corruptions were made to some of the mixtures to mimic the potential causes of tracer non-conservatism present when using tracers in natural fluvial environments. It was determined how accurately the known proportions of sediment sources in the mixtures were identified after unmixing modelling using the three classification schemes. The cluster analysis derived source groups (2) significantly increased tracer variability ratios (inter-/intra-source group variability) (up to 2122%, median 194%) compared to the surface and subsurface groupings (1). As a result, the composition of the artificial mixtures was identified an average of 9.8% more accurately on the 0-100% contribution scale. It was found that the cluster groups could be reclassified into a surface and subsurface component (3) with no significant increase in composite uncertainty (a 0.1% increase over Scheme 2). The far smaller effects of simulated tracer non-conservatism for the cluster analysis based schemes (2 and 3) was primarily attributed to the increased inter-group variability producing a far larger sediment source signal that the non-conservatism noise (1). Modified cluster analysis based classification methods have the potential to reduce composite uncertainty significantly in future source tracing studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Using real options analysis to support strategic management decisions
NASA Astrophysics Data System (ADS)
Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan
2013-12-01
Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.
Signal Processing and Interpretation Using Multilevel Signal Abstractions.
1986-06-01
mappings expressed in the Fourier domain. Pre- viously proposed causal analysis techniques for diagnosis are based on the analysis of intermediate data ...can be processed either as individual one-dimensional waveforms or as multichannel data 26 I P- - . . . ." " ." h9. for source detection and direction...microphone data . The signal processing for both spectral analysis of microphone signals and direc- * tion determination of acoustic sources involves
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Alexandrov, B.
2014-12-01
The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the location of the water-supply pumping wells based on the available data. The possible applications of the NMFk algorithm are not limited to hydrology problems; NMFk can be applied to any problem where temporal system behavior is observed at multiple locations and an unknown number of physical sources are causing these fluctuations.
Better Assessment Science Integrating Point and Non-point Sources (BASINS)
Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is a multipurpose environmental analysis system designed to help regional, state, and local agencies perform watershed- and water quality-based studies.
[Sources and potential risk of heavy metals in roadside soils of Xi' an City].
Chen, Jing-hui; Lu, Xin-wei; Zhai, Meng
2011-07-01
Based on the X-Ray fluorescence spectroscopic measurement of heavy metals concentration in roadside soil samples from Xi' an City, and by the methods of principal component analysis, cluster analysis, and correlation analysis, this paper approached the possible sources of heavy metals in the roadside soils of the City. In the meantime, potential ecological risk index was used to assess the ecological risk of the heavy metals. In the roadside soils, the mean concentrations of Co, Cr, Cu, Mn, Ni, Pb, and Zn were higher than those of the Shaanxi soil background values. The As, Mn and Ni in roadside soils mainly came from natural source and transportation source, the Cu, Pb, and Zn mainly came from transportation source, and the Co and Cr mainly came from industry source. These heavy metals in the roadside soils belonged to medium pollution, and had medium potential ecological risk.
Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J
2013-05-01
Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.
Moranda, Arianna
2017-01-01
A procedure for assessing harbour pollution by heavy metals and PAH and the possible sources of contamination is proposed. The procedure is based on a ratio-matching method applied to the results of principal component analysis (PCA), and it allows discrimination between point and nonpoint sources. The approach can be adopted when many sources of pollution can contribute in a very narrow coastal ecosystem, both internal and outside but close to the harbour, and was used to identify the possible point sources of contamination in a Mediterranean Harbour (Port of Vado, Savona, Italy). 235 sediment samples were collected in 81 sampling points during four monitoring campaigns and 28 chemicals were searched for within the collected samples. PCA of total samples allowed the assessment of 8 main possible point sources, while the refining ratio-matching identified 1 sampling point as a possible PAH source, 2 sampling points as Cd point sources, and 3 sampling points as C > 12 point sources. By a map analysis it was possible to assess two internal sources of pollution directly related to terminals activity. The study is the prosecution of a previous work aimed at assessing Savona-Vado Harbour pollution levels and suggested strategies to regulate the harbour activities. PMID:29270328
Paladino, Ombretta; Moranda, Arianna; Seyedsalehi, Mahdi
2017-01-01
A procedure for assessing harbour pollution by heavy metals and PAH and the possible sources of contamination is proposed. The procedure is based on a ratio-matching method applied to the results of principal component analysis (PCA), and it allows discrimination between point and nonpoint sources. The approach can be adopted when many sources of pollution can contribute in a very narrow coastal ecosystem, both internal and outside but close to the harbour, and was used to identify the possible point sources of contamination in a Mediterranean Harbour (Port of Vado, Savona, Italy). 235 sediment samples were collected in 81 sampling points during four monitoring campaigns and 28 chemicals were searched for within the collected samples. PCA of total samples allowed the assessment of 8 main possible point sources, while the refining ratio-matching identified 1 sampling point as a possible PAH source, 2 sampling points as Cd point sources, and 3 sampling points as C > 12 point sources. By a map analysis it was possible to assess two internal sources of pollution directly related to terminals activity. The study is the prosecution of a previous work aimed at assessing Savona-Vado Harbour pollution levels and suggested strategies to regulate the harbour activities.
NASA Astrophysics Data System (ADS)
Zhou, Yatong; Han, Chunying; Chi, Yue
2018-06-01
In a simultaneous source survey, no limitation is required for the shot scheduling of nearby sources and thus a huge acquisition efficiency can be obtained but at the same time making the recorded seismic data contaminated by strong blending interference. In this paper, we propose a multi-dip seislet frame based sparse inversion algorithm to iteratively separate simultaneous sources. We overcome two inherent drawbacks of traditional seislet transform. For the multi-dip problem, we propose to apply a multi-dip seislet frame thresholding strategy instead of the traditional seislet transform for deblending simultaneous-source data that contains multiple dips, e.g., containing multiple reflections. The multi-dip seislet frame strategy solves the conflicting dip problem that degrades the performance of the traditional seislet transform. For the noise issue, we propose to use a robust dip estimation algorithm that is based on velocity-slope transformation. Instead of calculating the local slope directly using the plane-wave destruction (PWD) based method, we first apply NMO-based velocity analysis and obtain NMO velocities for multi-dip components that correspond to multiples of different orders, then a fairly accurate slope estimation can be obtained using the velocity-slope conversion equation. An iterative deblending framework is given and validated through a comprehensive analysis over both numerical synthetic and field data examples.
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
Davis, Harley T.; Aelion, C. Marjorie; McDermott, Suzanne; Lawson, Andrew B.
2009-01-01
Determining sources of neurotoxic metals in rural and urban soils is important for mitigating human exposure. Surface soil from four areas with significant clusters of mental retardation and developmental delay (MR/DD) in children, and one control site were analyzed for nine metals and characterized by soil type, climate, ecological region, land use and industrial facilities using readily-available GIS-based data. Kriging, principal component analysis (PCA) and cluster analysis (CA) were used to identify commonalities of metal distribution. Three MR/DD areas (one rural and two urban) had similar soil types and significantly higher soil metal concentrations. PCA and CA results suggested that Ba, Be and Mn were consistently from natural sources; Pb and Hg from anthropogenic sources; and As, Cr, Cu, and Ni from both sources. Arsenic had low commonality estimates, was highly associated with a third PCA factor, and had a complex distribution, complicating mitigation strategies to minimize concentrations and exposures. PMID:19361902
Seasonal behavior of carbonyls and source characterization of formaldehyde (HCHO) in ambient air
NASA Astrophysics Data System (ADS)
Lui, K. H.; Ho, Steven Sai Hang; Louie, Peter K. K.; Chan, C. S.; Lee, S. C.; Hu, Di; Chan, P. W.; Lee, Jeffrey Chi Wai; Ho, K. F.
2017-03-01
Gas-phase formaldehyde (HCHO) is an intermediate and a sensitive indicator for volatile organic compounds (VOCs) oxidation, which drives tropospheric ozone production. Effective photochemical pollution control strategies demand a thorough understanding of photochemical oxidation precursors, making differentiation between sources of primary and secondary generated HCHO inevitable. Spatial and seasonal variations of airborne carbonyls based on two years of measurements (2012-2013), coupled with a correlation-based HCHO source apportionment analysis, were determined for three sampling locations in Hong Kong (denoted HT, TC, and YL). Formaldehyde and acetaldehyde were the two most abundant compounds of the total quantified carbonyls. Pearson's correlation analysis (r > 0.7) implies that formaldehyde and acetaldehyde possibly share similar sources. The total carbonyl concentration trends (HT < TC < YL) reflect location characteristics (urban > rural). A regression analysis further quantifies the relative primary HCHO source contributions at HT (∼13%), TC (∼21%), and YL (∼40%), showing more direct vehicular emissions in urban than rural areas. Relative secondary source contributions at YL (∼36%) and TC (∼31%) resemble each other, implying similar urban source contributions. Relative background source contributions at TC could be due to a closed structure microenvironment that favors the trapping of HCHO. Comparable seasonal differences are observed at all stations. The results of this study will aid in the development of a new regional ozone (O3) control policy, as ambient HCHO can enhance O3 production and also be produced from atmospheric VOCs oxidation (secondary HCHO).
Time-integrated (typically 24-hr) filter-based methods (historical methods) form the underpinning of our understanding of the fate, impact of source emissions at receptor locations (source impacts), and potential health and welfare effects of particulate matter (PM) in air. Over...
PANDORA: keyword-based analysis of protein sets by integration of annotation sources.
Kaplan, Noam; Vaaknin, Avishay; Linial, Michal
2003-10-01
Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.
Automatic control of a negative ion source
NASA Astrophysics Data System (ADS)
Saadatmand, K.; Sredniawski, J.; Solensten, L.
1989-04-01
A CAMAC based control architecture is devised for a Berkeley-type H - volume ion source [1]. The architecture employs three 80386 TM PCs. One PC is dedicated to control and monitoring of source operation. The other PC functions with digitizers to provide data acquisition of waveforms. The third PC is used for off-line analysis. Initially, operation of the source was put under remote computer control (supervisory). This was followed by development of an automated startup procedure. Finally, a study of the physics of operation is now underway to establish a data base from which automatic beam optimization can be derived.
Analysis And Augmentation Of Timing Advance Based Geolocation In Lte Cellular Networks
2016-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA DISSERTATION ANALYSIS AND AUGMENTATION OF TIMING ADVANCE-BASED GEOLOCATION IN LTE CELLULAR NETWORKS by...estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining the...AND SUBTITLE ANALYSIS AND AUGMENTATION OF TIMING ADVANCE-BASED GEOLOCA- TION IN LTE CELLULAR NETWORKS 5. FUNDING NUMBERS 6. AUTHOR(S) John D. Roth 7
Hiding the Source Based on Limited Flooding for Sensor Networks.
Chen, Juan; Lin, Zhengkui; Hu, Ying; Wang, Bailing
2015-11-17
Wireless sensor networks are widely used to monitor valuable objects such as rare animals or armies. Once an object is detected, the source, i.e., the sensor nearest to the object, generates and periodically sends a packet about the object to the base station. Since attackers can capture the object by localizing the source, many protocols have been proposed to protect source location. Instead of transmitting the packet to the base station directly, typical source location protection protocols first transmit packets randomly for a few hops to a phantom location, and then forward the packets to the base station. The problem with these protocols is that the generated phantom locations are usually not only near the true source but also close to each other. As a result, attackers can easily trace a route back to the source from the phantom locations. To address the above problem, we propose a new protocol for source location protection based on limited flooding, named SLP. Compared with existing protocols, SLP can generate phantom locations that are not only far away from the source, but also widely distributed. It improves source location security significantly with low communication cost. We further propose a protocol, namely SLP-E, to protect source location against more powerful attackers with wider fields of vision. The performance of our SLP and SLP-E are validated by both theoretical analysis and simulation results.
Research of mine water source identification based on LIF technology
NASA Astrophysics Data System (ADS)
Zhou, Mengran; Yan, Pengcheng
2016-09-01
According to the problem that traditional chemical methods to the mine water source identification takes a long time, put forward a method for rapid source identification system of mine water inrush based on the technology of laser induced fluorescence (LIF). Emphatically analyzes the basic principle of LIF technology. The hardware composition of LIF system are analyzed and the related modules were selected. Through the fluorescence experiment with the water samples of coal mine in the LIF system, fluorescence spectra of water samples are got. Traditional water source identification mainly according to the ion concentration representative of the water, but it is hard to analysis the ion concentration of the water from the fluorescence spectra. This paper proposes a simple and practical method of rapid identification of water by fluorescence spectrum, which measure the space distance between unknown water samples and standard samples, and then based on the clustering analysis, the category of the unknown water sample can be get. Water source identification for unknown samples verified the reliability of the LIF system, and solve the problem that the current coal mine can't have a better real-time and online monitoring on water inrush, which is of great significance for coal mine safety in production.
NASA Astrophysics Data System (ADS)
Sheesley, Rebecca J.; Nallathamby, Punith Dev; Surratt, Jason D.; Lee, Anita; Lewandowski, Michael; Offenberg, John H.; Jaoui, Mohammed; Kleindienst, Tadeusz E.
2017-10-01
The present study investigates primary and secondary sources of organic carbon for Bakersfield, CA, USA as part of the 2010 CalNex study. The method used here involves integrated sampling that is designed to allow for detailed and specific chemical analysis of particulate matter (PM) in the Bakersfield airshed. To achieve this objective, filter samples were taken during thirty-four 23-hr periods between 19 May and 26 June 2010 and analyzed for organic tracers by gas chromatography - mass spectrometry (GC-MS). Contributions to organic carbon (OC) were determined by two organic tracer-based techniques: primary OC by chemical mass balance and secondary OC by a mass fraction method. Radiocarbon (14C) measurements of the total organic carbon were also made to determine the split between the modern and fossil carbon and thereby constrain unknown sources of OC not accounted for by either tracer-based attribution technique. From the analysis, OC contributions from four primary sources and four secondary sources were determined, which comprised three sources of modern carbon and five sources of fossil carbon. The major primary sources of OC were from vegetative detritus (9.8%), diesel (2.3%), gasoline (<1.0%), and lubricating oil impacted motor vehicle exhaust (30%); measured secondary sources resulted from isoprene (1.5%), α-pinene (<1.0%), toluene (<1.0%), and naphthalene (<1.0%, as an upper limit) contributions. The average observed organic carbon (OC) was 6.42 ± 2.33 μgC m-3. The 14C derived apportionment indicated that modern and fossil components were nearly equivalent on average; however, the fossil contribution ranged from 32 to 66% over the five week campaign. With the fossil primary and secondary sources aggregated, only 25% of the fossil organic carbon could not be attributed. Whereas, nearly 80% of the modern carbon could not be attributed to primary and secondary sources accessible to this analysis, which included tracers of biomass burning, vegetative detritus and secondary biogenic carbon. The results of the current study contributes source-based evaluation of the carbonaceous aerosol at CalNex Bakersfield.
Sheesley, Rebecca J.; Nallathamby, Punith Dev; Surratt, Jason D.; Lee, Anita; Lewandowski, Michael; Offenberg, John H.; Jaoui, Mohammed; Kleindienst, Tadeusz E.
2018-01-01
The present study investigates primary and secondary sources of organic carbon for Bakersfield, CA, USA as part of the 2010 CalNex study. The method used here involves integrated sampling that is designed to allow for detailed and specific chemical analysis of particulate matter (PM) in the Bakersfield airshed. To achieve this objective, filter samples were taken during thirty-four 23-hr periods between 19 May and 26 June 2010 and analyzed for organic tracers by gas chromatography – mass spectrometry (GC-MS). Contributions to organic carbon (OC) were determined by two organic tracer-based techniques: primary OC by chemical mass balance and secondary OC by a mass fraction method. Radiocarbon (14C) measurements of the total organic carbon were also made to determine the split between the modern and fossil carbon and thereby constrain unknown sources of OC not accounted for by either tracer-based attribution technique. From the analysis, OC contributions from four primary sources and four secondary sources were determined, which comprised three sources of modern carbon and five sources of fossil carbon. The major primary sources of OC were from vegetative detritus (9.8%), diesel (2.3%), gasoline (<1.0%), and lubricating oil impacted motor vehicle exhaust (30%); measured secondary sources resulted from isoprene (1.5%), α-pinene (<1.0%), toluene (<1.0%), and naphthalene (<1.0%, as an upper limit) contributions. The average observed organic carbon (OC) was 6.42 ± 2.33 μgC m−3. The 14C derived apportionment indicated that modern and fossil components were nearly equivalent on average; however, the fossil contribution ranged from 32-66% over the five week campaign. With the fossil primary and secondary sources aggregated, only 25% of the fossil organic carbon could not be attributed. Whereas, nearly 80% of the modern carbon could not be attributed to primary and secondary sources accessible to this analysis, which included tracers of biomass burning, vegetative detritus and secondary biogenic carbon. The results of the current study contributes source-based evaluation of the carbonaceous aerosol at CalNex Bakersfield. PMID:29681757
Analysis on the inbound tourist source market in Fujian Province
NASA Astrophysics Data System (ADS)
YU, Tong
2017-06-01
The paper analyzes the development and structure of inbound tourism in Fujian Province by Excel software and conducts the cluster analysis on the inbound tourism market by SPSS 23.0 software based on the inbound tourism data of Fujian Province from 2006 to 2015. The results show: the rapid development of inbound tourism in Fujian Province and the diversified inbound tourist source countries indicate the stability of inbound tourism market; the inbound tourist source market in Fujian Province can be divided into four categories according to the cluster analysis, and tourists from the United States, Japan, Malaysia, and Singapore are the key of inbound tourism in Fujian Province.
NASA Astrophysics Data System (ADS)
Yu, Mingjing; Rhoads, Bruce L.
2018-05-01
The flux of fine sediment within agricultural watersheds is an important factor determining the environmental quality of streams and rivers. Despite this importance, the contributions of sediment sources to suspended sediment loads within intensively managed agricultural watersheds remain poorly understood. This study assesses the provenance of fine suspended sediment in the headwater portion of a river flowing through an agricultural landscape in Illinois. Sediment source samples were collected from five sources: croplands, forested floodplains, grasslands, upper grazed floodplains, and lower grazed floodplains. Event-based and aggregated suspended sediment samples were collected from the stream at the watershed outlet. Quantitative geochemical fingerprinting techniques and a mixing model were employed to estimate the relative contributions of sediment from the five sources to the suspended sediment loads. To account for possible effects of small sample sizes, the analysis was repeated with only two sources: grazed floodplains and croplands/grasslands/forested floodplains. Results based on mean values of tracers indicate that the vast majority of suspended sediment within the stream (>95%) is derived from erosion of channel banks and the soil surface within areas of grazed floodplains. Uncertainty analysis based on Monte Carlo simulations indicates that mean values of tracer properties, which do not account for sampling variability in these properties, probably overestimate contributions from the two major sources. Nevertheless, this analysis still supports the conclusion that floodplain erosion accounts for the largest percentage of instream sediment (≈55-75%). Although grazing occurs over only a small portion of the total watershed area, grazed floodplains, which lie in close proximity to the stream channel, are an important source of sediment in this headwater steam system. Efforts to reduce fluxes of fine sediment in this intensively managed landscape should focus on eroding floodplain surfaces and channel banks within heavily grazed reaches of the stream.
Evidence-based management - healthcare manager viewpoints.
Janati, Ali; Hasanpoor, Edris; Hajebrahimi, Sakineh; Sadeghi-Bazargani, Homayoun
2018-06-11
Purpose Hospital manager decisions can have a significant impact on service effectiveness and hospital success, so using an evidence-based approach can improve hospital management. The purpose of this paper is to identify evidence-based management (EBMgt) components and challenges. Consequently, the authors provide an improving evidence-based decision-making framework. Design/methodology/approach A total of 45 semi-structured interviews were conducted in 2016. The authors also established three focus group discussions with health service managers. Data analysis followed deductive qualitative analysis guidelines. Findings Four basic themes emerged from the interviews, including EBMgt evidence sources (including sub-themes: scientific and research evidence, facts and information, political-social development plans, managers' professional expertise and ethical-moral evidence); predictors (sub-themes: stakeholder values and expectations, functional behavior, knowledge, key competencies and skill, evidence sources, evidence levels, uses and benefits and government programs); EBMgt barriers (sub-themes: managers' personal characteristics, decision-making environment, training and research system and organizational issues); and evidence-based hospital management processes (sub-themes: asking, acquiring, appraising, aggregating, applying and assessing). Originality/value Findings suggest that most participants have positive EBMgt attitudes. A full evidence-based hospital manager is a person who uses all evidence sources in a six-step decision-making process. EBMgt frameworks are a good tool to manage healthcare organizations. The authors found factors affecting hospital EBMgt and identified six evidence sources that healthcare managers can use in evidence-based decision-making processes.
NASA Astrophysics Data System (ADS)
Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.
2009-12-01
This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.
Functional Interaction Network Construction and Analysis for Disease Discovery.
Wu, Guanming; Haw, Robin
2017-01-01
Network-based approaches project seemingly unrelated genes or proteins onto a large-scale network context, therefore providing a holistic visualization and analysis platform for genomic data generated from high-throughput experiments, reducing the dimensionality of data via using network modules and increasing the statistic analysis power. Based on the Reactome database, the most popular and comprehensive open-source biological pathway knowledgebase, we have developed a highly reliable protein functional interaction network covering around 60 % of total human genes and an app called ReactomeFIViz for Cytoscape, the most popular biological network visualization and analysis platform. In this chapter, we describe the detailed procedures on how this functional interaction network is constructed by integrating multiple external data sources, extracting functional interactions from human curated pathway databases, building a machine learning classifier called a Naïve Bayesian Classifier, predicting interactions based on the trained Naïve Bayesian Classifier, and finally constructing the functional interaction database. We also provide an example on how to use ReactomeFIViz for performing network-based data analysis for a list of genes.
Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy.
Görlitz, Frederik; Kelly, Douglas J; Warren, Sean C; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J; Stuhmeier, Frank; Neil, Mark A A; Tate, Edward W; Dunsby, Christopher; French, Paul M W
2017-01-18
We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set.
Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy
Warren, Sean C.; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A.; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J.; Stuhmeier, Frank; Neil, Mark A. A.; Tate, Edward W.; Dunsby, Christopher; French, Paul M. W.
2017-01-01
We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set. PMID:28190060
ERIC Educational Resources Information Center
Litman, Cindy; Marple, Stacy; Greenleaf, Cynthia; Charney-Sirott, Irisa; Bolz, Michael J.; Richardson, Lisa K.; Hall, Allison H.; George, MariAnne; Goldman, Susan R.
2017-01-01
This study presents a descriptive analysis of 71 videotaped lessons taught by 34 highly regarded secondary English language arts, history, and science teachers, collected to inform an intervention focused on evidence-based argumentation from multiple text sources. Studying the practices of highly regarded teachers is valuable for identifying…
This work reports the results of a regional receptor-based source apportionment analysis using the Positive Matrix Factorization (PMF) model on chemically speciated PM2.5 data from 36 urban and rural monitoring sites within the U.S. Pacific Northwest. The approach taken is to mo...
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library. PMID:26656189
Wang, Jiawei; Liu, Ruimin; Wang, Haotian; Yu, Wenwen; Xu, Fei; Shen, Zhenyao
2015-12-01
In this study, positive matrix factorization (PMF) and principal components analysis (PCA) were combined to identify and apportion pollution-based sources of hazardous elements in the surface sediments in the Yangtze River estuary (YRE). Source identification analysis indicated that PC1, including Al, Fe, Mn, Cr, Ni, As, Cu, and Zn, can be defined as a sewage component; PC2, including Pb and Sb, can be considered as an atmospheric deposition component; and PC3, containing Cd and Hg, can be considered as an agricultural nonpoint component. To better identify the sources and quantitatively apportion the concentrations to their sources, eight sources were identified with PMF: agricultural/industrial sewage mixed (18.6 %), mining wastewater (15.9 %), agricultural fertilizer (14.5 %), atmospheric deposition (12.8 %), agricultural nonpoint (10.6 %), industrial wastewater (9.8 %), marine activity (9.0 %), and nickel plating industry (8.8 %). Overall, the hazardous element content seems to be more connected to anthropogenic activity instead of natural sources. The PCA results laid the foundation for the PMF analysis by providing a general classification of sources. PMF resolves more factors with a higher explained variance than PCA; PMF provided both the internal analysis and the quantitative analysis. The combination of the two methods can provide more reasonable and reliable results.
The Admissions Office Goes Scientific.
ERIC Educational Resources Information Center
Bryant, Peter; Crockett, Kevin
1993-01-01
Data-based planning and management is revolutionizing college student recruitment. Data analysis focuses on historical trends, marketing and recruiting strategies, cost-effectiveness strategy, and markets. Data sources include primary market demographics, geo-demographics, secondary sources, student price response information, and institutional…
Guan, Qingyu; Wang, Feifei; Xu, Chuanqi; Pan, Ninghui; Lin, Jinkuo; Zhao, Rui; Yang, Yanyan; Luo, Haiping
2018-02-01
Hexi Corridor is the most important base of commodity grain and producing area for cash crops. However, the rapid development of agriculture and industry has inevitably led to heavy metal contamination in the soils. Multivariate statistical analysis, GIS-based geostatistical methods and Positive Matrix Factorization (PMF) receptor modeling techniques were used to understand the levels of heavy metals and their source apportionment for agricultural soil in Hexi Corridor. The results showed that the average concentrations of Cr, Cu, Ni, Pb and Zn were lower than the secondary standard of soil environmental quality; however, the concentrations of eight metals (Cr, Cu, Mn, Ni, Pb, Ti, V and Zn) were higher than background values, and their corresponding enrichment factor values were significantly greater than 1. Different degrees of heavy metal pollution occurred in the agricultural soils; specifically, Ni had the most potential for impacting human health. The results from the multivariate statistical analysis and GIS-based geostatistical methods indicated both natural sources (Co and W) and anthropogenic sources (Cr, Cu, Mn, Ni, Pb, Ti, V and Zn). To better identify pollution sources of heavy metals in the agricultural soils, the PMF model was applied. Further source apportionment revealed that enrichments of Pb and Zn were attributed to traffic sources; Cr and Ni were closely related to industrial activities, including mining, smelting, coal combustion, iron and steel production and metal processing; Zn and Cu originated from agricultural activities; and V, Ti and Mn were derived from oil- and coal-related activities. Copyright © 2017 Elsevier Ltd. All rights reserved.
Source-Modeling Auditory Processes of EEG Data Using EEGLAB and Brainstorm.
Stropahl, Maren; Bauer, Anna-Katharina R; Debener, Stefan; Bleichner, Martin G
2018-01-01
Electroencephalography (EEG) source localization approaches are often used to disentangle the spatial patterns mixed up in scalp EEG recordings. However, approaches differ substantially between experiments, may be strongly parameter-dependent, and results are not necessarily meaningful. In this paper we provide a pipeline for EEG source estimation, from raw EEG data pre-processing using EEGLAB functions up to source-level analysis as implemented in Brainstorm. The pipeline is tested using a data set of 10 individuals performing an auditory attention task. The analysis approach estimates sources of 64-channel EEG data without the prerequisite of individual anatomies or individually digitized sensor positions. First, we show advanced EEG pre-processing using EEGLAB, which includes artifact attenuation using independent component analysis (ICA). ICA is a linear decomposition technique that aims to reveal the underlying statistical sources of mixed signals and is further a powerful tool to attenuate stereotypical artifacts (e.g., eye movements or heartbeat). Data submitted to ICA are pre-processed to facilitate good-quality decompositions. Aiming toward an objective approach on component identification, the semi-automatic CORRMAP algorithm is applied for the identification of components representing prominent and stereotypic artifacts. Second, we present a step-wise approach to estimate active sources of auditory cortex event-related processing, on a single subject level. The presented approach assumes that no individual anatomy is available and therefore the default anatomy ICBM152, as implemented in Brainstorm, is used for all individuals. Individual noise modeling in this dataset is based on the pre-stimulus baseline period. For EEG source modeling we use the OpenMEEG algorithm as the underlying forward model based on the symmetric Boundary Element Method (BEM). We then apply the method of dynamical statistical parametric mapping (dSPM) to obtain physiologically plausible EEG source estimates. Finally, we show how to perform group level analysis in the time domain on anatomically defined regions of interest (auditory scout). The proposed pipeline needs to be tailored to the specific datasets and paradigms. However, the straightforward combination of EEGLAB and Brainstorm analysis tools may be of interest to others performing EEG source localization.
Cost Analysis Sources and Documents Data Base Reference Manual (Update)
1989-06-01
M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986
ERIC Educational Resources Information Center
Grantham, Susan; Vieira, Edward T., Jr.
2014-01-01
This project examined the focus of environmental news frames used in seven American newspapers between 1970 and 2010. During this time newspapers were a primary source of news. Based on gatekeeping and agenda-setting theory, as well as source credibility, the content analysis of 2,123 articles examined the environmental topics within the articles,…
NASA Astrophysics Data System (ADS)
Knapik, Maciej
2018-02-01
The article presents an economic analysis and comparison of selected (district heating, natural gas, heat pump with renewable energy sources) methods for the preparation of domestic hot water in a building with low energy demand. In buildings of this type increased demand of energy for domestic hot water preparation in relation to the total energy demand can be observed. As a result, the proposed solutions allow to further lower energy demand by using the renewable energy sources. This article presents the results of numerical analysis and calculations performed mainly in MATLAB software, based on typical meteorological years. The results showed that system with heat pump and renewable energy sources Is comparable with district heating system.
Open source tools for fluorescent imaging.
Hamilton, Nicholas A
2012-01-01
As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lee, Gang-San; Kim, Pyung-Rae; Han, Young-Ji; Holsen, Thomas M.; Seo, Yong-Seok; Yi, Seung-Muk
2016-03-01
As a global pollutant, mercury (Hg) is of particular concern in East Asia, where anthropogenic emissions are the largest. In this study, speciated Hg concentrations were measured on Yongheung Island, the westernmost island in Korea, located between China and the Korean mainland to identify the importance of local and regional Hg sources. Various tools including correlations with other pollutants, conditional probability function, and back-trajectory-based analysis consistently indicated that Korean sources were important for gaseous oxidized mercury (GOM) whereas, for total gaseous mercury (TGM) and particulate bound mercury (PBM), regional transport was also important. A trajectory cluster based approach, considering both Hg concentration and the fraction of time each cluster was impacting the site, was developed to quantify the effect of Korean sources and out-of-Korean sources. This analysis suggests that contributions from out-of-Korean sources were similar to Korean sources for TGM whereas Korean sources contributed slightly more to the concentration variations of GOM and PBM compared to out-of-Korean sources. The ratio of GOM/PBM decreased when the site was impacted by regional transport, suggesting that this ratio may be a useful tool for identifying the relative significance of local sources vs. regional transport. The secondary formation of PBM through gas-particle partitioning with GOM was found to be important at low temperatures and high relative humidity.
The Environment Friendly Power Source for Power Supply of Mobile Communication Base Stations
NASA Astrophysics Data System (ADS)
Rudenko, N. V.; Ershov, V. V.; Evstafiev, V. V.
2017-05-01
The article describes the technical proposals to improve environmental and resource characteristics of the autonomous power supply systems of mobile communication base stations based on renewable energy sources, while ensuring the required reliability and security of power supply. These include: the replacement of diesel-generator with clean energy source - an electrochemical generator based on hydrogen fuel cells; the use of wind turbines with a vertical axis; use of specialized batteries. Based on the analysis of the know technical solutions, the structural circuit diagram of the hybrid solar-wind-hydrogen power plant and the basic principles of the algorithm of its work were proposed. The implementation of these proposals will improve the environmental and resource characteristics.
Rong, Y; Padron, A V; Hagerty, K J; Nelson, N; Chi, S; Keyhani, N O; Katz, J; Datta, S P A; Gomes, C; McLamore, E S
2018-04-30
Impedimetric biosensors for measuring small molecules based on weak/transient interactions between bioreceptors and target analytes are a challenge for detection electronics, particularly in field studies or in the analysis of complex matrices. Protein-ligand binding sensors have enormous potential for biosensing, but achieving accuracy in complex solutions is a major challenge. There is a need for simple post hoc analytical tools that are not computationally expensive, yet provide near real time feedback on data derived from impedance spectra. Here, we show the use of a simple, open source support vector machine learning algorithm for analyzing impedimetric data in lieu of using equivalent circuit analysis. We demonstrate two different protein-based biosensors to show that the tool can be used for various applications. We conclude with a mobile phone-based demonstration focused on the measurement of acetone, an important biomarker related to the onset of diabetic ketoacidosis. In all conditions tested, the open source classifier was capable of performing as well as, or better, than the equivalent circuit analysis for characterizing weak/transient interactions between a model ligand (acetone) and a small chemosensory protein derived from the tsetse fly. In addition, the tool has a low computational requirement, facilitating use for mobile acquisition systems such as mobile phones. The protocol is deployed through Jupyter notebook (an open source computing environment available for mobile phone, tablet or computer use) and the code was written in Python. For each of the applications, we provide step-by-step instructions in English, Spanish, Mandarin and Portuguese to facilitate widespread use. All codes were based on scikit-learn, an open source software machine learning library in the Python language, and were processed in Jupyter notebook, an open-source web application for Python. The tool can easily be integrated with the mobile biosensor equipment for rapid detection, facilitating use by a broad range of impedimetric biosensor users. This post hoc analysis tool can serve as a launchpad for the convergence of nanobiosensors in planetary health monitoring applications based on mobile phone hardware.
NASA Astrophysics Data System (ADS)
Wang, Guoqiang; A, Yinglan; Jiang, Hong; Fu, Qing; Zheng, Binghui
2015-01-01
Increasing water pollution in developing countries poses a significant threat to environmental health and human welfare. Understanding the spatial distribution and apportioning the sources of pollution are important for the efficient management of water resources. In this study, ten types of heavy metals were detected during 2010-2013 for all ambient samples and point sources samples. A pollution assessment based on the surficial sediment dataset by Enrichment Factor (EF) showed the surficial sediment was moderately contaminated. A comparison of the multivariate approach (principle components analysis/absolute principle component score, PCA/APCS) and the chemical mass balance model (CMB) shows that the identification of sources and calculation of source contribution based on the CMB were more objective and acceptable when source profiles were known and source composition was complex. The results of source apportionment for surficial heavy metals, both from PCA/APCS and CMB model, showed that the natural background (30%) was the most dominant contributor to the surficial heavy metals, followed by mining activities (29%). The contribution percentage of the natural background was negatively related to the degree of contamination. The peak concentrations of many heavy metals (Cu, Ba, Fe, As and Hg) were found in the middle layer of sediment, which is most likely due to the result of development of industry beginning in the 1970s. However, the highest concentration of Pb appeared in the surficial sediment layer, which was most likely due to the sharp increase in the traffic volume. The historical analysis of the sources based on the CMB showed that mining and the chemical industry are stable sources for all of the sections. The comparing of change rates of source contribution versus years indicated that the composition of the materials in estuary site (HF1) is sensitive to the input from the land, whereas center site (HF4) has a buffering effect on the materials from the land through a series of complex movements. These results provide information for the development of improved pollution control strategies for the lakes and reservoirs.
NASA Astrophysics Data System (ADS)
Misawa, Tsuyoshi; Takahashi, Yoshiyuki; Yagi, Takahiro; Pyeon, Cheol Ho; Kimura, Masaharu; Masuda, Kai; Ohgaki, Hideaki
2015-10-01
For detection of hidden special nuclear materials (SNMs), we have developed an active neutron-based interrogation system combined with a D-D fusion pulsed neutron source and a neutron detection system. In the detection scheme, we have adopted new measurement techniques simultaneously; neutron noise analysis and neutron energy spectrum analysis. The validity of neutron noise analysis method has been experimentally studied in the Kyoto University Critical Assembly (KUCA), and was applied to a cargo container inspection system by simulation.
Suneel, V; Vethamony, P; Zakaria, M P; Naik, B G; Prasad, K V S R
2013-05-15
Deposition of tar balls along the coast of Goa, India is a common phenomenon during the southwest monsoon. Representative tar ball samples collected from various beaches of Goa and one Bombay High (BH) crude oil sample were subjected to fingerprint analysis based on diagnostic ratios of n-alkane, biomarkers of pentacyclic tri-terpanes and compound specific stable carbon isotope (δ¹³C) analysis to confirm the source. The results were compared with the published data of Middle East Crude Oil (MECO) and South East Asian Crude Oil (SEACO). The results revealed that the tar balls were from tanker-wash derived spills. The study also confirmed that the source is not the BH, but SEACO. The present study suggests that the biomarkers of alkanes and hopanes coupled with stable carbon isotope analysis act as a powerful tool for tracing the source of tar balls, particularly when the source specific biomarkers fail to distinguish the source. Copyright © 2013 Elsevier Ltd. All rights reserved.
de Knegt, Leonardo V; Pires, Sara M; Löfström, Charlotta; Sørensen, Gitte; Pedersen, Karl; Torpdahl, Mia; Nielsen, Eva M; Hald, Tine
2016-03-01
Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark. The purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques. © 2015 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.
2017-05-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2017-12-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.
A GIS-based time-dependent seismic source modeling of Northern Iran
NASA Astrophysics Data System (ADS)
Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza
2017-01-01
The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.
A two-step super-Gaussian independent component analysis approach for fMRI data.
Ge, Ruiyang; Yao, Li; Zhang, Hang; Long, Zhiying
2015-09-01
Independent component analysis (ICA) has been widely applied to functional magnetic resonance imaging (fMRI) data analysis. Although ICA assumes that the sources underlying data are statistically independent, it usually ignores sources' additional properties, such as sparsity. In this study, we propose a two-step super-GaussianICA (2SGICA) method that incorporates the sparse prior of the sources into the ICA model. 2SGICA uses the super-Gaussian ICA (SGICA) algorithm that is based on a simplified Lewicki-Sejnowski's model to obtain the initial source estimate in the first step. Using a kernel estimator technique, the source density is acquired and fitted to the Laplacian function based on the initial source estimates. The fitted Laplacian prior is used for each source at the second SGICA step. Moreover, the automatic target generation process for initial value generation is used in 2SGICA to guarantee the stability of the algorithm. An adaptive step size selection criterion is also implemented in the proposed algorithm. We performed experimental tests on both simulated data and real fMRI data to investigate the feasibility and robustness of 2SGICA and made a performance comparison between InfomaxICA, FastICA, mean field ICA (MFICA) with Laplacian prior, sparse online dictionary learning (ODL), SGICA and 2SGICA. Both simulated and real fMRI experiments showed that the 2SGICA was most robust to noises, and had the best spatial detection power and the time course estimation among the six methods. Copyright © 2015. Published by Elsevier Inc.
Head movement compensation in real-time magnetoencephalographic recordings.
Little, Graham; Boe, Shaun; Bardouille, Timothy
2014-01-01
Neurofeedback- and brain-computer interface (BCI)-based interventions can be implemented using real-time analysis of magnetoencephalographic (MEG) recordings. Head movement during MEG recordings, however, can lead to inaccurate estimates of brain activity, reducing the efficacy of the intervention. Most real-time applications in MEG have utilized analyses that do not correct for head movement. Effective means of correcting for head movement are needed to optimize the use of MEG in such applications. Here we provide preliminary validation of a novel analysis technique, real-time source estimation (rtSE), that measures head movement and generates corrected current source time course estimates in real-time. rtSE was applied while recording a calibrated phantom to determine phantom position localization accuracy and source amplitude estimation accuracy under stationary and moving conditions. Results were compared to off-line analysis methods to assess validity of the rtSE technique. The rtSE method allowed for accurate estimation of current source activity at the source-level in real-time, and accounted for movement of the source due to changes in phantom position. The rtSE technique requires modifications and specialized analysis of the following MEG work flow steps.•Data acquisition•Head position estimation•Source localization•Real-time source estimation This work explains the technical details and validates each of these steps.
NASA Astrophysics Data System (ADS)
Tong, Daniel Quansong; Kang, Daiwen; Aneja, Viney P.; Ray, John D.
2005-01-01
We present in this study both measurement-based and modeling analyses for elucidation of source attribution, influence areas, and process budget of reactive nitrogen oxides at two rural southeast United States sites (Great Smoky Mountains national park (GRSM) and Mammoth Cave national park (MACA)). Availability of nitrogen oxides is considered as the limiting factor to ozone production in these areas and the relative source contribution of reactive nitrogen oxides from point or mobile sources is important in understanding why these areas have high ozone. Using two independent observation-based techniques, multiple linear regression analysis and emission inventory analysis, we demonstrate that point sources contribute a minimum of 23% of total NOy at GRSM and 27% at MACA. The influence areas for these two sites, or origins of nitrogen oxides, are investigated using trajectory-cluster analysis. The result shows that air masses from the West and Southwest sweep over GRSM most frequently, while pollutants transported from the eastern half (i.e., East, Northeast, and Southeast) have limited influence (<10% out of all air masses) on air quality at GRSM. The processes responsible for formation and removal of reactive nitrogen oxides are investigated using a comprehensive 3-D air quality model (Multiscale Air Quality SImulation Platform (MAQSIP)). The NOy contribution associated with chemical transformations to NOz and O3, based on process budget analysis, is as follows: 32% and 84% for NOz, and 26% and 80% for O3 at GRSM and MACA, respectively. The similarity between NOz and O3 process budgets suggests a close association between nitrogen oxides and effective O3 production at these rural locations.
Selective Listening Point Audio Based on Blind Signal Separation and Stereophonic Technology
NASA Astrophysics Data System (ADS)
Niwa, Kenta; Nishino, Takanori; Takeda, Kazuya
A sound field reproduction method is proposed that uses blind source separation and a head-related transfer function. In the proposed system, multichannel acoustic signals captured at distant microphones are decomposed to a set of location/signal pairs of virtual sound sources based on frequency-domain independent component analysis. After estimating the locations and the signals of the virtual sources by convolving the controlled acoustic transfer functions with each signal, the spatial sound is constructed at the selected point. In experiments, a sound field made by six sound sources is captured using 48 distant microphones and decomposed into sets of virtual sound sources. Since subjective evaluation shows no significant difference between natural and reconstructed sound when six virtual sources and are used, the effectiveness of the decomposing algorithm as well as the virtual source representation are confirmed.
2013-01-01
Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726
Sources of referral information: a marketing analysis of physician behavior.
Powers, T L; Swan, J E; Taylor, J A; Bendall, D
1998-01-01
The referral process is an important means of obtaining patients and it is necessary to determine ways of influencing the referral process to increase the patient base. This article reports research based on a survey of the referral habits of 806 primary care physicians. The results are examined in the context of physician receptivity to marketer-controlled versus health services sources of referral information.
Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Mcmanus, John William
1992-01-01
Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.
Bai, Mingsian R; Li, Yi; Chiang, Yi-Hao
2017-10-01
A unified framework is proposed for analysis and synthesis of two-dimensional spatial sound field in reverberant environments. In the sound field analysis (SFA) phase, an unbaffled 24-element circular microphone array is utilized to encode the sound field based on the plane-wave decomposition. Depending on the sparsity of the sound sources, the SFA stage can be implemented in two manners. For sparse-source scenarios, a one-stage algorithm based on compressive sensing algorithm is utilized. Alternatively, a two-stage algorithm can be used, where the minimum power distortionless response beamformer is used to localize the sources and Tikhonov regularization algorithm is used to extract the source amplitudes. In the sound field synthesis (SFS), a 32-element rectangular loudspeaker array is employed to decode the target sound field using pressure matching technique. To establish the room response model, as required in the pressure matching step of the SFS phase, an SFA technique for nonsparse-source scenarios is utilized. Choice of regularization parameters is vital to the reproduced sound field. In the SFS phase, three SFS approaches are compared in terms of localization performance and voice reproduction quality. Experimental results obtained in a reverberant room are presented and reveal that an accurate room response model is vital to immersive rendering of the reproduced sound field.
Atayero, Aderemi A; Popoola, Segun I; Egeonu, Jesse; Oludayo, Olumuyiwa
2018-08-01
Citation is one of the important metrics that are used in measuring the relevance and the impact of research publications. The potentials of citation analytics may be exploited to understand the gains of publishing scholarly peer-reviewed research outputs in either Open Access (OA) sources or Subscription-Based (SB) sources in the bid to increase citation impact. However, relevant data required for such comparative analysis must be freely accessible for evidence-based findings and conclusions. In this data article, citation scores ( CiteScores ) of 2542 OA sources and 15,040 SB sources indexed in Scopus from 2014 to 2016 were presented and analyzed based on a set of five inclusion criteria. A robust dataset, which contains the CiteScores of OA and SB publication sources included, is attached as supplementary material to this data article to facilitate further reuse. Descriptive statistics and frequency distributions of OA CiteScores and SB CiteScores are presented in tables. Boxplot representations and scatter plots are provided to show the statistical distributions of OA CiteScores and SB CiteScores across the three sub-categories (Book Series, Journal, and Trade Journal). Correlation coefficient and p-value matrices are made available within the data article. In addition, Probability Density Functions (PDFs) and Cumulative Distribution Functions (CDFs) of OA CiteScores and SB CiteScores are computed and the results are presented using tables and graphs. Furthermore, Analysis of Variance (ANOVA) and multiple comparison post-hoc tests are conducted to understand the statistical difference (and its significance, if any) in the citation impact of OA publication sources and SB publication source based on CiteScore . In the long run, the data provided in this article will help policy makers and researchers in Higher Education Institutions (HEIs) to identify the appropriate publication source type and category for dissemination of scholarly research findings with maximum citation impact.
Spatial assessment of air quality patterns in Malaysia using multivariate analysis
NASA Astrophysics Data System (ADS)
Dominick, Doreena; Juahir, Hafizan; Latif, Mohd Talib; Zain, Sharifuddin M.; Aris, Ahmad Zaharin
2012-12-01
This study aims to investigate possible sources of air pollutants and the spatial patterns within the eight selected Malaysian air monitoring stations based on a two-year database (2008-2009). The multivariate analysis was applied on the dataset. It incorporated Hierarchical Agglomerative Cluster Analysis (HACA) to access the spatial patterns, Principal Component Analysis (PCA) to determine the major sources of the air pollution and Multiple Linear Regression (MLR) to assess the percentage contribution of each air pollutant. The HACA results grouped the eight monitoring stations into three different clusters, based on the characteristics of the air pollutants and meteorological parameters. The PCA analysis showed that the major sources of air pollution were emissions from motor vehicles, aircraft, industries and areas of high population density. The MLR analysis demonstrated that the main pollutant contributing to variability in the Air Pollutant Index (API) at all stations was particulate matter with a diameter of less than 10 μm (PM10). Further MLR analysis showed that the main air pollutant influencing the high concentration of PM10 was carbon monoxide (CO). This was due to combustion processes, particularly originating from motor vehicles. Meteorological factors such as ambient temperature, wind speed and humidity were also noted to influence the concentration of PM10.
Analysis and Simulation of Far-Field Seismic Data from the Source Physics Experiment
2012-09-01
ANALYSIS AND SIMULATION OF FAR-FIELD SEISMIC DATA FROM THE SOURCE PHYSICS EXPERIMENT Arben Pitarka, Robert J. Mellors, Arthur J. Rodgers, Sean...Security Site (NNSS) provides new data for investigating the excitation and propagation of seismic waves generated by buried explosions. A particular... seismic model. The 3D seismic model includes surface topography. It is based on regional geological data, with material properties constrained by shallow
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
Analysis of jet-airfoil interaction noise sources by using a microphone array technique
NASA Astrophysics Data System (ADS)
Fleury, Vincent; Davy, Renaud
2016-03-01
The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.
EEG and MEG data analysis in SPM8.
Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl
2011-01-01
SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.
EEG and MEG Data Analysis in SPM8
Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl
2011-01-01
SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221
Kharroubi, Adel; Gargouri, Dorra; Baati, Houda; Azri, Chafai
2012-06-01
Concentrations of selected heavy metals (Cd, Pb, Zn, Cu, Mn, and Fe) in surface sediments from 66 sites in both northern and eastern Mediterranean Sea-Boughrara lagoon exchange areas (southeastern Tunisia) were studied in order to understand current metal contamination due to the urbanization and economic development of nearby several coastal regions of the Gulf of Gabès. Multiple approaches were applied for the sediment quality assessment. These approaches were based on GIS coupled with chemometric methods (enrichment factors, geoaccumulation index, principal component analysis, and cluster analysis). Enrichment factors and principal component analysis revealed two distinct groups of metals. The first group corresponded to Fe and Mn derived from natural sources, and the second group contained Cd, Pb, Zn, and Cu originated from man-made sources. For these latter metals, cluster analysis showed two distinct distributions in the selected areas. They were attributed to temporal and spatial variations of contaminant sources input. The geoaccumulation index (I (geo)) values explained that only Cd, Pb, and Cu can be considered as moderate to extreme pollutants in the studied sediments.
Spectroscopic characterization of low dose rate brachytherapy sources
NASA Astrophysics Data System (ADS)
Beach, Stephen M.
The low dose rate (LDR) brachytherapy seeds employed in permanent radioactive-source implant treatments usually use one of two radionuclides, 125I or 103Pd. The theoretically expected source spectroscopic output from these sources can be obtained via Monte Carlo calculation based upon seed dimensions and materials as well as the bare-source photon emissions for that specific radionuclide. However the discrepancies resulting from inconsistent manufacturing of sources in comparison to each other within model groups and simplified Monte Carlo calculational geometries ultimately result in undesirably large uncertainties in the Monte Carlo calculated values. This dissertation describes experimentally attained spectroscopic outputs of the clinically used brachytherapy sources in air and in liquid water. Such knowledge can then be applied to characterize these sources by a more fundamental and metro logically-pure classification, that of energy-based dosimetry. The spectroscopic results contained within this dissertation can be utilized in the verification and benchmarking of Monte Carlo calculational models of these brachytherapy sources. This body of work was undertaken to establish a usable spectroscopy system and analysis methods for the meaningful study of LDR brachytherapy seeds. The development of a correction algorithm and the analysis of the resultant spectroscopic measurements are presented. The characterization of the spectrometer and the subsequent deconvolution of the measured spectrum to obtain the true spectrum free of any perturbations caused by the spectrometer itself is an important contribution of this work. The approach of spectroscopic deconvolution that was applied in this work is derived in detail and it is applied to the physical measurements. In addition, the spectroscopically based analogs to the LDR dosimetry parameters that are currently employed are detailed, as well as the development of the theory and measurement methods to arrive at these analogs. Several dosimetrically-relevant water-equivalent plastics were also investigated for their transmission properties within a liquid water environment, as well as in air. The framework for the accurate spectrometry of LDR sources is established as a result of this dissertation work. In addition to the measurement and analysis methods, this work presents the basic measured spectroscopic characteristics of each LDR seed currently in use in the clinic today.
In Search of a Pony: Sources, Methods, Outcomes, and Motivated Reasoning.
Stone, Marc B
2018-05-01
It is highly desirable to be able to evaluate the effect of policy interventions. Such evaluations should have expected outcomes based upon sound theory and be carefully planned, objectively evaluated and prospectively executed. In many cases, however, assessments originate with investigators' poorly substantiated beliefs about the effects of a policy. Instead of designing studies that test falsifiable hypotheses, these investigators adopt methods and data sources that serve as little more than descriptions of these beliefs in the guise of analysis. Interrupted time series analysis is one of the most popular forms of analysis used to present these beliefs. It is intuitively appealing but, in most cases, it is based upon false analogies, fallacious assumptions and analytical errors.
NASA Astrophysics Data System (ADS)
Yu, Tong; Ye, Yue-li
2018-05-01
In accordance with the related data of Statistical Yearbook of Jiangxi (2007—2016), conduct analysis on the development situation of inbound tourist source market in Jiangxi based on geographic concentration index and market competition status. The result shows: when the geographic concentration index of the inbound tourism market in Jiangxi presents an increasing decline trend, the tourism-generating countries of the inbound tourism in Jiangxi are getting more and more disperse, the tourist markets present the diversified feature and the inbound tourism market tends to be stable; besides, the analysis result of the market competition status shows that the inbound tourism development in Jiangxi has transformed from the rapid development to stable development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apte, A; Veeraraghavan, H; Oh, J
Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less
Bronchart, Filip; De Paepe, Michel; Dewulf, Jo; Schrevens, Eddie; Demeyer, Peter
2013-04-15
In Flanders and the Netherlands greenhouse production systems produce economically important quantities of vegetables, fruit and ornamentals. Indoor environmental control has resulted in high primary energy use. Until now, the research on saving primary energy in greenhouse systems has been mainly based on analysis of energy balances. However, according to the thermodynamic theory, an analysis based on the concept of exergy (free energy) and energy can result in new insights and primary energy savings. Therefore in this paper, we analyse the exergy and energy of various processes, inputs and outputs of a general greenhouse system. Also a total system analysis is then performed by linking the exergy analysis with a dynamic greenhouse climate growth simulation model. The exergy analysis indicates that some processes ("Sources") lie at the origin of several other processes, both destroying the exergy of primary energy inputs. The exergy destruction of these Sources is caused primarily by heat and vapour loss. Their impact can be compensated by exergy input from heating, solar radiation, or both. If the exergy destruction of these Sources is reduced, the necessary compensation can also be reduced. This can be accomplished through insulating the greenhouse and making the building more airtight. Other necessary Sources, namely transpiration and loss of CO2, have a low exergy destruction compared to the other Sources. They are therefore the best candidate for "pump" technologies ("vapour heat pump" and "CO2 pump") designed to have a low primary energy use. The combination of these proposed technologies results in an exergy efficient greenhouse with the highest primary energy savings. It can be concluded that exergy analyses add additional information compared to only energy analyses and it supports the development of primary energy efficient greenhouse systems. Copyright © 2013 Elsevier Ltd. All rights reserved.
Vakorin, Vasily A.; Mišić, Bratislav; Krakovska, Olga; McIntosh, Anthony Randal
2011-01-01
Variability in source dynamics across the sources in an activated network may be indicative of how the information is processed within a network. Information-theoretic tools allow one not only to characterize local brain dynamics but also to describe interactions between distributed brain activity. This study follows such a framework and explores the relations between signal variability and asymmetry in mutual interdependencies in a data-driven pipeline of non-linear analysis of neuromagnetic sources reconstructed from human magnetoencephalographic (MEG) data collected as a reaction to a face recognition task. Asymmetry in non-linear interdependencies in the network was analyzed using transfer entropy, which quantifies predictive information transfer between the sources. Variability of the source activity was estimated using multi-scale entropy, quantifying the rate of which information is generated. The empirical results are supported by an analysis of synthetic data based on the dynamics of coupled systems with time delay in coupling. We found that the amount of information transferred from one source to another was correlated with the difference in variability between the dynamics of these two sources, with the directionality of net information transfer depending on the time scale at which the sample entropy was computed. The results based on synthetic data suggest that both time delay and strength of coupling can contribute to the relations between variability of brain signals and information transfer between them. Our findings support the previous attempts to characterize functional organization of the activated brain, based on a combination of non-linear dynamics and temporal features of brain connectivity, such as time delay. PMID:22131968
NASA Astrophysics Data System (ADS)
Zheng, Huang; Kong, Shaofei; Xing, Xinli; Mao, Yao; Hu, Tianpeng; Ding, Yang; Li, Gang; Liu, Dantong; Li, Shuanglin; Qi, Shihua
2018-04-01
Oil and natural gas are important for energy supply around the world. The exploring, drilling, transportation and processing in oil and gas regions can release a lot of volatile organic compounds (VOCs). To understand the VOC levels, compositions and sources in such regions, an oil and gas station in northwest China was chosen as the research site and 57 VOCs designated as the photochemical precursors were continuously measured for an entire year (September 2014-August 2015) using an online monitoring system. The average concentration of total VOCs was 297 ± 372 ppbv and the main contributor was alkanes, accounting for 87.5 % of the total VOCs. According to the propylene-equivalent concentration and maximum incremental reactivity methods, alkanes were identified as the most important VOC groups for the ozone formation potential. Positive matrix factorization (PMF) analysis showed that the annual average contributions from natural gas, fuel evaporation, combustion sources, oil refining processes and asphalt (anthropogenic and natural sources) to the total VOCs were 62.6 ± 3.04, 21.5 ± .99, 10.9 ± 1.57, 3.8 ± 0.50 and 1.3 ± 0.69 %, respectively. The five identified VOC sources exhibited various diurnal patterns due to their different emission patterns and the impact of meteorological parameters. Potential source contribution function (PSCF) and concentration-weighted trajectory (CWT) models based on backward trajectory analysis indicated that the five identified sources had similar geographic origins. Raster analysis based on CWT analysis indicated that the local emissions contributed 48.4-74.6 % to the total VOCs. Based on the high-resolution observation data, this study clearly described and analyzed the temporal variation in VOC emission characteristics at a typical oil and gas field, which exhibited different VOC levels, compositions and origins compared with those in urban and industrial areas.
EGRET/COMPTEL Observations of an Unusual, Steep-Spectrum Gamma-Ray Source
NASA Technical Reports Server (NTRS)
Thompson, D. J.; Bertsch, D. L.; Hartman, R. C.; Collmar, W.; Johnson, W. N.
1999-01-01
During analysis of sources below the threshold of the third EGRET catalog, we have discovered a source, named GRO J1400-3956 based on the best position, with a remarkably steep spectrum. Archival analysis of COMPTEL data shows that the spectrum must have a strong turn-over in the energy range between COMPTEL and EGRET. The EGRET data show some evidence of time variability, suggesting an AGN, but the spectral change of slope is larger than that seen for most gamma-ray blazars. The sharp cutoff resembles the high-energy spectral breaks seen in some gamma-ray pulsars. There have as yet been no OSSE observations of this source.
Prioritizing Genes Related to Nicotine Addiction Via a Multi-source-Based Approach.
Liu, Xinhua; Liu, Meng; Li, Xia; Zhang, Lihua; Fan, Rui; Wang, Ju
2015-08-01
Nicotine has a broad impact on both the central and peripheral nervous systems. Over the past decades, an increasing number of genes potentially involved in nicotine addiction have been identified by different technical approaches. However, the molecular mechanisms underlying nicotine addiction remain largely unknown. Under such situation, prioritizing the candidate genes for further investigation is becoming increasingly important. In this study, we presented a multi-source-based gene prioritization approach for nicotine addiction by utilizing the vast amounts of information generated from for nicotine addiction study during the past years. In this approach, we first collected and curated genes from studies in four categories, i.e., genetic association analysis, genetic linkage analysis, high-throughput gene/protein expression analysis, and literature search of single gene/protein-based studies. Based on these resources, the genes were scored and a weight value was determined for each category. Finally, the genes were ranked by their combined scores, and 220 genes were selected as the prioritized nicotine addiction-related genes. Evaluation suggested the prioritized genes were promising targets for further analysis and replication study.
A Wavelet-Based Algorithm for the Spatial Analysis of Poisson Data
NASA Astrophysics Data System (ADS)
Freeman, P. E.; Kashyap, V.; Rosner, R.; Lamb, D. Q.
2002-01-01
Wavelets are scalable, oscillatory functions that deviate from zero only within a limited spatial regime and have average value zero, and thus may be used to simultaneously characterize the shape, location, and strength of astronomical sources. But in addition to their use as source characterizers, wavelet functions are rapidly gaining currency within the source detection field. Wavelet-based source detection involves the correlation of scaled wavelet functions with binned, two-dimensional image data. If the chosen wavelet function exhibits the property of vanishing moments, significantly nonzero correlation coefficients will be observed only where there are high-order variations in the data; e.g., they will be observed in the vicinity of sources. Source pixels are identified by comparing each correlation coefficient with its probability sampling distribution, which is a function of the (estimated or a priori known) background amplitude. In this paper, we describe the mission-independent, wavelet-based source detection algorithm ``WAVDETECT,'' part of the freely available Chandra Interactive Analysis of Observations (CIAO) software package. Our algorithm uses the Marr, or ``Mexican Hat'' wavelet function, but may be adapted for use with other wavelet functions. Aspects of our algorithm include: (1) the computation of local, exposure-corrected normalized (i.e., flat-fielded) background maps; (2) the correction for exposure variations within the field of view (due to, e.g., telescope support ribs or the edge of the field); (3) its applicability within the low-counts regime, as it does not require a minimum number of background counts per pixel for the accurate computation of source detection thresholds; (4) the generation of a source list in a manner that does not depend upon a detailed knowledge of the point spread function (PSF) shape; and (5) error analysis. These features make our algorithm considerably more general than previous methods developed for the analysis of X-ray image data, especially in the low count regime. We demonstrate the robustness of WAVDETECT by applying it to an image from an idealized detector with a spatially invariant Gaussian PSF and an exposure map similar to that of the Einstein IPC; to Pleiades Cluster data collected by the ROSAT PSPC; and to simulated Chandra ACIS-I image of the Lockman Hole region.
Detector location selection based on VIP analysis in near-infrared detection of dural hematoma.
Sun, Qiuming; Zhang, Yanjun; Ma, Jun; Tian, Feng; Wang, Huiquan; Liu, Dongyuan
2018-03-01
Detection of dural hematoma based on multi-channel near-infrared differential absorbance has the advantages of rapid and non-invasive detection. The location and number of detectors around the light source are critical for reducing the pathological characteristics of the prediction model on dural hematoma degree. Therefore, rational selection of detector numbers and their distances from the light source is very important. In this paper, a detector position screening method based on Variable Importance in the Projection (VIP) analysis is proposed. A preliminary modeling based on Partial Least Squares method (PLS) for the prediction of dural position μ a was established using light absorbance information from 30 detectors located 2.0-5.0 cm from the light source with a 0.1 cm interval. The mean relative error (MRE) of the dural position μ a prediction model was 4.08%. After VIP analysis, the number of detectors was reduced from 30 to 4 and the MRE of the dural position μ a prediction was reduced from 4.08% to 2.06% after the reduction in detector numbers. The prediction model after VIP detector screening still showed good prediction of the epidural position μ a . This study provided a new approach and important reference on the selection of detector location in near-infrared dural hematoma detection.
NASA Astrophysics Data System (ADS)
Loye, A.; Jaboyedoff, M.; Pedrazzini, A.
2009-10-01
The availability of high resolution Digital Elevation Models (DEM) at a regional scale enables the analysis of topography with high levels of detail. Hence, a DEM-based geomorphometric approach becomes more accurate for detecting potential rockfall sources. Potential rockfall source areas are identified according to the slope angle distribution deduced from high resolution DEM crossed with other information extracted from geological and topographic maps in GIS format. The slope angle distribution can be decomposed in several Gaussian distributions that can be considered as characteristic of morphological units: rock cliffs, steep slopes, footslopes and plains. A terrain is considered as potential rockfall sources when their slope angles lie over an angle threshold, which is defined where the Gaussian distribution of the morphological unit "Rock cliffs" become dominant over the one of "Steep slopes". In addition to this analysis, the cliff outcrops indicated by the topographic maps were added. They contain however "flat areas", so that only the slope angles values above the mode of the Gaussian distribution of the morphological unit "Steep slopes" were considered. An application of this method is presented over the entire Canton of Vaud (3200 km2), Switzerland. The results were compared with rockfall sources observed on the field and orthophotos analysis in order to validate the method. Finally, the influence of the cell size of the DEM is inspected by applying the methodology over six different DEM resolutions.
NASA Astrophysics Data System (ADS)
Aksenov, V. N.; Angeluts, A. A.; Balakin, A. V.; Maksimov, E. M.; Ozheredov, I. A.; Shkurinov, A. P.
2018-05-01
We demonstrate the possibility of using a multi-frequency terahertz source to identify substances basing on the analysis of relative amplitudes of the terahertz waves scattered by the object. The results of studying experimentally the scattering of quasi-monochromatic radiation generated by a two-frequency terahertz quantum-cascade laser by the surface of the samples containing inclusions of absorbing substances are presented. It is shown that the spectral features of absorption of these substances within the terahertz frequency range manifest themselves in variations of the amplitudes of the waves at frequencies of 3.0 and 3.7 THz, which are scattered by the samples under consideration.
Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area
NASA Astrophysics Data System (ADS)
Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan
2018-01-01
The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.
Research on response spectrum of dam based on scenario earthquake
NASA Astrophysics Data System (ADS)
Zhang, Xiaoliang; Zhang, Yushan
2017-10-01
Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.
Ueda, Masanori; Iwaki, Masafumi; Nishihara, Tokihiro; Satoh, Yoshio; Hashimoto, Ken-ya
2008-04-01
This paper describes a circuit model for the analysis of nonlinearity in the filters based on radiofrequency (RF) bulk acoustic wave (BAW) resonators. The nonlinear output is expressed by a current source connected parallel to the linear resonator. Amplitude of the nonlinear current source is programmed proportional to the product of linear currents flowing in the resonator. Thus, the nonlinear analysis is performed by the common linear analysis, even for complex device structures. The analysis is applied to a ladder-type RF BAW filter, and frequency dependence of the nonlinear output is discussed. Furthermore, this analysis is verified through comparison with experiments.
Feng, Jingjing; Chen, Xiaolin; Jia, Lei; Liu, Qizhen; Chen, Xiaojia; Han, Deming; Cheng, Jinping
2018-04-10
Wastewater treatment plants (WWTPs) are the most common form of industrial and municipal wastewater control. To evaluate the performance of wastewater treatment and the potential risk of treated wastewater to aquatic life and human health, the influent and effluent concentrations of nine toxic metals were determined in 12 full-scale WWTPs in Shanghai, China. The performance was evaluated based on national standards for reclamation and aquatic criteria published by US EPA, and by comparison with other full-scale WWTPs in different countries. Potential sources of heavy metals were recognized using partial correlation analysis, hierarchical clustering, and principal component analysis (PCA). Results indicated significant treatment effect on As, Cd, Cr, Cu, Hg, Mn, Pb, and Zn. The removal efficiencies ranged from 92% (Cr) to 16.7% (Hg). The results indicated potential acute and/or chronic effect of Cu, Ni, Pb, and Zn on aquatic life and potential harmful effect of As and Mn on human health for the consumption of water and/or organism. The results of partial correlation analysis, hierarchical clustering based on cosine distance, and PCA, which were consistent with each other, suggested common source of Cd, Cr, Cu, and Pb and common source of As, Hg, Mn, Ni, and Zn. Hierarchical clustering based on Jaccard similarity suggested common source of Cd, Hg, and Ni, which was statistically proved by Fisher's exact test.
NASA Astrophysics Data System (ADS)
Weise, Sebastian; Steinbach, Bastian; Biermann, Steffen
2016-03-01
The series JSIR350 sources are MEMS based infrared emitters. These IR sources are characterized by a high radiation output. Thus, they are excellent for NDIR gas analysis and are ideally suited for using with our pyro-electric or thermopile detectors. The MEMS chips used in Micro-Hybrid's infrared emitters consist of nano-amorphous carbon (NAC). The MEMS chips are produced in the USA. All Micro-Hybrid Emitter are designed and specified to operate up to 850°C. The improvements we have made in the source's packaging enable us to provide IR sources with the best performance on the market. This new technology enables us to seal the housings of infrared radiation sources with soldered infrared filters or windows and thus cause the parts to be impenetrable to gases. Micro-Hybrid provide various ways of adapting our MEMS based infrared emitter JSIR350 to customer specifications, like specific burn-in parameters/characteristic, different industrial standard housings, producible with customized cap, reflector or pin-out.
Spatiotemporal source tuning filter bank for multiclass EEG based brain computer interfaces.
Acharya, Soumyadipta; Mollazadeh, Moshen; Murari, Kartikeya; Thakor, Nitish
2006-01-01
Non invasive brain-computer interfaces (BCI) allow people to communicate by modulating features of their electroencephalogram (EEG). Spatiotemporal filtering has a vital role in multi-class, EEG based BCI. In this study, we used a novel combination of principle component analysis, independent component analysis and dipole source localization to design a spatiotemporal multiple source tuning (SPAMSORT) filter bank, each channel of which was tuned to the activity of an underlying dipole source. Changes in the event-related spectral perturbation (ERSP) were measured and used to train a linear support vector machine to classify between four classes of motor imagery tasks (left hand, right hand, foot and tongue) for one subject. ERSP values were significantly (p<0.01) different across tasks and better (p<0.01) than conventional spatial filtering methods (large Laplacian and common average reference). Classification resulted in an average accuracy of 82.5%. This approach could lead to promising BCI applications such as control of a prosthesis with multiple degrees of freedom.
Tunno, Brett J; Dalton, Rebecca; Michanowicz, Drew R; Shmool, Jessie L C; Kinnee, Ellen; Tripathy, Sheila; Cambal, Leah; Clougherty, Jane E
2016-01-01
Health effects of fine particulate matter (PM2.5) vary by chemical composition, and composition can help to identify key PM2.5 sources across urban areas. Further, this intra-urban spatial variation in concentrations and composition may vary with meteorological conditions (e.g., mixing height). Accordingly, we hypothesized that spatial sampling during atmospheric inversions would help to better identify localized source effects, and reveal more distinct spatial patterns in key constituents. We designed a 2-year monitoring campaign to capture fine-scale intra-urban variability in PM2.5 composition across Pittsburgh, PA, and compared both spatial patterns and source effects during “frequent inversion” hours vs 24-h weeklong averages. Using spatially distributed programmable monitors, and a geographic information systems (GIS)-based design, we collected PM2.5 samples across 37 sampling locations per year to capture variation in local pollution sources (e.g., proximity to industry, traffic density) and terrain (e.g., elevation). We used inductively coupled plasma mass spectrometry (ICP-MS) to determine elemental composition, and unconstrained factor analysis to identify source suites by sampling scheme and season. We examined spatial patterning in source factors using land use regression (LUR), wherein GIS-based source indicators served to corroborate factor interpretations. Under both summer sampling regimes, and for winter inversion-focused sampling, we identified six source factors, characterized by tracers associated with brake and tire wear, steel-making, soil and road dust, coal, diesel exhaust, and vehicular emissions. For winter 24-h samples, four factors suggested traffic/fuel oil, traffic emissions, coal/industry, and steel-making sources. In LURs, as hypothesized, GIS-based source terms better explained spatial variability in inversion-focused samples, including a greater contribution from roadway, steel, and coal-related sources. Factor analysis produced source-related constituent suites under both sampling designs, though factors were more distinct under inversion-focused sampling. PMID:26507005
Ion diffusion may introduce spurious current sources in current-source density (CSD) analysis.
Halnes, Geir; Mäki-Marttunen, Tuomo; Pettersen, Klas H; Andreassen, Ole A; Einevoll, Gaute T
2017-07-01
Current-source density (CSD) analysis is a well-established method for analyzing recorded local field potentials (LFPs), that is, the low-frequency part of extracellular potentials. Standard CSD theory is based on the assumption that all extracellular currents are purely ohmic, and thus neglects the possible impact from ionic diffusion on recorded potentials. However, it has previously been shown that in physiological conditions with large ion-concentration gradients, diffusive currents can evoke slow shifts in extracellular potentials. Using computer simulations, we here show that diffusion-evoked potential shifts can introduce errors in standard CSD analysis, and can lead to prediction of spurious current sources. Further, we here show that the diffusion-evoked prediction errors can be removed by using an improved CSD estimator which accounts for concentration-dependent effects. NEW & NOTEWORTHY Standard CSD analysis does not account for ionic diffusion. Using biophysically realistic computer simulations, we show that unaccounted-for diffusive currents can lead to the prediction of spurious current sources. This finding may be of strong interest for in vivo electrophysiologists doing extracellular recordings in general, and CSD analysis in particular. Copyright © 2017 the American Physiological Society.
SensA: web-based sensitivity analysis of SBML models.
Floettmann, Max; Uhlendorf, Jannis; Scharp, Till; Klipp, Edda; Spiesser, Thomas W
2014-10-01
SensA is a web-based application for sensitivity analysis of mathematical models. The sensitivity analysis is based on metabolic control analysis, computing the local, global and time-dependent properties of model components. Interactive visualization facilitates interpretation of usually complex results. SensA can contribute to the analysis, adjustment and understanding of mathematical models for dynamic systems. SensA is available at http://gofid.biologie.hu-berlin.de/ and can be used with any modern browser. The source code can be found at https://bitbucket.org/floettma/sensa/ (MIT license) © The Author 2014. Published by Oxford University Press.
A Concept Analysis of Self-Care Based on Islamic Sources.
Marzband, Rahmatollah; Zakavi, Ali Asghar
2017-07-01
This article describes the concept of self-care from Islamic texts. Rodgers' evolutionary model of concept analysis was used in this study. Self-care is a series of responsible activities to God for health promotion, preventive disease and remedy. It encompasses physical, mental, spiritual, and social dimensions. A comprehensive definition of the concept of self-care ensued from a review of Islamic literature. Since the nurses instruct and assist individuals as they engage in self-care, using a comprehensive definition of self-care based on Islamic sources would provide an anchor linking for them as they interact with Muslim patients. © 2015 NANDA International, Inc.
pyOpenMS: a Python-based interface to the OpenMS mass-spectrometry algorithm library.
Röst, Hannes L; Schmitt, Uwe; Aebersold, Ruedi; Malmström, Lars
2014-01-01
pyOpenMS is an open-source, Python-based interface to the C++ OpenMS library, providing facile access to a feature-rich, open-source algorithm library for MS-based proteomics analysis. It contains Python bindings that allow raw access to the data structures and algorithms implemented in OpenMS, specifically those for file access (mzXML, mzML, TraML, mzIdentML among others), basic signal processing (smoothing, filtering, de-isotoping, and peak-picking) and complex data analysis (including label-free, SILAC, iTRAQ, and SWATH analysis tools). pyOpenMS thus allows fast prototyping and efficient workflow development in a fully interactive manner (using the interactive Python interpreter) and is also ideally suited for researchers not proficient in C++. In addition, our code to wrap a complex C++ library is completely open-source, allowing other projects to create similar bindings with ease. The pyOpenMS framework is freely available at https://pypi.python.org/pypi/pyopenms while the autowrap tool to create Cython code automatically is available at https://pypi.python.org/pypi/autowrap (both released under the 3-clause BSD licence). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Lee, G.-S.; Kim, P.-R.; Han, Y.-J.; Holsen, T. M.; Seo, Y.-S.; Yi, S.-M.
2015-11-01
As a global pollutant, mercury (Hg) is of particular concern in East Asia where anthropogenic emissions are the largest. In this study, speciated Hg concentrations were measured in the western most island in Korea, located between China and the Korean mainland to identify the importance of local, regional and distant Hg sources. Various tools including correlations with other pollutants, conditional probability function, and back-trajectory based analysis consistently indicated that Korean sources were important for gaseous oxidized mercury (GOM) whereas, for total gaseous mercury (TGM) and particulate bound mercury (PBM), long-range and regional transport were also important. A trajectory cluster based approach considering both Hg concentration and the fraction of time each cluster was impacting the site was developed to quantify the effect of Korean sources and out-of-Korean source. This analysis suggests that Korean sources contributed approximately 55 % of the GOM and PBM while there were approximately equal contributions from Korean and out-of-Korean sources for the TGM measured at the site. The ratio of GOM / PBM decreased when the site was impacted by long-range transport, suggesting that this ratio may be a useful tool for identifying the relative significance of local sources vs. long-range transport. The secondary formation of PBM through gas-particle partitioning with GOM was found to be important at low temperatures and high relative humidity.
Xu, Wanying; Zhou, Chuanbin; Lan, Yajun; Jin, Jiasheng; Cao, Aixin
2015-05-01
Municipal solid waste (MSW) management (MSWM) is most important and challenging in large urban communities. Sound community-based waste management systems normally include waste reduction and material recycling elements, often entailing the separation of recyclable materials by the residents. To increase the efficiency of source separation and recycling, an incentive-based source separation model was designed and this model was tested in 76 households in Guiyang, a city of almost three million people in southwest China. This model embraced the concepts of rewarding households for sorting organic waste, government funds for waste reduction, and introducing small recycling enterprises for promoting source separation. Results show that after one year of operation, the waste reduction rate was 87.3%, and the comprehensive net benefit under the incentive-based source separation model increased by 18.3 CNY tonne(-1) (2.4 Euros tonne(-1)), compared to that under the normal model. The stakeholder analysis (SA) shows that the centralized MSW disposal enterprises had minimum interest and may oppose the start-up of a new recycling system, while small recycling enterprises had a primary interest in promoting the incentive-based source separation model, but they had the least ability to make any change to the current recycling system. The strategies for promoting this incentive-based source separation model are also discussed in this study. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Knevels, Raphael; Leopold, Philip; Petschko, Helene
2017-04-01
With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good success rates to objectively detect landslides in high-resolution topography data by GEOBIA.
Relationship of oil seep in Kudat Peninsula with surrounding rocks based on geochemical analysis
NASA Astrophysics Data System (ADS)
Izzati Azman, Nurul; Nur Fathiyah Jamaludin, Siti
2017-10-01
This study aims to investigate the relation of oil seepage at Sikuati area with the structural and petroleum system of Kudat Peninsula. The abundance of highly carbonaceous rocks with presence of lamination in the Sikuati Member outcrop at Kudat Peninsula may give an idea on the presence of oil seepage in this area. A detailed geochemical analysis of source rock sample and oil seepage from Sikuati area was carried out for their characterization and correlation. Hydrocarbon propectivity of Sikuati Member source rock is poor to good with Total Organic Carbon (TOC) value of 0.11% to 1.48%. and also categorized as immature to early mature oil window with Vitrinite Reflectance (VRo) value of 0.43% to 0.50 %Ro. Based on biomarker distribution, from Gas Chromatography (GC) and Gas Chromatography-Mass Spectrometry (GC-MS) analysis, source rock sample shows Pr/Ph, CPI and WI of 2.22 to 2.68, 2.17 to 2.19 and 2.46 to 2.74 respectively indicates the source rock is immature and coming from terrestrial environment. The source rock might be rich in carbonaceous material organic matter resulting from planktonic/bacterial activity which occurs at fluvial to fluvio-deltaic environment. Overall, the source rock from outcrop level of Kudat Peninsula is moderately prolific in term of prospectivity and maturity. However, as go far deeper beneath the surface, we can expect more activity of mature source rock that generate and expulse hydrocarbon from the subsurface then migrating through deep-seated fault beneath the Sikuati area.
Engström, Emma; Balfors, Berit; Mörtberg, Ulla; Thunvik, Roger; Gaily, Tarig; Mangold, Mikael
2015-05-15
In low-income regions, drinking water is often derived from groundwater sources, which might spread diarrheal disease if they are microbiologically polluted. This study aimed to investigate the occurrence of fecal contamination in 147 improved groundwater sources in Juba, South Sudan and to assess potential contributing risk factors, based on bivariate statistical analysis. Thermotolerant coliforms (TTCs) were detected in 66% of the investigated sources, including 95 boreholes, breaching the health-based recommendations for drinking water. A significant association (p<0.05) was determined between the presence of TTCs and the depth of cumulative, long-term prior precipitation (both within the previous five days and within the past month). No such link was found to short-term rainfall, the presence of latrines or damages in the borehole apron. However, the risk factor analysis further suggested, to a lesser degree, that the local topography and on-site hygiene were additionally significant. In summary, the analysis indicated that an important contamination mechanism was fecal pollution of the contributing groundwater, which was unlikely due to the presence of latrines; instead, infiltration from contaminated surface water was more probable. The reduction in fecal sources in the environment in Juba is thus recommended, for example, through constructing latrines or designating protection areas near water sources. The study results contribute to the understanding of microbiological contamination of groundwater sources in areas with low incomes and high population densities, tropical climates and weathered basement complex environments, which are common in urban sub-Saharan Africa. Copyright © 2015 Elsevier B.V. All rights reserved.
Huang, Ming-Xiong; Huang, Charles W; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L; Baker, Dewleen G; Song, Tao; Harrington, Deborah L; Theilmann, Rebecca J; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M; Edgar, J Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T; Drake, Angela; Lee, Roland R
2014-01-01
The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL's performance was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL's performance was then examined in the analysis of human median-nerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer's problems of signal leaking and distorted source time-courses. © 2013.
Huang, Ming-Xiong; Huang, Charles W.; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L.; Baker, Dewleen G.; Song, Tao; Harrington, Deborah L.; Theilmann, Rebecca J.; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M.; Edgar, J. Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T.; Drake, Angela; Lee, Roland R.
2014-01-01
The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL’s performance of was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL’s performance was then examined in the analysis of human mediannerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer’s problems of signal leaking and distorted source time-courses. PMID:24055704
NASA Astrophysics Data System (ADS)
Chiaro, G.; Salvetti, D.; La Mura, G.; Giroletti, M.; Thompson, D. J.; Bastieri, D.
2016-11-01
The Fermi-Large Area Telescope (LAT) is currently the most important facility for investigating the GeV γ-ray sky. With Fermi-LAT, more than three thousand γ-ray sources have been discovered so far. 1144 (˜40 per cent) of the sources are active galaxies of the blazar class, and 573 (˜20 per cent) are listed as blazar candidate of uncertain type (BCU), or sources without a conclusive classification. We use the empirical cumulative distribution functions and the artificial neural networks for a fast method of screening and classification for BCUs based on data collected at γ-ray energies only, when rigorous multiwavelength analysis is not available. Based on our method, we classify 342 BCUs as BL Lacs and 154 as flat-spectrum radio quasars, while 77 objects remain uncertain. Moreover, radio analysis and direct observations in ground-based optical observatories are used as counterparts to the statistical classifications to validate the method. This approach is of interest because of the increasing number of unclassified sources in Fermi catalogues and because blazars and in particular their subclass high synchrotron peak objects are the main targets of atmospheric Cherenkov telescopes.
Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan
2016-07-01
Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.
Open Source GIS based integrated watershed management
NASA Astrophysics Data System (ADS)
Byrne, J. M.; Lindsay, J.; Berg, A. A.
2013-12-01
Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users
Characteristics of Ampel bamboo as a biomass energy source potential in Bali
NASA Astrophysics Data System (ADS)
Sucipta, M.; Putra Negara, D. N. K.; Tirta Nindhia, T. G.; Surata, I. W.
2017-05-01
Currently, non-renewable fossil energy dominates utilization of the world energy need for many applications. Efforts has been developed to find alternative renewable energy sources, due to fossil energy availability is diminishing. And one of renewable energy source is from biomass. The aim of this research is to determine characteristics of the Ampel bamboo (Bambusa vulgaris) as an energy potential of biomass. The Ampel bamboo’s characteristics possessed are evaluated based on its chemical composition; moisture, volatile, ash, and fixed carbon through proximate analysis; and also carbon, hydrogen and nitrogen content through ultimate analysis. From the Thermo-gravimetric analysis (TGA) indicates that Ampel bamboo contains of about 18.10% hemicelluloses, 47.75% cellulose and 18.86% lignin. While from the ultimate analysis results in the content of carbon, hydrogen, and Nitrogen of Ampel bamboo are 39.75%, 5.75% and 0% respectively. With such characteristics, it indicates that Ampel bamboo has an attractive potential as a renewable energy source.
Long-term variability in bright hard X-ray sources: 5+ years of BATSE data
NASA Technical Reports Server (NTRS)
Robinson, C. R.; Harmon, B. A.; McCollough, M. L.; Paciesas, W. S.; Sahi, M.; Scott, D. M.; Wilson, C. A.; Zhang, S. N.; Deal, K. J.
1997-01-01
The operation of the Compton Gamma Ray Observatory (CGRO)/burst and transient source experiment (BATSE) continues to provide data for inclusion into a data base for the analysis of long term variability in bright, hard X-ray sources. The all-sky capability of BATSE provides up to 30 flux measurements/day for each source. The long baseline and the various rising and setting occultation flux measurements allow searches for periodic and quasi-periodic signals with periods of between several hours to hundreds of days to be conducted. The preliminary results from an analysis of the hard X-ray variability in 24 of the brightest BATSE sources are presented. Power density spectra are computed for each source and profiles are presented of the hard X-ray orbital modulations in some X-ray binaries, together with amplitude modulations and variations in outburst durations and intensities in recurrent X-ray transients.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.
Desai, Trunil S; Srivastava, Shireesh
2018-01-01
13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses
Desai, Trunil S.
2018-01-01
13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347
Studies of EGRET sources with a novel image restoration technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tajima, Hiroyasu; Cohen-Tanugi, Johann; Kamae, Tuneyoshi
2007-07-12
We have developed an image restoration technique based on the Richardson-Lucy algorithm optimized for GLAST-LAT image analysis. Our algorithm is original since it utilizes the PSF (point spread function) that is calculated for each event. This is critical for EGRET and GLAST-LAT image analysis since the PSF depends on the energy and angle of incident gamma-rays and varies by more than one order of magnitude. EGRET and GLAST-LAT image analysis also faces Poisson noise due to low photon statistics. Our technique incorporates wavelet filtering to minimize noise effects. We present studies of EGRET sources using this novel image restoration techniquemore » for possible identification of extended gamma-ray sources.« less
PIVOT: platform for interactive analysis and visualization of transcriptomics data.
Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong
2018-01-05
Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.
Cai, Hao; Long, Weiding; Li, Xianting; Kong, Lingjuan; Xiong, Shuang
2010-06-15
In case hazardous contaminants are suddenly released indoors, the prompt and proper emergency responses are critical to protect occupants. This paper aims to provide a framework for determining the optimal combination of ventilation and evacuation strategies by considering the uncertainty of source locations. The certainty of source locations is classified as complete certainty, incomplete certainty, and complete uncertainty to cover all the possible situations. According to this classification, three types of decision analysis models are presented. A new concept, efficiency factor of contaminant source (EFCS), is incorporated in these models to evaluate the payoffs of the ventilation and evacuation strategies. A procedure of decision-making based on these models is proposed and demonstrated by numerical studies of one hundred scenarios with ten ventilation modes, two evacuation modes, and five source locations. The results show that the models can be useful to direct the decision analysis of both the ventilation and evacuation strategies. In addition, the certainty of the source locations has an important effect on the outcomes of the decision-making. Copyright 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kropivnitskaya, Yelena; Tiampo, Kristy F.; Qin, Jinhui; Bauer, Michael A.
2017-06-01
Earthquake intensity is one of the key components of the decision-making process for disaster response and emergency services. Accurate and rapid intensity calculations can help to reduce total loss and the number of casualties after an earthquake. Modern intensity assessment procedures handle a variety of information sources, which can be divided into two main categories. The first type of data is that derived from physical sensors, such as seismographs and accelerometers, while the second type consists of data obtained from social sensors, such as witness observations of the consequences of the earthquake itself. Estimation approaches using additional data sources or that combine sources from both data types tend to increase intensity uncertainty due to human factors and inadequate procedures for temporal and spatial estimation, resulting in precision errors in both time and space. Here we present a processing approach for the real-time analysis of streams of data from both source types. The physical sensor data is acquired from the U.S. Geological Survey (USGS) seismic network in California and the social sensor data is based on Twitter user observations. First, empirical relationships between tweet rate and observed Modified Mercalli Intensity (MMI) are developed using data from the M6.0 South Napa, CAF earthquake that occurred on August 24, 2014. Second, the streams of both data types are analyzed together in simulated real-time to produce one intensity map. The second implementation is based on IBM InfoSphere Streams, a cloud platform for real-time analytics of big data. To handle large processing workloads for data from various sources, it is deployed and run on a cloud-based cluster of virtual machines. We compare the quality and evolution of intensity maps from different data sources over 10-min time intervals immediately following the earthquake. Results from the joint analysis shows that it provides more complete coverage, with better accuracy and higher resolution over a larger area than either data source alone.
Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape
NASA Astrophysics Data System (ADS)
Yu, T.; Xu, K.; Yuan, Z.
2017-09-01
Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze pollution; For Wuhan City, the method of adjusting the built-up area slightly and planning the non-built-up areas reasonably can be taken to reduce atmospheric haze pollution.
ERIC Educational Resources Information Center
Meerbaum-Salant, Orni; Hazzan, Orit
2009-01-01
This paper focuses on challenges in mentoring software development projects in the high school and analyzes difficulties encountered by Computer Science teachers in the mentoring process according to Shulman's Teacher Knowledge Base Model. The main difficulties that emerged from the data analysis belong to the following knowledge sources of…
The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality
NASA Technical Reports Server (NTRS)
Conway, Darrel J.; Hughes, Steven P.
2010-01-01
The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).
NASA Astrophysics Data System (ADS)
Sun, X.; Cheng, S.
2017-12-01
This paper presents the first attempt to investigate the emission source control of the Middle Reaches of Yangtze River Urban Agglomerations (MRYRUA), one of the national urban agglomerations in China. An emission inventory of the MRYRUA was the first time to be developed as inputs to the CAMx model based on county-level activity data obtained by full-coverage investigation and source-based spatial surrogates. The emission inventory was proved to be acceptable owing to the atmospheric modeling verification. A classification technology method for atmospheric pollution source priority control was the first time to be introduced and applied in the MRYRUA for the evaluation of the emission sources control on the region-scale and city-scale. MICAPS (Meteorological Information comprehensive Analysis and Processing System) was applied for the regional meteorological condition and sensitivity analysis. The results demonstrated that the emission sources in the Hefei-center Urban Agglomerations contributed biggest on the mean PM2.5 concentrations of the MRYRUA and should be taken the priority to control. The emission sources in the Ma'anshan city, Xiangtan city, Hefei city and Wuhan city were the bigger contributors on the mean PM2.5 concentrations of the MRYRUA among the cities and should be taken the priority to control. In addition, the cities along the Yangtze River and the tributary should be given the special attention for the regional air quality target attainments. This study provide a valuable preference for policy makers to develop effective air pollution control strategies.
Joint source based morphometry identifies linked gray and white matter group differences.
Xu, Lai; Pearlson, Godfrey; Calhoun, Vince D
2009-02-01
We present a multivariate approach called joint source based morphometry (jSBM), to identify linked gray and white matter regions which differ between groups. In jSBM, joint independent component analysis (jICA) is used to decompose preprocessed gray and white matter images into joint sources and statistical analysis is used to determine the significant joint sources showing group differences and their relationship to other variables of interest (e.g. age or sex). The identified joint sources are groupings of linked gray and white matter regions with common covariation among subjects. In this study, we first provide a simulation to validate the jSBM approach. To illustrate our method on real data, jSBM is then applied to structural magnetic resonance imaging (sMRI) data obtained from 120 chronic schizophrenia patients and 120 healthy controls to identify group differences. JSBM identified four joint sources as significantly associated with schizophrenia. Linked gray-white matter regions identified in each of the joint sources included: 1) temporal--corpus callosum, 2) occipital/frontal--inferior fronto-occipital fasciculus, 3) frontal/parietal/occipital/temporal--superior longitudinal fasciculus and 4) parietal/frontal--thalamus. Age effects on all four joint sources were significant, but sex effects were significant only for the third joint source. Our findings demonstrate that jSBM can exploit the natural linkage between gray and white matter by incorporating them into a unified framework. This approach is applicable to a wide variety of problems to study linked gray and white matter group differences.
Rostad, C.E.
2006-01-01
Polar components in fuels may enable differentiation between fuel types or commercial fuel sources. A range of commercial fuels from numerous sources were analyzed by flow injection analysis/electrospray ionization/mass spectrometry without extensive sample preparation, separation, or chromatography. This technique enabled screening for unique polar components at parts per million levels in commercial hydrocarbon products, including a range of products from a variety of commercial sources and locations. Because these polar compounds are unique in different fuels, their presence may provide source information on hydrocarbons released into the environment. This analysis was then applied to mixtures of various products, as might be found in accidental releases into the environment. Copyright ?? Taylor & Francis Group, LLC.
Experimental Analysis of Pseudospark Sourced Electron Beam
NASA Astrophysics Data System (ADS)
Kumar, Niraj; Pal, U. N.; Verma, D. K.; Prajapati, J.; Kumar, M.; Meena, B. L.; Tyagi, M. S.; Srivastava, V.
2011-12-01
The pseudospark (PS) discharge has been shown to be a promising source of high brightness, high intensity electron beam pulses. The PS discharge sourced electron beam has potential applications in plasma filled microwave sources where normal material cathode cannot be used. Analysis of the electron beam profile has been done experimentally for different applied voltages. The investigation has been carried out at different axial and radial location inside the drift space in argon atmosphere. This paper represents experimentally found axial and radial variation of the beam current inside the drift tube of PS discharge based plasma cathode electron (PCE) gun. With the help of current density estimation the focusing and defocusing point of electron beam in axial direction can be analyzed.
Accuracy analysis and design of A3 parallel spindle head
NASA Astrophysics Data System (ADS)
Ni, Yanbing; Zhang, Biao; Sun, Yupeng; Zhang, Yuan
2016-03-01
As functional components of machine tools, parallel mechanisms are widely used in high efficiency machining of aviation components, and accuracy is one of the critical technical indexes. Lots of researchers have focused on the accuracy problem of parallel mechanisms, but in terms of controlling the errors and improving the accuracy in the stage of design and manufacturing, further efforts are required. Aiming at the accuracy design of a 3-DOF parallel spindle head(A3 head), its error model, sensitivity analysis and tolerance allocation are investigated. Based on the inverse kinematic analysis, the error model of A3 head is established by using the first-order perturbation theory and vector chain method. According to the mapping property of motion and constraint Jacobian matrix, the compensatable and uncompensatable error sources which affect the accuracy in the end-effector are separated. Furthermore, sensitivity analysis is performed on the uncompensatable error sources. The sensitivity probabilistic model is established and the global sensitivity index is proposed to analyze the influence of the uncompensatable error sources on the accuracy in the end-effector of the mechanism. The results show that orientation error sources have bigger effect on the accuracy in the end-effector. Based upon the sensitivity analysis results, the tolerance design is converted into the issue of nonlinearly constrained optimization with the manufacturing cost minimum being the optimization objective. By utilizing the genetic algorithm, the allocation of the tolerances on each component is finally determined. According to the tolerance allocation results, the tolerance ranges of ten kinds of geometric error sources are obtained. These research achievements can provide fundamental guidelines for component manufacturing and assembly of this kind of parallel mechanisms.
Assessment of port-related air quality impacts: geographic analysis of population
Increased global trade has led to greater transportation by rail, road and ships to move cargo. Based upon multiple near-road and near-source monitoring studies, the busy roadways and large emission sources at ports may impact local air quality within several hundred metres of th...
Which Refrigeration System is Best for Your School?
ERIC Educational Resources Information Center
Little, Philip F.
1963-01-01
Several types of refrigeration systems available to the consulting engineer are discussed. The engineer should analyze all energy sources and base his recommendations on comparative costs and availability of sources, keeping in mind that operating costs are of primary importance to schools. The analysis begins with a careful appraisal of the…
Screening Methodologies to Support Risk and Technology Reviews (RTR): A Case Study Analysis
The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have lar...
Analysis (Simulation) of Ni-63 beta-voltaic cells based on silicon solar cells
NASA Astrophysics Data System (ADS)
Gorbatsevich, A. A.; Danilin, A. B.; Korneev, V. I.; Magomedbekov, E. P.; Molin, A. A.
2016-07-01
Beta-voltaic cells based on standard silicon solar cells with bilateral coating with beta-radiation sources in the form of 63Ni isotope have been studied experimentally and by numerical simulation. The optimal parameters of the cell, including its thickness, the doping level of the substrate, the depth of the p- n junction on its front side, and the p + layer on the back side, as well as the activity of the source material, have been calculated. The limiting theoretical values of the open-circuit voltage (0.26 V), short-circuiting current (2.1 μA), the output power of the cell (0.39 μW), and the efficiency of the conversion of the radioactive energy onto the electric energy (4.8%) have been determined for a beta-source activity of 40 mCi. The results of numerical analysis have been compared with the experimental data.
NASA Astrophysics Data System (ADS)
Li, Zhixiong; Yan, Xinping; Wang, Xuping; Peng, Zhongxiao
2016-06-01
In the complex gear transmission systems, in wind turbines a crack is one of the most common failure modes and can be fatal to the wind turbine power systems. A single sensor may suffer with issues relating to its installation position and direction, resulting in the collection of weak dynamic responses of the cracked gear. A multi-channel sensor system is hence applied in the signal acquisition and the blind source separation (BSS) technologies are employed to optimally process the information collected from multiple sensors. However, literature review finds that most of the BSS based fault detectors did not address the dependence/correlation between different moving components in the gear systems; particularly, the popular used independent component analysis (ICA) assumes mutual independence of different vibration sources. The fault detection performance may be significantly influenced by the dependence/correlation between vibration sources. In order to address this issue, this paper presents a new method based on the supervised order tracking bounded component analysis (SOTBCA) for gear crack detection in wind turbines. The bounded component analysis (BCA) is a state of art technology for dependent source separation and is applied limitedly to communication signals. To make it applicable for vibration analysis, in this work, the order tracking has been appropriately incorporated into the BCA framework to eliminate the noise and disturbance signal components. Then an autoregressive (AR) model built with prior knowledge about the crack fault is employed to supervise the reconstruction of the crack vibration source signature. The SOTBCA only outputs one source signal that has the closest distance with the AR model. Owing to the dependence tolerance ability of the BCA framework, interfering vibration sources that are dependent/correlated with the crack vibration source could be recognized by the SOTBCA, and hence, only useful fault information could be preserved in the reconstructed signal. The crack failure thus could be precisely identified by the cyclic spectral correlation analysis. A series of numerical simulations and experimental tests have been conducted to illustrate the advantages of the proposed SOTBCA method for fatigue crack detection. Comparisons to three representative techniques, i.e. Erdogan's BCA (E-BCA), joint approximate diagonalization of eigen-matrices (JADE), and FastICA, have demonstrated the effectiveness of the SOTBCA. Hence the proposed approach is suitable for accurate gear crack detection in practical applications.
Employment-based health insurance is failing: now what?
Enthoven, Alain C
2003-01-01
Employment-based health insurance is failing. Costs are out of control. Employers have no effective strategy to deal with this. They must think strategically about fundamental change. This analysis explains how employers' purchasing policies contribute to rising costs and block growth of economical care. Single-source managed care is ineffective, and effective managed care cannot be a single source. Employers should create exchanges through which they can offer employees wide, responsible, individual, multiple choices among health care delivery systems and create serious competition based on value for money. Recently introduced technology can assist this process.
2016-06-01
thorough market research, acquisition professionals must decide at an early stage which source selection strategy (lowest price technically...minimizing risk and ensuring best value for all stakeholders. On the basis of thorough market research, acquisition professionals must decide at an early...price-based, market -driven environment from requirements development through properly disposal. Source selection must be 8 made on a ‘best value
NASA Astrophysics Data System (ADS)
Peretyagin, Vladimir S.; Korolev, Timofey K.; Chertov, Aleksandr N.
2017-02-01
The problems of dressability the solid minerals are attracted attention of specialists, where the extraction of mineral raw materials is a significant sector of the economy. There are a significant amount of mineral ore dressability methods. At the moment the radiometric dressability methods are considered the most promising. One of radiometric methods is method photoluminescence. This method is based on the spectral analysis, amplitude and kinetic parameters luminescence of minerals (under UV radiation), as well as color parameters of radiation. The absence of developed scientific and methodological approaches of analysis irradiation area to UV radiation as well as absence the relevant radiation sources are the factors which hinder development and use of photoluminescence method. The present work is devoted to the development of multi-element UV radiation source designed for the solution problem of analysis and sorting minerals by their selective luminescence. This article is presented a method of theoretical modeling of the radiation devices based on UV LEDs. The models consider such factors as spectral component, the spatial and energy parameters of the LEDs. Also, this article is presented the results of experimental studies of the some samples minerals.
Performance of the New Los Alamos UCN Source and Implications for Future Experiments
NASA Astrophysics Data System (ADS)
Makela, Mark; LANL UCN Team
2017-01-01
The Los Alamos Ultracold Neutron (UCN) source was replaced during this past summer and has been commissioned during the last few months. The new source is the result of lessons learned during the 10 year operation of the first UCN source and extensive Monte Carlo analysis. The new source is a spallation driven source based on a solid deuterium UCN moderator similar the previous one. This talk will present an overview of the new source design and the results of commissioning tests. The talk will conclude with a brief overview of the implications of source performance on the neutron lifetime and LANL nEDM experiments. This work was funded by LANL LDRD.
Khan, Shaheer; Liu, Jenkuei; Szabo, Zoltan; Kunnummal, Baburaj; Han, Xiaorui; Ouyang, Yilan; Linhardt, Robert J; Xia, Qiangwei
2018-06-15
N-linked glycan analysis of recombinant therapeutic proteins, such as monoclonal antibodies, Fc-fusion proteins, and antibody-drug conjugates, provides valuable information regarding protein therapeutics glycosylation profile. Both qualitative identification and quantitative analysis of N-linked glycans on recombinant therapeutic proteins are critical analytical tasks in the biopharma industry during the development of a biotherapeutic. Currently, such analyses are mainly carried out using capillary electrophoresis/laser-induced fluorescence (CE/LIF), liquid chromatography/fluorescence (LC/FLR), and liquid chromatography/fluorescence/mass spectrometry (LC/FLR/MS) technologies. N-linked glycans are first released from glycoproteins by enzymatic digestion, then labeled with fluorescence dyes for subsequent CE or LC separation, and LIF or MS detection. Here we present an on-line CE/LIF/MS N-glycan analysis workflow that incorporates the fluorescent Teal™ dye and an electrokinetic pump-based nanospray sheath liquid capillary electrophoresis/mass spectrometry (CE/MS) ion source. Electrophoresis running buffer systems using ammonium acetate and ammonium hydroxide were developed for the negative ion mode CE/MS analysis of fluorescence-labeled N-linked glycans. Results show that on-line CE/LIF/MS analysis can be readily achieved using this versatile CE/MS ion source on common CE/MS instrument platforms. This on-line CE/LIF/MS method using Teal™ fluorescent dye and electrokinetic pump-based nanospray sheath liquid CE/MS coupling technology holds promise for on-line quantitation and identification of N-linked glycans on recombinant therapeutic proteins. Copyright © 2018 John Wiley & Sons, Ltd.
Algorithmic localisation of noise sources in the tip region of a low-speed axial flow fan
NASA Astrophysics Data System (ADS)
Tóth, Bence; Vad, János
2017-04-01
An objective and algorithmised methodology is proposed to analyse beamform data obtained for axial fans. Its application is demonstrated in a case study regarding the tip region of a low-speed cooling fan. First, beamforming is carried out in a co-rotating frame of reference. Then, a distribution of source strength is extracted along the circumference of the rotor at the blade tip radius in each analysed third-octave band. The circumferential distributions are expanded into Fourier series, which allows for filtering out the effects of perturbations, on the basis of an objective criterion. The remaining Fourier components are then considered as base sources to determine the blade-passage-periodic flow mechanisms responsible for the broadband noise. Based on their frequency and angular location, the base sources are grouped together. This is done using the fuzzy c-means clustering method to allow the overlap of the source mechanisms. The number of clusters is determined in a validity analysis. Finally, the obtained clusters are assigned to source mechanisms based on the literature. Thus, turbulent boundary layer - trailing edge interaction noise, tip leakage flow noise, and double leakage flow noise are identified.
Training Needs Analysis and Evaluation for New Technologies through the Use of Problem-Based Inquiry
ERIC Educational Resources Information Center
Casey, Matthew Scott; Doverspike, Dennis
2005-01-01
The analysis of calls to a help desk, in this case calls to a computer help desk, can serve as a rich source of information on the real world problems that individuals are having with the implementation of a new technology. Thus, we propose that an analysis of help desk calls, a form of problem-based inquiry, can serve as a fast and low cost means…
Spatiotemporal patterns of ERP based on combined ICA-LORETA analysis
NASA Astrophysics Data System (ADS)
Zhang, Jiacai; Guo, Taomei; Xu, Yaqin; Zhao, Xiaojie; Yao, Li
2007-03-01
In contrast to the FMRI methods widely used up to now, this method try to understand more profoundly how the brain systems work under sentence processing task map accurately the spatiotemporal patterns of activity of the large neuronal populations in the human brain from the analysis of ERP data recorded on the brain scalp. In this study, an event-related brain potential (ERP) paradigm to record the on-line responses to the processing of sentences is chosen as an example. In order to give attention to both utilizing the ERPs' temporal resolution of milliseconds and overcoming the insensibility of cerebral location ERP sources, we separate these sources in space and time based on a combined method of independent component analysis (ICA) and low-resolution tomography (LORETA) algorithms. ICA blindly separate the input ERP data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. And then the spatial maps associated with each ICA component are analyzed, with use of LORETA to uniquely locate its cerebral sources throughout the full brain according to the assumption that neighboring neurons are simultaneously and synchronously activated. Our results show that the cerebral computation mechanism underlies content words reading is mediated by the orchestrated activity of several spatially distributed brain sources located in the temporal, frontal, and parietal areas, and activate at distinct time intervals and are grouped into different statistically independent components. Thus ICA-LORETA analysis provides an encouraging and effective method to study brain dynamics from ERP.
Hopkins, Jesse Bennett; Gillilan, Richard E; Skou, Soren
2017-10-01
BioXTAS RAW is a graphical-user-interface-based free open-source Python program for reduction and analysis of small-angle X-ray solution scattering (SAXS) data. The software is designed for biological SAXS data and enables creation and plotting of one-dimensional scattering profiles from two-dimensional detector images, standard data operations such as averaging and subtraction and analysis of radius of gyration and molecular weight, and advanced analysis such as calculation of inverse Fourier transforms and envelopes. It also allows easy processing of inline size-exclusion chromatography coupled SAXS data and data deconvolution using the evolving factor analysis method. It provides an alternative to closed-source programs such as Primus and ScÅtter for primary data analysis. Because it can calibrate, mask and integrate images it also provides an alternative to synchrotron beamline pipelines that scientists can install on their own computers and use both at home and at the beamline.
Relationship between mass-flux reduction and source-zone mass removal: analysis of field data.
Difilippo, Erica L; Brusseau, Mark L
2008-05-26
The magnitude of contaminant mass-flux reduction associated with a specific amount of contaminant mass removed is a key consideration for evaluating the effectiveness of a source-zone remediation effort. Thus, there is great interest in characterizing, estimating, and predicting relationships between mass-flux reduction and mass removal. Published data collected for several field studies were examined to evaluate relationships between mass-flux reduction and source-zone mass removal. The studies analyzed herein represent a variety of source-zone architectures, immiscible-liquid compositions, and implemented remediation technologies. There are two general approaches to characterizing the mass-flux-reduction/mass-removal relationship, end-point analysis and time-continuous analysis. End-point analysis, based on comparing masses and mass fluxes measured before and after a source-zone remediation effort, was conducted for 21 remediation projects. Mass removals were greater than 60% for all but three of the studies. Mass-flux reductions ranging from slightly less than to slightly greater than one-to-one were observed for the majority of the sites. However, these single-snapshot characterizations are limited in that the antecedent behavior is indeterminate. Time-continuous analysis, based on continuous monitoring of mass removal and mass flux, was performed for two sites, both for which data were obtained under water-flushing conditions. The reductions in mass flux were significantly different for the two sites (90% vs. approximately 8%) for similar mass removals ( approximately 40%). These results illustrate the dependence of the mass-flux-reduction/mass-removal relationship on source-zone architecture and associated mass-transfer processes. Minimal mass-flux reduction was observed for a system wherein mass removal was relatively efficient (ideal mass-transfer and displacement). Conversely, a significant degree of mass-flux reduction was observed for a site wherein mass removal was inefficient (non-ideal mass-transfer and displacement). The mass-flux-reduction/mass-removal relationship for the latter site exhibited a multi-step behavior, which cannot be predicted using some of the available simple estimation functions.
room) or while being on the mobile (agents in action). While desktop based applications can be used to monitor but also process and analyse surveillance data coming from a variety of sources, mobile-based techniques Digital forensics analysis Visualization techniques for surveillance Mobile-based surveillance
Interpolity exchange of basalt tools facilitated via elite control in Hawaiian archaic states
Kirch, Patrick V.; Mills, Peter R.; Lundblad, Steven P.; Sinton, John; Kahn, Jennifer G.
2012-01-01
Ethnohistoric accounts of late precontact Hawaiian archaic states emphasize the independence of chiefly controlled territories (ahupua‘a) based on an agricultural, staple economy. However, elite control of unevenly distributed resources, such as high-quality volcanic rock for adze production, may have provided an alternative source of economic power. To test this hypothesis we used nondestructive energy-dispersive X-ray fluorescence (ED-XRF) analysis of 328 lithic artifacts from 36 archaeological features in the Kahikinui district, Maui Island, to geochemically characterize the source groups. This process was followed by a limited sampling using destructive wavelength-dispersive X-ray fluorescence (WD-XRF) analysis to more precisely characterize certain nonlocal source groups. Seventeen geochemical groups were defined, eight of which represent extra-Maui Island sources. Although the majority of stone tools were derived from Maui Island sources (71%), a significant quantity (27%) of tools derived from extraisland sources, including the large Mauna Kea quarry on Hawai‘i Island as well as quarries on O‘ahu, Moloka‘i, and Lāna‘i islands. Importantly, tools quarried from extralocal sources are found in the highest frequency in elite residential features and in ritual contexts. These results suggest a significant role for a wealth economy based on the control and distribution of nonagricultural goods and resources during the rise of the Hawaiian archaic states. PMID:22203984
Visualization-based analysis of multiple response survey data
NASA Astrophysics Data System (ADS)
Timofeeva, Anastasiia
2017-11-01
During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.
Examining Returned Samples in their Collection Tubes Using Synchrotron Radiation-Based Techniques
NASA Astrophysics Data System (ADS)
Schoonen, M. A.; Hurowitz, J. A.; Thieme, J.; Dooryhee, E.; Fogelqvist, E.; Gregerson, J.; Farley, K. A.; Sherman, S.; Hill, J.
2018-04-01
Synchrotron radiation-based techniques can be leveraged for triaging and analysis of returned samples before unsealing collection tubes. Proof-of-concept measurements conducted at Brookhaven National Lab's National Synchrotron Light Source-II.
The Orbital precession around oblate spheroids
NASA Astrophysics Data System (ADS)
Montanus, J. M. C.
2006-07-01
An exact series will be given for the gravitational potential generated by an oblate gravitating source. To this end the corresponding Epstein-Hubbell type elliptic integral is evaluated. The procedure is based on the Legendre polynomial expansion method and on combinatorial techniques. The result is of interest for gravitational models based on the linearity of the gravitational potential. The series approximation for such potentials is of use for the analysis of orbital motions around a nonspherical source. It can be considered advantageous that the analysis is purely algebraic. Numerical approximations are not required. As an important example, the expression for the orbital precession will be derived for an object orbiting around an oblate homogeneous spheroid.
Johnson, LeeAnn K; Brown, Mary B; Carruthers, Ethan A; Ferguson, John A; Dombek, Priscilla E; Sadowsky, Michael J
2004-08-01
A horizontal, fluorophore-enhanced, repetitive extragenic palindromic-PCR (rep-PCR) DNA fingerprinting technique (HFERP) was developed and evaluated as a means to differentiate human from animal sources of Escherichia coli. Box A1R primers and PCR were used to generate 2,466 rep-PCR and 1,531 HFERP DNA fingerprints from E. coli strains isolated from fecal material from known human and 12 animal sources: dogs, cats, horses, deer, geese, ducks, chickens, turkeys, cows, pigs, goats, and sheep. HFERP DNA fingerprinting reduced within-gel grouping of DNA fingerprints and improved alignment of DNA fingerprints between gels, relative to that achieved using rep-PCR DNA fingerprinting. Jackknife analysis of the complete rep-PCR DNA fingerprint library, done using Pearson's product-moment correlation coefficient, indicated that animal and human isolates were assigned to the correct source groups with an 82.2% average rate of correct classification. However, when only unique isolates were examined, isolates from a single animal having a unique DNA fingerprint, Jackknife analysis showed that isolates were assigned to the correct source groups with a 60.5% average rate of correct classification. The percentages of correctly classified isolates were about 15 and 17% greater for rep-PCR and HFERP, respectively, when analyses were done using the curve-based Pearson's product-moment correlation coefficient, rather than the band-based Jaccard algorithm. Rarefaction analysis indicated that, despite the relatively large size of the known-source database, genetic diversity in E. coli was very great and is most likely accounting for our inability to correctly classify many environmental E. coli isolates. Our data indicate that removal of duplicate genotypes within DNA fingerprint libraries, increased database size, proper methods of statistical analysis, and correct alignment of band data within and between gels improve the accuracy of microbial source tracking methods.
2017-06-09
structures constantly arise in firefights and skirmishes on the battlefield. Source: Andrew Ilachinski, Artificial War: Multiagent- Based Simulation of...Alternative Methods of Analysis and Innovative Organizational Structures .” Conference, Rome, Italy March 31-April 2. ...Intelligence Analysis, Joint Operational Planning, Cellular Automata, Agent- Based Modeling 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18
Frøkjær, Jens B; Graversen, Carina; Brock, Christina; Khodayari-Rostamabad, Ahmad; Olesen, Søren S; Hansen, Tine M; Søfteland, Eirik; Simrén, Magnus; Drewes, Asbjørn M
2017-02-01
Diabetes mellitus (DM) is associated with structural and functional changes of the central nervous system. We used electroencephalography (EEG) to assess resting state cortical activity and explored associations to relevant clinical features. Multichannel resting state EEG was recorded in 27 healthy controls and 24 patients with longstanding DM and signs of autonomic dysfunction. The power distribution based on wavelet analysis was summarized into frequency bands with corresponding topographic mapping. Source localization analysis was applied to explore the electrical cortical sources underlying the EEG. Compared to controls, DM patients had an overall decreased EEG power in the delta (1-4Hz) and gamma (30-45Hz) bands. Topographic analysis revealed that these changes were confined to the frontal region for the delta band and to central cortical areas for the gamma band. Source localization analysis identified sources with reduced activity in the left postcentral gyrus for the gamma band and in right superior parietal lobule for the alpha1 (8-10Hz) band. DM patients with clinical signs of autonomic dysfunction and gastrointestinal symptoms had evidence of altered resting state cortical processing. This may reflect metabolic, vascular or neuronal changes associated with diabetes. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi
2018-02-01
In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.
NASA Astrophysics Data System (ADS)
Long, Tao; Clement, Stephen W. J.; Bao, Zemin; Wang, Peizhi; Tian, Di; Liu, Dunyi
2018-03-01
A high spatial resolution and high brightness ion beam from a cold cathode duoplasmatron source and primary ion optics are presented and applied to in-situ analysis of micro-scale geological material with complex structural and chemical features. The magnetic field in the source as well as the influence of relative permeability of magnetic materials on source performance was simulated using COMSOL to confirm the magnetic field strength of the source. Based on SIMION simulation, a high brightness and high spatial resolution negative ion optical system has been developed to achieve Critical (Gaussian) illumination mode. The ion source and primary column are installed on a new Time-of-Flight secondary ion mass spectrometer for analysis of geological samples. The diameter of the ion beam was measured by the knife-edge method and a scanning electron microscope (SEM). Results show that an O2- beam of ca. 5 μm diameter with a beam intensity of ∼5 nA and an O- beam of ca. 5 μm diameter with a beam intensity of ∼50 nA were obtained, respectively. This design will open new possibilities for in-situ elemental and isotopic analysis in geological studies.
NASA Astrophysics Data System (ADS)
Lister, M. L.; Tingay, S. J.; Preston, R. A.
2001-06-01
We have performed a multidimensional correlation analysis on the observed properties of a statistically complete core-selected sample of compact radio-loud active galactic nuclei based on data from the VLBI Space Observing Programme (Paper I) and previously published studies. Our sample is drawn from the well-studied Pearson-Readhead (PR) survey and is ideally suited for investigating the general effects of relativistic beaming in compact radio sources. In addition to confirming many previously known correlations, we have discovered several new trends that lend additional support to the beaming model. These trends suggest that the most highly beamed sources in core-selected samples tend to have (1) high optical polarizations; (2) large parsec- kiloparsec-scale jet misalignments; (3) prominent VLBI core components; (4) one-sided, core, or halo radio morphology on kiloparsec scales; (5) narrow emission line equivalent widths; and (6) a strong tendency for intraday variability at radio wavelengths. We have used higher resolution space and ground-based VLBI maps to confirm the bimodality of the jet misalignment distribution for the PR survey and find that the sources with aligned parsec- and kiloparsec-scale jets generally have arcsecond-scale radio emission on both sides of the core. The aligned sources also have broader emission line widths. We find evidence that the BL Lacertae objects in the PR survey are all highly beamed and have very similar properties to the high optically polarized quasars, with the exception of smaller redshifts. A cluster analysis on our data shows that after partialing out the effects of redshift, the luminosities of our sample objects in various wave bands are generally well correlated with each other but not with other source properties.
Chen, Hao; Lu, Xinwei; Li, Loretta Y; Gao, Tianning; Chang, Yuyu
2014-06-15
The concentrations of As, Ba, Co, Cr, Cu, Mn, Ni, Pb, V and Zn in campus dust from kindergartens, elementary schools, middle schools and universities of Xi'an, China were determined by X-ray fluorescence spectrometry. Correlation coefficient analysis, principal component analysis (PCA) and cluster analysis (CA) were used to analyze the data and to identify possible sources of these metals in the dust. The spatial distributions of metals in urban dust of Xi'an were analyzed based on the metal concentrations in campus dusts using the geostatistics method. The results indicate that dust samples from campuses have elevated metal concentrations, especially for Pb, Zn, Co, Cu, Cr and Ba, with the mean values of 7.1, 5.6, 3.7, 2.9, 2.5 and 1.9 times the background values for Shaanxi soil, respectively. The enrichment factor results indicate that Mn, Ni, V, As and Ba in the campus dust were deficiently to minimally enriched, mainly affected by nature and partly by anthropogenic sources, while Co, Cr, Cu, Pb and Zn in the campus dust and especially Pb and Zn were mostly affected by human activities. As and Cu, Mn and Ni, Ba and V, and Pb and Zn had similar distribution patterns. The southwest high-tech industrial area and south commercial and residential areas have relatively high levels of most metals. Three main sources were identified based on correlation coefficient analysis, PCA, CA, as well as spatial distribution characteristics. As, Ni, Cu, Mn, Pb, Zn and Cr have mixed sources - nature, traffic, as well as fossil fuel combustion and weathering of materials. Ba and V are mainly derived from nature, but partly also from industrial emissions, as well as construction sources, while Co principally originates from construction. Copyright © 2014 Elsevier B.V. All rights reserved.
Effective temperature of an ultracold electron source based on near-threshold photoionization.
Engelen, W J; Smakman, E P; Bakker, D J; Luiten, O J; Vredenbregt, E J D
2014-01-01
We present a detailed description of measurements of the effective temperature of a pulsed electron source, based on near-threshold photoionization of laser-cooled atoms. The temperature is determined by electron beam waist scans, source size measurements with ion beams, and analysis with an accurate beam line model. Experimental data is presented for the source temperature as a function of the wavelength of the photoionization laser, for both nanosecond and femtosecond ionization pulses. For the nanosecond laser, temperatures as low as 14 ± 3 K were found; for femtosecond photoionization, 30 ± 5 K is possible. With a typical source size of 25 μm, this results in electron bunches with a relative transverse coherence length in the 10⁻⁴ range and an emittance of a few nm rad. © 2013 Elsevier B.V. All rights reserved.
Martian methane plume models for defining Mars rover methane source search strategies
NASA Astrophysics Data System (ADS)
Nicol, Christopher; Ellery, Alex; Lynch, Brian; Cloutis, Ed
2018-07-01
The detection of atmospheric methane on Mars implies an active methane source. This introduces the possibility of a biotic source with the implied need to determine whether the methane is indeed biotic in nature or geologically generated. There is a clear need for robotic algorithms which are capable of manoeuvring a rover through a methane plume on Mars to locate its source. We explore aspects of Mars methane plume modelling to reveal complex dynamics characterized by advection and diffusion. A statistical analysis of the plume model has been performed and compared to analyses of terrestrial plume models. Finally, we consider a robotic search strategy to find a methane plume source. We find that gradient-based techniques are ineffective, but that more sophisticated model-based search strategies are unlikely to be available in near-term rover missions.
Numerical models analysis of energy conversion process in air-breathing laser propulsion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong Yanji; Song Junling; Cui Cunyan
Energy source was considered as a key essential in this paper to describe energy conversion process in air-breathing laser propulsion. Some secondary factors were ignored when three independent modules, ray transmission module, energy source term module and fluid dynamic module, were established by simultaneous laser radiation transportation equation and fluid mechanics equation. The incidence laser beam was simulated based on ray tracing method. The calculated results were in good agreement with those of theoretical analysis and experiments.
Shi, Wei; Wei, Si; Hu, Xin-xin; Hu, Guan-jiu; Chen, Cu-lan; Wang, Xin-ru; Giesy, John P.; Yu, Hong-xia
2013-01-01
Some synthetic chemicals, which have been shown to disrupt thyroid hormone (TH) function, have been detected in surface waters and people have the potential to be exposed through water-drinking. Here, the presence of thyroid-active chemicals and their toxic potential in drinking water sources in Yangtze River Delta were investigated by use of instrumental analysis combined with cell-based reporter gene assay. A novel approach was developed to use Monte Carlo simulation, for evaluation of the potential risks of measured concentrations of TH agonists and antagonists and to determine the major contributors to observed thyroid receptor (TR) antagonist potency. None of the extracts exhibited TR agonist potency, while 12 of 14 water samples exhibited TR antagonistic potency. The most probable observed antagonist equivalents ranged from 1.4 to 5.6 µg di-n-butyl phthalate (DNBP)/L, which posed potential risk in water sources. Based on Monte Carlo simulation related mass balance analysis, DNBP accounted for 64.4% for the entire observed antagonist toxic unit in water sources, while diisobutyl phthalate (DIBP), di-n-octyl phthalate (DNOP) and di-2-ethylhexyl phthalate (DEHP) also contributed. The most probable observed equivalent and most probable relative potency (REP) derived from Monte Carlo simulation is useful for potency comparison and responsible chemicals screening. PMID:24204563
NASA Astrophysics Data System (ADS)
Araújo, Iván Gómez; Sánchez, Jesús Antonio García; Andersen, Palle
2018-05-01
Transmissibility-based operational modal analysis is a recent and alternative approach used to identify the modal parameters of structures under operational conditions. This approach is advantageous compared with traditional operational modal analysis because it does not make any assumptions about the excitation spectrum (i.e., white noise with a flat spectrum). However, common methodologies do not include a procedure to extract closely spaced modes with low signal-to-noise ratios. This issue is relevant when considering that engineering structures generally have closely spaced modes and that their measured responses present high levels of noise. Therefore, to overcome these problems, a new combined method for modal parameter identification is proposed in this work. The proposed method combines blind source separation (BSS) techniques and transmissibility-based methods. Here, BSS techniques were used to recover source signals, and transmissibility-based methods were applied to estimate modal information from the recovered source signals. To achieve this combination, a new method to define a transmissibility function was proposed. The suggested transmissibility function is based on the relationship between the power spectral density (PSD) of mixed signals and the PSD of signals from a single source. The numerical responses of a truss structure with high levels of added noise and very closely spaced modes were processed using the proposed combined method to evaluate its ability to identify modal parameters in these conditions. Colored and white noise excitations were used for the numerical example. The proposed combined method was also used to evaluate the modal parameters of an experimental test on a structure containing closely spaced modes. The results showed that the proposed combined method is capable of identifying very closely spaced modes in the presence of noise and, thus, may be potentially applied to improve the identification of damping ratios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yonggang
In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less
Song, Yuqiao; Liao, Jie; Dong, Junxing; Chen, Li
2015-09-01
The seeds of grapevine (Vitis vinifera) are a byproduct of wine production. To examine the potential value of grape seeds, grape seeds from seven sources were subjected to fingerprinting using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics. Firstly, we listed all reported components (56 components) from grape seeds and calculated the precise m/z values of the deprotonated ions [M-H](-) . Secondly, the experimental conditions were systematically optimized based on the peak areas of total ion chromatograms of the samples. Thirdly, the seven grape seed samples were examined using the optimized method. Information about 20 grape seed components was utilized to represent characteristic fingerprints. Finally, hierarchical clustering analysis and principal component analysis were performed to analyze the data. Grape seeds from seven different sources were classified into two clusters; hierarchical clustering analysis and principal component analysis yielded similar results. The results of this study lay the foundation for appropriate utilization and exploitation of grape seed samples. Due to the absence of complicated sample preparation methods and chromatographic separation, the method developed in this study represents one of the simplest and least time-consuming methods for grape seed fingerprinting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Evans, Ian N.; Primini, F. A.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R. M.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Kashyap, V. L.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Mossman, A. E.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2010-03-01
The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public ACIS imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents < 30". The catalog (1) provides access to estimates of the X-ray source properties for detected sources with good scientific fidelity; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of < 1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics. In addition, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra. Support for development of the CSC is provided by the National Aeronautics and Space Administration through the Chandra X-ray Center, which is operated by the Smithsonian Astrophysical Observatory for and on behalf of the National Aeronautics and Space Administration under contract NAS 8-03060.
Zhang, Rudong; Wang, Hailong; Hegg, D. A.; ...
2015-11-18
The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source–receptor relationships for atmospheric BC and its deposition to snow over western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over northwestern USA and westernmore » Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based positive matrix factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. Furthermore, while CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.« less
Contrastive Analysis and the Translation of Idioms: Some Remarks on Contrasting Idioms.
ERIC Educational Resources Information Center
Roos, Eckhard
Contrastive analysis can help solve certain problems in translation, for example, that of idioms. A contrastive analysis of source language (SL) and target language (TL) might have as its theoretical framework a contrastive lexical analysis based on generative semantics. In this approach both SL and TL idioms are broken down into their semantic…
Using Kepler for Tool Integration in Microarray Analysis Workflows.
Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C
Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.
Modeling and performance analysis of QoS data
NASA Astrophysics Data System (ADS)
Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.
2016-09-01
The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.
ERIC Educational Resources Information Center
Tolu, Hüseyin
2018-01-01
Investigating the sociology of educational technology can be approached through a series of deliberations based on the interaction between Free/Libre Open Source Software (FLOSS) and Proprietary Close Source Software (PCSS). This article consults public policy discourses of the Fatih project, which is the current educational technology project in…
Using Citation Analysis Heuristics to Prepare TAs across the Disciplines as Teachers and Writers
ERIC Educational Resources Information Center
Serviss, Tricia
2016-01-01
National discussions about source-based, academic writing in higher education have been and are increasingly tied to concerns about citation proficiency, plagiarism, and academic integrity. In response to these discussions, scholars have argued for better pedagogical strategies to teach students how to work with sources in effective and ethical…
Yacoub, Haitham A.; Sadek, Mahmoud A.; Uversky, Vladimir N.
2017-01-01
ABSTRACT This study was conducted to identify the source of animal meat based on the peculiarities of protein intrinsic disorder distribution in mitochondrial cytochrome b (mtCyt-b). The analysis revealed that animal and avian species can be discriminated based on the proportions of the two groups of residues, Leu+Ile, and Ser+Pro+Ala, in the amino acid sequences of their mtCyt-b. Although levels of the overall intrinsic disorder in mtCyt-b is not very high, the peculiarities of disorder distribution within the sequences of mtCyt-b from different species varies in a rather specific way. In fact, positions and intensities of disorder/flexibility “signals” in the corresponding disorder profiles are relatively unique for avian and animal species. Therefore, it is possible to devise a set of simple rules based on the peculiarities of disorder profiles of their mtCyt-b proteins to discriminate among species. This intrinsic disorder-based analysis represents a new technique that could be used to provide a promising solution for identification of the source of meats. PMID:28331777
Cohen, Michael X; Gulbinaite, Rasa
2017-02-15
Steady-state evoked potentials (SSEPs) are rhythmic brain responses to rhythmic sensory stimulation, and are often used to study perceptual and attentional processes. We present a data analysis method for maximizing the signal-to-noise ratio of the narrow-band steady-state response in the frequency and time-frequency domains. The method, termed rhythmic entrainment source separation (RESS), is based on denoising source separation approaches that take advantage of the simultaneous but differential projection of neural activity to multiple electrodes or sensors. Our approach is a combination and extension of existing multivariate source separation methods. We demonstrate that RESS performs well on both simulated and empirical data, and outperforms conventional SSEP analysis methods based on selecting electrodes with the strongest SSEP response, as well as several other linear spatial filters. We also discuss the potential confound of overfitting, whereby the filter captures noise in absence of a signal. Matlab scripts are available to replicate and extend our simulations and methods. We conclude with some practical advice for optimizing SSEP data analyses and interpreting the results. Copyright © 2016 Elsevier Inc. All rights reserved.
Assessment of macroseismic intensity in the Nile basin, Egypt
NASA Astrophysics Data System (ADS)
Fergany, Elsayed
2018-01-01
This work intends to assess deterministic seismic hazard and risk analysis in terms of the maximum expected intensity map of the Egyptian Nile basin sector. Seismic source zone model of Egypt was delineated based on updated compatible earthquake catalog in 2015, focal mechanisms, and the common tectonic elements. Four effective seismic source zones were identified along the Nile basin. The observed macroseismic intensity data along the basin was used to develop intensity prediction equation defined in terms of moment magnitude. Expected maximum intensity map was proven based on the developed intensity prediction equation, identified effective seismic source zones, and maximum expected magnitude for each zone along the basin. The earthquake hazard and risk analysis was discussed and analyzed in view of the maximum expected moment magnitude and the maximum expected intensity values for each effective source zone. Moderate expected magnitudes are expected to put high risk at Cairo and Aswan regions. The results of this study could be a recommendation for the planners in charge to mitigate the seismic risk at these strategic zones of Egypt.
NASA Astrophysics Data System (ADS)
Gan, Shuwei; Wang, Shoudong; Chen, Yangkang; Qu, Shan; Zu, Shaohuan
2016-02-01
Direct imaging of simultaneous-source (or blended) data, without the need of deblending, requires a precise subsurface velocity model. In this paper, we focus on the velocity analysis of simultaneous-source data using the normal moveout-based velocity picking approach.We demonstrate that it is possible to obtain a precise velocity model directly from the blended data in the common-midpoint domain. The similarity-weighted semblance can help us obtain much better velocity spectrum with higher resolution and higher reliability compared with the traditional semblance. The similarity-weighted semblance enforces an inherent noise attenuation solely in the semblance calculation stage, thus it is not sensitive to the intense interference. We use both simulated synthetic and field data examples to demonstrate the performance of the similarity-weighted semblance in obtaining reliable subsurface velocity model for direct migration of simultaneous-source data. The migrated image of blended field data using prestack Kirchhoff time migration approach based on the picked velocity from the similarity-weighted semblance is very close to the migrated image of unblended data.
Comparison of Three Plasma Sources for Ambient Desorption/Ionization Mass Spectrometry
NASA Astrophysics Data System (ADS)
McKay, Kirsty; Salter, Tara L.; Bowfield, Andrew; Walsh, James L.; Gilmore, Ian S.; Bradley, James W.
2014-09-01
Plasma-based desorption/ionization sources are an important ionization technique for ambient surface analysis mass spectrometry. In this paper, we compare and contrast three competing plasma based desorption/ionization sources: a radio-frequency (rf) plasma needle, a dielectric barrier plasma jet, and a low-temperature plasma probe. The ambient composition of the three sources and their effectiveness at analyzing a range of pharmaceuticals and polymers were assessed. Results show that the background mass spectrum of each source was dominated by air species, with the rf needle producing a richer ion spectrum consisting mainly of ionized water clusters. It was also seen that each source produced different ion fragments of the analytes under investigation: this is thought to be due to different substrate heating, different ion transport mechanisms, and different electric field orientations. The rf needle was found to fragment the analytes least and as a result it was able to detect larger polymer ions than the other sources.
Comparison of three plasma sources for ambient desorption/ionization mass spectrometry.
McKay, Kirsty; Salter, Tara L; Bowfield, Andrew; Walsh, James L; Gilmore, Ian S; Bradley, James W
2014-09-01
Plasma-based desorption/ionization sources are an important ionization technique for ambient surface analysis mass spectrometry. In this paper, we compare and contrast three competing plasma based desorption/ionization sources: a radio-frequency (rf) plasma needle, a dielectric barrier plasma jet, and a low-temperature plasma probe. The ambient composition of the three sources and their effectiveness at analyzing a range of pharmaceuticals and polymers were assessed. Results show that the background mass spectrum of each source was dominated by air species, with the rf needle producing a richer ion spectrum consisting mainly of ionized water clusters. It was also seen that each source produced different ion fragments of the analytes under investigation: this is thought to be due to different substrate heating, different ion transport mechanisms, and different electric field orientations. The rf needle was found to fragment the analytes least and as a result it was able to detect larger polymer ions than the other sources.
Perceptually controlled doping for audio source separation
NASA Astrophysics Data System (ADS)
Mahé, Gaël; Nadalin, Everton Z.; Suyama, Ricardo; Romano, João MT
2014-12-01
The separation of an underdetermined audio mixture can be performed through sparse component analysis (SCA) that relies however on the strong hypothesis that source signals are sparse in some domain. To overcome this difficulty in the case where the original sources are available before the mixing process, the informed source separation (ISS) embeds in the mixture a watermark, which information can help a further separation. Though powerful, this technique is generally specific to a particular mixing setup and may be compromised by an additional bitrate compression stage. Thus, instead of watermarking, we propose a `doping' method that makes the time-frequency representation of each source more sparse, while preserving its audio quality. This method is based on an iterative decrease of the distance between the distribution of the signal and a target sparse distribution, under a perceptual constraint. We aim to show that the proposed approach is robust to audio coding and that the use of the sparsified signals improves the source separation, in comparison with the original sources. In this work, the analysis is made only in instantaneous mixtures and focused on voice sources.
NASA Technical Reports Server (NTRS)
Blados, Walter R.; Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.
1990-01-01
This paper formulates and studies two propositions. Proposition 1 states that information that is external to the aerospace organization tends to be used less than internal sources of information; the more geographically removed the information is from the organization, the less likely it is to be used. Proposition 2 states that of the various sociometric variables assumed to influence the use of an information channel or source, perceived accessibility exerts the greatest influence. Preliminary analysis based on surveys supports Proposition 1. This analysis does not support Proposition 2, however. Evidence here indicates that reliability and relevance influence the use of an information source more than the idea of perceived accessibility.
Zhu, Guangxu; Guo, Qingjun; Xiao, Huayun; Chen, Tongbin; Yang, Jun
2017-06-01
Heavy metals are considered toxic to humans and ecosystems. In the present study, heavy metal concentration in soil was investigated using the single pollution index (PIi), the integrated Nemerow pollution index (PIN), and the geoaccumulation index (Igeo) to determine metal accumulation and its pollution status at the abandoned site of the Capital Iron and Steel Factory in Beijing and its surrounding area. Multivariate statistical (principal component analysis and correlation analysis), geostatistical analysis (ArcGIS tool), combined with stable Pb isotopic ratios, were applied to explore the characteristics of heavy metal pollution and the possible sources of pollutants. The results indicated that heavy metal elements show different degrees of accumulation in the study area, the observed trend of the enrichment factors, and the geoaccumulation index was Hg > Cd > Zn > Cr > Pb > Cu ≈ As > Ni. Hg, Cd, Zn, and Cr were the dominant elements that influenced soil quality in the study area. The Nemerow index method indicated that all of the heavy metals caused serious pollution except Ni. Multivariate statistical analysis indicated that Cd, Zn, Cu, and Pb show obvious correlation and have higher loads on the same principal component, suggesting that they had the same sources, which are related to industrial activities and vehicle emissions. The spatial distribution maps based on ordinary kriging showed that high concentrations of heavy metals were located in the local factory area and in the southeast-northwest part of the study region, corresponding with the predominant wind directions. Analyses of lead isotopes confirmed that Pb in the study soils is predominantly derived from three Pb sources: dust generated during steel production, coal combustion, and the natural background. Moreover, the ternary mixture model based on lead isotope analysis indicates that lead in the study soils originates mainly from anthropogenic sources, which contribute much more than the natural sources. Our study could not only reveal the overall situation of heavy metal contamination, but also identify the specific pollution sources.
Early-type galaxies: Automated reduction and analysis of ROSAT PSPC data
NASA Technical Reports Server (NTRS)
Mackie, G.; Fabbiano, G.; Harnden, F. R., Jr.; Kim, D.-W.; Maggio, A.; Micela, G.; Sciortino, S.; Ciliegi, P.
1996-01-01
Preliminary results of early-type galaxies that will be part of a galaxy catalog to be derived from the complete Rosat data base are presented. The stored data were reduced and analyzed by an automatic pipeline. This pipeline is based on a command language scrip. The important features of the pipeline include new data time screening in order to maximize the signal to noise ratio of faint point-like sources, source detection via a wavelet algorithm, and the identification of sources with objects from existing catalogs. The pipeline outputs include reduced images, contour maps, surface brightness profiles, spectra, color and hardness ratios.
An open-source computational and data resource to analyze digital maps of immunopeptidomes
Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; ...
2015-07-08
We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies.
NASA Astrophysics Data System (ADS)
Watanabe, Yukinobu; Kin, Tadahiro; Araki, Shouhei; Nakayama, Shinsuke; Iwamoto, Osamu
2017-09-01
A comprehensive research program on deuteron nuclear data motivated by development of accelerator-based neutron sources is being executed. It is composed of measurements of neutron and gamma-ray yields and production cross sections, modelling of deuteron-induced reactions and code development, nuclear data evaluation and benchmark test, and its application to medical radioisotopes production. The goal of this program is to develop a state-of-the-art deuteron nuclear data library up to 200 MeV which will be useful for the design of future (d,xn) neutron sources. The current status and future plan are reviewed.
Non-contact cardiac pulse rate estimation based on web-camera
NASA Astrophysics Data System (ADS)
Wang, Yingzhi; Han, Tailin
2015-12-01
In this paper, we introduce a new methodology of non-contact cardiac pulse rate estimation based on the imaging Photoplethysmography (iPPG) and blind source separation. This novel's approach can be applied to color video recordings of the human face and is based on automatic face tracking along with blind source separation of the color channels into RGB three-channel component. First of all, we should do some pre-processings of the data which can be got from color video such as normalization and sphering. We can use spectrum analysis to estimate the cardiac pulse rate by Independent Component Analysis (ICA) and JADE algorithm. With Bland-Altman and correlation analysis, we compared the cardiac pulse rate extracted from videos recorded by a basic webcam to a Commercial pulse oximetry sensors and achieved high accuracy and correlation. Root mean square error for the estimated results is 2.06bpm, which indicates that the algorithm can realize the non-contact measurements of cardiac pulse rate.
NASA Astrophysics Data System (ADS)
Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang
2018-02-01
Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.
Adaptive distributed source coding.
Varodayan, David; Lin, Yao-Chung; Girod, Bernd
2012-05-01
We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.
Article title misstates the role of pavement sealers.
O'Reilly, Kirk
2014-08-01
The claim made in the title of Witter et al. (2014) "Coal-tar-based sealcoated pavement: A major PAH source to urban stream sediments" is not supported by the data presented. The author's use of Pearson correlation coefficients is insufficient to indicate causation. The application of spatial analysis and principle component analysis did not include sealer specific inputs, so provides no basis for the claim. To test the hypothesis that sealers are a source of PAHs in the stream studied, EPA's Chemical Mass Balance (CMB) source evaluation model was applied to Witter's sediment data. CMB found an excellent fit (R(2) > 0.999) between measured and modeled PAH concentrations when sealers were not included as a potential source. This finding does not support Witter et al. (2014) claim that sealers are a major source of PAHs. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Su, Shiliang; Zhi, Junjun; Lou, Liping; Huang, Fang; Chen, Xia; Wu, Jiaping
Characterizing the spatio-temporal patterns and apportioning the pollution sources of water bodies are important for the management and protection of water resources. The main objective of this study is to describe the dynamics of water quality and provide references for improving river pollution control practices. Comprehensive application of neural-based modeling and different multivariate methods was used to evaluate the spatio-temporal patterns and source apportionment of pollution in Qiantang River, China. Measurement data were obtained and pretreated for 13 variables from 41 monitoring sites for the period of 2001-2004. A self-organizing map classified the 41 monitoring sites into three groups (Group A, B and C), representing different pollution characteristics. Four significant parameters (dissolved oxygen, biochemical oxygen demand, total phosphorus and total lead) were identified by discriminant analysis for distinguishing variations of different years, with about 80% correct assignment for temporal variation. Rotated principal component analysis (PCA) identified four potential pollution sources for Group A (domestic sewage and agricultural pollution, industrial wastewater pollution, mineral weathering, vehicle exhaust and sand mining), five for Group B (heavy metal pollution, agricultural runoff, vehicle exhaust and sand mining, mineral weathering, chemical plants discharge) and another five for Group C (vehicle exhaust and sand mining, chemical plants discharge, soil weathering, biochemical pollution, mineral weathering). The identified potential pollution sources explained 75.6% of the total variances for Group A, 75.0% for Group B and 80.0% for Group C, respectively. Receptor-based source apportionment was applied to further estimate source contributions for each pollution variable in the three groups, which facilitated and supported the PCA results. These results could assist managers to develop optimal strategies and determine priorities for river pollution control and effective water resources management.
NASA Astrophysics Data System (ADS)
Alewell, Christine; Birkholz, Axel; Meusburger, Katrin; Schindler Wildhaber, Yael; Mabit, Lionel
2016-03-01
As sediment loads impact freshwater systems and infrastructure, their origin in complex landscape systems is of crucial importance for sustainable management of agricultural catchments. We differentiated the sediment source contribution to a lowland river in central Switzerland by using compound-specific isotope analysis (CSIA). We found a clear distinction of sediment sources originating from forest and agricultural land use. Our results demonstrate that it is possible to reduce the uncertainty of sediment source attribution in: (i) using compound content (in our case, long-chain fatty acids; FAs) rather than soil organic matter content to transfer δ13C signal of FAs to soil contribution and (ii) restricting the investigation to the long-chain FAs (> C22 : 0) not to introduce errors due to aquatic contributions from algae and microorganisms. Results showed unambiguously that during base flow, agricultural land contributed up to 65 % of the suspended sediments, while forest was the dominant sediment source during high flow. This indicates that connectivity of sediment source areas within the river changes between base and high flow conditions. Uncertainty, which might occur in complex, large-scale studies due to undetected source attribution and/or CSSI signature degradation, is low because of limited data complexity in our study (i.e., two-three sources and two tracers). Our findings are the first published results highlighting (i) significant differences in compound-specific stable isotope (CSSI) signature of sediment sources from land uses dominated by C3 plant cultivation and (ii) the use of these differences to quantify sediment contribution to a small river.
Joint source based morphometry identifies linked gray and white matter group differences
Xu, Lai; Pearlson, Godfrey; Calhoun, Vince D.
2009-01-01
We present a multivariate approach called joint source based morphometry (jSBM), to identify linked gray and white matter regions which differ between groups. In jSBM, joint independent component analysis (jICA) is used to decompose preprocessed gray and white matter images into joint sources and statistical analysis is used to determine the significant joint sources showing group differences and their relationship to other variables of interest (e.g. age or sex). The identified joint sources are groupings of linked gray and white matter regions with common covariation among subjects. In this study, we first provide a simulation to validate the jSBM approach. To illustrate our method on real data, jSBM is then applied to structural magnetic resonance imaging (sMRI) data obtained from 120 chronic schizophrenia patients and 120 healthy controls to identify group differences. JSBM identified four joint sources as significantly associated with schizophrenia. Linked gray–white matter regions identified in each of the joint sources included: 1) temporal — corpus callosum, 2) occipital/frontal — inferior fronto-occipital fasciculus, 3) frontal/parietal/occipital/temporal —superior longitudinal fasciculus and 4) parietal/frontal — thalamus. Age effects on all four joint sources were significant, but sex effects were significant only for the third joint source. Our findings demonstrate that jSBM can exploit the natural linkage between gray and white matter by incorporating them into a unified framework. This approach is applicable to a wide variety of problems to study linked gray and white matter group differences. PMID:18992825
NASA Astrophysics Data System (ADS)
Yang, Liyang; Chang, Soon-Woong; Shin, Hyun-Sang; Hur, Jin
2015-04-01
The source of river dissolved organic matter (DOM) during storm events has not been well constrained, which is critical in determining the quality and reactivity of DOM. This study assessed temporal changes in the contributions of four end members (weeds, leaf litter, soil, and groundwater), which exist in a small forested watershed (the Ehwa Brook, South Korea), to the stream DOM during two storm events, using end member mixing analysis (EMMA) based on spectroscopic properties of DOM. The instantaneous export fluxes of dissolved organic carbon (DOC), chromophoric DOM (CDOM), and fluorescent components were all enhanced during peak flows. The DOC concentration increased with the flow rate, while CDOM and humic-like fluorescent components were diluted around the peak flows. Leaf litter was dominant for the DOM source in event 2 with a higher rainfall, although there were temporal variations in the contributions of the four end members to the stream DOM for both events. The contribution of leaf litter peaked while that of deeper soils decreased to minima at peak flows. Our results demonstrated that EMMA based on DOM properties could be used to trace the DOM source, which is of fundamental importance for understanding the factors responsible for river DOM dynamics during storm events.
An automated multi-scale network-based scheme for detection and location of seismic sources
NASA Astrophysics Data System (ADS)
Poiata, N.; Aden-Antoniow, F.; Satriano, C.; Bernard, P.; Vilotte, J. P.; Obara, K.
2017-12-01
We present a recently developed method - BackTrackBB (Poiata et al. 2016) - allowing to image energy radiation from different seismic sources (e.g., earthquakes, LFEs, tremors) in different tectonic environments using continuous seismic records. The method exploits multi-scale frequency-selective coherence in the wave field, recorded by regional seismic networks or local arrays. The detection and location scheme is based on space-time reconstruction of the seismic sources through an imaging function built from the sum of station-pair time-delay likelihood functions, projected onto theoretical 3D time-delay grids. This imaging function is interpreted as the location likelihood of the seismic source. A signal pre-processing step constructs a multi-band statistical representation of the non stationary signal, i.e. time series, by means of higher-order statistics or energy envelope characteristic functions. Such signal-processing is designed to detect in time signal transients - of different scales and a priori unknown predominant frequency - potentially associated with a variety of sources (e.g., earthquakes, LFE, tremors), and to improve the performance and the robustness of the detection-and-location location step. The initial detection-location, based on a single phase analysis with the P- or S-phase only, can then be improved recursively in a station selection scheme. This scheme - exploiting the 3-component records - makes use of P- and S-phase characteristic functions, extracted after a polarization analysis of the event waveforms, and combines the single phase imaging functions with the S-P differential imaging functions. The performance of the method is demonstrated here in different tectonic environments: (1) analysis of the one year long precursory phase of 2014 Iquique earthquake in Chile; (2) detection and location of tectonic tremor sources and low-frequency earthquakes during the multiple episodes of tectonic tremor activity in southwestern Japan.
NASA Astrophysics Data System (ADS)
Lavrieux, Marlène; Meusburger, Katrin; Birkholz, Axel; Alewell, Christine
2017-04-01
Slope destabilization and associated sediment transfer are among the major causes of aquatic ecosystems and surface water quality impairment. Through land uses and agricultural practices, human activities modify the soil erosive risk and the catchment connectivity, becoming a key factor of sediment dynamics. Hence, restoration and management plans of water bodies can only be efficient if the sediment sources and the proportion attributable to different land uses and agricultural practices are identified. Several sediment fingerprinting methods, based on the geochemical (elemental composition), color, magnetic or isotopic (137Cs) sediment properties, are currently in use. However, these tools are not suitable for a land-use based fingerprinting. New organic geochemical approaches are now developed to discriminate source-soil contributions under different land-uses: The compound-specific stable isotopes (CSSI) technique, based on the biomarkers isotopic signature (here, fatty acids δ13C) variability within the plant species, The analysis of highly specific (i.e. source-family- or even source-species-specific) biomarkers assemblages, which use is until now mainly restricted to palaeoenvironmental reconstructions, and which offer also promising prospects for tracing current sediment origin. The approach was applied to reconstruct the spatio-temporal variability of the main sediment sources of Baldegg Lake (Lucern Canton, Switzerland), which suffers from a substantial eutrophication, despite several restoration attempts during the last 40 years. The sediment supplying areas and the exported volumes were identified using CSSI technique and highly specific biomarkers, coupled to a sediment connectivity model. The sediment origin variability was defined through the analysis of suspended river sediments sampled at high flow conditions (short term), and by the analysis of a lake sediment core covering the last 130 years (long term). The results show the utility of biomarkers and CSSI to track organic sources in contrasted land-use settings. Associated to other fingerprinting methods, this approach could in the future become a decision support tool for catchments management.
Corneal topography with high-speed swept source OCT in clinical examination
Karnowski, Karol; Kaluzny, Bartlomiej J.; Szkulmowski, Maciej; Gora, Michalina; Wojtkowski, Maciej
2011-01-01
We present the applicability of high-speed swept source (SS) optical coherence tomography (OCT) for quantitative evaluation of the corneal topography. A high-speed OCT device of 108,000 lines/s permits dense 3D imaging of the anterior segment within a time period of less than one fourth of second, minimizing the influence of motion artifacts on final images and topographic analysis. The swept laser performance was specially adapted to meet imaging depth requirements. For the first time to our knowledge the results of a quantitative corneal analysis based on SS OCT for clinical pathologies such as keratoconus, a cornea with superficial postinfectious scar, and a cornea 5 months after penetrating keratoplasty are presented. Additionally, a comparison with widely used commercial systems, a Placido-based topographer and a Scheimpflug imaging-based topographer, is demonstrated. PMID:21991558
What are the Starting Points? Evaluating Base-Year Assumptions in the Asian Modeling Exercise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaturvedi, Vaibhav; Waldhoff, Stephanie; Clarke, Leon E.
2012-12-01
A common feature of model inter-comparison efforts is that the base year numbers for important parameters such as population and GDP can differ substantially across models. This paper explores the sources and implications of this variation in Asian countries across the models participating in the Asian Modeling Exercise (AME). Because the models do not all have a common base year, each team was required to provide data for 2005 for comparison purposes. This paper compares the year 2005 information for different models, noting the degree of variation in important parameters, including population, GDP, primary energy, electricity, and CO2 emissions. Itmore » then explores the difference in these key parameters across different sources of base-year information. The analysis confirms that the sources provide different values for many key parameters. This variation across data sources and additional reasons why models might provide different base-year numbers, including differences in regional definitions, differences in model base year, and differences in GDP transformation methodologies, are then discussed in the context of the AME scenarios. Finally, the paper explores the implications of base-year variation on long-term model results.« less
[Study on atmospheric VOCs in Gongga Mountain base station].
Zhang, Jun-Ke; Wang, Yue-Si; Wu, Fang-Kun; Sun, Jie
2012-12-01
Volatile organic compounds (VOCs) play important roles in the atmosphere as precursors of secondary air pollutants. The regional background concentrations and variation characteristics of VOCs in the atmosphere of southwestern China were studied. Meanwhile, a receptor model based on principal component analysis (PCA) was used to identify the major sources of VOCs. Weekly samples were collected in 2007 in the Gongga Mountain base station and analyzed with a three-stage preconcentration method coupled with GC-MS. The annual mean concentration of TVOCs and NMHCs were 9.40 x 10(-9) +/- 4.55 x 10(-9) and 7.73 x 10(-9) +/- 4.43 x 10(-9), respectively. Aromatic hydrocarbons provided the largest contribution to TVOCs (37.3%), follow by alkanes (30.0%) and halogenated hydrocarbons (19.8%), the smallest contribution was from alkenes (12.9%). Three major sources were resolved by the receptor model, traffic sources, biogenic sources and combustion sources. The seasonal variation of TVOCs in this area was obviously, and the order was autumn > winter > spring > summer. TVOCs concentration in autumn was very significantly higher than that in summer (P < 0.01). The seasonal variation of the four types of VOCs showed different characteristics due to the differences in photochemical properties. Isoprene emissions were from biogenic sources. Regression analysis revealed a good exponential relationship between the isoprene concentration and temperature. High temperatures increased the isoprene concentrations. However, the isoprene concentration remained constant when the ambient air temperature was below 20 degrees C. The TVOCs in Gongga Mountain were at a medium level comparing with the results of other regions, and there was a clear background station emission characteristic.
Inferring the source of evaporated waters using stable H and O isotopes
NASA Astrophysics Data System (ADS)
Bowen, G. J.; Putman, A.; Brooks, J. R.; Bowling, D. R.; Oerter, E.; Good, S. P.
2017-12-01
Stable isotope ratios of H and O are widely used identify the source of water, e.g., in aquifers, river runoff, soils, plant xylem, and plant-based beverages. In situations where the sampled water is partially evaporated, its isotope values will have evolved along an evaporation line (EL) in δ2H/δ18O space, and back-correction along the EL to its intersection with a meteoric water line (MWL) has been used to estimate the source water's isotope ratios. Several challenges and potential pitfalls exist with traditional approaches to this problem, including potential for bias from a commonly used regression-based approach for EL slope estimation and incomplete estimation of uncertainty in most studies. We suggest the value of a model-based approach to EL estimation, and introduce a mathematical framework that eliminates the need to explicitly estimate the EL-MWL intersection, simplifying analysis and facilitating more rigorous uncertainty estimation. We apply this analysis framework to data from 1,000 lakes sampled in EPA's 2007 National Lakes Assessment. We find that data for most lakes is consistent with a water source similar to annual runoff, estimated from monthly precipitation and evaporation within the lake basin. Strong evidence for both summer- and winter-biased sources exists, however, with winter bias pervasive in most snow-prone regions. The new analytical framework should improve the rigor of source-water inference from evaporated samples in ecohydrology and related sciences, and our initial results from U.S. lakes suggest that previous interpretations of lakes as unbiased isotope integrators may only be valid in certain climate regimes.
NASA Astrophysics Data System (ADS)
Kong, Shaofei; Lu, Bing; Ji, Yaqin; Bai, Zhipeng; Xu, Yonghai; Liu, Yong; Jiang, Hua
2012-08-01
Thirty re-suspended dust samples were collected from building surfaces in an oilfield city, re-suspended and sampled through PM2.5, PM10 and PM100 inlets and analyzed for 18 PAHs by GC-MS technique. PAHs concentrations, toxicity and profiles characteristic for different districts and size were studied. PAHs sources were identified by diagnostic ratios and primary component analysis. Results showed that the total amounts of analyzed PAHs in re-suspended dust in Dongying were 45.29, 23.79 and 11.41 μg g-1 for PM2.5, PM10 and PM100, respectively. PAHs tended to concentrate in finer particles with mass ratios of PM2.5/PM10 and PM10/PM100 as 1.96 ± 0.86 and 2.53 ± 1.57. The old district with more human activities and long oil exploitation history exhibited higher concentrations of PAHs from both combustion and non-combustion sources. BaP-based toxic equivalent factor and BaP-based equivalent carcinogenic power exhibited decreasing sequence as PM2.5 > PM10 > PM100 suggesting that the finer the particles, the more toxic of the dust. NaP, Phe, Flu, Pyr, BbF and BghiP were the abundant species. Coefficient of divergence analysis implied that PAHs in different districts and size fractions had common sources. Coal combustion, industrial sources, vehicle emission and petroleum were probably the main contributions according to the principal component analysis result.
Burgos, Pablo; Kilborn, Kerry; Evans, Jonathan J.
2017-01-01
Objective Time-based prospective memory (PM), remembering to do something at a particular moment in the future, is considered to depend upon self-initiated strategic monitoring, involving a retrieval mode (sustained maintenance of the intention) plus target checking (intermittent time checks). The present experiment was designed to explore what brain regions and brain activity are associated with these components of strategic monitoring in time-based PM tasks. Method 24 participants were asked to reset a clock every four minutes, while performing a foreground ongoing word categorisation task. EEG activity was recorded and data were decomposed into source-resolved activity using Independent Component Analysis. Common brain regions across participants, associated with retrieval mode and target checking, were found using Measure Projection Analysis. Results Participants decreased their performance on the ongoing task when concurrently performed with the time-based PM task, reflecting an active retrieval mode that relied on withdrawal of limited resources from the ongoing task. Brain activity, with its source in or near the anterior cingulate cortex (ACC), showed changes associated with an active retrieval mode including greater negative ERP deflections, decreased theta synchronization, and increased alpha suppression for events locked to the ongoing task while maintaining a time-based intention. Activity in the ACC was also associated with time-checks and found consistently across participants; however, we did not find an association with time perception processing per se. Conclusion The involvement of the ACC in both aspects of time-based PM monitoring may be related to different functions that have been attributed to it: strategic control of attention during the retrieval mode (distributing attentional resources between the ongoing task and the time-based task) and anticipatory/decision making processing associated with clock-checks. PMID:28863146
A search for rapidly modulated emission in bright X-ray sources using the HEAO A-1 data base
NASA Technical Reports Server (NTRS)
Fairbank, William M.
1987-01-01
A search was performed in the HEAO A-1 Data Base (located at the Naval Research Laboratory in Washington, D.C.) for evidence of rapidly-rotating neutron stars that could be sources of coherent gravitational radiation. A new data analysis algorithm, which was developed, is described. The algorithm was applied to data from observations of Cyg X-2, Cyg X-3, and 1820-30. Upper limits on pulse fraction were derived and reported.
Analysis, Analysis Practices and Implications for Modeling and Simulation
2007-01-01
the Somme, New York: Penguin , 1983. Kent, Glenn A., “Looking Back: Four Decades of Analysis,” Operations Research, Vol. 50, No. 1, 2002, pp. 122–224...to many sources is http://www.saunalahti.fi/ fta /EBO.htm (as of December 18, 2006). Effects-based operations are controversial in some respects (Davis
NASA Astrophysics Data System (ADS)
Zheng, Sifa; Liu, Haitao; Dan, Jiabi; Lian, Xiaomin
2015-05-01
Linear time-invariant assumption for the determination of acoustic source characteristics, the source strength and the source impedance in the frequency domain has been proved reasonable in the design of an exhaust system. Different methods have been proposed to its identification and the multi-load method is widely used for its convenience by varying the load number and impedance. Theoretical error analysis has rarely been referred to and previous results have shown an overdetermined set of open pipes can reduce the identification error. This paper contributes a theoretical error analysis for the load selection. The relationships between the error in the identification of source characteristics and the load selection were analysed. A general linear time-invariant model was built based on the four-load method. To analyse the error of the source impedance, an error estimation function was proposed. The dispersion of the source pressure was obtained by an inverse calculation as an indicator to detect the accuracy of the results. It was found that for a certain load length, the load resistance at the frequency points of one-quarter wavelength of odd multiples results in peaks and in the maximum error for source impedance identification. Therefore, the load impedance of frequency range within the one-quarter wavelength of odd multiples should not be used for source impedance identification. If the selected loads have more similar resistance values (i.e., the same order of magnitude), the identification error of the source impedance could be effectively reduced.
NASA Technical Reports Server (NTRS)
Lund, Kurt O.
1991-01-01
The simplified geometry for the analysis is an infinite, axis symmetric annulus with a specified solar flux at the outer radius. The inner radius is either adiabatic (modeling Flight Experiment conditions), or convective (modeling Solar Dynamic conditions). Liquid LiF either contacts the outer wall (modeling ground based testing), or faces a void gap at the outer wall (modeling possible space based conditions). The analysis is presented in three parts: Part 3 considers and adiabatic inner wall and linearized radiation equations; part 2 adds effects of convection at the inner wall; and part 1 includes the effect of the void gap, as well as previous effects, and develops the radiation model further. The main results are the differences in melting behavior which can occur between ground based 1 g experiments and the microgravity flight experiments. Under 1 gravity, melted PCM will always contact the outer wall having the heat flux source, thus providing conductance from this source to the phase change front. In space based tests where a void gap may likely form during solidification, the situation is reversed; radiation is now the only mode of heat transfer and the majority of melting takes place from the inner wall.
Pabon, Peter; Ternström, Sten; Lamarche, Anick
2011-06-01
To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the contour, is assessed and also is compared to density-based VRP averaging methods that use the overlap count. VRP contours can be usefully described and compared using FDs. The method also permits the visualization of the local covariation along the contour average. For example, the FD-based analysis shows that the population variance for ensembles of VRP contours is usually smallest at the upper left part of the VRP. To illustrate the method's advantages and possible further application, graphs are given that compare the averaged contours from different authors and recording devices--for normal, trained, and untrained male and female voices as well as for child voices. The proposed technique allows any VRP shape to be brought to the same uniform base. On this uniform base, VRP contours or contour elements coming from a variety of sources may be placed within the same graph for comparison and for statistical analysis.
RADIOISOTOPES USED IN PHARMACY. 5. IONIZING RADIATION IN PHARMACEUTICAL ANALYSIS (in Danish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristensen, K.
1962-09-01
The use of radioisotope methods for analyzing drugs is reviewed. It is pointed out that heretofore most methods have been based on isotope dilution principles whereas in the future radioactivation analysis, especially with neutron sources, offers great possibilities. (BBB)
White Matter Fiber-based Analysis of T1w/T2w Ratio Map.
Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D; Entringer, Sonja; Buss, Claudia; Styner, Martin
2017-02-01
To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.
White matter fiber-based analysis of T1w/T2w ratio map
NASA Astrophysics Data System (ADS)
Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D.; Entringer, Sonja; Buss, Claudia; Styner, Martin
2017-02-01
Purpose: To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. Background: The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. Methods: We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. Results: We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.
Li, Li-Guan; Yin, Xiaole; Zhang, Tong
2018-05-24
Antimicrobial resistance (AMR) has been a worldwide public health concern. Current widespread AMR pollution has posed a big challenge in accurately disentangling source-sink relationship, which has been further confounded by point and non-point sources, as well as endogenous and exogenous cross-reactivity under complicated environmental conditions. Because of insufficient capability in identifying source-sink relationship within a quantitative framework, traditional antibiotic resistance gene (ARG) signatures-based source-tracking methods would hardly be a practical solution. By combining broad-spectrum ARG profiling with machine-learning classification SourceTracker, here we present a novel way to address the question in the era of high-throughput sequencing. Its potential in extensive application was firstly validated by 656 global-scale samples covering diverse environmental types (e.g., human/animal gut, wastewater, soil, ocean) and broad geographical regions (e.g., China, USA, Europe, Peru). Its potential and limitations in source prediction as well as effect of parameter adjustment were then rigorously evaluated by artificial configurations with representative source proportions. When applying SourceTracker in region-specific analysis, excellent performance was achieved by ARG profiles in two sample types with obvious different source compositions, i.e., influent and effluent of wastewater treatment plant. Two environmental metagenomic datasets of anthropogenic interference gradient further supported its potential in practical application. To complement general-profile-based source tracking in distinguishing continuous gradient pollution, a few generalist and specialist indicator ARGs across ecotypes were identified in this study. We demonstrated for the first time that the developed source-tracking platform when coupling with proper experiment design and efficient metagenomic analysis tools will have significant implications for assessing AMR pollution. Following predicted source contribution status, risk ranking of different sources in ARG dissemination will be possible, thereby paving the way for establishing priority in mitigating ARG spread and designing effective control strategies.
NASA Astrophysics Data System (ADS)
Ajayakumar, J.; Shook, E.; Turner, V. K.
2017-10-01
With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE). SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the "North American storm complex" in December 2015, the "snowstorm Jonas" in January 2016, the "West Virginia floods" in June 2016, and the "Hurricane Matthew" in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI) for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with social media data and will be useful for researchers and decision makers to enhance their analysis on spatio-temporal social media responses during extreme events.
Calibration of Passive Microwave Polarimeters that Use Hybrid Coupler-Based Correlators
NASA Technical Reports Server (NTRS)
Piepmeier, J. R.
2003-01-01
Four calibration algorithms are studied for microwave polarimeters that use hybrid coupler-based correlators: 1) conventional two-look of hot and cold sources, 2) three looks of hot and cold source combinations, 3) two-look with correlated source, and 4) four-look combining methods 2 and 3. The systematic errors are found to depend on the polarimeter component parameters and accuracy of calibration noise temperatures. A case study radiometer in four different remote sensing scenarios was considered in light of these results. Applications for Ocean surface salinity, Ocean surface winds, and soil moisture were found to be sensitive to different systematic errors. Finally, a standard uncertainty analysis was performed on the four-look calibration algorithm, which was found to be most sensitive to the correlated calibration source.
Protein Data Bank depositions from synchrotron sources.
Jiang, Jiansheng; Sweet, Robert M
2004-07-01
A survey and analysis of Protein Data Bank (PDB) depositions from international synchrotron radiation facilities, based on the latest released PDB entries, are reported. The results (http://asdp.bnl.gov/asda/Libraries/) show that worldwide, every year since 1999, more than 50% of the deposited X-ray structures have used synchrotron facilities, reaching 75% by 2003. In this web-based database, all PDB entries among individual synchrotron beamlines are archived, synchronized with the weekly PDB release. Statistics regarding the quality of experimental data and the refined model for all structures are presented, and these are analysed to reflect the impact of synchrotron sources. The results confirm the common impression that synchrotron sources extend the size of structures that can be solved with equivalent or better quality than home sources.
Seo, S-T; Tsuchiya, K
2004-01-01
To study the genotypic identification and characterization of the 119 Burkholderia cepacia complex (Bcc) strains recovered from clinical and environmental sources in Japan and Thailand. Based on the results of analysis by 16S rDNA RFLP generated after digestion with DdeI, the Bcc strains were differentiated into two patterns: pattern 1 (including Burkholderia vietnamiensis) and pattern 2 (including B. cepacia genomovar I, Burkholderia cenocepacia and Burkholderia stabilis). All strains belonged to pattern 2 except for one strain. In the RFLP analysis of the recA gene using HaeIII, strains were separated into eight patterns designated as A, D, E, G, H, I, J and K, of which pattern K was new. Burkholderia cepacia epidemic strain marker (BCESM) encoded by esmR [corrected] and the pyrrolnitrin biosynthetic locus encoded by prnC were present in 22 strains (18%) and 88 strains (74%) from all sources, respectively. All esmR-positive [corrected] strains belonged to B. cenocepacia, whereas most prnC-positive strains belonged to B. cepacia genomovar I. Strains derived from clinical sources were assigned to B. cepacia genomovar I, B. cenocepacia, B. stabilis and B. vietnamiensis. The majority of Bcc strains from environmental sources (77 of a total 95 strains) belonged to B. cepacia genomovar I, whereas the rest belonged to B. cenocepacia. On the basis of genomovar-specific PCR and prnC RFLP analysis, strains belonging to recA pattern K were identified as B. cepacia genomovar I. This work provides the genotypic identification of a collection of the Bcc strains from Japan and Thailand. RFLP analysis of the prnC gene promises to be a useful method for differentiating Burkholderia pyrrocinia from B. cepacia genomovar I strains.
Devulder, Veerle; Degryse, Patrick; Vanhaecke, Frank
2013-12-17
The provenance of the flux raw material used in the manufacturing of Roman glass is an understudied topic in archaeology. Whether one or multiple sources of natron mineral salts were exploited during this period is still open for debate, largely because of the lack of a good provenance indicator. The flux is the major source of B in Roman glass. Therefore, B isotopic analysis of a sufficiently large collection and variety (origin and age) of such glass samples might give an indication of the number of flux sources used. For this purpose, a method based on acid digestion, chromatographic B isolation and B isotopic analysis using multicollector inductively coupled plasma mass spectrometry was developed. B isolation was accomplished using a combination of strong cation exchange and strong anion exchange chromatography. Although the B fraction was not completely matrix-free, the remaining Sb was shown not to affect the δ(11)B result. The method was validated using obsidian and archaeological glass samples that were stripped of their B content, after which an isotopic reference material with known B isotopic composition was added. Absence of artificial B isotope fractionation was demonstrated, and the total uncertainty was shown to be <2‰. A proof-of-concept application to natron glass samples showed a narrow range of δ(11)B, whereas first results for natron salt samples do show a larger difference in δ(11)B. These results suggest the use of only one natron source or of several sources with similar δ(11)B. This indicates that B isotopic analysis is a promising tool for the provenance determination of this flux raw material.
NASA Technical Reports Server (NTRS)
Long, V. S.; Wright, M. C.; McDanels, S. J.; Lubas, D.; Tucker, B.; Marciniak, P. J.
2010-01-01
This slide presentation reviews the debris analysis of the Starboard Solar Alpha Rotary Joint (SARJ), a mechanism that is designed to keep the solar arrays facing the sun. The goal of this was to identify the failure mechanism based on surface morphology and to determine the source of debris through elemental and particle analysis.
Chronic disease surveillance systems within the US Associated Pacific Island jurisdictions.
Hosey, Gwen; Ichiho, Henry; Satterfield, Dawn; Dankwa-Mullan, Irene; Kuartei, Stevenson; Rhee, Kyu; Belyeu-Camacho, Tayna; deBrum, Ione; Demei, Yorah; Lippwe, Kipier; Luces, Patrick Solidum; Roby, Faiese
2011-07-01
In recent years, illness and death due to chronic disease in the US Associated Pacific Islands (USAPI) jurisdictions have dramatically increased. Effective chronic disease surveillance can help monitor disease trends, evaluate public policy, prioritize resource allocation, and guide program planning, evaluation, and research. Although chronic disease surveillance is being conducted in the USAPI, no recently published capacity assessments for chronic disease surveillance are available. The objective of this study was to assess the quality of existing USAPI chronic disease data sources and identify jurisdictional capacity for chronic disease surveillance. The assessment included a chronic disease data source inventory, literature review, and review of surveillance documentation available from the web or through individual jurisdictions. We used the World Health Organization's Health Metric Network Framework to assess data source quality and to identify jurisdictional capacity. Results showed that USAPI data sources are generally aligned with widely accepted chronic disease surveillance indicators and use standardized data collection methodology to measure chronic disease behavioral risks, preventive practices, illness, and death. However, all jurisdictions need to strengthen chronic disease surveillance through continued assessment and expanded support for valid and reliable data collection, analysis and reporting, dissemination, and integration among population-based and institution-based data sources. For sustained improvement, we recommend investment and technical assistance in support of a chronic disease surveillance system that integrates population-based and institution-based data sources. An integrated strategy that bridges and links USAPI data sources can support evidence-based policy and population health interventions.
Apparatus and methods for determining at least one characteristic of a proximate environment
Novascone, Stephen R.; West, Phillip B.; Anderson, Michael J.
2008-04-15
Methods and an apparatus for determining at least one characteristic of an environment are disclosed. A vibrational energy may be imparted into an environment and a magnitude of damping of the vibrational energy may be measured and at least one characteristic of the environment may be determined. Particularly, a vibratory source may be operated and coupled to an environment. At least one characteristic of the environment may be determined based on a shift in at least one steady-state frequency of oscillation of the vibratory source. An apparatus may include at least one vibratory source and a structure for positioning the at least one vibratory source proximate to an environment. Further, the apparatus may include an analysis device for determining at least one characteristic of the environment based at least partially upon shift in a steady-state oscillation frequency of the vibratory source for the given impetus.
Wang, Jiawei; Zhou, Yuqi; Sun, Xiaodong; Ma, Qingyu; Zhang, Dong
2016-04-01
As a multiphysics imaging approach, magnetoacoustic tomography with magnetic induction (MAT-MI) works on the physical mechanism of magnetic excitation, acoustic vibration, and transmission. Based on the theoretical analysis of the source vibration, numerical studies are conducted to simulate the pathological changes of tissues for a single-layer cylindrical conductivity gradual-varying model and estimate the strengths of sources inside the model. The results suggest that the inner source is generated by the product of the conductivity and the curl of the induced electric intensity inside conductivity homogeneous medium, while the boundary source is produced by the cross product of the gradient of conductivity and the induced electric intensity at conductivity boundary. For a biological tissue with low conductivity, the strength of boundary source is much higher than that of the inner source only when the size of conductivity transition zone is small. In this case, the tissue can be treated as a conductivity abrupt-varying model, ignoring the influence of inner source. Otherwise, the contributions of inner and boundary sources should be evaluated together quantitatively. This study provide basis for further study of precise image reconstruction of MAT-MI for pathological tissues.
Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van
2017-08-01
Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
IQM: An Extensible and Portable Open Source Application for Image and Signal Analysis in Java
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM’s image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis. PMID:25612319
IQM: an extensible and portable open source application for image and signal analysis in Java.
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.
Receptor modeling for source apportionment of polycyclic aromatic hydrocarbons in urban atmosphere.
Singh, Kunwar P; Malik, Amrita; Kumar, Ranjan; Saxena, Puneet; Sinha, Sarita
2008-01-01
This study reports source apportionment of polycyclic aromatic hydrocarbons (PAHs) in particulate depositions on vegetation foliages near highway in the urban environment of Lucknow city (India) using the principal components analysis/absolute principal components scores (PCA/APCS) receptor modeling approach. The multivariate method enables identification of major PAHs sources along with their quantitative contributions with respect to individual PAH. The PCA identified three major sources of PAHs viz. combustion, vehicular emissions, and diesel based activities. The PCA/APCS receptor modeling approach revealed that the combustion sources (natural gas, wood, coal/coke, biomass) contributed 19-97% of various PAHs, vehicular emissions 0-70%, diesel based sources 0-81% and other miscellaneous sources 0-20% of different PAHs. The contributions of major pyrolytic and petrogenic sources to the total PAHs were 56 and 42%, respectively. Further, the combustion related sources contribute major fraction of the carcinogenic PAHs in the study area. High correlation coefficient (R2 > 0.75 for most PAHs) between the measured and predicted concentrations of PAHs suggests for the applicability of the PCA/APCS receptor modeling approach for estimation of source contribution to the PAHs in particulates.
Spatial Paradigm for Information Retrieval and Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
The SPIRE system consists of software for visual analysis of primarily text based information sources. This technology enables the content analysis of text documents without reading all the documents. It employs several algorithms for text and word proximity analysis. It identifies the key themes within the text documents. From this analysis, it projects the results onto a visual spatial proximity display (Galaxies or Themescape) where items (documents and/or themes) visually close to each other are known to have content which is close to each other. Innovative interaction techniques then allow for dynamic visual analysis of large text based information spaces.
SPIRE1.03. Spatial Paradigm for Information Retrieval and Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, K.J.; Bohn, S.; Crow, V.
The SPIRE system consists of software for visual analysis of primarily text based information sources. This technology enables the content analysis of text documents without reading all the documents. It employs several algorithms for text and word proximity analysis. It identifies the key themes within the text documents. From this analysis, it projects the results onto a visual spatial proximity display (Galaxies or Themescape) where items (documents and/or themes) visually close to each other are known to have content which is close to each other. Innovative interaction techniques then allow for dynamic visual analysis of large text based information spaces.
Use of Non-Social Work Journals in Social Work Research: Results of a Citation Analysis
ERIC Educational Resources Information Center
Strothmann, Molly
2010-01-01
Social work research and teaching draw on the literature of other disciplines. While the use of interdisciplinary sources has been discussed at length and citation patterns in social work literature have been studied, no research has identified specific sources from other disciplines that are important for social work scholarship. Based on…
Role of poultry in the H7N9 influenza outbreaks in China
USDA-ARS?s Scientific Manuscript database
The outbreaks of H7N9 influenza in China in spring 2013 resulted in many human cases with a high fatality rate. Poultry were suspected as the source of infection based on sequence analysis and virus isolations from live poultry markets (LPM). The original source of the virus from poultry farms is ...
Point source detection in infrared astronomical surveys
NASA Technical Reports Server (NTRS)
Pelzmann, R. F., Jr.
1977-01-01
Data processing techniques useful for infrared astronomy data analysis systems are reported. This investigation is restricted to consideration of data from space-based telescope systems operating as survey instruments. In this report the theoretical background for specific point-source detection schemes is completed, and the development of specific algorithms and software for the broad range of requirements is begun.
A Critical Analysis of the Sources of Reading Recovery: An Empiricist Perspective
ERIC Educational Resources Information Center
Groff, Patrick
2004-01-01
This discussion of the sources of "Reading Recovery" presents the results of an investigation into whether or not this relatively costly, tutoring remedial reading program, designed for primary-grade students, is based on relevant experimental evidence as to how these students best learn to read. The general finding of the study was that Reading…
Mapping Points of Interest: An Analysis of Students' Engagement with Digital Primary Sources
ERIC Educational Resources Information Center
Rysavy, Monica D. T.; Michalak, Russell; Hunt, Kevin
2018-01-01
The Digital Archival Advertisements Survey Process (DAASP) model is a collaborative active learning exercise designed to aid students in evaluating primary source documents of print-based advertisements. By deploying DAASP, the researchers were able to assess the students' ability to evaluate their biases of the advertisements in a first-year…
Chemicals dispersed by accidental, deliberate, or weather-related events must be rapidly identified to assess health risks. Mass spectra from high levels of analytes obtained using rapid, open-air ionization by a Direct Analysis in Real Time (DART®) ion source often contain
Unique effects and moderators of effects of sources on self-efficacy: A model-based meta-analysis.
Byars-Winston, Angela; Diestelmann, Jacob; Savoy, Julia N; Hoyt, William T
2017-11-01
Self-efficacy beliefs are strong predictors of academic pursuits, performance, and persistence, and in theory are developed and maintained by 4 classes of experiences Bandura (1986) referred to as sources: performance accomplishments (PA), vicarious learning (VL), social persuasion (SP), and affective arousal (AA). The effects of sources on self-efficacy vary by performance domain and individual difference factors. In this meta-analysis (k = 61 studies of academic self-efficacy; N = 8,965), we employed B. J. Becker's (2009) model-based approach to examine cumulative effects of the sources as a set and unique effects of each source, controlling for the others. Following Becker's recommendations, we used available data to create a correlation matrix for the 4 sources and self-efficacy, then used these meta-analytically derived correlations to test our path model. We further examined moderation of these associations by subject area (STEM vs. non-STEM), grade, sex, and ethnicity. PA showed by far the strongest unique association with self-efficacy beliefs. Subject area was a significant moderator, with sources collectively predicting self-efficacy more strongly in non-STEM (k = 14) compared with STEM (k = 47) subjects (R2 = .37 and .22, respectively). Within studies of STEM subjects, grade level was a significant moderator of the coefficients in our path model, as were 2 continuous study characteristics (percent non-White and percent female). Practical implications of the findings and future research directions are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Squizzato, Stefania; Masiol, Mauro
2015-10-01
The air quality is influenced by the potential effects of meteorology at meso- and synoptic scales. While local weather and mixing layer dynamics mainly drive the dispersion of sources at small scales, long-range transports affect the movements of air masses over regional, transboundary and even continental scales. Long-range transport may advect polluted air masses from hot-spots by increasing the levels of pollution at nearby or remote locations or may further raise air pollution levels where external air masses originate from other hot-spots. Therefore, the knowledge of ground-wind circulation and potential long-range transports is fundamental not only to evaluate how local or external sources may affect the air quality at a receptor site but also to quantify it. This review is focussed on establishing the relationships among PM2.5 sources, meteorological condition and air mass origin in the Po Valley, which is one of the most polluted areas in Europe. We have chosen the results from a recent study carried out in Venice (Eastern Po Valley) and have analysed them using different statistical approaches to understand the influence of external and local contribution of PM2.5 sources. External contributions were evaluated by applying Trajectory Statistical Methods (TSMs) based on back-trajectory analysis including (i) back-trajectories cluster analysis, (ii) potential source contribution function (PSCF) and (iii) concentration weighted trajectory (CWT). Furthermore, the relationships between the source contributions and ground-wind circulation patterns were investigated by using (iv) cluster analysis on wind data and (v) conditional probability function (CPF). Finally, local source contribution have been estimated by applying the Lenschow' approach. In summary, the integrated approach of different techniques has successfully identified both local and external sources of particulate matter pollution in a European hot-spot affected by the worst air quality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiaoyan Tang; Min Shao; Yuanhang Zhang
1996-12-31
Ambient aerosol is one of most important pollutants in China. This paper showed the results of aerosol sources of Beijing area revealed by combination of multivariate analysis models and 14C tracer measured on Accelerator Mass Spectrometry (AMS). The results indicated that the mass concentration of particulate (<100 (M)) didn`t increase rapidly, compared with economic development in Beijing city. The multivariate analysis showed that the predominant source was soil dust which contributed more than 50% to atmospheric particles. However, it would be a risk to conclude that the aerosol pollution from anthropogenic sources was less important in Beijing city based onmore » above phenomenon. Due to lack of reliable tracers, it was very hard to distinguish coal burning from soil source. Thus, it was suspected that the soil source above might be the mixture of soil dust and coal burning. The 14C measurement showed that carbonaceous species of aerosol had quite different emission sources. For carbonaceous aerosols in Beijing, the contribution from fossil fuel to ambient particles was nearly 2/3, as the man-made activities ( coal-burning, etc.) increased, the fossil part would contribute more to atmospheric carbonaceous particles. For example, in downtown Beijing at space-heating seasons, the fossil fuel even contributed more than 95% to carbonaceous particles, which would be potential harmful to population. By using multivariate analysis together with 14C data, two important sources of aerosols in Beijing (soil and coal) combustion were more reliably distinguished, which was critical important for the assessment of aerosol problem in China.« less
Acoustic centering of sources measured by surrounding spherical microphone arrays.
Hagai, Ilan Ben; Pollow, Martin; Vorländer, Michael; Rafaely, Boaz
2011-10-01
The radiation patterns of acoustic sources have great significance in a wide range of applications, such as measuring the directivity of loudspeakers and investigating the radiation of musical instruments for auralization. Recently, surrounding spherical microphone arrays have been studied for sound field analysis, facilitating measurement of the pressure around a sphere and the computation of the spherical harmonics spectrum of the sound source. However, the sound radiation pattern may be affected by the location of the source inside the microphone array, which is an undesirable property when aiming to characterize source radiation in a unique manner. This paper presents a theoretical analysis of the spherical harmonics spectrum of spatially translated sources and defines four measures for the misalignment of the acoustic center of a radiating source. Optimization is used to promote optimal alignment based on the proposed measures and the errors caused by numerical and array-order limitations are investigated. This methodology is examined using both simulated and experimental data in order to investigate the performance and limitations of the different alignment methods. © 2011 Acoustical Society of America
Client-nurse relationships in home-based palliative care: a critical analysis of power relations.
Oudshoorn, Abram; Ward-Griffin, Catherine; McWilliam, Carol
2007-08-01
To elicit an in-depth understanding of the sources of power and how power is exercised within client-nurse relationships in home-based palliative care. As in all social relations, power is present within client-nurse relationships. Although much research has focused on interpersonal relationships in nursing, the concept of power within the client-nurse relationship in palliative care settings has not been extensively investigated. Applying a critical lens, secondary qualitative data analysis was conducted. Seventeen nurse and 16 client transcripts from a primary study were selected for secondary data analysis. These 33 transcripts afforded theme saturation, which allowed for both commonalities and differences to be identified. Data analysis involved analytic coding. Study findings help make explicit the underlying power present in the context of home-based palliative care and how this power is used and potentially abused. In analysing the sources and exercise of power, the linkage between macro and micro levels of power is made explicit, as nurses functioned within a hierarchy of power. The findings suggest that educational/occupational status continues to be a source of power for nurses within the relationship. However, nurses also experience powerlessness within the home care context. For clients, being able to control one's own life is a source of power, but this power is over-shadowed by the powerlessness experienced in relationships with nurses. The exercise of power by clients and nurses creates experiences of both liberation and domination. Nurses who are willing to reflect on and change those disempowering aspects of the client-nurse relationship, including a harmful hierarchy, will ultimately be successful in the health promotion of clients in home-based palliative care. Additionally, it should be recognized that nurses work within a specific health system context and, therefore, their practice is influenced by policies and funding models implemented at various levels of the health care system. The insights gained through this investigation may assist nurses and other health professionals in reflecting on and improving practices and policies within home-based palliative care and within home care in general.
Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yi, Qing; Whaley, Richard Clint; Qasem, Apan
This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis,more » identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.« less
Versatile new ion source for the analysis of materials in open air under ambient conditions.
Cody, Robert B; Laramée, James A; Durst, H Dupont
2005-04-15
A new ion source has been developed for rapid, noncontact analysis of materials at ambient pressure and at ground potential. The new source, termed DART (for "Direct Analysis in Real Time"), is based on the reactions of electronic or vibronic excited-state species with reagent molecules and polar or nonpolar analytes. DART has been installed on a high-resolution time-of-flight mass spectrometer (TOFMS) that provides improved selectivity and accurate elemental composition assignment through exact mass measurements. Although DART has been applied to the analysis of gases, liquids, and solids, a unique application is the direct detection of chemicals on surfaces without requiring sample preparation, such as wiping or solvent extraction. DART has demonstrated success in sampling hundreds of chemicals, including chemical agents and their signatures, pharmaceutics, metabolites, peptides and oligosaccharides, synthetic organics, organometallics, drugs of abuse, explosives, and toxic industrial chemicals. These species were detected on various surfaces, such as concrete, asphalt, human skin, currency, airline boarding passes, business cards, fruits, vegetables, spices, beverages, body fluids, horticultural leaves, cocktail glasses, and clothing. DART employs no radioactive components and is more versatile than devices using radioisotope-based ionization. Because its response is instantaneous, DART provides real-time information, a critical requirement for screening or high throughput.
Explosion Source Similarity Analysis via SVD
NASA Astrophysics Data System (ADS)
Yedlin, Matthew; Ben Horin, Yochai; Margrave, Gary
2016-04-01
An important seismological ingredient for establishing a regional seismic nuclear discriminant is the similarity analysis of a sequence of explosion sources. To investigate source similarity, we are fortunate to have access to a sequence of 1805 three-component recordings of quarry blasts, shot from March 2002 to January 2015. The centroid of these blasts has an estimated location 36.3E and 29.9N. All blasts were detonated by JPMC (Jordan Phosphate Mines Co.) All data were recorded at the Israeli NDC, HFRI, located at 30.03N and 35.03E. Data were first winnowed based on the distribution of maximum amplitudes in the neighborhood of the P-wave arrival. The winnowed data were then detrended using the algorithm of Cleveland et al (1990). The detrended data were bandpass filtered between .1 to 12 Hz using an eighth order Butterworth filter. Finally, data were sorted based on maximum trace amplitude. Two similarity analysis approaches were used. First, for each component, the entire suite of traces was decomposed into its eigenvector representation, by employing singular-valued decomposition (SVD). The data were then reconstructed using 10 percent of the singular values, with the resulting enhancement of the S-wave and surface wave arrivals. The results of this first method are then compared to the second analysis method based on the eigenface decomposition analysis of Turk and Pentland (1991). While both methods yield similar results in enhancement of data arrivals and reduction of data redundancy, more analysis is required to calibrate the recorded data to charge size, a quantity that was not available for the current study. References Cleveland, R. B., Cleveland, W. S., McRae, J. E., and Terpenning, I., Stl: A seasonal-trend decomposition procedure based on loess, Journal of Official Statistics, 6, No. 1, 3-73, 1990. Turk, M. and Pentland, A., Eigenfaces for recognition. Journal of cognitive neuroscience, 3(1), 71-86, 1991.
Conjoint Analysis for New Service Development on Electricity Distribution in Indonesia
NASA Astrophysics Data System (ADS)
Widaningrum, D. L.; Chynthia; Astuti, L. D.; Seran, M. A. B.
2017-07-01
Many cases of illegal use of electricity in Indonesia is still rampant, especially for activities where the power source is not available, such as in the location of street vendors. It is not only detrimental to the state, but also harm the perpetrators of theft of electricity and the surrounding communities. The purpose of this study is to create New Service Development (NSD) to provide a new electricity source for street vendors' activity based on their preferences. The methods applied in NSD is Conjoint Analysis, Cluster Analysis, Quality Function Deployment (QFD), Service Blueprint, Process Flow Diagrams and Quality Control Plan. The results of this study are the attributes and their importance in the new electricity’s service based on street vendors’ preferences as customers, customer segmentation, service design for new service, designing technical response, designing operational procedures, the quality control plan of any existing operational procedures.
Analysis of pollen load based on color, physicochemical composition and botanical source.
Modro, Anna F H; Silva, Izabel C; Luz, Cynthia F P; Message, Dejair
2009-06-01
Pollen load samples from 10 hives of Apis mellifera (L.) were analyzed based on their physicochemical composition and botanical source, considering color as a parameter for quality control. In seven samples it was possible to establish the occurrence of more than 80% of a single pollen type, characterizing them as unifloral but with protein content variation. One of the samples was exclusively composed of saprophytic fungi (Cladosporium sp.). Comparing the mean results of the fungi loads with those of the nutritional components of pollen load, the former presented higher protein, mineral matter and dry matter and lower organic matter, ethereal extract and total carbohydrate values. The monochromatic samples met the physicochemical specifications regulating pollen load quality. The results showed that homogeneous coloration of the pollen load was not found to be a good indication of unifloral pollen, confirming the importance of physicochemical analysis and melissopalynological analysis for characterization of the quality of commercial pollen load.
Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M
2017-02-01
Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.
Guo, Xueru; Zuo, Rui; Meng, Li; Wang, Jinsheng; Teng, Yanguo; Liu, Xin; Chen, Minhua
2018-01-01
Globally, groundwater resources are being deteriorated by rapid social development. Thus, there is an urgent need to assess the combined impacts of natural and enhanced anthropogenic sources on groundwater chemistry. The aim of this study was to identify seasonal characteristics and spatial variations in anthropogenic and natural effects, to improve the understanding of major hydrogeochemical processes based on source apportionment. 34 groundwater points located in a riverside groundwater resource area in northeast China were sampled during the wet and dry seasons in 2015. Using principal component analysis and factor analysis, 4 principal components (PCs) were extracted from 16 groundwater parameters. Three of the PCs were water-rock interaction (PC1), geogenic Fe and Mn (PC2), and agricultural pollution (PC3). A remarkable difference (PC4) was organic pollution originating from negative anthropogenic effects during the wet season, and geogenic F enrichment during the dry season. Groundwater exploitation resulted in dramatic depression cone with higher hydraulic gradient around the water source area. It not only intensified dissolution of calcite, dolomite, gypsum, Fe, Mn and fluorine minerals, but also induced more surface water recharge for the water source area. The spatial distribution of the PCs also suggested the center of the study area was extremely vulnerable to contamination by Fe, Mn, COD, and F−. PMID:29415516
Augmented classical least squares multivariate spectral analysis
Haaland, David M.; Melgaard, David K.
2004-02-03
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-07-26
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-01-11
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
BATSE imaging survey of the Galactic plane
NASA Technical Reports Server (NTRS)
Grindlay, J. E.; Barret, D.; Bloser, P. F.; Zhang, S. N.; Robinson, C.; Harmon, B. A.
1997-01-01
The burst and transient source experiment (BATSE) onboard the Compton Gamma Ray Observatory (CGRO) provides all sky monitoring capability, occultation analysis and occultation imaging which enables new and fainter sources to be searched for in relatively crowded fields. The occultation imaging technique is used in combination with an automated BATSE image scanner, allowing an analysis of large data sets of occultation images for detections of candidate sources and for the construction of source catalogs and data bases. This automated image scanner system is being tested on archival data in order to optimize the search and detection thresholds. The image search system, its calibration results and preliminary survey results on archival data are reported on. The aim of the survey is to identify a complete sample of black hole candidates in the galaxy and constrain the number of black hole systems and neutron star systems.
Localization of short-range acoustic and seismic wideband sources: Algorithms and experiments
NASA Astrophysics Data System (ADS)
Stafsudd, J. Z.; Asgari, S.; Hudson, R.; Yao, K.; Taciroglu, E.
2008-04-01
We consider the determination of the location (source localization) of a disturbance source which emits acoustic and/or seismic signals. We devise an enhanced approximate maximum-likelihood (AML) algorithm to process data collected at acoustic sensors (microphones) belonging to an array of, non-collocated but otherwise identical, sensors. The approximate maximum-likelihood algorithm exploits the time-delay-of-arrival of acoustic signals at different sensors, and yields the source location. For processing the seismic signals, we investigate two distinct algorithms, both of which process data collected at a single measurement station comprising a triaxial accelerometer, to determine direction-of-arrival. The direction-of-arrivals determined at each sensor station are then combined using a weighted least-squares approach for source localization. The first of the direction-of-arrival estimation algorithms is based on the spectral decomposition of the covariance matrix, while the second is based on surface wave analysis. Both of the seismic source localization algorithms have their roots in seismology; and covariance matrix analysis had been successfully employed in applications where the source and the sensors (array) are typically separated by planetary distances (i.e., hundreds to thousands of kilometers). Here, we focus on very-short distances (e.g., less than one hundred meters) instead, with an outlook to applications in multi-modal surveillance, including target detection, tracking, and zone intrusion. We demonstrate the utility of the aforementioned algorithms through a series of open-field tests wherein we successfully localize wideband acoustic and/or seismic sources. We also investigate a basic strategy for fusion of results yielded by acoustic and seismic arrays.
NASA Astrophysics Data System (ADS)
Zhao, Yang; Dai, Rui-Na; Xiao, Xiang; Zhang, Zong; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe
2017-02-01
Two-person neuroscience, a perspective in understanding human social cognition and interaction, involves designing immersive social interaction experiments as well as simultaneously recording brain activity of two or more subjects, a process termed "hyperscanning." Using newly developed imaging techniques, the interbrain connectivity or hyperlink of various types of social interaction has been revealed. Functional near-infrared spectroscopy (fNIRS)-hyperscanning provides a more naturalistic environment for experimental paradigms of social interaction and has recently drawn much attention. However, most fNIRS-hyperscanning studies have computed hyperlinks using sensor data directly while ignoring the fact that the sensor-level signals contain confounding noises, which may lead to a loss of sensitivity and specificity in hyperlink analysis. In this study, on the basis of independent component analysis (ICA), a source-level analysis framework is proposed to investigate the hyperlinks in a fNIRS two-person neuroscience study. The performance of five widely used ICA algorithms in extracting sources of interaction was compared in simulative datasets, and increased sensitivity and specificity of hyperlink analysis by our proposed method were demonstrated in both simulative and real two-person experiments.
A method for generating new datasets based on copy number for cancer analysis.
Kim, Shinuk; Kon, Mark; Kang, Hyunsik
2015-01-01
New data sources for the analysis of cancer data are rapidly supplementing the large number of gene-expression markers used for current methods of analysis. Significant among these new sources are copy number variation (CNV) datasets, which typically enumerate several hundred thousand CNVs distributed throughout the genome. Several useful algorithms allow systems-level analyses of such datasets. However, these rich data sources have not yet been analyzed as deeply as gene-expression data. To address this issue, the extensive toolsets used for analyzing expression data in cancerous and noncancerous tissue (e.g., gene set enrichment analysis and phenotype prediction) could be redirected to extract a great deal of predictive information from CNV data, in particular those derived from cancers. Here we present a software package capable of preprocessing standard Agilent copy number datasets into a form to which essentially all expression analysis tools can be applied. We illustrate the use of this toolset in predicting the survival time of patients with ovarian cancer or glioblastoma multiforme and also provide an analysis of gene- and pathway-level deletions in these two types of cancer.
Web-based multimedia information retrieval for clinical application research
NASA Astrophysics Data System (ADS)
Cao, Xinhua; Hoo, Kent S., Jr.; Zhang, Hong; Ching, Wan; Zhang, Ming; Wong, Stephen T. C.
2001-08-01
We described a web-based data warehousing method for retrieving and analyzing neurological multimedia information. The web-based method supports convenient access, effective search and retrieval of clinical textual and image data, and on-line analysis. To improve the flexibility and efficiency of multimedia information query and analysis, a three-tier, multimedia data warehouse for epilepsy research has been built. The data warehouse integrates clinical multimedia data related to epilepsy from disparate sources and archives them into a well-defined data model.
Gamage, I H; Jonker, A; Zhang, X; Yu, P
2014-01-24
The objective of this study was to determine the possibility of using molecular spectroscopy with multivariate technique as a fast method to detect the source effects among original feedstock sources of wheat and their corresponding co-products, wheat DDGS, from bioethanol production. Different sources of the bioethanol feedstock and their corresponding bioethanol co-products, three samples per source, were collected from the same newly-built bioethanol plant with current bioethanol processing technology. Multivariate molecular spectral analyses were carried out using agglomerative hierarchical cluster analysis (AHCA) and principal component analysis (PCA). The molecular spectral data of different feedstock sources and their corresponding co-products were compared at four different regions of ca. 1800-1725 cm(-1) (carbonyl CO ester, mainly related to lipid structure conformation), ca. 1725-1482 cm(-1) (amide I and amide II region mainly related to protein structure conformation), ca. 1482-1180 cm(-1) (mainly associated with structural carbohydrate) and ca. 1180-800 cm(-1) (mainly related to carbohydrates) in complex plant-based system. The results showed that the molecular spectroscopy with multivariate technique could reveal the structural differences among the bioethanol feedstock sources and among their corresponding co-products. The AHCA and PCA analyses were able to distinguish the molecular structure differences associated with chemical functional groups among the different sources of the feedstock and their corresponding co-products. The molecular spectral differences indicated the differences in functional, biomolecular and biopolymer groups which were confirmed by wet chemical analysis. These biomolecular and biopolymer structural differences were associated with chemical and nutrient profiles and nutrient utilization and availability. Molecular spectral analyses had the potential to identify molecular structure difference among bioethanol feedstock sources and their corresponding co-products. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura
2016-05-01
Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.
NASA Astrophysics Data System (ADS)
Gamage, I. H.; Jonker, A.; Zhang, X.; Yu, P.
2014-01-01
The objective of this study was to determine the possibility of using molecular spectroscopy with multivariate technique as a fast method to detect the source effects among original feedstock sources of wheat and their corresponding co-products, wheat DDGS, from bioethanol production. Different sources of the bioethanol feedstock and their corresponding bioethanol co-products, three samples per source, were collected from the same newly-built bioethanol plant with current bioethanol processing technology. Multivariate molecular spectral analyses were carried out using agglomerative hierarchical cluster analysis (AHCA) and principal component analysis (PCA). The molecular spectral data of different feedstock sources and their corresponding co-products were compared at four different regions of ca. 1800-1725 cm-1 (carbonyl Cdbnd O ester, mainly related to lipid structure conformation), ca. 1725-1482 cm-1 (amide I and amide II region mainly related to protein structure conformation), ca. 1482-1180 cm-1 (mainly associated with structural carbohydrate) and ca. 1180-800 cm-1 (mainly related to carbohydrates) in complex plant-based system. The results showed that the molecular spectroscopy with multivariate technique could reveal the structural differences among the bioethanol feedstock sources and among their corresponding co-products. The AHCA and PCA analyses were able to distinguish the molecular structure differences associated with chemical functional groups among the different sources of the feedstock and their corresponding co-products. The molecular spectral differences indicated the differences in functional, biomolecular and biopolymer groups which were confirmed by wet chemical analysis. These biomolecular and biopolymer structural differences were associated with chemical and nutrient profiles and nutrient utilization and availability. Molecular spectral analyses had the potential to identify molecular structure difference among bioethanol feedstock sources and their corresponding co-products.
NASA Astrophysics Data System (ADS)
Wang, Xing; Sun, Wenliang; Guo, Min; Li, Minjiao; Li, Wan
2018-01-01
The research object of this paper is fine particles in typical region. The construction of component spectrum bank is based on the technology of online source apportionment, then the result of the apportionment is utilized to verify the effectiveness of fine particles component spectrum bank and which also act as the matching basis of online source apportionment receptor sample. On the next, the particle source of air pollution is carried through the matching diagnosis empirical research by utilizing online source apportionment technology, to provide technical support for the cause analysis and treatment of heavy pollution weather.
[Construction and application of special analysis database of geoherbs based on 3S technology].
Guo, Lan-ping; Huang, Lu-qi; Lv, Dong-mei; Shao, Ai-juan; Wang, Jian
2007-09-01
In this paper,the structures, data sources, data codes of "the spacial analysis database of geoherbs" based 3S technology are introduced, and the essential functions of the database, such as data management, remote sensing, spacial interpolation, spacial statistics, spacial analysis and developing are described. At last, two examples for database usage are given, the one is classification and calculating of NDVI index of remote sensing image in geoherbal area of Atractylodes lancea, the other one is adaptation analysis of A. lancea. These indicate that "the spacial analysis database of geoherbs" has bright prospect in spacial analysis of geoherbs.
Research on precise modeling of buildings based on multi-source data fusion of air to ground
NASA Astrophysics Data System (ADS)
Li, Yongqiang; Niu, Lubiao; Yang, Shasha; Li, Lixue; Zhang, Xitong
2016-03-01
Aims at the accuracy problem of precise modeling of buildings, a test research was conducted based on multi-source data for buildings of the same test area , including top data of air-borne LiDAR, aerial orthophotos, and façade data of vehicle-borne LiDAR. After accurately extracted the top and bottom outlines of building clusters, a series of qualitative and quantitative analysis was carried out for the 2D interval between outlines. Research results provide a reliable accuracy support for precise modeling of buildings of air ground multi-source data fusion, on the same time, discussed some solution for key technical problems.
Peña, Victor H.; Fernández, Geysson J.; Gómez-Palacio, Andrés M.; Mejía-Jaramillo, Ana M.; Cantillo, Omar; Triana-Chávez, Omar
2012-01-01
Methods to determine blood-meal sources of hematophagous Triatominae bugs (Chagas disease vectors) are serological or based on PCR employing species-specific primers or heteroduplex analysis, but these are expensive, inaccurate, or problematic when the insect has fed on more than one species. To solve those problems, we developed a technique based on HRM analysis of the mitochondrial gene cytochrome B (Cyt b). This technique recognized 14 species involved in several ecoepidemiological cycles of the transmission of Trypanosoma cruzi and it was suitable with DNA extracted from intestinal content and feces 30 days after feeding, revealing a resolution power that can display mixed feedings. Field samples were analyzed showing blood meal sources corresponding to domestic, peridomiciliary and sylvatic cycles. The technique only requires a single pair of primers that amplify the Cyt b gene in vertebrates and no other standardization, making it quick, easy, relatively inexpensive, and highly accurate. PMID:22389739
Interferometric Laser Scanner for Direction Determination
Kaloshin, Gennady; Lukin, Igor
2016-01-01
In this paper, we explore the potential capabilities of new laser scanning-based method for direction determination. The method for fully coherent beams is extended to the case when interference pattern is produced in the turbulent atmosphere by two partially coherent sources. The performed theoretical analysis identified the conditions under which stable pattern may form on extended paths of 0.5–10 km in length. We describe a method for selecting laser scanner parameters, ensuring the necessary operability range in the atmosphere for any possible turbulence characteristics. The method is based on analysis of the mean intensity of interference pattern, formed by two partially coherent sources of optical radiation. Visibility of interference pattern is estimated as a function of propagation pathlength, structure parameter of atmospheric turbulence, and spacing of radiation sources, producing the interference pattern. It is shown that, when atmospheric turbulences are moderately strong, the contrast of interference pattern of laser scanner may ensure its applicability at ranges up to 10 km. PMID:26805841
Interferometric Laser Scanner for Direction Determination.
Kaloshin, Gennady; Lukin, Igor
2016-01-21
In this paper, we explore the potential capabilities of new laser scanning-based method for direction determination. The method for fully coherent beams is extended to the case when interference pattern is produced in the turbulent atmosphere by two partially coherent sources. The performed theoretical analysis identified the conditions under which stable pattern may form on extended paths of 0.5-10 km in length. We describe a method for selecting laser scanner parameters, ensuring the necessary operability range in the atmosphere for any possible turbulence characteristics. The method is based on analysis of the mean intensity of interference pattern, formed by two partially coherent sources of optical radiation. Visibility of interference pattern is estimated as a function of propagation pathlength, structure parameter of atmospheric turbulence, and spacing of radiation sources, producing the interference pattern. It is shown that, when atmospheric turbulences are moderately strong, the contrast of interference pattern of laser scanner may ensure its applicability at ranges up to 10 km.
Cho, Seongbeom; Boxrud, David J; Bartkus, Joanne M; Whittam, Thomas S; Saeed, Mahdi
2007-01-01
Simplified multiple-locus variable-number tandem repeat analysis (MLVA) was developed using one-shot multiplex PCR for seven variable-number tandem repeats (VNTR) markers with high diversity capacity. MLVA, phage typing, and PFGE methods were applied on 34 diverse Salmonella Enteritidis isolates from human and non-human sources. MLVA detected allelic variations that helped to classify the S. Enteritidis isolates into more evenly distributed subtypes than other methods. MLVA-based S. Enteritidis clonal groups were largely associated with sources of the isolates. Nei's diversity indices for polymorphism ranged from 0.25 to 0.70 for seven VNTR loci markers. Based on Simpson's and Shannon's diversity indices, MLVA had a higher discriminatory power than pulsed field gel electrophoresis (PFGE), phage typing, or multilocus enzyme electrophoresis. Therefore, MLVA may be used along with PFGE to enhance the effectiveness of the molecular epidemiologic investigation of S. Enteritidis infections. PMID:17692097
NASA Astrophysics Data System (ADS)
Patsariya, Ajay; Rai, Shiwani; Kumar, Yogendra, Dr.; Kirar, Mukesh, Dr.
2017-08-01
The energy crisis particularly with developing GDPs, has bring up to a new panorama of sustainable power source like solar energy, which has encountered huge development. Progressively high infiltration level of photovoltaic (PV) era emerges in keen matrix. Sunlight based power is irregular and variable, as the sun based source at the ground level is exceedingly subject to overcast cover inconstancy, environmental vaporized levels, and other climate parameters. The inalienable inconstancy of substantial scale sun based era acquaints huge difficulties with keen lattice vitality administration. Exact determining of sun powered power/irradiance is basic to secure financial operation of the shrewd framework. In this paper a noble TLBO-MPPT technique has been proposed to address the vitality of solar energy. A comparative analysis has been presented between conventional PO, IC and the proposed MPPT technique. The research has been done on Matlab Simulink software version 2013.
Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei
2011-12-01
A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.
Tan, Z.; Liu, S.; Johnston, C.A.; Liu, J.; Tieszen, L.L.
2006-01-01
Our ability to forecast the role of ecosystem processes in mitigating global greenhouse effects relies on understanding the driving forces on terrestrial C dynamics. This study evaluated the controls on soil organic C (SOC) changes from 1973 to 2000 in the northwest Great Plains. SOC source-sink relationships were quantified using the General Ensemble Biogeochemical Modeling System (GEMS) based on 40 randomly located 10 × 10 km2 sample blocks. These sample blocks were aggregated into cropland, grassland, and forestland groups based on land cover composition within each sample block. Canonical correlation analysis indicated that SOC source-sink relationship from 1973 to 2000 was significantly related to the land cover type while the change rates mainly depended on the baseline SOC level and annual precipitation. Of all selected driving factors, the baseline SOC and nitrogen levels controlled the SOC change rates for the forestland and cropland groups, while annual precipitation determined the C source-sink relationship for the grassland group in which noticeable SOC sink strength was attributed to the conversion from cropped area to grass cover. Canonical correlation analysis also showed that grassland ecosystems are more complicated than others in the ecoregion, which may be difficult to identify on a field scale. Current model simulations need further adjustments to the model input variables for the grass cover-dominated ecosystems in the ecoregion.
A phase coherence approach to identifying co-located earthquakes and tremor
NASA Astrophysics Data System (ADS)
Hawthorne, J. C.; Ampuero, J.-P.
2018-05-01
We present and use a phase coherence approach to identify seismic signals that have similar path effects but different source time functions: co-located earthquakes and tremor. The method used is a phase coherence-based implementation of empirical matched field processing, modified to suit tremor analysis. It works by comparing the frequency-domain phases of waveforms generated by two sources recorded at multiple stations. We first cross-correlate the records of the two sources at a single station. If the sources are co-located, this cross-correlation eliminates the phases of the Green's function. It leaves the relative phases of the source time functions, which should be the same across all stations so long as the spatial extent of the sources are small compared with the seismic wavelength. We therefore search for cross-correlation phases that are consistent across stations as an indication of co-located sources. We also introduce a method to obtain relative locations between the two sources, based on back-projection of interstation phase coherence. We apply this technique to analyse two tremor-like signals that are thought to be composed of a number of earthquakes. First, we analyse a 20 s long seismic precursor to a M 3.9 earthquake in central Alaska. The analysis locates the precursor to within 2 km of the mainshock, and it identifies several bursts of energy—potentially foreshocks or groups of foreshocks—within the precursor. Second, we examine several minutes of volcanic tremor prior to an eruption at Redoubt Volcano. We confirm that the tremor source is located close to repeating earthquakes identified earlier in the tremor sequence. The amplitude of the tremor diminishes about 30 s before the eruption, but the phase coherence results suggest that the tremor may persist at some level through this final interval.
NASA Astrophysics Data System (ADS)
Eatough, Delbert J.; Grover, Brett D.; Woolwine, Woods R.; Eatough, Norman L.; Long, Russell; Farber, Robert
Positive matrix factorization (PMF2) was used to elucidate sources of fine particulate material (PM 2.5) for a study conducted during July and August 2005, in Riverside, CA. One-hour averaged semi-continuous measurements were made with a suite of instruments to provide PM 2.5 mass and chemical composition data. Total PM 2.5 mass concentrations (non-volatile plus semi-volatile) were measured with an R&P filter dynamic measurement system (FDMS TEOM) and a conventional TEOM monitor was used to measure non-volatile mass concentrations. PM 2.5 chemical species monitors included a dual-oven Sunset monitor to measure both non-volatile and semi-volatile carbonaceous material, an ion chromatographic-based monitor to measure sulfate and nitrate and an Anderson Aethalometer to measure black carbon (BC). Gas phase data including CO, NO 2, NO x and O 3 were also collected during the sampling period. In addition, single-particle measurements were made using aerosol time-of-flight mass spectrometry (ATOFMS). Twenty different single-particle types consistent with those observed in previous ATOFMS studies in Riverside were identified for the PMF2 analysis. Finally, time-of-flight aerosol mass spectrometry (ToF-AMS) provided data on markers of primary and secondary organic aerosol. Two distinct PMF2 analyses were performed. In analysis 1, all the data except for the ATOFMS and ToF-AMS data were used in an initial evaluation of sources at Riverside during the study. PMF2 was able to identify six factors from the data set corresponding to both primary and secondary sources, primarily from automobile emissions, diesel emissions, secondary nitrate formation, a secondary photochemical associated source, organic emissions and Basin transported pollutants. In analysis 2, the ATOFMS and ToF-AMS data were included in the analysis. In the second analysis, PMF2 was able to identify 16 factors with a variety of both primary and secondary factors being identified, corresponding to both primary and secondary material from both anthropogenic and natural sources. Based on relationships with Basin meteorology, the PMF identified source profiles and diurnal patterns in the source concentrations, sources were identified as being of local origin or resulting from transport of pollutants across the Basin due to onshore flow. Good agreement was observed between the PMF2 predicted mass and the FDMS measured mass for both analyses.
A logical model of cooperating rule-based systems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.
1989-01-01
A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.
NASA Astrophysics Data System (ADS)
Barthod, Louise; Lobb, David; Owens, Philip; Martinez-Carreras, Nuria; Koiter, Alexander; Petticrew, Ellen; McCullough, Gregory
2014-05-01
The development of beneficial management practises to minimize adverse impacts of agriculture on soil and water quality requires information on the sources of sediment at the watershed scale. Sediment fingerprinting allows for the determination of sediment sources and apportionment of their contribution within a watershed, using unique physical, radiochemical or biogeochemical properties, or fingerprints, of the potential sediment sources. The use of sediment colour as a fingerprint is an emerging technique that can provide a rapid and inexpensive means of investigating sediment sources. This technique is currently being utilized to determine sediment sources within the South Tobacco Creek Watershed, an agricultural watershed located in the Canadian prairies (south-central Manitoba). Suspended sediment and potential source (topsoil, channel bank and shale bedrock material) samples were collected between 2009 and 2011 at six locations along the main stem of the creek. Sample colour was quantified from diffuse reflectance spectrometry measurements over the visible wavelength range using a spectroradiometer (ASD Field Spec Pro, 400-2500 nm). Sixteen colour coefficients were derived from several colour space models (CIE XYZ, CIE xyY, CIE Lab, CIE Luv, CIE Lch, Landsat RGB, Redness Index). The individual discrimination power of the colour coefficients, after passing several prerequisite tests (e.g., linearly additive behaviour), was assessed using discriminant function analysis. A stepwise discriminant analysis, based on the Wilk's lambda criterion, was then performed in order to determine the best-suited colour coefficient fingerprints which maximized the discrimination between the potential sources. The selected fingerprints classified the source samples in the correct category 86% of the time. The misclassification is due to intra-source variability and source overlap which can lead to higher uncertainty in sediment source apportionment. The selected fingerprints were then included in a Bayesian mixing model using Monte-Carlo simulation (Stable Isotope Analysis in R, SIAR) in order to apportion the contribution of the different sources to the sediment collected at each location. A switch in the dominant sediment source between the headwaters and the outlet of the watershed is observed. Sediment is enriched with shale bedrock and depleted of topsoil sources as the stream crosses and down-cuts through the Manitoba Escarpment. The switch in sources highlights the importance of the sampling location in relation to the scale and geomorphic connectivity of the watershed. Although the results include considerable uncertainty, they are consistent with the findings from a classical fingerprinting approach undertaken in the same watershed (i.e., geochemical and radionuclide fingerprints), and confirm the potential of sediment colour parameters as suitable fingerprints.
Improved Overpressure Recording and Modeling for Near-Surface Explosion Forensics
NASA Astrophysics Data System (ADS)
Kim, K.; Schnurr, J.; Garces, M. A.; Rodgers, A. J.
2017-12-01
The accurate recording and analysis of air-blast acoustic waveforms is a key component of the forensic analysis of explosive events. Smartphone apps can enhance traditional technologies by providing scalable, cost-effective ubiquitous sensor solutions for monitoring blasts, undeclared activities, and inaccessible facilities. During a series of near-surface chemical high explosive tests, iPhone 6's running the RedVox infrasound recorder app were co-located with high-fidelity Hyperion overpressure sensors, allowing for direct comparison of the resolution and frequency content of the devices. Data from the traditional sensors is used to characterize blast signatures and to determine relative iPhone microphone amplitude and phase responses. A Wiener filter based source deconvolution method is applied, using a parameterized source function estimated from traditional overpressure sensor data, to estimate system responses. In addition, progress on a new parameterized air-blast model is presented. The model is based on the analysis of a large set of overpressure waveforms from several surface explosion test series. An appropriate functional form with parameters determined empirically from modern air-blast and acoustic data will allow for better parameterization of signals and the improved characterization of explosive sources.
Mantini, D; Franciotti, R; Romani, G L; Pizzella, V
2008-03-01
The major limitation for the acquisition of high-quality magnetoencephalography (MEG) recordings is the presence of disturbances of physiological and technical origins: eye movements, cardiac signals, muscular contractions, and environmental noise are serious problems for MEG signal analysis. In the last years, multi-channel MEG systems have undergone rapid technological developments in terms of noise reduction, and many processing methods have been proposed for artifact rejection. Independent component analysis (ICA) has already shown to be an effective and generally applicable technique for concurrently removing artifacts and noise from the MEG recordings. However, no standardized automated system based on ICA has become available so far, because of the intrinsic difficulty in the reliable categorization of the source signals obtained with this technique. In this work, approximate entropy (ApEn), a measure of data regularity, is successfully used for the classification of the signals produced by ICA, allowing for an automated artifact rejection. The proposed method has been tested using MEG data sets collected during somatosensory, auditory and visual stimulation. It was demonstrated to be effective in attenuating both biological artifacts and environmental noise, in order to reconstruct clear signals that can be used for improving brain source localizations.
Improving the geological interpretation of magnetic and gravity satellite anomalies
NASA Technical Reports Server (NTRS)
Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.
1987-01-01
Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.
MultiElec: A MATLAB Based Application for MEA Data Analysis.
Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R
2015-01-01
We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, Brian K.; Hand, James R.; Horner, Jacob A.
This document provides an overview of renewable resource potential at Fort Sill, based primarily upon analysis of secondary data sources supplemented with limited on-site evaluations. This effort focuses on grid-connected generation of electricity from renewable energy sources and on ground source heat pumps for heating and cooling buildings. The effort was funded by the U.S. Army Installation Management Command (IMCOM) as follow-on to the 2005 Department of Defense (DoD) Renewables Assessment. The site visit to Fort Sill took place on June 10, 2010.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Scott A.; Orrell, Alice C.; Solana, Amy E.
This document provides an overview of renewable resource potential at Fort Drum, based primarily upon analysis of secondary data sources supplemented with limited on-site evaluations. This effort focuses on grid-connected generation of electricity from renewable energy sources and also on ground source heat pumps for heating and cooling buildings. The effort was funded by the U.S. Army Installation Management Command (IMCOM) as follow-on to the 2005 Department of Defense (DoD) Renewables Assessment. The site visit to Fort Drum took place on May 4 and 5, 2010.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, Brian K.; Gorrissen, Willy J.; Hand, James R.
This document provides an overview of renewable resource potential at Fort Gordon, based primarily upon analysis of secondary data sources supplemented with limited on-site evaluations. This effort focuses on grid-connected generation of electricity from renewable energy sources and also on ground source heat pumps for heating and cooling buildings. The effort was funded by the American Recovery and Reinvestment Act (ARRA) as follow-on to the 2005 Department of Defense (DoD) Renewables Assessment. The site visit to Fort Gordon took place on March 9, 2010.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solana, Amy E.; Boyd, Brian K.; Horner, Jacob A.
This document provides an overview of renewable resource potential at Fort Polk, based primarily upon analysis of secondary data sources supplemented with limited on-site evaluations. This effort focuses on grid-connected generation of electricity from renewable energy sources and also on ground source heat pumps for heating and cooling buildings. The effort was funded by the U.S. Army Installation Management Command (IMCOM) as follow-on to the 2005 Department of Defense (DoD) Renewables Assessment. The site visit to Fort Polk took place on February 16, 2010.
J-Plus: Morphological Classification Of Compact And Extended Sources By Pdf Analysis
NASA Astrophysics Data System (ADS)
López-Sanjuan, C.; Vázquez-Ramió, H.; Varela, J.; Spinoso, D.; Cristóbal-Hornillos, D.; Viironen, K.; Muniesa, D.; J-PLUS Collaboration
2017-10-01
We present a morphological classification of J-PLUS EDR sources into compact (i.e. stars) and extended (i.e. galaxies). Such classification is based on the Bayesian modelling of the concentration distribution, including observational errors and magnitude + sky position priors. We provide the star / galaxy probability of each source computed from the gri images. The comparison with the SDSS number counts support our classification up to r 21. The 31.7 deg² analised comprises 150k stars and 101k galaxies.
Renewable Energy Opportunities at White Sands Missile Range, New Mexico
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chvala, William D.; Solana, Amy E.; States, Jennifer C.
2008-09-01
The document provides an overview of renewable resource potential at White Sands Missile Range (WSMR) based primarily upon analysis of secondary data sources supplemented with limited on-site evaluations. The effort was funded by the U.S. Army Installation Management Command (IMCOM) as follow-on to the 2005 DoD Renewable Energy Assessment. This effort focuses on grid-connected generation of electricity from renewable energy sources and also ground source heat pumps (GSHPs) for heating and cooling buildings, as directed by IMCOM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Chen, Xingyuan; Ye, Ming
Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less
Beniczky, Sándor; Lantz, Göran; Rosenzweig, Ivana; Åkeson, Per; Pedersen, Birthe; Pinborg, Lars H; Ziebell, Morten; Jespersen, Bo; Fuglsang-Frederiksen, Anders
2013-10-01
Although precise identification of the seizure-onset zone is an essential element of presurgical evaluation, source localization of ictal electroencephalography (EEG) signals has received little attention. The aim of our study was to estimate the accuracy of source localization of rhythmic ictal EEG activity using a distributed source model. Source localization of rhythmic ictal scalp EEG activity was performed in 42 consecutive cases fulfilling inclusion criteria. The study was designed according to recommendations for studies on diagnostic accuracy (STARD). The initial ictal EEG signals were selected using a standardized method, based on frequency analysis and voltage distribution of the ictal activity. A distributed source model-local autoregressive average (LAURA)-was used for the source localization. Sensitivity, specificity, and measurement of agreement (kappa) were determined based on the reference standard-the consensus conclusion of the multidisciplinary epilepsy surgery team. Predictive values were calculated from the surgical outcome of the operated patients. To estimate the clinical value of the ictal source analysis, we compared the likelihood ratios of concordant and discordant results. Source localization was performed blinded to the clinical data, and before the surgical decision. Reference standard was available for 33 patients. The ictal source localization had a sensitivity of 70% and a specificity of 76%. The mean measurement of agreement (kappa) was 0.61, corresponding to substantial agreement (95% confidence interval (CI) 0.38-0.84). Twenty patients underwent resective surgery. The positive predictive value (PPV) for seizure freedom was 92% and the negative predictive value (NPV) was 43%. The likelihood ratio was nine times higher for the concordant results, as compared with the discordant ones. Source localization of rhythmic ictal activity using a distributed source model (LAURA) for the ictal EEG signals selected with a standardized method is feasible in clinical practice and has a good diagnostic accuracy. Our findings encourage clinical neurophysiologists assessing ictal EEGs to include this method in their armamentarium. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.
NASA Astrophysics Data System (ADS)
Gabriel, A. A.; Madden, E. H.; Ulrich, T.; Wollherr, S.
2016-12-01
Capturing the observed complexity of earthquake sources in dynamic rupture simulations may require: non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure. All of these factors have been independently shown to alter dynamic rupture behavior and thus possibly influence the degree of realism attainable via simulated ground motions. In this presentation we will show examples of high-resolution earthquake scenarios, e.g. based on the 2004 Sumatra-Andaman Earthquake and a potential rupture of the Husavik-Flatey fault system in Northern Iceland. The simulations combine a multitude of representations of source complexity at the necessary spatio-temporal resolution enabled by excellent scalability on modern HPC systems. Such simulations allow an analysis of the dominant factors impacting earthquake source physics and ground motions given distinct tectonic settings or distinct focuses of seismic hazard assessment. Across all simulations, we find that fault geometry concurrently with the regional background stress state provide a first order influence on source dynamics and the emanated seismic wave field. The dynamic rupture models are performed with SeisSol, a software package based on an ADER-Discontinuous Galerkin scheme for solving the spontaneous dynamic earthquake rupture problem with high-order accuracy in space and time. Use of unstructured tetrahedral meshes allows for a realistic representation of the non-planar fault geometry, subsurface structure and bathymetry. The results presented highlight the fact that modern numerical methods are essential to further our understanding of earthquake source physics and complement both physic-based ground motion research and empirical approaches in seismic hazard analysis.
NASA Astrophysics Data System (ADS)
Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie
2017-04-01
Capturing the observed complexity of earthquake sources in dynamic rupture simulations may require: non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure. All of these factors have been independently shown to alter dynamic rupture behavior and thus possibly influence the degree of realism attainable via simulated ground motions. In this presentation we will show examples of high-resolution earthquake scenarios, e.g. based on the 2004 Sumatra-Andaman Earthquake, the 1994 Northridge earthquake and a potential rupture of the Husavik-Flatey fault system in Northern Iceland. The simulations combine a multitude of representations of source complexity at the necessary spatio-temporal resolution enabled by excellent scalability on modern HPC systems. Such simulations allow an analysis of the dominant factors impacting earthquake source physics and ground motions given distinct tectonic settings or distinct focuses of seismic hazard assessment. Across all simulations, we find that fault geometry concurrently with the regional background stress state provide a first order influence on source dynamics and the emanated seismic wave field. The dynamic rupture models are performed with SeisSol, a software package based on an ADER-Discontinuous Galerkin scheme for solving the spontaneous dynamic earthquake rupture problem with high-order accuracy in space and time. Use of unstructured tetrahedral meshes allows for a realistic representation of the non-planar fault geometry, subsurface structure and bathymetry. The results presented highlight the fact that modern numerical methods are essential to further our understanding of earthquake source physics and complement both physic-based ground motion research and empirical approaches in seismic hazard analysis.
NASA Astrophysics Data System (ADS)
Zhou, Ping; Lin, Hui; Zhang, Qi
2018-01-01
The reference source system is a key factor to ensure the successful location of the satellite interference source. Currently, the traditional system used a mechanical rotating antenna which leaded to the disadvantages of slow rotation and high failure-rate, which seriously restricted the system’s positioning-timeliness and became its obvious weaknesses. In this paper, a multi-beam antenna scheme based on the horn array was proposed as a reference source for the satellite interference location, which was used as an alternative to the traditional reference source antenna. The new scheme has designed a small circularly polarized horn antenna as an element and proposed a multi-beamforming algorithm based on planar array. Moreover, the simulation analysis of horn antenna pattern, multi-beam forming algorithm and simulated satellite link cross-ambiguity calculation have been carried out respectively. Finally, cross-ambiguity calculation of the traditional reference source system has also been tested. The comparison between the results of computer simulation and the actual test results shows that the scheme is scientific and feasible, obviously superior to the traditional reference source system.
Evaluation of SNS Beamline Shielding Configurations using MCNPX Accelerated by ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, Joel M; Johnson, Seth R.; Remec, Igor
2015-01-01
Shielding analyses for the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory pose significant computational challenges, including highly anisotropic high-energy sources, a combination of deep penetration shielding and an unshielded beamline, and a desire to obtain well-converged nearly global solutions for mapping of predicted radiation fields. The majority of these analyses have been performed using MCNPX with manually generated variance reduction parameters (source biasing and cell-based splitting and Russian roulette) that were largely based on the analyst's insight into the problem specifics. Development of the variance reduction parameters required extensive analyst time, and was often tailored to specific portionsmore » of the model phase space. We previously applied a developmental version of the ADVANTG code to an SNS beamline study to perform a hybrid deterministic/Monte Carlo analysis and showed that we could obtain nearly global Monte Carlo solutions with essentially uniform relative errors for mesh tallies that cover extensive portions of the model with typical voxel spacing of a few centimeters. The use of weight window maps and consistent biased sources produced using the FW-CADIS methodology in ADVANTG allowed us to obtain these solutions using substantially less computer time than the previous cell-based splitting approach. While those results were promising, the process of using the developmental version of ADVANTG was somewhat laborious, requiring user-developed Python scripts to drive much of the analysis sequence. In addition, limitations imposed by the size of weight-window files in MCNPX necessitated the use of relatively coarse spatial and energy discretization for the deterministic Denovo calculations that we used to generate the variance reduction parameters. We recently applied the production version of ADVANTG to this beamline analysis, which substantially streamlined the analysis process. We also tested importance function collapsing (in space and energy) capabilities in ADVANTG. These changes, along with the support for parallel Denovo calculations using the current version of ADVANTG, give us the capability to improve the fidelity of the deterministic portion of the hybrid analysis sequence, obtain improved weight-window maps, and reduce both the analyst and computational time required for the analysis process.« less
Cost and Usage Study of the Educational Resources Information Center (ERIC) System. Final Report.
ERIC Educational Resources Information Center
McDonald, Dennis D; And Others
A detailed descriptive analysis of both the direct and indirect costs incurred by the Federal government in operating the ERIC system, and the user population and user demand for ERIC products and services, this study is based on data gathered from a number of complementary sources. These sources included a survey of ERIC's U.S. intermediate…
ERIC Educational Resources Information Center
Heinmiller, Joseph L.
Based on data gathered from a number of complementary sources, this study provides a detailed descriptive analysis of both the direct and indirect costs incurred by the Federal government in operating the ERIC system, and the user population and user demand for ERIC products and services. Data sources included a survey of ERIC's U.S. intermediate…
NASA Astrophysics Data System (ADS)
Stone, Elizabeth A.; Schauer, James J.; Pradhan, Bidya Banmali; Dangol, Pradeep Man; Habib, Gazala; Venkataraman, Chandra; Ramanathan, V.
2010-03-01
This study focuses on improving source apportionment of carbonaceous aerosol in South Asia and consists of three parts: (1) development of novel molecular marker-based profiles for real-world biofuel combustion, (2) application of these profiles to a year-long data set, and (3) evaluation of profiles by an in-depth sensitivity analysis. Emissions profiles for biomass fuels were developed through source testing of a residential stove commonly used in South Asia. Wood fuels were combusted at high and low rates, which corresponded to source profiles high in organic carbon (OC) or high in elemental carbon (EC), respectively. Crop wastes common to the region, including rice straw, mustard stalk, jute stalk, soybean stalk, and animal residue burnings, were also characterized. Biofuel profiles were used in a source apportionment study of OC and EC in Godavari, Nepal. This site is located in the foothills of the Himalayas and was selected for its well-mixed and regionally impacted air masses. At Godavari, daily samples of fine particulate matter (PM2.5) were collected throughout the year of 2006, and the annual trends in particulate mass, OC, and EC followed the occurrence of a regional haze in South Asia. Maximum concentrations occurred during the dry winter season and minimum concentrations occurred during the summer monsoon season. Specific organic compounds unique to aerosol sources, molecular markers, were measured in monthly composite samples. These markers implicated motor vehicles, coal combustion, biomass burning, cow dung burning, vegetative detritus, and secondary organic aerosol as sources of carbonaceous aerosol. A molecular marker-based chemical mass balance (CMB) model provided a quantitative assessment of primary source contributions to carbonaceous aerosol. The new profiles were compared to widely used biomass burning profiles from the literature in a sensitivity analysis. This analysis indicated a high degree of stability in estimates of source contributions to OC when different biomass profiles were used. The majority of OC was unapportioned to primary sources and was estimated to be of secondary origin, while biomass combustion was the next-largest source of OC. The CMB apportionment of EC to primary sources was unstable due to the diversity of biomass burning conditions in the region. The model results suggested that biomass burning and fossil fuel were important contributors to EC, but could not reconcile their relative contributions.
Assessment of Ecological Risk of Heavy Metal Contamination in Coastal Municipalities of Montenegro.
Mugoša, Boban; Đurović, Dijana; Nedović-Vuković, Mirjana; Barjaktarović-Labović, Snežana; Vrvić, Miroslav
2016-03-31
Assessment of heavy metal concentrations in the soil samples of urban parks and playgrounds is very important for the evaluation of potential risks for residents, especially children. Until recently, there has been very little data about urban parks pollution in Montenegro. To evaluate the sources of potential contamination and concentration of heavy metals, soil samples from coastal urban parks and kindergartens of Montenegro were collected. Based on the heavy metal concentrations, multivariate analysis combined with geochemical approaches showed that soil samples in coastal areas of Montenegro had mean Pb and Cd concentrations that were over two times higher than the background values, respectively. Based on principal component analysis (PCA), soil pollution with Pb, Cd, Cu, and Zn is contributed by anthropogenic sources. Results for Cr in the surface soils were primarily derived from natural sources. Calculation of different ecological contamination factors showed that Cd is the primary contribution to ecological risk index (RI) origins from anthropogenic, industry, and urbanization sources. This data provides evidence about soil pollution in coastal municipalities of Montenegro. Special attention should be paid to this problem in order to continue further research and to consider possible ways of remediation of the sites where contamination has been observed.
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Aydin, Ü; Rampp, S; Wollbrink, A; Kugel, H; Cho, J -H; Knösche, T R; Grova, C; Wellmer, J; Wolters, C H
2017-07-01
In recent years, the use of source analysis based on electroencephalography (EEG) and magnetoencephalography (MEG) has gained considerable attention in presurgical epilepsy diagnosis. However, in many cases the source analysis alone is not used to tailor surgery unless the findings are confirmed by lesions, such as, e.g., cortical malformations in MRI. For many patients, the histology of tissue resected from MRI negative epilepsy shows small lesions, which indicates the need for more sensitive MR sequences. In this paper, we describe a technique to maximize the synergy between combined EEG/MEG (EMEG) source analysis and high resolution MRI. The procedure has three main steps: (1) construction of a detailed and calibrated finite element head model that considers the variation of individual skull conductivities and white matter anisotropy, (2) EMEG source analysis performed on averaged interictal epileptic discharges (IED), (3) high resolution (0.5 mm) zoomed MR imaging, limited to small areas centered at the EMEG source locations. The proposed new diagnosis procedure was then applied in a particularly challenging case of an epilepsy patient: EMEG analysis at the peak of the IED coincided with a right frontal focal cortical dysplasia (FCD), which had been detected at standard 1 mm resolution MRI. Of higher interest, zoomed MR imaging (applying parallel transmission, 'ZOOMit') guided by EMEG at the spike onset revealed a second, fairly subtle, FCD in the left fronto-central region. The evaluation revealed that this second FCD, which had not been detectable with standard 1 mm resolution, was the trigger of the seizures.
Variable cycle control model for intersection based on multi-source information
NASA Astrophysics Data System (ADS)
Sun, Zhi-Yuan; Li, Yue; Qu, Wen-Cong; Chen, Yan-Yan
2018-05-01
In order to improve the efficiency of traffic control system in the era of big data, a new variable cycle control model based on multi-source information is presented for intersection in this paper. Firstly, with consideration of multi-source information, a unified framework based on cyber-physical system is proposed. Secondly, taking into account the variable length of cell, hysteresis phenomenon of traffic flow and the characteristics of lane group, a Lane group-based Cell Transmission Model is established to describe the physical properties of traffic flow under different traffic signal control schemes. Thirdly, the variable cycle control problem is abstracted into a bi-level programming model. The upper level model is put forward for cycle length optimization considering traffic capacity and delay. The lower level model is a dynamic signal control decision model based on fairness analysis. Then, a Hybrid Intelligent Optimization Algorithm is raised to solve the proposed model. Finally, a case study shows the efficiency and applicability of the proposed model and algorithm.
Siemann, Julia; Herrmann, Manfred; Galashan, Daniela
2018-01-25
The present study examined whether feature-based cueing affects early or late stages of flanker conflict processing using EEG and fMRI. Feature cues either directed participants' attention to the upcoming colour of the target or were neutral. Validity-specific modulations during interference processing were investigated using the N200 event-related potential (ERP) component and BOLD signal differences. Additionally, both data sets were integrated using an fMRI-constrained source analysis. Finally, the results were compared with a previous study in which spatial instead of feature-based cueing was applied to an otherwise identical flanker task. Feature-based and spatial attention recruited a common fronto-parietal network during conflict processing. Irrespective of attention type (feature-based; spatial), this network responded to focussed attention (valid cueing) as well as context updating (invalid cueing), hinting at domain-general mechanisms. However, spatially and non-spatially directed attention also demonstrated domain-specific activation patterns for conflict processing that were observable in distinct EEG and fMRI data patterns as well as in the respective source analyses. Conflict-specific activity in visual brain regions was comparable between both attention types. We assume that the distinction between spatially and non-spatially directed attention types primarily applies to temporal differences (domain-specific dynamics) between signals originating in the same brain regions (domain-general localization).
Measurement of Vibrated Bulk Density of Coke Particle Blends Using Image Texture Analysis
NASA Astrophysics Data System (ADS)
Azari, Kamran; Bogoya-Forero, Wilinthon; Duchesne, Carl; Tessier, Jayson
2017-09-01
A rapid and nondestructive machine vision sensor was developed for predicting the vibrated bulk density (VBD) of petroleum coke particles based on image texture analysis. It could be used for making corrective adjustments to a paste plant operation to reduce green anode variability (e.g., changes in binder demand). Wavelet texture analysis (WTA) and gray level co-occurrence matrix (GLCM) algorithms were used jointly for extracting the surface textural features of coke aggregates from images. These were correlated with the VBD using partial least-squares (PLS) regression. Coke samples of several sizes and from different sources were used to test the sensor. Variations in the coke surface texture introduced by coke size and source allowed for making good predictions of the VBD of individual coke samples and mixtures of them (blends involving two sources and different sizes). Promising results were also obtained for coke blends collected from an industrial-baked carbon anode manufacturer.
Fish-Eye Observing with Phased Array Radio Telescopes
NASA Astrophysics Data System (ADS)
Wijnholds, S. J.
The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.
Pan, Huiyun; Lu, Xinwei; Lei, Kai
2017-12-31
A detailed investigation was conducted to study heavy metal contamination in road dust from four regions of Xi'an, Northwest China. The concentrations of eight heavy metals Co, Cr, Cu, Mn, Ni, Pb, Zn and V were determined by X-Ray Fluorescence. The mean concentrations of these elements were: 30.9mgkg -1 Co, 145.0mgkg -1 Cr, 54.7mgkg -1 Cu, 510.5mgkg -1 Mn, 30.8mgkg -1 Ni, 124.5mgkg -1 Pb, 69.6mgkg -1 V and 268.6mgkg -1 Zn. There was significant enrichment of Pb, Zn, Co, Cu and Cr based on geo-accumulation index value. Multivariate statistical analysis showed that levels of Cu, Pb, Zn, Co and Cr were controlled by anthropogenic activities, while levels of Mn, Ni and V were associated with natural sources. Principle component analysis and multiple linear regression were applied to determine the source apportionment. The results showed that traffic was the main source with a percent contribution of 53.4%. Natural sources contributed 26.5%, and other anthropogenic pollution sources contributed 20.1%. Clear heavy metal pollution hotspots were identified by GIS mapping. The location of point pollution sources and prevailing wind direction were found to be important factors in the spatial distribution of heavy metals. Copyright © 2017 Elsevier B.V. All rights reserved.
Self-consistent multidimensional electron kinetic model for inductively coupled plasma sources
NASA Astrophysics Data System (ADS)
Dai, Fa Foster
Inductively coupled plasma (ICP) sources have received increasing interest in microelectronics fabrication and lighting industry. In 2-D configuration space (r, z) and 2-D velocity domain (νθ,νz), a self- consistent electron kinetic analytic model is developed for various ICP sources. The electromagnetic (EM) model is established based on modal analysis, while the kinetic analysis gives the perturbed Maxwellian distribution of electrons by solving Boltzmann-Vlasov equation. The self- consistent algorithm combines the EM model and the kinetic analysis by updating their results consistently until the solution converges. The closed-form solutions in the analytical model provide rigorous and fast computing for the EM fields and the electron kinetic behavior. The kinetic analysis shows that the RF energy in an ICP source is extracted by a collisionless dissipation mechanism, if the electron thermovelocity is close to the RF phase velocities. A criterion for collisionless damping is thus given based on the analytic solutions. To achieve uniformly distributed plasma for plasma processing, we propose a novel discharge structure with both planar and vertical coil excitations. The theoretical results demonstrate improved uniformity for the excited azimuthal E-field in the chamber. Non-monotonic spatial decay in electric field and space current distributions was recently observed in weakly- collisional plasmas. The anomalous skin effect is found to be responsible for this phenomenon. The proposed model successfully models the non-monotonic spatial decay effect and achieves good agreements with the measurements for different applied RF powers. The proposed analytical model is compared with other theoretical models and different experimental measurements. The developed model is also applied to two kinds of ICP discharges used for electrodeless light sources. One structure uses a vertical internal coil antenna to excite plasmas and another has a metal shield to prevent the electromagnetic radiation. The theoretical results delivered by the proposed model agree quite well with the experimental measurements in many aspects. Therefore, the proposed self-consistent model provides an efficient and reliable means for designing ICP sources in various applications such as VLSI fabrication and electrodeless light sources.
Zhang, Xiaowen; Wei, Shuai; Sun, Qianqian; Wadood, Syed Abdul; Guo, Boli
2018-09-15
Characterizing the distribution and defining potential sources of arsenic and heavy metals are the basic preconditions for reducing the contamination of heavy metals and metalloids. 71 topsoil samples and 61 subsoil samples were collected by grid method to measure the concentration of cadmium (Cd), arsenic (As), lead (Pb), copper (Cu), zinc (Zn), nickel (Ni) and chromium (Cr). Principle components analysis (PCA), GIS-based geo-statistical methods and Positive Matrix Factorization (PMF) were applied. The results showed that the mean concentrations were 9.59 mg kg -1 , 51.28 mg kg -1 , 202.07 mg kg -1 , 81.32 mg kg -1 and 771.22 mg kg -1 for Cd, As, Pb, Cu and Zn, respectively, higher than the guideline values of Chinese Environmental Quality Standard for Soils; while the concentrations of Ni and Cr were very close to recommended value (50 mg kg -1 , 200 mg kg -1 ), and some site were higher than guideline values. The soil was polluted by As and heavy metals in different degree, which had harmful impact on human health. The results from principle components analysis methods extracted three components, namely industrial sources (Cd, Zn and Pb), agricultural sources (As and Cu) and nature sources (Cr and Ni). GIS-based geo-statistical combined with local conditions further apportioned the sources of these trace elements. To better identify pollution sources of As and heavy metals in soil, the PMF was applied. The results of PMF demonstrated that the enrichment of Zn, Cd and Pb were attributed to industrial activities and their contribution was 24.9%; As was closely related to agricultural activities and its contribution was 19.1%; Cr, a part of Cu and Ni were related to subsoil and their contribution was 30.1%; Cu and Pb came from industry and traffic emission and their contribution was 25.9%. Copyright © 2018 Elsevier Inc. All rights reserved.
Diagnosing turbulence for research aircraft safety using open source toolkits
NASA Astrophysics Data System (ADS)
Lang, T. J.; Guy, N.
Open source software toolkits have been developed and applied to diagnose in-cloud turbulence in the vicinity of Earth science research aircraft, via analysis of ground-based Doppler radar data. Based on multiple retrospective analyses, these toolkits show promise for detecting significant turbulence well prior to cloud penetrations by research aircraft. A pilot study demonstrated the ability to provide mission scientists turbulence estimates in near real time during an actual field campaign, and thus these toolkits are recommended for usage in future cloud-penetrating aircraft field campaigns.
Zhang, Wanfeng; Zhu, Shukui; He, Sheng; Wang, Yanxin
2015-02-06
Using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOFMS), volatile and semi-volatile organic compounds in crude oil samples from different reservoirs or regions were analyzed for the development of a molecular fingerprint database. Based on the GC×GC/TOFMS fingerprints of crude oils, principal component analysis (PCA) and cluster analysis were used to distinguish the oil sources and find biomarkers. As a supervised technique, the geological characteristics of crude oils, including thermal maturity, sedimentary environment etc., are assigned to the principal components. The results show that tri-aromatic steroid (TAS) series are the suitable marker compounds in crude oils for the oil screening, and the relative abundances of individual TAS compounds have excellent correlation with oil sources. In order to correct the effects of some other external factors except oil sources, the variables were defined as the content ratio of some target compounds and 13 parameters were proposed for the screening of oil sources. With the developed model, the crude oils were easily discriminated, and the result is in good agreement with the practical geological setting. Copyright © 2014 Elsevier B.V. All rights reserved.
Integration and Optimization of Alternative Sources of Energy in a Remote Region
NASA Astrophysics Data System (ADS)
Berberi, Pellumb; Inodnorjani, Spiro; Aleti, Riza
2010-01-01
In a remote coastal region supply of energy from national grid is insufficient for a sustainable development. Integration and optimization of local alternative renewable energy sources is an optional solution of the problem. In this paper we have studied the energetic potential of local sources of renewable energy (water, solar, wind and biomass). A bottom-up energy system optimization model is proposed in order to support planning policies for promoting the use of renewable energy sources. A software, based on multiple factors and constrains analysis for optimization energy flow is proposed, which provides detailed information for exploitation each source of energy, power and heat generation, GHG emissions and end-use sectors. Economical analysis shows that with existing technologies both stand alone and regional facilities may be feasible. Improving specific legislation will foster investments from Central or Local Governments and also from individuals, private companies or small families. The study is carried on the frame work of a FP6 project "Integrated Renewable Energy System."
NASA Astrophysics Data System (ADS)
Ragusa, Jérémy; Kindler, Pascal; Segvic, Branimir; Ospina-Ostios, Lina Maria
2017-04-01
The Chablais Prealps (Haute-Savoie, France) represent a well-preserved accretionary wedge of the Western Alpine Tethys. They comprise a stack of sedimentary nappes related to palaeogeographic realms ranging from the Ultrahelvetic to the Southern Penninic. The provenance analysis is based on the Gazzi-Dickinson method and on QEMSCAN® for heavy-minerals. The Quartzose petrofacies is the most important of the two sources, and supplied three of the four formations of the Voirons Flysch. It is similar to the sources that fed the other flyschs from the Gurnigel nappe. It is characterised by a mature, quartz-rich assemblage and a heavy-mineral population dominated by apatite and the zircon-tourmaline-rutile mineral group. These observations suggest a Clastic wedge provenance. The Feldspathic petrofacies is derived from a feldspar-rich source associated with metamorphic clasts and a heavy-mineral population dominated by garnet. This provenance characterises only one formation of the Voirons Flysch, and is related to the axial belt provenance. This provenance analysis shows that the Middle Eocene to Early Oligocene Voirons Flysch was fed by two sources, in contrast to the other flyschs of the Gurnigel nappe, and further suggests that this flysch was not deposited in the Piemont Ocean but in the Valais domain. Based on the results and comparative provenance analysis with the other flyschs of the Gurnigel nappe, we propose a generic feeding model which involves the Sesia-Dent Blanche nappe, the sedimentary nappes incorporated in the accretionary prism, and probably the Briançonnais basement.
Gori, Claudia
2011-01-01
Summary This article is based on analysis of 4 couple’s personal and public documents, in order to integrate personal choices, values and ideas with cultural representations and social attitudes. Moreover, being based on Italian sources from the nineteenth century, the study offers an historical insight on the Italian nation-building process and its political and social foundations. This study is based on archival and printed primary sources from: Gianna Maffei and Ercole Trotti Mosti (Museo Centrale del Risorgimento – Roma – MCRR); Augusto Pierantoni and Grazia Mancini (Museo Centrale del Risorgimento – Roma); Luigi Majno and Ersilia Bronzini (Archivio Unione Femminile Nazionale – Milano); Angiolo Orvieto and Laura Cantoni (Archivio Contemporaneo Bonsanti del Gabinetto Vieuesseux – Firenze – ACGV). This study reflects on love as a political and moral issue, by linking the personal sphere of subjectivity to the public dimension of the political community. An extensive understanding of the role played by the perception and the expression of sentiments can be considered as the central issue of this analysis. PMID:22037756
Blackboard architecture for medical image interpretation
NASA Astrophysics Data System (ADS)
Davis, Darryl N.; Taylor, Christopher J.
1991-06-01
There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.
Zhang, Shunqi; Yin, Tao; Ma, Ren; Liu, Zhipeng
2015-08-01
Functional imaging method of biological electrical characteristics based on magneto-acoustic effect gives valuable information of tissue in early tumor diagnosis, therein time and frequency characteristics analysis of magneto-acoustic signal is important in image reconstruction. This paper proposes wave summing method based on Green function solution for acoustic source of magneto-acoustic effect. Simulations and analysis under quasi 1D transmission condition are carried out to time and frequency characteristics of magneto-acoustic signal of models with different thickness. Simulation results of magneto-acoustic signal were verified through experiments. Results of the simulation with different thickness showed that time-frequency characteristics of magneto-acoustic signal reflected thickness of sample. Thin sample, which is less than one wavelength of pulse, and thick sample, which is larger than one wavelength, showed different summed waveform and frequency characteristics, due to difference of summing thickness. Experimental results verified theoretical analysis and simulation results. This research has laid a foundation for acoustic source and conductivity reconstruction to the medium with different thickness in magneto-acoustic imaging.
Introduction to an Open Source Internet-Based Testing Program for Medical Student Examinations
2009-01-01
The author developed a freely available open source internet-based testing program for medical examination. PHP and Java script were used as the programming language and postgreSQL as the database management system on an Apache web server and Linux operating system. The system approach was that a super user inputs the items, each school administrator inputs the examinees' information, and examinees access the system. The examinee's score is displayed immediately after examination with item analysis. The set-up of the system beginning with installation is described. This may help medical professors to easily adopt an internet-based testing system for medical education. PMID:20046457
Introduction to an open source internet-based testing program for medical student examinations.
Lee, Yoon-Hwan
2009-12-20
The author developed a freely available open source internet-based testing program for medical examination. PHP and Java script were used as the programming language and postgreSQL as the database management system on an Apache web server and Linux operating system. The system approach was that a super user inputs the items, each school administrator inputs the examinees' information, and examinees access the system. The examinee's score is displayed immediately after examination with item analysis. The set-up of the system beginning with installation is described. This may help medical professors to easily adopt an internet-based testing system for medical education.
Performance Analysis of GAME: A Generic Automated Marking Environment
ERIC Educational Resources Information Center
Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram
2008-01-01
This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…
Cellular Consequences of Telomere Shortening in Histologically Normal Breast Tissues
2013-09-01
using the open source, JAVA -based image analysis software package ImageJ (http://rsb.info.nih.gov/ij/) and a custom designed plugin (“Telometer...Tabulated data were stored in a MySQL (http://www.mysql.com) database and viewed through Microsoft Access (Microsoft Corp.). Statistical Analysis For
An uncertainty analysis of wildfire modeling [Chapter 13
Karin Riley; Matthew Thompson
2017-01-01
Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...
Multiple Component Event-Related Potential (mcERP) Estimation
NASA Technical Reports Server (NTRS)
Knuth, K. H.; Clanton, S. T.; Shah, A. S.; Truccolo, W. A.; Ding, M.; Bressler, S. L.; Trejo, L. J.; Schroeder, C. E.; Clancy, Daniel (Technical Monitor)
2002-01-01
We show how model-based estimation of the neural sources responsible for transient neuroelectric signals can be improved by the analysis of single trial data. Previously, we showed that a multiple component event-related potential (mcERP) algorithm can extract the responses of individual sources from recordings of a mixture of multiple, possibly interacting, neural ensembles. McERP also estimated single-trial amplitudes and onset latencies, thus allowing more accurate estimation of ongoing neural activity during an experimental trial. The mcERP algorithm is related to informax independent component analysis (ICA); however, the underlying signal model is more physiologically realistic in that a component is modeled as a stereotypic waveshape varying both in amplitude and onset latency from trial to trial. The result is a model that reflects quantities of interest to the neuroscientist. Here we demonstrate that the mcERP algorithm provides more accurate results than more traditional methods such as factor analysis and the more recent ICA. Whereas factor analysis assumes the sources are orthogonal and ICA assumes the sources are statistically independent, the mcERP algorithm makes no such assumptions thus allowing investigators to examine interactions among components by estimating the properties of single-trial responses.
Probability model for atmospheric sulfur dioxide concentrations in the area of Venice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buttazzoni, C.; Lavagnini, I.; Marani, A.
1986-09-01
This paper deals with a comparative screening of existing air quality models based on their ability to simulate the distribution of sulfur dioxide data in the Venetian area. Investigations have been carried out on sulfur dioxide dispersion in the atmosphere of the Venetian area. The studies have been mainly focused on transport models (Gaussian, plume and K-models) aiming at meaningful correlations of sources and receptors. Among the results, a noteworthy disagreement of simulated and experimental data, due to the lack of thorough knowledge of source field conditions and of local meteorology of the sea-land transition area, has been shown. Investigationsmore » with receptor oriented models (based, e.g., on time series analysis, Fourier analysis, or statistical distributions) have also been performed.« less
NASA Astrophysics Data System (ADS)
Lin, Pei-Chun; Yu, Chun-Chang; Chen, Charlie Chung-Ping
2015-01-01
As one of the critical stages of a very large scale integration fabrication process, postexposure bake (PEB) plays a crucial role in determining the final three-dimensional (3-D) profiles and lessening the standing wave effects. However, the full 3-D chemically amplified resist simulation is not widely adopted during the postlayout optimization due to the long run-time and huge memory usage. An efficient simulation method is proposed to simulate the PEB while considering standing wave effects and resolution enhancement techniques, such as source mask optimization and subresolution assist features based on the Sylvester equation and Abbe-principal component analysis method. Simulation results show that our algorithm is 20× faster than the conventional Gaussian convolution method.
Source apportion of atmospheric particulate matter: a joint Eulerian/Lagrangian approach.
Riccio, A; Chianese, E; Agrillo, G; Esposito, C; Ferrara, L; Tirimberio, G
2014-12-01
PM2.5 samples were collected during an annual monitoring campaign (January 2012-January 2013) in the urban area of Naples, one of the major cities in Southern Italy. Samples were collected by means of a standard gravimetric sampler (Tecora Echo model) and characterized from a chemical point of view by ion chromatography. As a result, 143 samples together with their ionic composition have been collected. We extend traditional source apportionment techniques, usually based on multivariate factor analysis, interpreting the chemical analysis results within a Lagrangian framework. The Hybrid Single-Particle Lagrangian Integrated Trajectory Model (HYSPLIT) model was used, providing linkages to the source regions in the upwind areas. Results were analyzed in order to quantify the relative weight of different source types/areas. Model results suggested that PM concentrations are strongly affected not only by local emissions but also by transboundary emissions, especially from the Eastern and Northern European countries and African Saharan dust episodes.
Kang, Xuming; Song, Jinming; Yuan, Huamao; Duan, Liqin; Li, Xuegang; Li, Ning; Liang, Xianmeng; Qu, Baoxiao
2017-09-01
Heavy metal contamination is an essential indicator of environmental health. In this work, one sediment core was used for the analysis of the speciation of heavy metals (Cr, Mn, Ni, Cu, Zn, As, Cd, and Pb) in Jiaozhou Bay sediments with different grain sizes. The bioavailability, sources and ecological risk of heavy metals were also assessed on a centennial timescale. Heavy metals were enriched in grain sizes of < 63µm and were predominantly present in residual phases. Moreover, the mobility sequence based on the sum of the first three phases (for grain sizes of < 63µm) was Mn > Pb > Cd > Zn > Cu >Ni > Cr > As. Enrichment factors (EF) indicated that heavy metals in Jiaozhou Bay presented from no enrichment to minor enrichment. The potential ecological risk index (RI) indicated that Jiaozhou Bay had been suffering from a low ecological risk and presented an increasing trend since 1940s owing to the increase of anthropogenic activities. The source analysis indicated that natural sources were primary sources of heavy metals in Jiaozhou Bay and anthropogenic sources of heavy metals presented an increasing trend since 1940s. The principal component analysis (PCA) indicated that Cr, Mn, Ni, Cu and Pb were primarily derived from natural sources and that Zn and Cd were influenced by shipbuilding industry. Mn, Cu, Zn and Pb may originate from both natural and anthropogenic sources. As may be influenced by agricultural activities. Moreover, heavy metals in sediments of Jiaozhou Bay were clearly influenced by atmospheric deposition and river input. Copyright © 2017. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Liu,Kuo-Chia; Maghami, Peiman; Blaurock, Carl
2008-01-01
The Solar Dynamics Observatory (SDO) aims to study the Sun's influence on the Earth by understanding the source, storage, and release of the solar energy, and the interior structure of the Sun. During science observations, the jitter stability at the instrument focal plane must be maintained to less than a fraction of an arcsecond for two of the SDO instruments. To meet these stringent requirements, a significant amount of analysis and test effort has been devoted to predicting the jitter induced from various disturbance sources. One of the largest disturbance sources onboard is the reaction wheel. This paper presents the SDO approach on reaction wheel disturbance modeling and jitter analysis. It describes the verification and calibration of the disturbance model, and ground tests performed for validating the reaction wheel jitter analysis. To mitigate the reaction wheel disturbance effects, the wheels will be limited to operate at low wheel speeds based on the current analysis. An on-orbit jitter test algorithm is also presented in the paper which will identify the true wheel speed limits in order to ensure that the wheel jitter requirements are met.
NASA Astrophysics Data System (ADS)
Giblin, A. C.
2015-12-01
The Central American Land Bridge is the crucial connection between North and South America, and the Miocene closure of the Panama seaway led to a change in global oceanic circulation patterns. Modern Costa Rica is part of the island arc that formed over the western Caribbean subduction zone, and the Santa Elena peninsula is on the northwest coast of Costa Rica next to the Sandino forearc basin. This study focuses on the origin and provenance of the Paleocene deep-water Rivas and Descartes turbidites that crop out on the northern part of the Santa Elena peninsula in northwestern Costa Rica. Understanding the sedimentary fill of the Sandino Basin that contributed to the closing of the seaway may lead to a better understanding of the Late Cretaceous-Paleogene arcs. Provenance studies of the Santa Elena Peninsula turbidite sandstone bodies constrain the history of the paleogeography and tectonics of the region. Petrographic analyses of rock thin sections constrain source areas; geochemical analysis of individual detrital heavy minerals from rock samples give indications of sediment sources and tectonic setting during deposition. This study is a provenance analysis based on (i) semi-quantitative energy-dispersive spectrometry analysis of heavy minerals, (ii) quantitative wavelength-dispersive spectrometry for major elements of detrital clinopyroxene and spinel grains, (iii) trace element analysis through laser ablation of single detrital clinopyroxene grains, and (iv) comparative analysis of the different potential source rocks to clearly identify the most likely sediment sources. The detrital spinel and clinopyroxene are possibly sourced from: mantle ophiolites, mid-ocean ridge gabbros, or volcanic arc tholeiitic basalts or calc-alkaline andesites. Spinel and clinopyroxne geochemistry suggests a possible peridotitic source, linked to mantle rocks that are now covered by Tertiary volcanics or have completely eroded. The character of the crustal minerals indicates sources from mid-ocean ridge gabbros, and island arc tholeiites and andesites. This suggests that during the early history of the gateway uplift and seaway closure, sediment sources were dominated first by older ophiolites and gabbroic sources, then by volcanic inputs from the arc.
Multiwavelength counterparts of the point sources in the Chandra Source Catalog
NASA Astrophysics Data System (ADS)
Reynolds, Michael; Civano, Francesca Maria; Fabbiano, Giuseppina; D'Abrusco, Raffaele
2018-01-01
The most recent release of the Chandra Source Catalog (CSC) version 2.0 comprises more than $\\sim$350,000 point sources, down to fluxes of $\\sim$10$^{-16}$ erg/cm$^2$/s, covering $\\sim$500 deg$^2$ of the sky, making it one of the best available X-ray catalogs to date. There are many reasons to have multiwavelength counterparts for sources, one such reason is that X-ray information alone is not enough to identify the sources and divide them between galactic and extragalactic origin, therefore multiwavelength data associated to each X-ray source is crucial for classification and scientific analysis of the sample. To perform this multiwavelength association, we are going to employ the recently released versatile tool NWAY (Salvato et al. 2017), based on a Bayesian algorithm for cross-matching multiple catalogs. NWAY allows the combination of multiple catalogs at the same time, provides a probability for the matches, even in case of non-detection due to different depth of the matching catalogs, and it can be used by including priors on the nature of the sources (e.g. colors, magnitudes, etc). In this poster, we are presenting the preliminary analysis using the CSC sources above the galactic plane matched to the WISE All-Sky catalog, SDSS, Pan-STARRS and GALEX.
EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.
Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice
2015-01-01
The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.
EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome
Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice
2015-01-01
The brain is a large-scale complex network often referred to as the “connectome”. Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/. PMID:26379232
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
Automated detection of extended sources in radio maps: progress from the SCORPIO survey
NASA Astrophysics Data System (ADS)
Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.
2016-08-01
Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.
The Distressed Brain: A Group Blind Source Separation Analysis on Tinnitus
De Ridder, Dirk; Vanneste, Sven; Congedo, Marco
2011-01-01
Background Tinnitus, the perception of a sound without an external sound source, can lead to variable amounts of distress. Methodology In a group of tinnitus patients with variable amounts of tinnitus related distress, as measured by the Tinnitus Questionnaire (TQ), an electroencephalography (EEG) is performed, evaluating the patients' resting state electrical brain activity. This resting state electrical activity is compared with a control group and between patients with low (N = 30) and high distress (N = 25). The groups are homogeneous for tinnitus type, tinnitus duration or tinnitus laterality. A group blind source separation (BSS) analysis is performed using a large normative sample (N = 84), generating seven normative components to which high and low tinnitus patients are compared. A correlation analysis of the obtained normative components' relative power and distress is performed. Furthermore, the functional connectivity as reflected by lagged phase synchronization is analyzed between the brain areas defined by the components. Finally, a group BSS analysis on the Tinnitus group as a whole is performed. Conclusions Tinnitus can be characterized by at least four BSS components, two of which are posterior cingulate based, one based on the subgenual anterior cingulate and one based on the parahippocampus. Only the subgenual component correlates with distress. When performed on a normative sample, group BSS reveals that distress is characterized by two anterior cingulate based components. Spectral analysis of these components demonstrates that distress in tinnitus is related to alpha and beta changes in a network consisting of the subgenual anterior cingulate cortex extending to the pregenual and dorsal anterior cingulate cortex as well as the ventromedial prefrontal cortex/orbitofrontal cortex, insula, and parahippocampus. This network overlaps partially with brain areas implicated in distress in patients suffering from pain, functional somatic syndromes and posttraumatic stress disorder, and might therefore represent a specific distress network. PMID:21998628
A Deep Chandra ACIS Survey of M83
NASA Astrophysics Data System (ADS)
Long, Knox S.; Kuntz, Kip D.; Blair, William P.; Godfrey, Leith; Plucinsky, Paul P.; Soria, Roberto; Stockdale, Christopher; Winkler, P. Frank
2014-06-01
We have obtained a series of deep X-ray images of the nearby galaxy M83 using Chandra, with a total exposure of 729 ks. Combining the new data with earlier archival observations totaling 61 ks, we find 378 point sources within the D25 contour of the galaxy. We find 80 more sources, mostly background active galactic nuclei (AGNs), outside of the D25 contour. Of the X-ray sources, 47 have been detected in a new radio survey of M83 obtained using the Australia Telescope Compact Array. Of the X-ray sources, at least 87 seem likely to be supernova remnants (SNRs), based on a combination of their properties in X-rays and at other wavelengths. We attempt to classify the point source population of M83 through a combination of spectral and temporal analysis. As part of this effort, we carry out an initial spectral analysis of the 29 brightest X-ray sources. The soft X-ray sources in the disk, many of which are SNRs, are associated with the spiral arms, while the harder X-ray sources, mostly X-ray binaries (XRBs), do not appear to be. After eliminating AGNs, foreground stars, and identified SNRs from the sample, we construct the cumulative luminosity function (CLF) of XRBs brighter than 8 × 1035 erg s-1. Despite M83's relatively high star formation rate, the CLF indicates that most of the XRBs in the disk are low mass XRBs. Based on observations made with NASA's Chandra X-Ray Observatory. NASA's Chandra Observatory is operated by Smithsonian Astrophysical Observatory under contract NAS83060 and the data were obtained through program GO1-12115.
Iliev, Filip L.; Stanev, Valentin G.; Vesselinov, Velimir V.
2018-01-01
Factor analysis is broadly used as a powerful unsupervised machine learning tool for reconstruction of hidden features in recorded mixtures of signals. In the case of a linear approximation, the mixtures can be decomposed by a variety of model-free Blind Source Separation (BSS) algorithms. Most of the available BSS algorithms consider an instantaneous mixing of signals, while the case when the mixtures are linear combinations of signals with delays is less explored. Especially difficult is the case when the number of sources of the signals with delays is unknown and has to be determined from the data as well. To address this problem, in this paper, we present a new method based on Nonnegative Matrix Factorization (NMF) that is capable of identifying: (a) the unknown number of the sources, (b) the delays and speed of propagation of the signals, and (c) the locations of the sources. Our method can be used to decompose records of mixtures of signals with delays emitted by an unknown number of sources in a nondispersive medium, based only on recorded data. This is the case, for example, when electromagnetic signals from multiple antennas are received asynchronously; or mixtures of acoustic or seismic signals recorded by sensors located at different positions; or when a shift in frequency is induced by the Doppler effect. By applying our method to synthetic datasets, we demonstrate its ability to identify the unknown number of sources as well as the waveforms, the delays, and the strengths of the signals. Using Bayesian analysis, we also evaluate estimation uncertainties and identify the region of likelihood where the positions of the sources can be found. PMID:29518126
Iliev, Filip L; Stanev, Valentin G; Vesselinov, Velimir V; Alexandrov, Boian S
2018-01-01
Factor analysis is broadly used as a powerful unsupervised machine learning tool for reconstruction of hidden features in recorded mixtures of signals. In the case of a linear approximation, the mixtures can be decomposed by a variety of model-free Blind Source Separation (BSS) algorithms. Most of the available BSS algorithms consider an instantaneous mixing of signals, while the case when the mixtures are linear combinations of signals with delays is less explored. Especially difficult is the case when the number of sources of the signals with delays is unknown and has to be determined from the data as well. To address this problem, in this paper, we present a new method based on Nonnegative Matrix Factorization (NMF) that is capable of identifying: (a) the unknown number of the sources, (b) the delays and speed of propagation of the signals, and (c) the locations of the sources. Our method can be used to decompose records of mixtures of signals with delays emitted by an unknown number of sources in a nondispersive medium, based only on recorded data. This is the case, for example, when electromagnetic signals from multiple antennas are received asynchronously; or mixtures of acoustic or seismic signals recorded by sensors located at different positions; or when a shift in frequency is induced by the Doppler effect. By applying our method to synthetic datasets, we demonstrate its ability to identify the unknown number of sources as well as the waveforms, the delays, and the strengths of the signals. Using Bayesian analysis, we also evaluate estimation uncertainties and identify the region of likelihood where the positions of the sources can be found.
Source Plane Reconstruction of the Bright Lensed Galaxy RCSGA 032727-132609
NASA Technical Reports Server (NTRS)
Sharon, Keren; Gladders, Michael D.; Rigby, Jane R.; Wuyts, Eva; Koester, Benjamin P.; Bayliss, Matthew B.; Barrientos, L. Felipe
2011-01-01
We present new HST/WFC3 imaging data of RCS2 032727-132609, a bright lensed galaxy at z=1.7 that is magnified and stretched by the lensing cluster RCS2 032727-132623. Using this new high-resolution imaging, we modify our previous lens model (which was based on ground-based data) to fully understand the lensing geometry, and use it to reconstruct the lensed galaxy in the source plane. This giant arc represents a unique opportunity to peer into 100-pc scale structures in a high redshift galaxy. This new source reconstruction will be crucial for a future analysis of the spatially-resolved rest-UV and rest-optical spectra of the brightest parts of the arc.
An open-source computational and data resource to analyze digital maps of immunopeptidomes
Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J; Schuster, Heiko; Ternette, Nicola; Alpízar, Adán; Schittenhelm, Ralf B; Ramarathinam, Sri H; Lindestam Arlehamn, Cecilia S; Chiek Koh, Ching; Gillet, Ludovic C; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David S; Deutsch, Eric W; Moritz, Robert L; Purcell, Anthony W; Rammensee, Hans-Georg; Stevanovic, Stefan; Aebersold, Ruedi
2015-01-01
We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies. DOI: http://dx.doi.org/10.7554/eLife.07661.001 PMID:26154972
NASA Astrophysics Data System (ADS)
Bartnik, R.; Hnydiuk-Stefan, A.; Buryn, Z.
2017-11-01
This paper reports the results of the investment strategy analysis in different electricity sources. New methodology and theory of calculating the market value of the power plant and value of the electricity market supplied by it are presented. The financial gain forms the most important criteria in the assessment of an investment by an investor. An investment strategy has to involve a careful analysis of each considered project in order that the right decision and selection will be made while various components of the projects will be considered. The latter primarily includes the aspects of risk and uncertainty. Profitability of an investment in the electricity sources (as well as others) is offered by the measures applicable for the assessment of the economic effectiveness of an investment based on calculations e.g. power plant market value and the value of the electricity that is supplied by a power plant. The values of such measures decide on an investment strategy in the energy sources. This paper contains analysis of exemplary calculations results of power plant market value and the electricity market value supplied by it.
NASA Astrophysics Data System (ADS)
Meng, L.; Ampuero, J. P.; Rendon, H.
2010-12-01
Back projection of teleseismic waves based on array processing has become a popular technique for earthquake source imaging,in particular to track the areas of the source that generate the strongest high frequency radiation. The technique has been previously applied to study the rupture process of the Sumatra earthquake and the supershear rupture of the Kunlun earthquakes. Here we attempt to image the Haiti earthquake using the data recorded by Venezuela National Seismic Network (VNSN). The network is composed of 22 broad-band stations with an East-West oriented geometry, and is located approximately 10 degrees away from Haiti in the perpendicular direction to the Enriquillo fault strike. This is the first opportunity to exploit the privileged position of the VNSN to study large earthquake ruptures in the Caribbean region. This is also a great opportunity to explore the back projection scheme of the crustal Pn phase at regional distances,which provides unique complementary insights to the teleseismic source inversions. The challenge in the analysis of the 2010 M7.0 Haiti earthquake is its very compact source region, possibly shorter than 30km, which is below the resolution limit of standard back projection techniques based on beamforming. Results of back projection analysis using the teleseismic USarray data reveal little details of the rupture process. To overcome the classical resolution limit we explored the Multiple Signal Classification method (MUSIC), a high-resolution array processing technique based on the signal-noise orthognality in the eigen space of the data covariance, which achieves both enhanced resolution and better ability to resolve closely spaced sources. We experiment with various synthetic earthquake scenarios to test the resolution. We find that MUSIC provides at least 3 times higher resolution than beamforming. We also study the inherent bias due to the interferences of coherent Green’s functions, which leads to a potential quantification of biased uncertainty of the back projection. Preliminary results from the Venezuela data set shows an East to West rupture propagation along the fault with sub-Rayleigh rupture speed, consistent with a compact source with two significant asperities which are confirmed by source time function obtained from Green’s function deconvolution and other source inversion results. These efforts could lead the Venezuela National Seismic Network to play a prominent role in the timely characterization of the rupture process of large earthquakes in the Caribbean, including the future ruptures along the yet unbroken segments of the Enriquillo fault system.
Application of Ontology Technology in Health Statistic Data Analysis.
Guo, Minjiang; Hu, Hongpu; Lei, Xingyun
2017-01-01
Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.
Constrained Null Space Component Analysis for Semiblind Source Separation Problem.
Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn
2018-02-01
The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.
NASA Astrophysics Data System (ADS)
Allgaier, Joachim
2011-06-01
Media accounts of reality have the potential to influence public opinion and decision making processes. Therefore who has and who does not have access to the media and can make their voice heard is a crucial question with serious political consequences. In this article it is investigated whether the speciality of journalists influences their source selection procedures. The coverage of science in schools is an interesting example, since it can be covered by specialized science or education correspondents, but also by general news reporters. A public controversy in the UK about the inclusion of creationism in a school is used to identify which types of sources were selected by various journalists. The focus is upon the selection of sources and whether journalists with different specialties consider various sources relevant and credible. A content analysis of articles, featuring this controversy, is combined with an analysis of correspondent's strategies for selecting sources based on interviews with them. The findings suggest that compared to journalists that specialize in education issues, science correspondents employ a narrower scope when seeking sources. This might have important consequences for the representation of views on science education in the media.
Derrien, Morgane; Kim, Min-Seob; Ock, Giyoung; Hong, Seongjin; Cho, Jinwoo; Shin, Kyung-Hoon; Hur, Jin
2018-03-15
The two popular source tracing tools of stable isotope ratios (δ 13 C and δ 15 N) and fluorescence spectroscopy were used to estimate the relative source contributions to sediment organic matter (SeOM) at five different river sites in an agricultural-forested watershed (Soyang Lake watershed), and their capabilities for the source assignment were compared. Bulk sediments were used for the stable isotopes, while alkaline extractable organic matter (AEOM) from sediments was used to obtain fluorescent indices for SeOM. Several source discrimination indices were fully compiled for a range of the SeOM sources distributed in the catchments of the watershed, which included soils, forest leaves, crop (C3 and C4) and riparian plants, periphyton, and organic fertilizers. The relative source contributions to the river sediment samples were estimated via end member mixing analysis (EMMA) based on several selected discrimination indices. The EMMA based on the isotopes demonstrated that all sediments were characterized by a medium to a high contribution of periphyton ranging from ~30% to 70% except for one site heavily affected by forest and agricultural fields with relatively high contributions of terrestrial materials. The EMMA based on fluorescence parameters, however, did not show similar results with low contributions from forest leaf and periphyton. The characteristics of the studied watershed were more consistent with the source contributions determined by the isotope ratios. The discrepancy in the EMMA capability for source assignments between the two analytical tools can be explained by the limited analytical window of fluorescence spectroscopy for non-fluorescent dissolved organic matter (FDOM) and the inability of AEOM to represent original bulk particulate organic matter (POM). Copyright © 2017 Elsevier B.V. All rights reserved.
Knowledge Acquisition Using Linguistic-Based Knowledge Analysis
Daniel L. Schmoldt
1998-01-01
Most knowledge-based system developmentefforts include acquiring knowledge from one or more sources. difficulties associated with this knowledge acquisition task are readily acknowledged by most researchers. While a variety of knowledge acquisition methods have been reported, little has been done to organize those different methods and to suggest how to apply them...
School-Based Medicaid for Children with Disabilities. inForum
ERIC Educational Resources Information Center
Sopko, Kim Moherek
2006-01-01
Obtaining sufficient funding to cover ever-increasing costs of services for students with disabilities is a critical responsibility for state and district special education directors. Medicaid is a possible source of support for certain school-based services in conjunction with other federal funds. This policy analysis provides a brief background…
School-Based Decision Making: A Principal-Agent Perspective.
ERIC Educational Resources Information Center
Ferris, James M.
1992-01-01
A principal-agent framework is used to examine potential gains in educational performance and potential threats to public accountability that school-based decision-making proposals pose. Analysis underscores the need to tailor the design of decentralized decision making to the sources of poor educational performance and threats to school…
Chen, Lian; Zhou, Shenglu; Wu, Shaohua; Wang, Chunhui; Li, Baojie; Li, Yan; Wang, Junxiao
2018-08-01
Two quantitative methods (emission inventory and isotope ratio analysis) were combined to apportion source contributions of heavy metals entering agricultural soils in the Lihe River watershed (Taihu region, east China). Source apportionment based on the emission inventory method indicated that for Cd, Cr, Cu, Pb, and Zn, the mean percentage input from atmospheric deposition was highest (62-85%), followed by irrigation (12-27%) and fertilization (1-14%). Thus, the heavy metals were derived mainly from industrial activities and traffic emissions. For Ni the combined percentage input from irrigation and fertilization was approximately 20% higher than that from atmospheric deposition, indicating that Ni was mainly derived from agricultural activities. Based on isotope ratio analysis, atmospheric deposition accounted for 57-93% of Pb entering soil, with the mean value of 69.3%, which indicates that this was the major source of Pb entering soil in the study area. The mean contributions of irrigation and fertilization to Pb pollution of soil ranged from 0% to 10%, indicating that they played only a marginally important role. Overall, the results obtained using the two methods were similar. This study provides a reliable approach for source apportionment of heavy metals entering agricultural soils in the study area, and clearly have potential application for future studies in other regions. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Beyene, F.; Knospe, S.; Busch, W.
2015-04-01
Landslide detection and monitoring remain difficult with conventional differential radar interferometry (DInSAR) because most pixels of radar interferograms around landslides are affected by different error sources. These are mainly related to the nature of high radar viewing angles and related spatial distortions (such as overlays and shadows), temporal decorrelations owing to vegetation cover, and speed and direction of target sliding masses. On the other hand, GIS can be used to integrate spatial datasets obtained from many sources (including radar and non-radar sources). In this paper, a GRID data model is proposed to integrate deformation data derived from DInSAR processing with other radar origin data (coherence, layover and shadow, slope and aspect, local incidence angle) and external datasets collected from field study of landslide sites and other sources (geology, geomorphology, hydrology). After coordinate transformation and merging of data, candidate landslide representing pixels of high quality radar signals were filtered out by applying a GIS based multicriteria filtering analysis (GIS-MCFA), which excludes grid points in areas of shadow and overlay, low coherence, non-detectable and non-landslide deformations, and other possible sources of errors from the DInSAR data processing. At the end, the results obtained from GIS-MCFA have been verified by using the external datasets (existing landslide sites collected from fieldworks, geological and geomorphologic maps, rainfall data etc.).
Visualization of NO2 emission sources using temporal and spatial pattern analysis in Asia
NASA Astrophysics Data System (ADS)
Schütt, A. M. N.; Kuhlmann, G.; Zhu, Y.; Lipkowitsch, I.; Wenig, M.
2016-12-01
Nitrogen dioxide (NO2) is an indicator for population density and level of development, but the contributions of the different emission sources to the overall concentrations remains mostly unknown. In order to allocate fractions of OMI NO2 to emission types, we investigate several temporal cycles and regional patterns.Our analysis is based on daily maps of tropospheric NO2 vertical column densities (VCDs) from the Ozone Monitoring Instrument (OMI). The data set is mapped to a high resolution grid by a histopolation algorithm. This algorithm is based on a continuous parabolic spline, producing more realistic smooth distributions while reproducing the measured OMI values when integrating over ground pixel areas.In the resulting sequence of zoom in maps, we analyze weekly and annual cycles for cities, countryside and highways in China, Japan and Korea Republic and look for patterns and trends and compare the derived results to emission sources in Middle Europe and North America. Due to increased heating in winter compared to summer and more traffic during the week than on Sundays, we dissociate traffic, heating and power plants and visualized maps with different sources. We will also look into the influence of emission control measures during big events like the Olympic Games 2008 and the World Expo 2010 as a possibility to confirm our classification of NO2 emission sources.
de Loë, R C; Murray, D; Michaels, S; Plummer, R
2016-07-01
Organizations at the local and regional scales often face the challenge of developing policy mechanisms rapidly and concurrently, whether in response to expanding mandates, newly identified threats, or changes in the political environment. In the Canadian Province of Ontario, rapid, concurrent policy development was considered desirable by 19 regional organizations tasked with developing policies for protection of drinking water sources under very tight and highly prescribed mandates. An explicit policy transfer approach was used by these organizations. Policy transfer refers to using knowledge of policies, programs, and institutions in one context in the development of policies, programs, and institutions in another. This paper assesses three online mechanisms developed to facilitate policy transfer for source water protection in Ontario. Insights are based on a survey of policy planners from the 19 regional organizations who used the three policy transfer tools, supplemented by an analysis of three policies created and transferred among the 19 regional source water protection organizations. Policy planners in the study indicated they had used policy transfer to develop source protection policies for their regions-a finding confirmed by analysis of the text of policies. While the online policy transfer tools clearly facilitated systematic policy transfer, participants still preferred informal, direct exchanges with their peers in other regions over the use of the internet-based policy transfer mechanisms created on their behalf.
As above, so below? Towards understanding inverse models in BCI
NASA Astrophysics Data System (ADS)
Lindgren, Jussi T.
2018-02-01
Objective. In brain-computer interfaces (BCI), measurements of the user’s brain activity are classified into commands for the computer. With EEG-based BCIs, the origins of the classified phenomena are often considered to be spatially localized in the cortical volume and mixed in the EEG. We investigate if more accurate BCIs can be obtained by reconstructing the source activities in the volume. Approach. We contrast the physiology-driven source reconstruction with data-driven representations obtained by statistical machine learning. We explain these approaches in a common linear dictionary framework and review the different ways to obtain the dictionary parameters. We consider the effect of source reconstruction on some major difficulties in BCI classification, namely information loss, feature selection and nonstationarity of the EEG. Main results. Our analysis suggests that the approaches differ mainly in their parameter estimation. Physiological source reconstruction may thus be expected to improve BCI accuracy if machine learning is not used or where it produces less optimal parameters. We argue that the considered difficulties of surface EEG classification can remain in the reconstructed volume and that data-driven techniques are still necessary. Finally, we provide some suggestions for comparing approaches. Significance. The present work illustrates the relationships between source reconstruction and machine learning-based approaches for EEG data representation. The provided analysis and discussion should help in understanding, applying, comparing and improving such techniques in the future.
NASA Astrophysics Data System (ADS)
Massin, F.; Malcolm, A. E.
2017-12-01
Knowing earthquake source mechanisms gives valuable information for earthquake response planning and hazard mitigation. Earthquake source mechanisms can be analyzed using long period waveform inversion (for moderate size sources with sufficient signal to noise ratio) and body-wave first motion polarity or amplitude ratio inversion (for micro-earthquakes with sufficient data coverage). A robust approach that gives both source mechanisms and their associated probabilities across all source scales would greatly simplify the determination of source mechanisms and allow for more consistent interpretations of the results. Following previous work on shift and stack approaches, we develop such a probabilistic source mechanism analysis, using waveforms, which does not require polarity picking. For a given source mechanism, the first period of the observed body-waves is selected for all stations, multiplied by their corresponding theoretical polarity and stacked together. (The first period is found from a manually picked travel time by measuring the central period where the signal power is concentrated, using the second moment of the power spectral density function.) As in other shift and stack approaches, our method is not based on the optimization of an objective function through an inversion. Instead, the power of the polarity-corrected stack is a proxy for the likelihood of the trial source mechanism, with the most powerful stack corresponding to the most likely source mechanism. Using synthetic data, we test our method for robustness to the data coverage, coverage gap, signal to noise ratio, travel-time picking errors and non-double couple component. We then present results for field data in a volcano-tectonic context. Our results are reliable when constrained by 15 body-wavelets, with gap below 150 degrees, signal to noise ratio over 1 and arrival time error below a fifth of the period (0.2T) of the body-wave. We demonstrate that the source scanning approach for source mechanism analysis has similar advantages to waveform inversion (full waveform data, no manual intervention, probabilistic approach) and similar applicability to polarity inversion (any source size, any instrument type).
International patent analysis of water source heat pump based on orbit database
NASA Astrophysics Data System (ADS)
Li, Na
2018-02-01
Using orbit database, this paper analysed the international patents of water source heat pump (WSHP) industry with patent analysis methods such as analysis of publication tendency, geographical distribution, technology leaders and top assignees. It is found that the beginning of the 21st century is a period of rapid growth of the patent application of WSHP. Germany and the United States had done researches and development of WSHP in an early time, but now Japan and China have become important countries of patent applications. China has been developing faster and faster in recent years, but the patents are concentrated in universities and urgent to be transferred. Through an objective analysis, this paper aims to provide appropriate decision references for the development of domestic WSHP industry.
Matsuoka, Yu; Shimizu, Kazuyuki
2013-10-20
It is quite important to understand the basic principle embedded in the main metabolism for the interpretation of the fermentation data. For this, it may be useful to understand the regulation mechanism based on systems biology approach. In the present study, we considered the perturbation analysis together with computer simulation based on the models which include the effects of global regulators on the pathway activation for the main metabolism of Escherichia coli. Main focus is the acetate overflow metabolism and the co-fermentation of multiple carbon sources. The perturbation analysis was first made to understand the nature of the feed-forward loop formed by the activation of Pyk by FDP (F1,6BP), and the feed-back loop formed by the inhibition of Pfk by PEP in the glycolysis. Those together with the effect of transcription factor Cra caused by FDP level affected the glycolysis activity. The PTS (phosphotransferase system) acts as the feed-back system by repressing the glucose uptake rate for the increase in the glucose uptake rate. It was also shown that the increased PTS flux (or glucose consumption rate) causes PEP/PYR ratio to be decreased, and EIIA-P, Cya, cAMP-Crp decreased, where cAMP-Crp in turn repressed TCA cycle and more acetate is formed. This was further verified by the detailed computer simulation. In the case of multiple carbon sources such as glucose and xylose, it was shown that the sequential utilization of carbon sources was observed for wild type, while the co-consumption of multiple carbon sources with slow consumption rates were observed for the ptsG mutant by computer simulation, and this was verified by experiments. Moreover, the effect of a specific gene knockout such as Δpyk on the metabolic characteristics was also investigated based on the computer simulation. Copyright © 2013 Elsevier B.V. All rights reserved.
Hebels, Dennie G A J; Rasche, Axel; Herwig, Ralf; van Westen, Gerard J P; Jennen, Danyel G J; Kleinjans, Jos C S
2016-01-01
When evaluating compound similarity, addressing multiple sources of information to reach conclusions about common pharmaceutical and/or toxicological mechanisms of action is a crucial strategy. In this chapter, we describe a systems biology approach that incorporates analyses of hepatotoxicant data for 33 compounds from three different sources: a chemical structure similarity analysis based on the 3D Tanimoto coefficient, a chemical structure-based protein target prediction analysis, and a cross-study/cross-platform meta-analysis of in vitro and in vivo human and rat transcriptomics data derived from public resources (i.e., the diXa data warehouse). Hierarchical clustering of the outcome scores of the separate analyses did not result in a satisfactory grouping of compounds considering their known toxic mechanism as described in literature. However, a combined analysis of multiple data types may hypothetically compensate for missing or unreliable information in any of the single data types. We therefore performed an integrated clustering analysis of all three data sets using the R-based tool iClusterPlus. This indeed improved the grouping results. The compound clusters that were formed by means of iClusterPlus represent groups that show similar gene expression while simultaneously integrating a similarity in structure and protein targets, which corresponds much better with the known mechanism of action of these toxicants. Using an integrative systems biology approach may thus overcome the limitations of the separate analyses when grouping liver toxicants sharing a similar mechanism of toxicity.
Towards shot-noise limited diffraction experiments with table-top femtosecond hard x-ray sources.
Holtz, Marcel; Hauf, Christoph; Weisshaupt, Jannick; Salvador, Antonio-Andres Hernandez; Woerner, Michael; Elsaesser, Thomas
2017-09-01
Table-top laser-driven hard x-ray sources with kilohertz repetition rates are an attractive alternative to large-scale accelerator-based systems and have found widespread applications in x-ray studies of ultrafast structural dynamics. Hard x-ray pulses of 100 fs duration have been generated at the Cu K α wavelength with a photon flux of up to 10 9 photons per pulse into the full solid angle, perfectly synchronized to the sub-100-fs optical pulses from the driving laser system. Based on spontaneous x-ray emission, such sources display a particular noise behavior which impacts the sensitivity of x-ray diffraction experiments. We present a detailed analysis of the photon statistics and temporal fluctuations of the x-ray flux, together with experimental strategies to optimize the sensitivity of optical pump/x-ray probe experiments. We demonstrate measurements close to the shot-noise limit of the x-ray source.
Towards shot-noise limited diffraction experiments with table-top femtosecond hard x-ray sources
Holtz, Marcel; Hauf, Christoph; Weisshaupt, Jannick; Salvador, Antonio-Andres Hernandez; Woerner, Michael; Elsaesser, Thomas
2017-01-01
Table-top laser-driven hard x-ray sources with kilohertz repetition rates are an attractive alternative to large-scale accelerator-based systems and have found widespread applications in x-ray studies of ultrafast structural dynamics. Hard x-ray pulses of 100 fs duration have been generated at the Cu Kα wavelength with a photon flux of up to 109 photons per pulse into the full solid angle, perfectly synchronized to the sub-100-fs optical pulses from the driving laser system. Based on spontaneous x-ray emission, such sources display a particular noise behavior which impacts the sensitivity of x-ray diffraction experiments. We present a detailed analysis of the photon statistics and temporal fluctuations of the x-ray flux, together with experimental strategies to optimize the sensitivity of optical pump/x-ray probe experiments. We demonstrate measurements close to the shot-noise limit of the x-ray source. PMID:28795079
Research on pathogens at Great Lakes beaches: sampling, influential factors, and potential sources
,
2013-01-01
The overall mission of this work is to provide science-based information and methods that will allow beach managers to more accurately make beach closure and advisory decisions, understand the sources and physical processes affecting beach contaminants, and understand how science-based information can be used to mitigate and restore beaches and protect the public. The U.S. Geological Survey (USGS), in collaboration with many Federal, State, and local agencies and universities, has conducted research on beach health issues in the Great Lakes Region for more than a decade. The work consists of four science elements that align with the USGS Beach Health Initiative Mission: real-time assessments of water quality; coastal processes; pathogens and source tracking; and data analysis, interpretation, and communication. The ongoing or completed research for the pathogens and source tracking topic is described in this fact sheet.
Model of a thin film optical fiber fluorosensor
NASA Technical Reports Server (NTRS)
Egalon, Claudio O.; Rogowski, Robert S.
1991-01-01
The efficiency of core-light injection from sources in the cladding of an optical fiber is modeled analytically by means of the exact field solution of a step-profile fiber. The analysis is based on the techniques by Marcuse (1988) in which the sources are treated as infinitesimal electric currents with random phase and orientation that excite radiation fields and bound modes. Expressions are developed based on an infinite cladding approximation which yield the power efficiency for a fiber coated with fluorescent sources in the core/cladding interface. Marcuse's results are confirmed for the case of a weakly guiding cylindrical fiber with fluorescent sources uniformly distributed in the cladding, and the power efficiency is shown to be practically constant for variable wavelengths and core radii. The most efficient fibers have the thin film located at the core/cladding boundary, and fibers with larger differences in the indices of refraction are shown to be the most efficient.
Flow diagram analysis of electrical fatalities in construction industry.
Chi, Chia-Fen; Lin, Yuan-Yuan; Ikhwan, Mohamad
2012-01-01
The current study reanalyzed 250 electrical fatalities in the construction industry from 1996 to 2002 into seven patterns based on source of electricity (power line, energized equipment, improperly installed or damaged equipment), direct contact or indirect contact through some source of injury (boom vehicle, metal bar or pipe, and other conductive material). Each fatality was coded in terms of age, company size, experience, performing tasks, source of injury, accident cause and hazard pattern. The Chi-square Automatic Interaction Detector (CHAID) was applied to the coded data of the fatal electrocution to find a subset of predictors that might derive meaningful classifications or accidents scenarios. A series of Flow Diagrams was constructed based on CHAID result to illustrate the flow of electricity travelling from electrical source to human body. Each of the flow diagrams can be directly linked with feasible prevention strategies by cutting the flow of electricity.
NASA Astrophysics Data System (ADS)
Huan, Huan; Wang, Jinsheng; Lai, Desheng; Teng, Yanguo; Zhai, Yuanzheng
2015-05-01
Well vulnerability assessment is essential for groundwater source protection. A quantitative approach to assess well vulnerability in a well capture zone is presented, based on forward solute transport modeling. This method was applied to three groundwater source areas (Jiuzhan, Hadawan and Songyuanhada) in Jilin City, northeast China. The ratio of the maximum contaminant concentration at the well to the released concentration at the contamination source ( c max/ c 0) was determined as the well vulnerability indicator. The results indicated that well vulnerability was higher close to the pumping well. The well vulnerability in each groundwater source area was low. Compared with the other two source areas, the cone of depression at Jiuzhan resulted in higher spatial variability of c max/ c 0 and lower minimum c max/ c 0 by three orders of magnitude. Furthermore, a sensitivity analysis indicated that the denitrification rate in the aquifer was the most sensitive with respect to well vulnerability. A process to derive a NO3-N concentration at the pumping well is presented, based on determining the maximum nitrate loading limit to satisfy China's drinking-water quality standards. Finally, the advantages, disadvantages and prospects for improving the precision of this well vulnerability assessment approach are discussed.
Acoustic Emission Source Location Using a Distributed Feedback Fiber Laser Rosette
Huang, Wenzhu; Zhang, Wentao; Li, Fang
2013-01-01
This paper proposes an approach for acoustic emission (AE) source localization in a large marble stone using distributed feedback (DFB) fiber lasers. The aim of this study is to detect damage in structures such as those found in civil applications. The directional sensitivity of DFB fiber laser is investigated by calculating location coefficient using a method of digital signal analysis. In this, autocorrelation is used to extract the location coefficient from the periodic AE signal and wavelet packet energy is calculated to get the location coefficient of a burst AE source. Normalization is processed to eliminate the influence of distance and intensity of AE source. Then a new location algorithm based on the location coefficient is presented and tested to determine the location of AE source using a Delta (Δ) DFB fiber laser rosette configuration. The advantage of the proposed algorithm over the traditional methods based on fiber Bragg Grating (FBG) include the capability of: having higher strain resolution for AE detection and taking into account two different types of AE source for location. PMID:24141266
NASA Astrophysics Data System (ADS)
Petroselli, Chiara; Crocchianti, Stefano; Moroni, Beatrice; Castellini, Silvia; Selvaggi, Roberta; Nava, Silvia; Calzolai, Giulia; Lucarelli, Franco; Cappelletti, David
2018-05-01
In this paper, we combined a Potential Source Contribution Function (PSCF) analysis of daily chemical aerosol composition data with hourly aerosol size distributions with the aim to disentangle the major source areas during a complex and fast modulating advection event impacting on Central Italy in 2013. Chemical data include an ample set of metals obtained by Proton Induced X-ray Emission (PIXE), main soluble ions from ionic chromatography and elemental and organic carbon (EC, OC) obtained by thermo-optical measurements. Size distributions have been recorded with an optical particle counter for eight calibrated size classes in the 0.27-10 μm range. We demonstrated the usefulness of the approach by the positive identification of two very different source areas impacting during the transport event. In particular, biomass burning from Eastern Europe and desert dust from Sahara sources have been discriminated based on both chemistry and size distribution time evolution. Hourly BT provided the best results in comparison to 6 h or 24 h based calculations.
First operation and effect of a new tandem-type ion source based on electron cyclotron resonance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kato, Yushi, E-mail: kato@eei.eng.osaka-u.ac.jp; Kimura, Daiju; Yano, Keisuke
A new tandem type source has been constructed on the basis of electron cyclotron resonance plasma for producing synthesized ion beams in Osaka University. Magnetic field in the first stage consists of all permanent magnets, i.e., cylindrically comb shaped one, and that of the second stage consists of a pair of mirror coil, a supplemental coil and the octupole magnets. Both stage plasmas can be individually operated, and produced ions in which is energy controlled by large bore extractor also can be transported from the first to the second stage. We investigate the basic operation and effects of the tandemmore » type electron cyclotron resonance ion source (ECRIS). Analysis of ion beams and investigation of plasma parameters are conducted on produced plasmas in dual plasmas operation as well as each single operation. We describe construction and initial experimental results of the new tandem type ion source based on ECRIS with wide operation window for aiming at producing synthesized ion beams as this new source can be a universal source in future.« less
Sustainable Management Approaches and Revitalization Tools - electronic (SMARTe), is an open-source, web-based, decision support system for developing and evaluating future reuse scenarios for potentially contaminated land. SMARTe contains resources and analysis tools for all asp...
Sustainable Management Approaches and Revitalization Tools - electronic (SMARTe), is an open-source, web-based, decisions support system for developing and evaluating future reuse scenarios for potentially contaminated land. SMARTe contains resources and analysis tools for all a...
Sustainable Management Approaches and Revitalization Tools-electronic (SMARTe), is an open-source, web-based, decision support system for developing and evaluating future reuse scenarios for potentially contaminated land. SMARTe contains guidance and analysis tools for all aspect...
Conversion of the Aerodynamic Preliminary Analysis System (APAS) to an IBM PC Compatible Format
NASA Technical Reports Server (NTRS)
Kruep, John M.
1995-01-01
The conversion of the Aerodynamic Preliminary Analysis System (APAS) software from a Silicon Graphics UNIX-based platform to a DOS-based IBM PC compatible is discussed. Relevant background information is given, followed by a discussion of the steps taken to accomplish the conversion and a discussion of the type of problems encountered during the conversion. A brief comparison of aerodynamic data obtained using APAS with data from another source is also made.
An image analysis toolbox for high-throughput C. elegans assays
Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.
2012-01-01
We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656
Source positions from VLBI combined solution
NASA Astrophysics Data System (ADS)
Bachmann, S.; Thaller, D.; Engelhardt, G.
2014-12-01
The IVS Combination Center at BKG is primarily responsible for combined Earth Orientation Parameter (EOP) products and the generation of a terrestrial reference frame based on VLBI observations (VTRF). The procedure is based on the combination of normal equations provided by six IVS Analysis Centers (AC). Since more and more ACs also provide source positions in the normal equations - beside EOPs and station coordinates - an estimation of these parameters is possible and should be investigated. In the past, the International Celestial Reference Frame (ICRF) was not generated as a combined solution from several individual solutions, but was based on a single solution provided by one AC. The presentation will give an overview on the combination strategy and the possibilities for combined source position determination. This includes comparisons with existing catalogs, quality estimation and possibilities of rigorous combination of EOP, TRF and CRF in one combination process.
NASA Astrophysics Data System (ADS)
Rantz, Robert; Roundy, Shad
2016-04-01
A tremendous amount of research has been performed on the design and analysis of vibration energy harvester architectures with the goal of optimizing power output; most studies assume idealized input vibrations without paying much attention to whether such idealizations are broadly representative of real sources. These "idealized input signals" are typically derived from the expected nature of the vibrations produced from a given source. Little work has been done on corroborating these expectations by virtue of compiling a comprehensive list of vibration signals organized by detailed classifications. Vibration data representing 333 signals were collected from the NiPS Laboratory "Real Vibration" database, processed, and categorized according to the source of the signal (e.g. animal, machine, etc.), the number of dominant frequencies, the nature of the dominant frequencies (e.g. stationary, band-limited noise, etc.), and other metrics. By categorizing signals in this way, the set of idealized vibration inputs commonly assumed for harvester input can be corroborated and refined, and heretofore overlooked vibration input types have motivation for investigation. An initial qualitative analysis of vibration signals has been undertaken with the goal of determining how often a standard linear oscillator based harvester is likely the optimal architecture, and how often a nonlinear harvester with a cubic stiffness function might provide improvement. Although preliminary, the analysis indicates that in at least 23% of cases, a linear harvester is likely optimal and in no more than 53% of cases would a nonlinear cubic stiffness based harvester provide improvement.
Castro, Eduardo; Martínez-Ramón, Manel; Pearlson, Godfrey; Sui, Jing; Calhoun, Vince D.
2011-01-01
Pattern classification of brain imaging data can enable the automatic detection of differences in cognitive processes of specific groups of interest. Furthermore, it can also give neuroanatomical information related to the regions of the brain that are most relevant to detect these differences by means of feature selection procedures, which are also well-suited to deal with the high dimensionality of brain imaging data. This work proposes the application of recursive feature elimination using a machine learning algorithm based on composite kernels to the classification of healthy controls and patients with schizophrenia. This framework, which evaluates nonlinear relationships between voxels, analyzes whole-brain fMRI data from an auditory task experiment that is segmented into anatomical regions and recursively eliminates the uninformative ones based on their relevance estimates, thus yielding the set of most discriminative brain areas for group classification. The collected data was processed using two analysis methods: the general linear model (GLM) and independent component analysis (ICA). GLM spatial maps as well as ICA temporal lobe and default mode component maps were then input to the classifier. A mean classification accuracy of up to 95% estimated with a leave-two-out cross-validation procedure was achieved by doing multi-source data classification. In addition, it is shown that the classification accuracy rate obtained by using multi-source data surpasses that reached by using single-source data, hence showing that this algorithm takes advantage of the complimentary nature of GLM and ICA. PMID:21723948
Use of cyclodextrin-based polymer for patulin analysis in apple juice
USDA-ARS?s Scientific Manuscript database
Penicillium expansum, one of the patulin producing fungi that causes decay on apple, is recognized as the main source of patulin contamination on apple and apple products. The widely used method for patulin analysis in apple juice is liquid-liquid extraction with ethyl acetate followed by HPLC-UV or...
2014-04-11
particle location files for each source (hours) dti : time step in seconds horzmix: CONSTANT = use the value of horcon...however, if leg lengths are short. Extreme values of D/Lo can occur. We will handle these by assigning a maximum to the output. This is discussed by
Evaluating Cognitive Theory: A Joint Modeling Approach Using Responses and Response Times
ERIC Educational Resources Information Center
Klein Entink, Rinke H.; Kuhn, Jorg-Tobias; Hornke, Lutz F.; Fox, Jean-Paul
2009-01-01
In current psychological research, the analysis of data from computer-based assessments or experiments is often confined to accuracy scores. Response times, although being an important source of additional information, are either neglected or analyzed separately. In this article, a new model is developed that allows the simultaneous analysis of…
NASA Technical Reports Server (NTRS)
Kniffen, Donald A.; Elliott, William W.
1999-01-01
The final report consists of summaries of work proposed, work accomplished, papers and presentations published and continuing work regarding the cooperative agreement. The work under the agreement is based on high energy gamma ray source data analysis collected from the Energetic Gamma-Ray Experiment Telescope (EGRET).
ERIC Educational Resources Information Center
Liao, Yuen-kuang Cliff; Chang, Huei-wen; Chen, Yu-wen
2008-01-01
A meta-analysis was performed to synthesize existing research comparing the effects of computer applications (i.e., computer-assisted instruction, computer simulations, and Web-based learning) versus traditional instruction on elementary school students' achievement in Taiwan. Forty-eight studies were located from four sources, and their…
Job Descriptions and Organizational Analysis for Hospitals and Related Health Services.
ERIC Educational Resources Information Center
Manpower Administration (DOL), Washington, DC. U.S. Training and Employment Service.
Descriptions of 238 hospital payroll jobs are based on a job analysis study of all jobs within specified departments of 27 hospitals, and are intended for use of public employment offices and as a source of occupational information for hospital personnel administrators. Under major sections corresponding to the hospital divisions of…
USDA-ARS?s Scientific Manuscript database
Compound-specific isotopic analysis of amino acids (CSIA-AA) has emerged in the last decade as a powerful approach for tracing the origins and fate of nitrogen in ecological and biogeochemical studies. This approach is based on the empirical knowledge that source AAs (i.e., phenylalanine), fractiona...
New paradigms for Salmonella source attribution based on microbial subtyping.
Mughini-Gras, Lapo; Franz, Eelco; van Pelt, Wilfrid
2018-05-01
Microbial subtyping is the most common approach for Salmonella source attribution. Typically, attributions are computed using frequency-matching models like the Dutch and Danish models based on phenotyping data (serotyping, phage-typing, and antimicrobial resistance profiling). Herewith, we critically review three major paradigms facing Salmonella source attribution today: (i) the use of genotyping data, particularly Multi-Locus Variable Number of Tandem Repeats Analysis (MLVA), which is replacing traditional Salmonella phenotyping beyond serotyping; (ii) the integration of case-control data into source attribution to improve risk factor identification/characterization; (iii) the investigation of non-food sources, as attributions tend to focus on foods of animal origin only. Population genetics models or simplified MLVA schemes may provide feasible options for source attribution, although there is a strong need to explore novel modelling options as we move towards whole-genome sequencing as the standard. Classical case-control studies are enhanced by incorporating source attribution results, as individuals acquiring salmonellosis from different sources have different associated risk factors. Thus, the more such analyses are performed the better Salmonella epidemiology will be understood. Reparametrizing current models allows for inclusion of sources like reptiles, the study of which improves our understanding of Salmonella epidemiology beyond food to tackle the pathogen in a more holistic way. Copyright © 2017 Elsevier Ltd. All rights reserved.
Methods for determining remanent and total magnetisations of magnetic sources - a review
NASA Astrophysics Data System (ADS)
Clark, David A.
2014-07-01
Assuming without evidence that magnetic sources are magnetised parallel to the geomagnetic field can seriously mislead interpretation and can result in drill holes missing their targets. This article reviews methods that are available for estimating, directly or indirectly, the natural remanent magnetisation (NRM) and total magnetisation of magnetic sources, noting the strengths and weaknesses of each approach. These methods are: (i) magnetic property measurements of samples; (ii) borehole magnetic measurements; (iii) inference of properties from petrographic/petrological information, supplemented by palaeomagnetic databases; (iv) constrained modelling/inversion of magnetic sources; (v) direct inversions of measured or calculated vector and gradient tensor data for simple sources; (vi) retrospective inference of magnetisation of a mined deposit by comparing magnetic data acquired pre- and post-mining; (vii) combined analysis of magnetic and gravity anomalies using Poisson's theorem; (viii) using a controlled magnetic source to probe the susceptibility distribution of the subsurface; (ix) Helbig-type analysis of gridded vector components, gradient tensor elements, and tensor invariants; (x) methods based on reduction to the pole and related transforms; and (xi) remote in situ determination of NRM direction, total magnetisation direction and Koenigsberger ratio by deploying dual vector magnetometers or a single combined gradiometer/magnetometer to monitor local perturbation of natural geomagnetic variations, operating in base station mode within a magnetic anomaly of interest. Characterising the total and remanent magnetisations of sources is important for several reasons. Knowledge of total magnetisation is often critical for accurate determination of source geometry and position. Knowledge of magnetic properties such as magnetisation intensity and Koenigsberger ratio constrains the likely magnetic mineralogy (composition and grain size) of a source, which gives an indication of its geological nature. Determining the direction of a stable ancient remanence gives an indication of the age of magnetisation, which provides useful information about the geological history of the source and its environs.
NASA Astrophysics Data System (ADS)
Owens, P. R.; Libohova, Z.; Seybold, C. A.; Wills, S. A.; Peaslee, S.; Beaudette, D.; Lindbo, D. L.
2017-12-01
The measurement errors and spatial prediction uncertainties of soil properties in the modeling community are usually assessed against measured values when available. However, of equal importance is the assessment of errors and uncertainty impacts on cost benefit analysis and risk assessments. Soil pH was selected as one of the most commonly measured soil properties used for liming recommendations. The objective of this study was to assess the error size from different sources and their implications with respect to management decisions. Error sources include measurement methods, laboratory sources, pedotransfer functions, database transections, spatial aggregations, etc. Several databases of measured and predicted soil pH were used for this study including the United States National Cooperative Soil Survey Characterization Database (NCSS-SCDB), the US Soil Survey Geographic (SSURGO) Database. The distribution of errors among different sources from measurement methods to spatial aggregation showed a wide range of values. The greatest RMSE of 0.79 pH units was from spatial aggregation (SSURGO vs Kriging), while the measurement methods had the lowest RMSE of 0.06 pH units. Assuming the order of data acquisition based on the transaction distance i.e. from measurement method to spatial aggregation the RMSE increased from 0.06 to 0.8 pH units suggesting an "error propagation". This has major implications for practitioners and modeling community. Most soil liming rate recommendations are based on 0.1 pH unit increments, while the desired soil pH level increments are based on 0.4 to 0.5 pH units. Thus, even when the measured and desired target soil pH are the same most guidelines recommend 1 ton ha-1 lime, which translates in 111 ha-1 that the farmer has to factor in the cost-benefit analysis. However, this analysis need to be based on uncertainty predictions (0.5-1.0 pH units) rather than measurement errors (0.1 pH units) which would translate in 555-1,111 investment that need to be assessed against the risk. The modeling community can benefit from such analysis, however, error size and spatial distribution for global and regional predictions need to be assessed against the variability of other drivers and impact on management decisions.
Renewable Energy Opportunities at Fort Campbell, Tennessee/Kentucky
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hand, James R.; Horner, Jacob A.; Kora, Angela R.
This document provides an overview of renewable resource potential at Fort Campbell, based primarily upon analysis of secondary data sources supplemented with limited on-site evaluations. This effort focuses on grid-connected generation of electricity from renewable energy sources and also on ground source heat pumps for heating and cooling buildings. The effort was funded by the U.S. Army Installation Management Command (IMCOM) as follow-on to the 2005 Department of Defense (DoD) Renewables Assessment. The site visit to Fort Campbell took place on June 10, 2010.
Electrical source of pseudothermal light
NASA Astrophysics Data System (ADS)
Kuusela, Tom A.
2018-06-01
We describe a simple and compact electrical version of a pseudothermal light source. The source is based on electrical white noise whose spectral properties are tailored by analog filters. This signal is used to drive a light-emitting diode. The type of second-order coherence of the output light can be either Gaussian or Lorentzian, and the intensity distribution can be either Gaussian or non-Gaussian. The output light field is similar in all viewing angles, and thus, there is no need for a small aperture or optical fiber in temporal coherence analysis.
An efficient algorithm for the retarded time equation for noise from rotating sources
NASA Astrophysics Data System (ADS)
Loiodice, S.; Drikakis, D.; Kokkalis, A.
2018-01-01
This study concerns modelling of noise emanating from rotating sources such as helicopter rotors. We present an accurate and efficient algorithm for the solution of the retarded time equation, which can be used both in subsonic and supersonic flow regimes. A novel approach for the search of the roots of the retarded time function was developed based on considerations of the kinematics of rotating sources and of the bifurcation analysis of the retarded time function. It is shown that the proposed algorithm is faster than the classical Newton and Brent methods, especially in the presence of sources rotating supersonically.
Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
An Android malware detection system based on machine learning
NASA Astrophysics Data System (ADS)
Wen, Long; Yu, Haiyang
2017-08-01
The Android smartphone, with its open source character and excellent performance, has attracted many users. However, the convenience of the Android platform also has motivated the development of malware. The traditional method which detects the malware based on the signature is unable to detect unknown applications. The article proposes a machine learning-based lightweight system that is capable of identifying malware on Android devices. In this system we extract features based on the static analysis and the dynamitic analysis, then a new feature selection approach based on principle component analysis (PCA) and relief are presented in the article to decrease the dimensions of the features. After that, a model will be constructed with support vector machine (SVM) for classification. Experimental results show that our system provides an effective method in Android malware detection.
Recent advances in proteomics of cereals.
Bansal, Monika; Sharma, Madhu; Kanwar, Priyanka; Goyal, Aakash
Cereals contribute a major part of human nutrition and are considered as an integral source of energy for human diets. With genomic databases already available in cereals such as rice, wheat, barley, and maize, the focus has now moved to proteome analysis. Proteomics studies involve the development of appropriate databases based on developing suitable separation and purification protocols, identification of protein functions, and can confirm their functional networks based on already available data from other sources. Tremendous progress has been made in the past decade in generating huge data-sets for covering interactions among proteins, protein composition of various organs and organelles, quantitative and qualitative analysis of proteins, and to characterize their modulation during plant development, biotic, and abiotic stresses. Proteomics platforms have been used to identify and improve our understanding of various metabolic pathways. This article gives a brief review of efforts made by different research groups on comparative descriptive and functional analysis of proteomics applications achieved in the cereal science so far.
A supertree of early tetrapods.
Ruta, Marcello; Jeffery, Jonathan E; Coates, Michael I
2003-01-01
A genus-level supertree for early tetrapods is built using a matrix representation of 50 source trees. The analysis of all combined trees delivers a long-stemmed topology in which most taxonomic groups are assigned to the tetrapod stem. A second analysis, which excludes source trees superseded by more comprehensive studies, supports a deep phylogenetic split between lissamphibian and amniote total groups. Instances of spurious groups are rare in both analyses. The results of the pruned second analysis are mostly comparable with those of a recent, character-based and large-scale phylogeny of Palaeozoic tetrapods. Outstanding areas of disagreement include the branching sequence of lepospondyls and the content of the amniote crown group, in particular the placement of diadectomorphs as stem diapsids. Supertrees are unsurpassed in their ability to summarize relationship patterns from multiple independent topologies. Therefore, they might be used as a simple test of the degree of corroboration of nodes in the contributory analyses. However, we urge caution in using them as a replacement for character-based cladograms and for inferring macroevolutionary patterns. PMID:14667343
NASA Technical Reports Server (NTRS)
Stoll, John C.
1995-01-01
The performance of an unaided attitude determination system based on GPS interferometry is examined using linear covariance analysis. The modelled system includes four GPS antennae onboard a gravity gradient stabilized spacecraft, specifically the Air Force's RADCAL satellite. The principal error sources are identified and modelled. The optimal system's sensitivities to these error sources are examined through an error budget and by varying system parameters. The effects of two satellite selection algorithms, Geometric and Attitude Dilution of Precision (GDOP and ADOP, respectively) are examined. The attitude performance of two optimal-suboptimal filters is also presented. Based on this analysis, the limiting factors in attitude accuracy are the knowledge of the relative antenna locations, the electrical path lengths from the antennae to the receiver, and the multipath environment. The performance of the system is found to be fairly insensitive to torque errors, orbital inclination, and the two satellite geometry figures-of-merit tested.
Creating system engineering products with executable models in a model-based engineering environment
NASA Astrophysics Data System (ADS)
Karban, Robert; Dekens, Frank G.; Herzig, Sebastian; Elaasar, Maged; Jankevičius, Nerijus
2016-08-01
Applying systems engineering across the life-cycle results in a number of products built from interdependent sources of information using different kinds of system level analysis. This paper focuses on leveraging the Executable System Engineering Method (ESEM) [1] [2], which automates requirements verification (e.g. power and mass budget margins and duration analysis of operational modes) using executable SysML [3] models. The particular value proposition is to integrate requirements, and executable behavior and performance models for certain types of system level analysis. The models are created with modeling patterns that involve structural, behavioral and parametric diagrams, and are managed by an open source Model Based Engineering Environment (named OpenMBEE [4]). This paper demonstrates how the ESEM is applied in conjunction with OpenMBEE to create key engineering products (e.g. operational concept document) for the Alignment and Phasing System (APS) within the Thirty Meter Telescope (TMT) project [5], which is under development by the TMT International Observatory (TIO) [5].
Sensitivity tests to define the source apportionment performance criteria in the DeltaSA tool
NASA Astrophysics Data System (ADS)
Pernigotti, Denise; Belis, Claudio A.
2017-04-01
Identification and quantification of the contribution of emission sources to a given area is a key task for the design of abatement strategies. Moreover, European member states are obliged to report this kind of information for zones where the pollution levels exceed the limit values. At present, little is known about the performance and uncertainty of the variety of methodologies used for source apportionment and the comparability between the results of studies using different approaches. The source apportionment Delta (SA Delta) is a tool developed by the EC-JRC to support the particulate matter source apportionment modellers in the identification of sources (for factor analysis studies) and/or in the measure of their performance. The source identification is performed by the tool measuring the proximity of any user chemical profile to preloaded repository data (SPECIATE and SPECIEUROPE). The model performances criteria are based on standard statistical indexes calculated by comparing participants' source contribute estimates and their time series with preloaded references data. Those preloaded data refer to previous European SA intercomparison exercises: the first with real world data (22 participants), the second with synthetic data (25 participants) and the last with real world data which was also extended to Chemical Transport Models (38 receptor models and 4 CTMs). The references used for the model performances are 'true' (predefined by JRC) for the synthetic while they are calculated as ensemble average of the participants' results in real world intercomparisons. The candidates used for each source ensemble reference calculation were selected among participants results based on a number of consistency checks plus the similarity between their chemical profiles to the repository measured data. The estimation of the ensemble reference uncertainty is crucial in order to evaluate the users' performances against it. For this reason a sensitivity analysis on different methods to estimate the ensemble references' uncertainties was performed re-analyzing the synthetic intercomparison dataset, the only one where 'true' reference and ensemble reference contributions were both present. The Delta SA is now available on-line and will be presented, with a critical discussion of the sensitivity analysis on the ensemble reference uncertainty. In particular the grade of among participants mutual agreement on the presence of a certain source should be taken into account. Moreover also the importance of the synthetic intercomparisons in order to catch receptor models common biases will be stressed.
Yang, Liping; Mei, Kun; Liu, Xingmei; Wu, Laosheng; Zhang, Minghua; Xu, Jianming; Wang, Fan
2013-08-01
Water quality degradation in river systems has caused great concerns all over the world. Identifying the spatial distribution and sources of water pollutants is the very first step for efficient water quality management. A set of water samples collected bimonthly at 12 monitoring sites in 2009 and 2010 were analyzed to determine the spatial distribution of critical parameters and to apportion the sources of pollutants in Wen-Rui-Tang (WRT) river watershed, near the East China Sea. The 12 monitoring sites were divided into three administrative zones of urban, suburban, and rural zones considering differences in land use and population density. Multivariate statistical methods [one-way analysis of variance, principal component analysis (PCA), and absolute principal component score-multiple linear regression (APCS-MLR) methods] were used to investigate the spatial distribution of water quality and to apportion the pollution sources. Results showed that most water quality parameters had no significant difference between the urban and suburban zones, whereas these two zones showed worse water quality than the rural zone. Based on PCA and APCS-MLR analysis, urban domestic sewage and commercial/service pollution, suburban domestic sewage along with fluorine point source pollution, and agricultural nonpoint source pollution with rural domestic sewage pollution were identified to the main pollution sources in urban, suburban, and rural zones, respectively. Understanding the water pollution characteristics of different administrative zones could put insights into effective water management policy-making especially in the area across various administrative zones.
NASA Astrophysics Data System (ADS)
Miola, Apollonia; Ciuffo, Biagio
2011-04-01
Maritime transport plays a central role in the transport sector's sustainability debate. Its contribution to air pollution and greenhouse gases is significant. An effective policy strategy to regulate air emissions requires their robust estimation in terms of quantification and location. This paper provides a critical analysis of the ship emission modelling approaches and data sources available, identifying their limits and constraints. It classifies the main methodologies on the basis of the approach followed (bottom-up or top-down) for the evaluation and geographic characterisation of emissions. The analysis highlights the uncertainty of results from the different methods. This is mainly due to the level of uncertainty connected with the sources of information that are used as inputs to the different studies. This paper describes the sources of the information required for these analyses, paying particular attention to AIS data and to the possible problems associated with their use. One way of reducing the overall uncertainty in the results could be the simultaneous use of different sources of information. This paper presents an alternative methodology based on this approach. As a final remark, it can be expected that new approaches to the problem together with more reliable data sources over the coming years could give more impetus to the debate on the global impact of maritime traffic on the environment that, currently, has only reached agreement via the "consensus" estimates provided by IMO (2009).
Cross-industry Performance Modeling: Toward Cooperative Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reece, Wendy Jane; Blackman, Harold Stabler
One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experiencemore » and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.« less
Cross-Industry Performance Modeling: Toward Cooperative Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
H. S. Blackman; W. J. Reece
One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operatingmore » experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.« less
NASA Astrophysics Data System (ADS)
Jin, Yan; Ye, Chen; Luo, Xiao; Yuan, Hui; Cheng, Changgui
2017-05-01
In order to improve the inclusion removal property of the tundish, the mathematic model for simulation of the flow field sourced from inner-swirl-type turbulence controller (ISTTC) was developed, in which there were six blades arranged with an eccentric angle (θ) counterclockwise. Based on the mathematical and water model, the effect of inclusion removal in the swirling flow field formed by ISTTC was analyzed. It was found that ISTTC had got the better effect of inhibiting turbulence in tundish than traditional turbulence inhibitor (TI). As the blades eccentric angle (θ) of ISTTC increasing, the intensity of swirling flow above it increased. The maximum rotate speed of fluid in swirling flow band driven by ISTTC (θ=45°) was equal to 25 rmp. Based on the force analysis of inclusion in swirling flow sourced from ISTTC, the removal effect of medium size inclusion by ISTTC was attributed to the centripetal force (Fct) of swirling flow, but removal effect of ISTTC to small size inclusion was more depend on its better turbulence depression behavior.
Monitoring Seismo-volcanic and Infrasonic Signals at Volcanoes: Mt. Etna Case Study
NASA Astrophysics Data System (ADS)
Cannata, Andrea; Di Grazia, Giuseppe; Aliotta, Marco; Cassisi, Carmelo; Montalto, Placido; Patanè, Domenico
2013-11-01
Volcanoes generate a broad range of seismo-volcanic and infrasonic signals, whose features and variations are often closely related to volcanic activity. The study of these signals is hence very useful in the monitoring and investigation of volcano dynamics. The analysis of seismo-volcanic and infrasonic signals requires specifically developed techniques due to their unique characteristics, which are generally quite distinct compared with tectonic and volcano-tectonic earthquakes. In this work, we describe analysis methods used to detect and locate seismo-volcanic and infrasonic signals at Mt. Etna. Volcanic tremor sources are located using a method based on spatial seismic amplitude distribution, assuming propagation in a homogeneous medium. The tremor source is found by calculating the goodness of the linear regression fit ( R 2) of the log-linearized equation of the seismic amplitude decay with distance. The location method for long-period events is based on the joint computation of semblance and R 2 values, and the location method of very long-period events is based on the application of radial semblance. Infrasonic events and tremor are located by semblance-brightness- and semblance-based methods, respectively. The techniques described here can also be applied to other volcanoes and do not require particular network geometries (such as arrays) but rather simple sparse networks. Using the source locations of all the considered signals, we were able to reconstruct the shallow plumbing system (above sea level) during 2011.
Common source cascode amplifiers for integrating IR-FPA applications
NASA Technical Reports Server (NTRS)
Woolaway, James T.; Young, Erick T.
1989-01-01
Space based astronomical infrared measurements present stringent performance requirements on the infrared detector arrays and their associated readout circuitry. To evaluate the usefulness of commercial CMOS technology for astronomical readout applications a theoretical and experimental evaluation was performed on source follower and common-source cascode integrating amplifiers. Theoretical analysis indicates that for conditions where the input amplifier integration capacitance is limited by the detectors capacitance the input referred rms noise electrons of each amplifier should be equivalent. For conditions of input gate limited capacitance the source follower should provide lower noise. Measurements of test circuits containing both source follower and common source cascode circuits showed substantially lower input referred noise for the common-source cascode input circuits. Noise measurements yielded 4.8 input referred rms noise electrons for an 8.5 minute integration. The signal and noise gain of the common-source cascode amplifier appears to offer substantial advantages in acheiving predicted noise levels.
The Commercial Open Source Business Model
NASA Astrophysics Data System (ADS)
Riehle, Dirk
Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.
Zhang, Wei; Zhang, Shucai; Wan, Chao; Yue, Dapan; Ye, Youbin; Wang, Xuejun
2008-06-01
Diagnostic ratios and multivariate analysis were utilized to apportion polycyclic aromatic hydrocarbon (PAH) sources for road runoff, road dust, rain and canopy throughfall based on samples collected in an urban area of Beijing, China. Three sampling sites representing vehicle lane, bicycle lane and branch road were selected. For road runoff and road dust, vehicular emission and coal combustion were identified as major sources, and the source contributions varied among the sampling sites. For rain, three principal components were apportioned representing coal/oil combustion (54%), vehicular emission (34%) and coking (12%). For canopy throughfall, vehicular emission (56%), coal combustion (30%) and oil combustion (14%) were identified as major sources. Overall, the PAH's source for road runoff mainly reflected that for road dust. Despite site-specific sources, the findings at the study area provided a general picture of PAHs sources for the road runoff system in urban area of Beijing.
Measurements of the thermal neutron flux for an accelerator-based photoneutron source.
Taheri, Ali; Pazirandeh, Ali
2016-12-01
To have access to an appropriate neutron source is one of the most demanding requirements for neutron studies. This is important specially in laboratory and clinical applications, which need more compact and accessible sources. The most known neutron sources are fission reactors and natural isotopes, but there is an increasing interest for using accelerator based neutron sources because of their advantages. In this paper, we shall present a photo-neutron source prototype which is designed and fabricated to be used for different neutron researches including in-laboratory neutron activation analysis and neutron imaging, and also preliminary studies in boron neutron capture therapy (BNCT). Series of experimental tests were conducted to examine the intensity and quality of the neutron field produced by this source. Monte-Carlo simulations were also utilized to provide more detailed evaluation of the neutron spectrum, and determine the accuracy of the experiments. The experiments demonstrated a thermal neutron flux in the order of 10 7 (n/cm 2 .s), while simulations affirmed this flux and showed a neutron spectrum with a sharp peak at thermal energy region. According to the results, about 60 % of produced neutrons are in the range of thermal to epithermal neutrons.
[GIS and scenario analysis aid to water pollution control planning of river basin].
Wang, Shao-ping; Cheng, Sheng-tong; Jia, Hai-feng; Ou, Zhi-dan; Tan, Bin
2004-07-01
The forward and backward algorithms for watershed water pollution control planning were summarized in this paper as well as their advantages and shortages. The spatial databases of water environmental function region, pollution sources, monitoring sections and sewer outlets were built with ARCGIS8.1 as the platform in the case study of Ganjiang valley, Jiangxi province. Based on the principles of the forward algorithm, four scenarios were designed for the watershed pollution control. Under these scenarios, ten sets of planning schemes were generated to implement cascade pollution source control. The investment costs of sewage treatment for these schemes were estimated by means of a series of cost-effective functions; with pollution source prediction, the water quality was modeled with CSTR model for each planning scheme. The modeled results of different planning schemes were visualized through GIS to aid decision-making. With the results of investment cost and water quality attainment as decision-making accords and based on the analysis of the economic endurable capacity for water pollution control in Ganjiang river basin, two optimized schemes were proposed. The research shows that GIS technology and scenario analysis can provide a good guidance to the synthesis, integrity and sustainability aspects for river basin water quality planning.
NASA Astrophysics Data System (ADS)
Ni, X. Y.; Huang, H.; Du, W. P.
2017-02-01
The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.
Resource assessment in Western Australia using a geographic information system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, A.
1991-03-01
Three study areas in Western Australia covering from 77,000 to 425,000 mi{sup 2} were examined for oil and gas potential using a geographic information system (GIS). A data base of source rock thickness, source richness, maturity, and expulsion efficiency was created for each interval. The GIS (Arc/Info) was used to create, manage, and analyze data for each interval in each study area. Source rock thickness and source richness data were added to the data base from digitized data. Maturity information was generated with Arc/Info by combining geochemical and depth to structure data. Expulsion efficiency data was created by a systemmore » level Arc/Info program. After the data base for each interval was built, the GIS was used to analyze the geologic data. The analysis consisted of converting each data layer into a lattice (grid) and using the lattice operation in Arc/Infor (addition, multiplication, division, and subtraction) to combine the data layers. Additional techniques for combining and selecting data were developed using Arc/Info system level programs. The procedure for performing the analyses was written as macros in Arc/Info's macro programming language (AML). The results of the analysis were estimates of oil and gas volumes for each interval. The resultant volumes were produced in tabular form for reports and cartographic form for presentation. The geographic information system provided several clear advantages over traditional methods of resource assessment including simplified management, updating, and editing of geologic data.« less
Airport Surface Delays and Causes: A Preliminary Analysis
NASA Technical Reports Server (NTRS)
Chin, David K.; Goldberg, Jay; Tang, Tammy
1997-01-01
This report summarizes FAA Program Analysis and Operations Research Service (ASD-400)/Lockheed Martin activities and findings related to airport surface delays and causes, in support of NASA Langley Research Center's Terminal Area Productivity (TAP) Program. The activities described in this report were initiated in June 1995. A preliminary report was published on September 30, 1995. The final report incorporates data collection forms filled out by traffic managers, other FAA staff, and an airline for the New York City area, some updates, data previously requested from various sources to support this analysis, and further quantification and documentation than in the preliminary report. This final report is based on data available as of April 12, 1996. This report incorporates data obtained from review and analysis of data bases and literature, discussions/interviews with engineers, air-traffic staff, other FAA technical personnel, and airline staff, site visits, and a survey on surface delays and causes. It includes analysis of delay statistics; preliminary findings and conclusions on surface movement, surface delay sources and causes, runway occupancy time (ROT), and airport characteristics impacting surface operations and delays; and site-specific data on the New York City area airports, which are the focus airports for this report.
Chapter 11. Community analysis-based methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Y.; Wu, C.H.; Andersen, G.L.
2010-05-01
Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. Inmore » increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.« less
Chemometric analysis of minerals in gluten-free products.
Gliszczyńska-Świgło, Anna; Klimczak, Inga; Rybicka, Iga
2018-06-01
Numerous studies indicate mineral deficiencies in people on a gluten-free (GF) diet. These deficiencies may indicate that GF products are a less valuable source of minerals than gluten-containing products. In the study, the nutritional quality of 50 GF products is discussed taking into account the nutritional requirements for minerals expressed as percentage of recommended daily allowance (%RDA) or percentage of adequate intake (%AI) for a model celiac patient. Elements analyzed were calcium, potassium, magnesium, sodium, copper, iron, manganese, and zinc. Analysis of %RDA or %AI was performed using principal component analysis (PCA) and hierarchical cluster analysis (HCA). Using PCA, the differentiation between products based on rice, corn, potato, GF wheat starch and based on buckwheat, chickpea, millet, oats, amaranth, teff, quinoa, chestnut, and acorn was possible. In the HCA, four clusters were created. The main criterion determining the adherence of the sample to the cluster was the content of all minerals included to HCA (K, Mg, Cu, Fe, Mn); however, only the Mn content differentiated four formed groups. GF products made of buckwheat, chickpea, millet, oats, amaranth, teff, quinoa, chestnut, and acorn are better source of minerals than based on other GF raw materials, what was confirmed by PCA and HCA. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Noncoherent Detection of Coherent Optical Heterodyne Signals Corrupted by Laser Phase Noise
1991-03-01
replicated speech at the receiving end through the photoelectric effect . Bell’s photophone was the first practical use of light as a transmission...source dominates system performance. An analyti- cal expression representing the effect of laser phase noise on system performance is derived based on a...decision threshold analysis illustrates which noise source dominates system performance. An analytical expression representing the effect of laser phase
WebScope: A New Tool for Fusion Data Analysis and Visualization
NASA Astrophysics Data System (ADS)
Yang, Fei; Dang, Ningning; Xiao, Bingjia
2010-04-01
A visualization tool was developed through a web browser based on Java applets embedded into HTML pages, in order to provide a world access to the EAST experimental data. It can display data from various trees in different servers in a single panel. With WebScope, it is easier to make a comparison between different data sources and perform a simple calculation over different data sources.
Assessment of Ecological Risk of Heavy Metal Contamination in Coastal Municipalities of Montenegro
Mugoša, Boban; Đurović, Dijana; Nedović-Vuković, Mirjana; Barjaktarović-Labović, Snežana; Vrvić, Miroslav
2016-01-01
Assessment of heavy metal concentrations in the soil samples of urban parks and playgrounds is very important for the evaluation of potential risks for residents, especially children. Until recently, there has been very little data about urban parks pollution in Montenegro. To evaluate the sources of potential contamination and concentration of heavy metals, soil samples from coastal urban parks and kindergartens of Montenegro were collected. Based on the heavy metal concentrations, multivariate analysis combined with geochemical approaches showed that soil samples in coastal areas of Montenegro had mean Pb and Cd concentrations that were over two times higher than the background values, respectively. Based on principal component analysis (PCA), soil pollution with Pb, Cd, Cu, and Zn is contributed by anthropogenic sources. Results for Cr in the surface soils were primarily derived from natural sources. Calculation of different ecological contamination factors showed that Cd is the primary contribution to ecological risk index (RI) origins from anthropogenic, industry, and urbanization sources. This data provides evidence about soil pollution in coastal municipalities of Montenegro. Special attention should be paid to this problem in order to continue further research and to consider possible ways of remediation of the sites where contamination has been observed. PMID:27043601
NASA Astrophysics Data System (ADS)
Čufar, Aljaž; Batistoni, Paola; Conroy, Sean; Ghani, Zamir; Lengar, Igor; Milocco, Alberto; Packer, Lee; Pillon, Mario; Popovichev, Sergey; Snoj, Luka; JET Contributors
2017-03-01
At the Joint European Torus (JET) the ex-vessel fission chambers and in-vessel activation detectors are used as the neutron production rate and neutron yield monitors respectively. In order to ensure that these detectors produce accurate measurements they need to be experimentally calibrated. A new calibration of neutron detectors to 14 MeV neutrons, resulting from deuterium-tritium (DT) plasmas, is planned at JET using a compact accelerator based neutron generator (NG) in which a D/T beam impinges on a solid target containing T/D, producing neutrons by DT fusion reactions. This paper presents the analysis that was performed to model the neutron source characteristics in terms of energy spectrum, angle-energy distribution and the effect of the neutron generator geometry. Different codes capable of simulating the accelerator based DT neutron sources are compared and sensitivities to uncertainties in the generator's internal structure analysed. The analysis was performed to support preparation to the experimental measurements performed to characterize the NG as a calibration source. Further extensive neutronics analyses, performed with this model of the NG, will be needed to support the neutron calibration experiments and take into account various differences between the calibration experiment and experiments using the plasma as a source of neutrons.
Research on effects of baffle position in an integrating sphere on the luminous flux measurement
NASA Astrophysics Data System (ADS)
Lin, Fangsheng; Li, Tiecheng; Yin, Dejin; Lai, Lei; Xia, Ming
2016-09-01
In the field of optical metrology, luminous flux is an important index to characterize the quality of electric light source. Currently, the majority of luminous flux measurement is based on the integrating sphere method, so measurement accuracy of integrating sphere is the key factor. There are plenty of factors affecting the measurement accuracy, such as coating, power and the position of light source. However, the baffle which is a key part of integrating sphere has important effects on the measurement results. The paper analyzes in detail the principle of an ideal integrating sphere. We use moving rail to change the relative position of baffle and light source inside the sphere. By experiments, measured luminous flux values at different distances between the light source and baffle are obtained, which we used to take analysis of the effects of different baffle position on the measurement. By theoretical calculation, computer simulation and experiment, we obtain the optimum position of baffle for luminous flux measurements. Based on the whole luminous flux measurement error analysis, we develop the methods and apparatus to improve the luminous flux measurement accuracy and reliability. It makes our unifying and transferring work of the luminous flux more accurate in East China and provides effective protection for our traceability system.
2010-01-01
Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement
Analysis of spectrally resolved autofluorescence images by support vector machines
NASA Astrophysics Data System (ADS)
Mateasik, A.; Chorvat, D.; Chorvatova, A.
2013-02-01
Spectral analysis of the autofluorescence images of isolated cardiac cells was performed to evaluate and to classify the metabolic state of the cells in respect to the responses to metabolic modulators. The classification was done using machine learning approach based on support vector machine with the set of the automatically calculated features from recorded spectral profile of spectral autofluorescence images. This classification method was compared with the classical approach where the individual spectral components contributing to cell autofluorescence were estimated by spectral analysis, namely by blind source separation using non-negative matrix factorization. Comparison of both methods showed that machine learning can effectively classify the spectrally resolved autofluorescence images without the need of detailed knowledge about the sources of autofluorescence and their spectral properties.
ATLAS tile calorimeter cesium calibration control and analysis software
NASA Astrophysics Data System (ADS)
Solovyanov, O.; Solodkov, A.; Starchenko, E.; Karyukhin, A.; Isaev, A.; Shalanda, N.
2008-07-01
An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented.
Advanced microgrid design and analysis for forward operating bases
NASA Astrophysics Data System (ADS)
Reasoner, Jonathan
This thesis takes a holistic approach in creating an improved electric power generation system for a forward operating base (FOB) in the future through the design of an isolated microgrid. After an extensive literature search, this thesis found a need for drastic improvement of the FOB power system. A thorough design process analyzed FOB demand, researched demand side management improvements, evaluated various generation sources and energy storage options, and performed a HOMERRTM discrete optimization to determine the best microgrid design. Further sensitivity analysis was performed to see how changing parameters would affect the outcome. Lastly, this research also looks at some of the challenges which are associated with incorporating a design which relies heavily on inverter-based generation sources, and gives possible solutions to help make a renewable energy powered microgrid a reality. While this thesis uses a FOB as the case study, the process and discussion can be adapted to aide in the design of an off-grid small-scale power grid which utilizes high-penetration levels of renewable energy.
NASA Astrophysics Data System (ADS)
Ozer, Demet; Köse, Dursun A.; Şahin, Onur; Oztas, Nursen Altuntas
2017-08-01
The new metal-organic framework materials based on boric acid reported herein. Sodium and boron containing metal-organic frameworks were synthesized by one-pot self-assembly reaction in the presence of trimesic acid and terephthalic acid in water/ethanol solution. Boric acid is a relatively cheap boron source and boric acid mediated metal-organic framework prepared mild conditions compared to the other boron source based metal-organic framework. The synthesized compounds were characterized by FT-IR, p-XRD, TGA/DTA, elemental analysis, 13C-MAS NMR, 11B-NMR and single crystal measurements. The molecular formulas of compounds were estimated as C18H33B2Na5O28 and C8H24B2Na2O17 according to the structural analysis. The obtained complexes were thermally stable. Surface properties of inorganic polymer complexes were investigated by BET analyses and hydrogen storage properties of compound were also calculated.
Using Self-Efficacy as a Construct for Evaluating Science and Mathematics Methods Courses
NASA Astrophysics Data System (ADS)
Brand, Brenda R.; Wilkins, Jesse L. M.
2007-04-01
The focus of this study was elementary preservice teachers’ development as effective teachers of science and mathematics as influenced by their participation in elementary science and mathematics methods courses. Preservice teachers’ reports of factors that influenced their perception of their teaching abilities were analyzed according to Bandura’s (1994) 4 sources of efficacy: mastery experiences, vicarious experiences, social persuasion, and stress reduction. This investigation allowed the researchers to evaluate the courses based on these sources. The analysis indicated all 4 sources influenced preservice teachers’ teaching self-efficacy beliefs, with mastery experiences considered the most influential. Embedded within discussions of mastery experiences were references to the other sources of efficacy, which suggest an interrelationship between mastery experiences and the other sources.
Forbes, Thomas P.; Degertekin, F. Levent; Fedorov, Andrei G.
2010-01-01
Electrochemistry and ion transport in a planar array of mechanically-driven, droplet-based ion sources are investigated using an approximate time scale analysis and in-depth computational simulations. The ion source is modeled as a controlled-current electrolytic cell, in which the piezoelectric transducer electrode, which mechanically drives the charged droplet generation using ultrasonic atomization, also acts as the oxidizing/corroding anode (positive mode). The interplay between advective and diffusive ion transport of electrochemically generated ions is analyzed as a function of the transducer duty cycle and electrode location. A time scale analysis of the relative importance of advective vs. diffusive ion transport provides valuable insight into optimality, from the ionization prospective, of alternative design and operation modes of the ion source operation. A computational model based on the solution of time-averaged, quasi-steady advection-diffusion equations for electroactive species transport is used to substantiate the conclusions of the time scale analysis. The results show that electrochemical ion generation at the piezoelectric transducer electrodes located at the back-side of the ion source reservoir results in poor ionization efficiency due to insufficient time for the charged analyte to diffuse away from the electrode surface to the ejection location, especially at near 100% duty cycle operation. Reducing the duty cycle of droplet/analyte ejection increases the analyte residence time and, in turn, improves ionization efficiency, but at an expense of the reduced device throughput. For applications where this is undesirable, i.e., multiplexed and disposable device configurations, an alternative electrode location is incorporated. By moving the charging electrode to the nozzle surface, the diffusion length scale is greatly reduced, drastically improving ionization efficiency. The ionization efficiency of all operating conditions considered is expressed as a function of the dimensionless Peclet number, which defines the relative effect of advection as compared to diffusion. This analysis is general enough to elucidate an important role of electrochemistry in ionization efficiency of any arrayed ion sources, be they mechanically-driven or electrosprays, and is vital for determining optimal design and operation conditions. PMID:20607111
Analysis of Ground Motion from An Underground Chemical Explosion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pitarka, Arben; Mellors, Robert J.; Walter, William R.
Here in this paper we investigate the excitation and propagation of far-field seismic waves from the 905 kg trinitrotoluene equivalent underground chemical explosion SPE-3 recorded during the Source Physics Experiment (SPE) at the Nevada National Security Site. The recorded far-field ground motion at short and long distances is characterized by substantial shear-wave energy, and large azimuthal variations in P-and S-wave amplitudes. The shear waves observed on the transverse component of sensors at epicentral distances <50 m suggests they were generated at or very near the source. The relative amplitude of the shear waves grows as the waves propagate away frommore » the source. We analyze and model the shear-wave excitation during the explosion in the 0.01–10 Hz frequency range, at epicentral distances of up to 1 km. We used two simulation techniques. One is based on the empirical isotropic Mueller–Murphy (MM) (Mueller and Murphy, 1971) nuclear explosion source model, and 3D anelastic wave propagation modeling. The second uses a physics-based approach that couples hydrodynamic modeling of the chemical explosion source with anelastic wave propagation modeling. Comparisons with recorded data show the MM source model overestimates the SPE-3 far-field ground motion by an average factor of 4. The observations show that shear waves with substantial high-frequency energy were generated at the source. However, to match the observations additional shear waves from scattering, including surface topography, and heterogeneous shallow structure contributed to the amplification of far-field shear motion. Comparisons between empirically based isotropic and physics-based anisotropic source models suggest that both wave-scattering effects and near-field nonlinear effects are needed to explain the amplitude and irregular radiation pattern of shear motion observed during the SPE-3 explosion.« less
Analysis of Ground Motion from An Underground Chemical Explosion
Pitarka, Arben; Mellors, Robert J.; Walter, William R.; ...
2015-09-08
Here in this paper we investigate the excitation and propagation of far-field seismic waves from the 905 kg trinitrotoluene equivalent underground chemical explosion SPE-3 recorded during the Source Physics Experiment (SPE) at the Nevada National Security Site. The recorded far-field ground motion at short and long distances is characterized by substantial shear-wave energy, and large azimuthal variations in P-and S-wave amplitudes. The shear waves observed on the transverse component of sensors at epicentral distances <50 m suggests they were generated at or very near the source. The relative amplitude of the shear waves grows as the waves propagate away frommore » the source. We analyze and model the shear-wave excitation during the explosion in the 0.01–10 Hz frequency range, at epicentral distances of up to 1 km. We used two simulation techniques. One is based on the empirical isotropic Mueller–Murphy (MM) (Mueller and Murphy, 1971) nuclear explosion source model, and 3D anelastic wave propagation modeling. The second uses a physics-based approach that couples hydrodynamic modeling of the chemical explosion source with anelastic wave propagation modeling. Comparisons with recorded data show the MM source model overestimates the SPE-3 far-field ground motion by an average factor of 4. The observations show that shear waves with substantial high-frequency energy were generated at the source. However, to match the observations additional shear waves from scattering, including surface topography, and heterogeneous shallow structure contributed to the amplification of far-field shear motion. Comparisons between empirically based isotropic and physics-based anisotropic source models suggest that both wave-scattering effects and near-field nonlinear effects are needed to explain the amplitude and irregular radiation pattern of shear motion observed during the SPE-3 explosion.« less
Zvereva, Alexandra; Kamp, Florian; Schlattl, Helmut; Zankl, Maria; Parodi, Katia
2018-05-17
Variance-based sensitivity analysis (SA) is described and applied to the radiation dosimetry model proposed by the Committee on Medical Internal Radiation Dose (MIRD) for the organ-level absorbed dose calculations in nuclear medicine. The uncertainties in the dose coefficients thus calculated are also evaluated. A Monte Carlo approach was used to compute first-order and total-effect SA indices, which rank the input factors according to their influence on the uncertainty in the output organ doses. These methods were applied to the radiopharmaceutical (S)-4-(3- 18 F-fluoropropyl)-L-glutamic acid ( 18 F-FSPG) as an example. Since 18 F-FSPG has 11 notable source regions, a 22-dimensional model was considered here, where 11 input factors are the time-integrated activity coefficients (TIACs) in the source regions and 11 input factors correspond to the sets of the specific absorbed fractions (SAFs) employed in the dose calculation. The SA was restricted to the foregoing 22 input factors. The distributions of the input factors were built based on TIACs of five individuals to whom the radiopharmaceutical 18 F-FSPG was administered and six anatomical models, representing two reference, two overweight, and two slim individuals. The self-absorption SAFs were mass-scaled to correspond to the reference organ masses. The estimated relative uncertainties were in the range 10%-30%, with a minimum and a maximum for absorbed dose coefficients for urinary bladder wall and heart wall, respectively. The applied global variance-based SA enabled us to identify the input factors that have the highest influence on the uncertainty in the organ doses. With the applied mass-scaling of the self-absorption SAFs, these factors included the TIACs for absorbed dose coefficients in the source regions and the SAFs from blood as source region for absorbed dose coefficients in highly vascularized target regions. For some combinations of proximal target and source regions, the corresponding cross-fire SAFs were found to have an impact. Global variance-based SA has been for the first time applied to the MIRD schema for internal dose calculation. Our findings suggest that uncertainties in computed organ doses can be substantially reduced by performing an accurate determination of TIACs in the source regions, accompanied by the estimation of individual source region masses along with the usage of an appropriate blood distribution in a patient's body and, in a few cases, the cross-fire SAFs from proximal source regions. © 2018 American Association of Physicists in Medicine.
Quality Analysis of Open Street Map Data
NASA Astrophysics Data System (ADS)
Wang, M.; Li, Q.; Hu, Q.; Zhou, M.
2013-05-01
Crowd sourcing geographic data is an opensource geographic data which is contributed by lots of non-professionals and provided to the public. The typical crowd sourcing geographic data contains GPS track data like OpenStreetMap, collaborative map data like Wikimapia, social websites like Twitter and Facebook, POI signed by Jiepang user and so on. These data will provide canonical geographic information for pubic after treatment. As compared with conventional geographic data collection and update method, the crowd sourcing geographic data from the non-professional has characteristics or advantages of large data volume, high currency, abundance information and low cost and becomes a research hotspot of international geographic information science in the recent years. Large volume crowd sourcing geographic data with high currency provides a new solution for geospatial database updating while it need to solve the quality problem of crowd sourcing geographic data obtained from the non-professionals. In this paper, a quality analysis model for OpenStreetMap crowd sourcing geographic data is proposed. Firstly, a quality analysis framework is designed based on data characteristic analysis of OSM data. Secondly, a quality assessment model for OSM data by three different quality elements: completeness, thematic accuracy and positional accuracy is presented. Finally, take the OSM data of Wuhan for instance, the paper analyses and assesses the quality of OSM data with 2011 version of navigation map for reference. The result shows that the high-level roads and urban traffic network of OSM data has a high positional accuracy and completeness so that these OSM data can be used for updating of urban road network database.
Marketing Need-Based Financial Aid Programs: An Institutional Case Study
ERIC Educational Resources Information Center
Knight, Mary Beth
2010-01-01
Colleges and universities represent one of the most utilized sources of need-based financial aid information for students and families, and yet most research in access marketing is focused at the national and state levels. There is sparse published information about the effects of financial aid marketing observed through quantitative analysis, in…
Discrimination of particulate matter emission sources using stochastic methods
NASA Astrophysics Data System (ADS)
Szczurek, Andrzej; Maciejewska, Monika; Wyłomańska, Agnieszka; Sikora, Grzegorz; Balcerek, Michał; Teuerle, Marek
2016-12-01
Particulate matter (PM) is one of the criteria pollutants which has been determined as harmful to public health and the environment. For this reason the ability to recognize its emission sources is very important. There are a number of measurement methods which allow to characterize PM in terms of concentration, particles size distribution, and chemical composition. All these information are useful to establish a link between the dust found in the air, its emission sources and influence on human as well as the environment. However, the methods are typically quite sophisticated and not applicable outside laboratories. In this work, we considered PM emission source discrimination method which is based on continuous measurements of PM concentration with a relatively cheap instrument and stochastic analysis of the obtained data. The stochastic analysis is focused on the temporal variation of PM concentration and it involves two steps: (1) recognition of the category of distribution for the data i.e. stable or the domain of attraction of stable distribution and (2) finding best matching distribution out of Gaussian, stable and normal-inverse Gaussian (NIG). We examined six PM emission sources. They were associated with material processing in industrial environment, namely machining and welding aluminum, forged carbon steel and plastic with various tools. As shown by the obtained results, PM emission sources may be distinguished based on statistical distribution of PM concentration variations. Major factor responsible for the differences detectable with our method was the type of material processing and the tool applied. In case different materials were processed by the same tool the distinction of emission sources was difficult. For successful discrimination it was crucial to consider size-segregated mass fraction concentrations. In our opinion the presented approach is very promising. It deserves further study and development.
Sequence-based analysis of the microbial composition of water kefir from multiple sources.
Marsh, Alan J; O'Sullivan, Orla; Hill, Colin; Ross, R Paul; Cotter, Paul D
2013-11-01
Water kefir is a water-sucrose-based beverage, fermented by a symbiosis of bacteria and yeast to produce a final product that is lightly carbonated, acidic and that has a low alcohol percentage. The microorganisms present in water kefir are introduced via water kefir grains, which consist of a polysaccharide matrix in which the microorganisms are embedded. We aimed to provide a comprehensive sequencing-based analysis of the bacterial population of water kefir beverages and grains, while providing an initial insight into the corresponding fungal population. To facilitate this objective, four water kefirs were sourced from the UK, Canada and the United States. Culture-independent, high-throughput, sequencing-based analyses revealed that the bacterial fraction of each water kefir and grain was dominated by Zymomonas, an ethanol-producing bacterium, which has not previously been detected at such a scale. The other genera detected were representatives of the lactic acid bacteria and acetic acid bacteria. Our analysis of the fungal component established that it was comprised of the genera Dekkera, Hanseniaspora, Saccharomyces, Zygosaccharomyces, Torulaspora and Lachancea. This information will assist in the ultimate identification of the microorganisms responsible for the potentially health-promoting attributes of these beverages. © 2013 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.
Yu, Hye-Weon; Jang, Am; Kim, Lan Hee; Kim, Sung-Jo; Kim, In S
2011-09-15
Due to the increased occurrence of cyanobacterial blooms and their toxins in drinking water sources, effective management based on a sensitive and rapid analytical method is in high demand for security of safe water sources and environmental human health. Here, a competitive fluorescence immunoassay of microcystin-LR (MCYST-LR) is developed in an attempt to improve the sensitivity, analysis time, and ease-of-manipulation of analysis. To serve this aim, a bead-based suspension assay was introduced based on two major sensing elements: an antibody-conjugated quantum dot (QD) detection probe and an antigen-immobilized magnetic bead (MB) competitor. The assay was composed of three steps: the competitive immunological reaction of QD detection probes against analytes and MB competitors, magnetic separation and washing, and the optical signal generation of QDs. The fluorescence intensity was found to be inversely proportional to the MCYST-LR concentration. Under optimized conditions, the proposed assay performed well for the identification and quantitative analysis of MCYST-LR (within 30 min in the range of 0.42-25 μg/L, with a limit of detection of 0.03 μg/L). It is thus expected that this enhanced assay can contribute both to the sensitive and rapid diagnosis of cyanotoxin risk in drinking water and effective management procedures.
NASA Astrophysics Data System (ADS)
Kong, Xiangzhen; He, Wei; Qin, Ning; He, Qishuang; Yang, Bin; Ouyang, Huiling; Wang, Qingmei; Xu, Fuliu
2013-03-01
Trajectory cluster analysis, including the two-stage cluster method based on Euclidean metrics and the one-stage clustering method based on Mahalanobis metrics and self-organizing maps (SOM), was applied and compared to identify the transport pathways of PM10 for the cities of Chaohu and Hefei, both located near Lake Chaohu in China. The two-stage cluster method was modified to further investigate the long trajectories in the second stage in order to eliminate the observed disaggregation among them. Twelve trajectory clusters were identified for both cities. The one-stage clustering method based on Mahalanobis metrics gives the best performance regarding the variances within clusters. The results showed that local PM10 emission was one of the most important sources in both cities and that the local emission in Hefei was higher than in Chaohu. In addition, Chaohu suffered greater effects from the eastern region (Yangtze River Delta, YRD) than Hefei. On the other hand, the long-range transportation from the northwestern pathway had a higher influence on the PM10 level in Hefei. Receptor models, including potential source contribution function (PSCF) and residence time weighted concentrations (RTWC), were utilized to identify the potential source locations of PM10 for both cities. However, the combined PSCF and RTWC results for the two cities provided PM10 source locations that were more consistent with the results of transport pathways and the total anthropogenic PM10 emission inventory. This indicates that the combined method's ability to identify the source regions is superior to that of the individual PSCF or RTWC methods. Henan and Shanxi Provinces and the YRD were important PM10 source regions for the two cities, but the Henan and Shanxi area was more important for Hefei than for Chaohu, while the YRD region was less important. In addition, the PSCF, RTWC and the combined results all had higher correlation coefficients with PM10 emission from traffic than from industry, electricity generation or residential sources, suggesting the relatively higher contribution of traffic emissions to the PM10 pollution in Lake Chaohu.
Mokhtari, Amirhossein; Moore, Christina M; Yang, Hong; Jaykus, Lee-Ann; Morales, Roberta; Cates, Sheryl C; Cowen, Peter
2006-06-01
We describe a one-dimensional probabilistic model of the role of domestic food handling behaviors on salmonellosis risk associated with the consumption of eggs and egg-containing foods. Six categories of egg-containing foods were defined based on the amount of egg contained in the food, whether eggs are pooled, and the degree of cooking practiced by consumers. We used bootstrap simulation to quantify uncertainty in risk estimates due to sampling error, and sensitivity analysis to identify key sources of variability and uncertainty in the model. Because of typical model characteristics such as nonlinearity, interaction between inputs, thresholds, and saturation points, Sobol's method, a novel sensitivity analysis approach, was used to identify key sources of variability. Based on the mean probability of illness, examples of foods from the food categories ranked from most to least risk of illness were: (1) home-made salad dressings/ice cream; (2) fried eggs/boiled eggs; (3) omelettes; and (4) baked foods/breads. For food categories that may include uncooked eggs (e.g., home-made salad dressings/ice cream), consumer handling conditions such as storage time and temperature after food preparation were the key sources of variability. In contrast, for food categories associated with undercooked eggs (e.g., fried/soft-boiled eggs), the initial level of Salmonella contamination and the log10 reduction due to cooking were the key sources of variability. Important sources of uncertainty varied with both the risk percentile and the food category under consideration. This work adds to previous risk assessments focused on egg production and storage practices, and provides a science-based approach to inform consumer risk communications regarding safe egg handling practices.
THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.
Theobald, Douglas L; Wuttke, Deborah S
2006-09-01
THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.
Semantic integration of gene expression analysis tools and data sources using software connectors
2013-01-01
Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380
Semantic integration of gene expression analysis tools and data sources using software connectors.
Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G
2013-10-25
The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Valuation of medical resource units collected in health economic studies.
Copley-Merriman, C; Lair, T J
1994-01-01
This paper reviews the issues that are critical for the valuation of medical resources in the context of health economic studies. There are several points to consider when undertaking the valuation of medical resources. The perspective of the analysis should be established before determining the valuation process. Future costs should be discounted to present values, and time and effort spent in assigning a monetary value to a medical resource should be proportional to its importance in the analysis. Prices vary considerably based on location of the service and the severity of the illness episode. Because of the wide variability in pricing data, sensitivity analysis is an important component of validation of study results. A variety of data sources have been applied to the valuation of medical resources. Several types of data are reviewed in this paper, including claims data, national survey data, administrative data, and marketing research data. Valuation of medical resources collected in clinical trials is complex because of the lack of standardization of the data sources. A national pricing data source for health economic valuation would greatly facilitate study analysis and make comparisons between results more meaningful.
Indoor source apportionment in urban communities near industrial sites
NASA Astrophysics Data System (ADS)
Tunno, Brett J.; Dalton, Rebecca; Cambal, Leah; Holguin, Fernando; Lioy, Paul; Clougherty, Jane E.
2016-08-01
Because fine particulate matter (PM2.5) differs in chemical composition, source apportionment is frequently used for identification of relative contributions of multiple sources to outdoor concentrations. Indoor air pollution and source apportionment is often overlooked, though people in northern climates may spend up to 90% of their time inside. We selected 21 homes for a 1-week indoor sampling session during summer (July to September 2011), repeated in winter (January to March 2012). Elemental analysis was performed using inductively-coupled plasma mass spectrometry (ICP-MS), and factor analysis was used to determine constituent grouping. Multivariate modeling was run on factor scores to corroborate interpretations of source factors based on a literature review. For each season, a 5-factor solution explained 86-88% of variability in constituent concentrations. Indoor sources (i.e. cooking, smoking) explained greater variability than did outdoor sources in these industrial communities. A smoking factor was identified in each season, predicted by number of cigarettes smoked. Cooking factors were also identified in each season, explained by frequency of stove cooking and stovetop frying. Significant contributions from outdoor sources including coal and motor vehicles were also identified. Higher coal and secondary-related elemental concentrations were detected during summer than winter. Our findings suggest that source contributions to indoor concentrations can be identified and should be examined in relation to health effects.
Oscillation Baselining and Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).
Becker, H; Albera, L; Comon, P; Nunes, J-C; Gribonval, R; Fleureau, J; Guillotel, P; Merlet, I
2017-08-15
Over the past decades, a multitude of different brain source imaging algorithms have been developed to identify the neural generators underlying the surface electroencephalography measurements. While most of these techniques focus on determining the source positions, only a small number of recently developed algorithms provides an indication of the spatial extent of the distributed sources. In a recent comparison of brain source imaging approaches, the VB-SCCD algorithm has been shown to be one of the most promising algorithms among these methods. However, this technique suffers from several problems: it leads to amplitude-biased source estimates, it has difficulties in separating close sources, and it has a high computational complexity due to its implementation using second order cone programming. To overcome these problems, we propose to include an additional regularization term that imposes sparsity in the original source domain and to solve the resulting optimization problem using the alternating direction method of multipliers. Furthermore, we show that the algorithm yields more robust solutions by taking into account the temporal structure of the data. We also propose a new method to automatically threshold the estimated source distribution, which permits to delineate the active brain regions. The new algorithm, called Source Imaging based on Structured Sparsity (SISSY), is analyzed by means of realistic computer simulations and is validated on the clinical data of four patients. Copyright © 2017 Elsevier Inc. All rights reserved.
Javidi, Soroush; Mandic, Danilo P.; Took, Clive Cheong; Cichocki, Andrzej
2011-01-01
A new class of complex domain blind source extraction algorithms suitable for the extraction of both circular and non-circular complex signals is proposed. This is achieved through sequential extraction based on the degree of kurtosis and in the presence of non-circular measurement noise. The existence and uniqueness analysis of the solution is followed by a study of fast converging variants of the algorithm. The performance is first assessed through simulations on well understood benchmark signals, followed by a case study on real-time artifact removal from EEG signals, verified using both qualitative and quantitative metrics. The results illustrate the power of the proposed approach in real-time blind extraction of general complex-valued sources. PMID:22319461
Light curves of flat-spectrum radio sources (Jenness+, 2010)
NASA Astrophysics Data System (ADS)
Jenness, T.; Robson, E. I.; Stevens, J. A.
2010-05-01
Calibrated data for 143 flat-spectrum extragalactic radio sources are presented at a wavelength of 850um covering a 5-yr period from 2000 April. The data, obtained at the James Clerk Maxwell Telescope using the Submillimetre Common-User Bolometer Array (SCUBA) camera in pointing mode, were analysed using an automated pipeline process based on the Observatory Reduction and Acquisition Control - Data Reduction (ORAC-DR) system. This paper describes the techniques used to analyse and calibrate the data, and presents the data base of results along with a representative sample of the better-sampled light curves. A re-analysis of previously published data from 1997 to 2000 is also presented. The combined catalogue, comprising 10493 flux density measurements, provides a unique and valuable resource for studies of extragalactic radio sources. (2 data files).
TRANSIENT X-RAY SOURCE POPULATION IN THE MAGELLANIC-TYPE GALAXY NGC 55
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jithesh, V.; Wang, Zhongxiang, E-mail: jithesh@shao.ac.cn
2016-04-10
We present the spectral and temporal properties of 15 candidate transient X-ray sources detected in archival XMM-Newton and Chandra observations of the nearby Magellanic-type, SB(s)m galaxy NGC 55. Based on an X-ray color classification scheme, the majority of the sources may be identified as X-ray binaries (XRBs), and six sources are soft, including a likely supernova remnant. We perform a detailed spectral and variability analysis of the data for two bright candidate XRBs. Both sources displayed strong short-term X-ray variability, and their X-ray spectra and hardness ratios are consistent with those of XRBs. These results, combined with their high X-raymore » luminosities (∼10{sup 38} erg s{sup −1}), strongly suggest that they are black hole (BH) binaries. Seven less luminous sources have spectral properties consistent with those of neutron star or BH XRBs in both normal and high-rate accretion modes, but one of them is the likely counterpart to a background galaxy (because of positional coincidence). From our spectral analysis, we find that the six soft sources are candidate super soft sources (SSSs) with dominant emission in the soft (0.3–2 keV) X-ray band. Archival Hubble Space Telescope optical images for seven sources are available, and the data suggest that most of them are likely to be high-mass XRBs. Our analysis has revealed the heterogeneous nature of the transient population in NGC 55 (six high-mass XRBs, one low-mass XRBs, six SSSs, one active galactic nucleus), helping establish the similarity of the X-ray properties of this galaxy to those of other Magellanic-type galaxies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shott, Gregory J.
This special analysis (SA) evaluates whether the Lawrence Livermore National Laboratory (LLNL) Low Activity Beta/Gamma Sources waste stream (BCLALADOEOSRP, Revision 0) is suitable for disposal by shallow land burial (SLB) at the Area 5 Radioactive Waste Management Site (RWMS) at the Nevada National Security Site (NNSS). The LLNL Low Activity Beta/Gamma Sources waste stream consists of sealed sources that are no longer needed. The LLNL Low Activity Beta/Gamma Sources waste stream required a special analysis because cobalt-60 (60Co), strontium-90 (90Sr), cesium-137 (137Cs), and radium-226 (226Ra) exceeded the NNSS Waste Acceptance Criteria (WAC) Action Levels (U.S. Department of Energy, National Nuclearmore » Security Administration Nevada Field Office [NNSA/NFO] 2015). The results indicate that all performance objectives can be met with disposal of the LLNL Low Activity Beta/Gamma Sources in a SLB trench. The LLNL Low Activity Beta/Gamma Sources waste stream is suitable for disposal by SLB at the Area 5 RWMS. However, the activity concentration of 226Ra listed on the waste profile sheet significantly exceeds the action level. Approval of the waste profile sheet could potentially allow the disposal of high activity 226Ra sources. To ensure that the generator does not include large 226Ra sources in this waste stream without additional evaluation, a control is need on the maximum 226Ra inventory. A limit based on the generator’s estimate of the total 226Ra inventory is recommended. The waste stream is recommended for approval with the control that the total 226Ra inventory disposed shall not exceed 5.5E10 Bq (1.5 Ci).« less
NASA Astrophysics Data System (ADS)
Shepson, P. B.; Lavoie, T. N.; Kerlo, A. E.; Stirm, B. H.
2016-12-01
Understanding the contribution of anthropogenic activities to atmospheric greenhouse gas concentrations requires an accurate characterization of emission sources. Previously, we have reported the use of a novel aircraft-based mass balance measurement technique to quantify greenhouse gas emission rates from point and area sources, however, the accuracy of this approach has not been evaluated to date. Here, an assessment of method accuracy and precision was performed by conducting a series of six aircraft-based mass balance experiments at a power plant in southern Indiana and comparing the calculated CO2 emission rates to the reported hourly emission measurements made by continuous emissions monitoring systems (CEMS) installed directly in the exhaust stacks at the facility. For all flights, CO2 emissions were quantified before CEMS data were released online to ensure unbiased analysis. Additionally, we assess the uncertainties introduced to the final emission rate caused by our analysis method, which employs a statistical kriging model to interpolate and extrapolate the CO2 fluxes across the flight transects from the ground to the top of the boundary layer. Subsequently, using the results from these flights combined with the known emissions reported by the CEMS, we perform an inter-model comparison of alternative kriging methods to evaluate the performance of the kriging approach.
Zakrzewska, Karolina Ewa; Samluk, Anna; Wencel, Agnieszka; Dudek, Krzysztof; Pijanowska, Dorota Genowefa; Pluta, Krzysztof Dariusz
2017-01-01
Cell-based therapies that could provide an alternative treatment for the end-stage liver disease require an adequate source of functional hepatocytes. There is little scientific evidence for the influence of patient's age, sex, and chemotherapy on the cell isolation efficiency and metabolic activity of the harvested hepatocytes. The purpose of this study was to investigate whether hepatocytes derived from different sources display differential viability and biosynthetic capacity. Liver cells were isolated from 41 different human tissue specimens. Hepatocytes were labeled using specific antibodies and analyzed using flow cytometry. Multiparametric analysis of the acquired data revealed statistically significant differences between some studied groups of patients. Generally, populations of cells isolated from the male specimens had greater percentage of biosynthetically active hepatocytes than those from the female ones regardless of age and previous chemotherapy of the patient. Based on the albumin staining (and partially on the α-1-antitrypsin labeling) after donor liver exclusion (6 out of 41 samples), our results indicated that: 1. samples obtained from males gave a greater percentage of active hepatocytes than those from females (p = 0.034), and 2. specimens from the males after chemotherapy greater than those from the treated females (p = 0.032).
The effect of beamwidth on the analysis of electron-beam-induced current line scans
NASA Astrophysics Data System (ADS)
Luke, Keung L.
1995-04-01
A real electron beam has finite width, which has been almost universally ignored in electron-beam-induced current (EBIC) theories. Obvious examples are point-source-based EBIC analyses, which neglect both the finite volume of electron-hole carriers generated by an energetic electron beam of negligible width and the beamwidth when it is no longer negligible. Gaussian source-based analyses are more realistic but the beamwidth has not been included, partly because the generation volume is much larger than the beamwidth, but this is not always the case. In this article Donolato's Gaussian source-based EBIC equation is generalized to include the beamwidth of a Gaussian beam. This generalized equation is then used to study three problems: (1) the effect of beamwidth on EBIC line scans and on effective diffusion lengths and the results are applied to the analysis of the EBIC data of Dixon, Williams, Das, and Webb; (2) unresolved questions raised by others concerning the applicability of the Watanabe-Actor-Gatos method to real EBIC data to evaluate surface recombination velocity; (3) the effect of beamwidth on the methods proposed recently by the author to determine the surface recombination velocity and to discriminate between the Everhart-Hoff and Kanaya-Okayama ranges which is the correct one to use for analyzing EBIC line scans.
Cloud Based Metalearning System for Predictive Modeling of Biomedical Data
Vukićević, Milan
2014-01-01
Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101
2007-06-15
the lack of any or all of these items in order to de -legitimize the occupation force and or its follow on civilian government in order to gain a...operations will be used to compare and contrast, rather than trying to develop new patterns and themes based on primary sources. Delimitations of this study...chapter 4, “Analysis and Interpretation.” Based upon this analysis, in chapter 5, “Conclusions and Recommendations,” the research will develop and
Enhancing source location protection in wireless sensor networks
NASA Astrophysics Data System (ADS)
Chen, Juan; Lin, Zhengkui; Wu, Di; Wang, Bailing
2015-12-01
Wireless sensor networks are widely deployed in the internet of things to monitor valuable objects. Once the object is monitored, the sensor nearest to the object which is known as the source informs the base station about the object's information periodically. It is obvious that attackers can capture the object successfully by localizing the source. Thus, many protocols have been proposed to secure the source location. However, in this paper, we examine that typical source location protection protocols generate not only near but also highly localized phantom locations. As a result, attackers can trace the source easily from these phantom locations. To address these limitations, we propose a protocol to enhance the source location protection (SLE). With phantom locations far away from the source and widely distributed, SLE improves source location anonymity significantly. Theory analysis and simulation results show that our SLE provides strong source location privacy preservation and the average safety period increases by nearly one order of magnitude compared with existing work with low communication cost.
Fermi Large Area Telescope third source catalog
Acero, F.; Ackermann, M.; Ajello, M.; ...
2015-06-12
Here, we present the third Fermi Large Area Telescope (LAT) source catalog (3FGL) of sources in the 100 MeV–300 GeV range. Based on the first 4 yr of science data from the Fermi Gamma-ray Space Telescope mission, it is the deepest yet in this energy range. Relative to the Second Fermi LAT catalog, the 3FGL catalog incorporates twice as much data, as well as a number of analysis improvements, including improved calibrations at the event reconstruction level, an updated model for Galactic diffuse γ-ray emission, a refined procedure for source detection, and improved methods for associating LAT sources with potential counterparts at other wavelengths. The 3FGL catalog includes 3033 sources abovemore » $$4\\sigma $$ significance, with source location regions, spectral properties, and monthly light curves for each. Of these, 78 are flagged as potentially being due to imperfections in the model for Galactic diffuse emission. Twenty-five sources are modeled explicitly as spatially extended, and overall 238 sources are considered as identified based on angular extent or correlated variability (periodic or otherwise) observed at other wavelengths. For 1010 sources we have not found plausible counterparts at other wavelengths. More than 1100 of the identified or associated sources are active galaxies of the blazar class; several other classes of non-blazar active galaxies are also represented in the 3FGL. Pulsars represent the largest Galactic source class. As a result, from source counts of Galactic sources we estimate that the contribution of unresolved sources to the Galactic diffuse emission is ~3% at 1 GeV.« less
Sources of delayed provision of neurosurgical care in a rural kenyan setting
Mansouri, Alireza; Chan, Vivien; Njaramba, Veronica; Cadotte, David W.; Albright, A. Leland; Bernstein, Mark
2015-01-01
Background: Delay to neurosurgical care can result in significant morbidity and mortality. In this study, we aim to identify and quantify the sources of delay to neurosurgical consultation and care at a rural setting in Kenya. Methods: A mixed-methods, cross-sectional analysis of all patients admitted to the neurosurgical department at Kijabe Hospital (KH) was conducted: A retrospective analysis of admissions from October 1 to December 31, 2013 and a prospective analysis from June 2 to June 20, 2014. Sources of delay were categorized and quantified. The Kruskal–Wallis test was used to identify an overall significant difference among diagnoses. The Mann–Whitney U test was used for pairwise comparisons within groups; the Bonferroni correction was applied to the alpha level of significance (0.05) according to the number of comparisons conducted. IBM SPSS version 22.0 (SPSS, Chicago, IL) was used for statistical analyses. Results: A total of 332 admissions were reviewed (237 retrospective, 95 prospective). The majority was pediatric admissions (median age: 3 months). Hydrocephalus (35%) and neural tube defects (NTDs; 27%) were most common. At least one source of delay was identified in 192 cases (58%); 39 (12%) were affected by multiple sources. Delay in primary care (PCPs), in isolation or combined with other sources, comprised 137 of total (71%); misdiagnosis or incorrect management comprised 46 (34%) of these. Finances contributed to delays in 25 of 95 prospective cases. At a median delay of 49 and 200.5 days, the diagnoses of hydrocephalus and tumors were associated with a significantly longer delay compared with NTDs (P < 0.001). Conclusion: A substantial proportion of patients experienced delays in procuring pediatric neurosurgical care. Improvement in PCP knowledge base, implementation of a triage and referral process, and development of community-based funding strategies can potentially reduce these delays. PMID:25745587
Steel, Amie; Adams, Jon
2011-06-01
The approach of evidence-based medicine (EBM), providing a paradigm to validate information sources and a process for critiquing their value, is an important platform for guiding practice. Researchers have explored the application and value of information sources in clinical practice with regard to a range of health professions; however, naturopathic practice has been overlooked. An exploratory study of naturopaths' perspectives of the application and value of information sources has been undertaken. Semi-structured interviews with 12 naturopaths in current clinical practice, concerning the information sources used in clinical practice and their perceptions of these sources. Thematic analysis identified differences in the application of the variety of information sources used, depending upon the perceived validity. Internet databases were viewed as highly valid. Textbooks, formal education and interpersonal interactions were judged based upon a variety of factors, whilst validation of general internet sites and manufacturers information was required prior to use. The findings of this study will provide preliminary aid to those responsible for supporting naturopaths' information use and access. In particular, it may assist publishers, medical librarians and professional associations in developing strategies to expand the clinically useful information sources available to naturopaths. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.
Discriminating Simulated Vocal Tremor Source Using Amplitude Modulation Spectra
Carbonell, Kathy M.; Lester, Rosemary A.; Story, Brad H.; Lotto, Andrew J.
2014-01-01
Objectives/Hypothesis Sources of vocal tremor are difficult to categorize perceptually and acoustically. This paper describes a preliminary attempt to discriminate vocal tremor sources through the use of spectral measures of the amplitude envelope. The hypothesis is that different vocal tremor sources are associated with distinct patterns of acoustic amplitude modulations. Study Design Statistical categorization methods (discriminant function analysis) were used to discriminate signals from simulated vocal tremor with different sources using only acoustic measures derived from the amplitude envelopes. Methods Simulations of vocal tremor were created by modulating parameters of a vocal fold model corresponding to oscillations of respiratory driving pressure (respiratory tremor), degree of vocal fold adduction (adductory tremor) and fundamental frequency of vocal fold vibration (F0 tremor). The acoustic measures were based on spectral analyses of the amplitude envelope computed across the entire signal and within select frequency bands. Results The signals could be categorized (with accuracy well above chance) in terms of the simulated tremor source using only measures of the amplitude envelope spectrum even when multiple sources of tremor were included. Conclusions These results supply initial support for an amplitude-envelope based approach to identify the source of vocal tremor and provide further evidence for the rich information about talker characteristics present in the temporal structure of the amplitude envelope. PMID:25532813
Your Personal Analysis Toolkit - An Open Source Solution
NASA Astrophysics Data System (ADS)
Mitchell, T.
2009-12-01
Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!
VLA OH Zeeman Observations of the NGC 6334 Complex Source A
NASA Astrophysics Data System (ADS)
Mayo, E. A.; Sarma, A. P.; Troland, T. H.; Abel, N. P.
2004-12-01
We present a detailed analysis of the NGC 6334 complex source A, a compact continuum source in the SW region of the complex. Our intent is to determine the significance of the magnetic field in the support of the surrounding molecular cloud against gravitational collapse. We have performed OH 1665 and 1667 MHz observations taken with the Very Large Array in the BnA configuration and combined these data with the lower resolution CnB data of Sarma et al. (2000). These observations reveal magnetic fields with values of the order of 350 μ G toward source A, with maximum fields reaching 500 μ G. We have also theoretically modeled the molecular cloud surrounding source A using Cloudy, with the constraints to the model based on observation. This model provides significant information on the density of H2 through the cloud and also the relative density of H2 to OH which is important to our analysis of the region. We will combine the knowledge gained through the Cloudy modeling with Virial estimates to determine the significance of the magnetic field to the dynamics and evolution of source A.
Mohana, Mudiam; Reddy, Krishna; Jayshanker, Gurumurthy; Suresh, Velayudhan; Sarin, Rajendra Kumar; Sashidhar, R B
2005-08-01
A total of 124 opium samples originating from different licit opium growing divisions of India were analyzed for their principal alkaloid (thebaine, codeine, morphine, papaverine, and narcotine) content by capillary zone electrophoresis (CZE) without derivatization or purification. Absence of papaverine in Bareilly, Tilhar, and most of the samples originating from Kota is a significant observation in relation to the source of Indian opium. Multiple discriminant analysis was applied to the quantitative principal alkaloid data to determine an optimal classifier in order to evaluate the source of Indian opium. The predictive value based on the discriminant analysis was found to be 85% in relation to the source of opium and the study also revealed that all the principal alkaloids have to be analyzed for source identification of Indian opium. Chemometrics performed with principal alkaloids analytical data was used successfully in discriminating the licit opium growing divisions of India into three major groups, viz., group I, II, and III. The methodology developed may find wide forensic application in identifying the source of licit or illicit opium originating from India, and to differentiate it from opium originating from other opium producing countries.
Odor-conditioned rheotaxis of the sea lamprey: modeling, analysis and validation
Choi, Jongeun; Jean, Soo; Johnson, Nicholas S.; Brant, Cory O.; Li, Weiming
2013-01-01
Mechanisms for orienting toward and locating an odor source are sought in both biology and engineering. Chemical ecology studies have demonstrated that adult female sea lamprey show rheotaxis in response to a male pheromone with dichotomous outcomes: sexually mature females locate the source of the pheromone whereas immature females swim by the source and continue moving upstream. Here we introduce a simple switching mechanism modeled after odor-conditioned rheotaxis for the sea lamprey as they search for the source of a pheromone in a one-dimensional riverine environment. In this strategy, the females move upstream only if they detect that the pheromone concentration is higher than a threshold value and drifts down (by turning off control action to save energy) otherwise. In addition, we propose various uncertainty models such as measurement noise, actuator disturbance, and a probabilistic model of a concentration field in turbulent flow. Based on the proposed model with uncertainties, a convergence analysis showed that with this simplistic switching mechanism, the lamprey converges to the source location on average in spite of all such uncertainties. Furthermore, a slightly modified model and its extensive simulation results explain the behaviors of immature female lamprey near the source location.
Real-time Adaptive EEG Source Separation using Online Recursive Independent Component Analysis
Hsu, Sheng-Hsiou; Mullen, Tim; Jung, Tzyy-Ping; Cauwenberghs, Gert
2016-01-01
Independent Component Analysis (ICA) has been widely applied to electroencephalographic (EEG) biosignal processing and brain-computer interfaces. The practical use of ICA, however, is limited by its computational complexity, data requirements for convergence, and assumption of data stationarity, especially for high-density data. Here we study and validate an optimized online recursive ICA algorithm (ORICA) with online recursive least squares (RLS) whitening for blind source separation of high-density EEG data, which offers instantaneous incremental convergence upon presentation of new data. Empirical results of this study demonstrate the algorithm's: (a) suitability for accurate and efficient source identification in high-density (64-channel) realistically-simulated EEG data; (b) capability to detect and adapt to non-stationarity in 64-ch simulated EEG data; and (c) utility for rapidly extracting principal brain and artifact sources in real 61-channel EEG data recorded by a dry and wearable EEG system in a cognitive experiment. ORICA was implemented as functions in BCILAB and EEGLAB and was integrated in an open-source Real-time EEG Source-mapping Toolbox (REST), supporting applications in ICA-based online artifact rejection, feature extraction for real-time biosignal monitoring in clinical environments, and adaptable classifications in brain-computer interfaces. PMID:26685257
An evolution of image source camera attribution approaches.
Jahanirad, Mehdi; Wahab, Ainuddin Wahid Abdul; Anuar, Nor Badrul
2016-05-01
Camera attribution plays an important role in digital image forensics by providing the evidence and distinguishing characteristics of the origin of the digital image. It allows the forensic analyser to find the possible source camera which captured the image under investigation. However, in real-world applications, these approaches have faced many challenges due to the large set of multimedia data publicly available through photo sharing and social network sites, captured with uncontrolled conditions and undergone variety of hardware and software post-processing operations. Moreover, the legal system only accepts the forensic analysis of the digital image evidence if the applied camera attribution techniques are unbiased, reliable, nondestructive and widely accepted by the experts in the field. The aim of this paper is to investigate the evolutionary trend of image source camera attribution approaches from fundamental to practice, in particular, with the application of image processing and data mining techniques. Extracting implicit knowledge from images using intrinsic image artifacts for source camera attribution requires a structured image mining process. In this paper, we attempt to provide an introductory tutorial on the image processing pipeline, to determine the general classification of the features corresponding to different components for source camera attribution. The article also reviews techniques of the source camera attribution more comprehensively in the domain of the image forensics in conjunction with the presentation of classifying ongoing developments within the specified area. The classification of the existing source camera attribution approaches is presented based on the specific parameters, such as colour image processing pipeline, hardware- and software-related artifacts and the methods to extract such artifacts. The more recent source camera attribution approaches, which have not yet gained sufficient attention among image forensics researchers, are also critically analysed and further categorised into four different classes, namely, optical aberrations based, sensor camera fingerprints based, processing statistics based and processing regularities based, to present a classification. Furthermore, this paper aims to investigate the challenging problems, and the proposed strategies of such schemes based on the suggested taxonomy to plot an evolution of the source camera attribution approaches with respect to the subjective optimisation criteria over the last decade. The optimisation criteria were determined based on the strategies proposed to increase the detection accuracy, robustness and computational efficiency of source camera brand, model or device attribution. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Schäfer, Mike S; Füchslin, Tobias; Metag, Julia; Kristiansen, Silje; Rauchfleisch, Adrian
2018-01-01
Few studies have assessed whether populations can be divided into segments with different perceptions of science. We provide such an analysis and assess whether these segments exhibit specific patterns of media and information use. Based on representative survey data from Switzerland, we use latent class analysis to reconstruct four segments: the "Sciencephiles," with strong interest for science, extensive knowledge, and a pronounced belief in its potential, who use a variety of sources intensively; the "Critically Interested," also with strong interest and support for science but with less trust in it, who use similar sources but are more cautious toward them; the "Passive Supporters" with moderate levels of interest, trust, and knowledge and tempered perceptions of science, who use fewer sources; and the "Disengaged," who are not interested in science, do not know much about it, harbor critical views toward it, and encounter it-if at all-mostly through television.
Cramer-Rao bound analysis of wideband source localization and DOA estimation
NASA Astrophysics Data System (ADS)
Yip, Lean; Chen, Joe C.; Hudson, Ralph E.; Yao, Kung
2002-12-01
In this paper, we derive the Cramér-Rao Bound (CRB) for wideband source localization and DOA estimation. The resulting CRB formula can be decomposed into two terms: one that depends on the signal characteristic and one that depends on the array geometry. For a uniformly spaced circular array (UCA), a concise analytical form of the CRB can be given by using some algebraic approximation. We further define a DOA beamwidth based on the resulting CRB formula. The DOA beamwidth can be used to design the sampling angular spacing for the Maximum-likelihood (ML) algorithm. For a randomly distributed array, we use an elliptical model to determine the largest and smallest effective beamwidth. The effective beamwidth and the CRB analysis of source localization allow us to design an efficient algorithm for the ML estimator. Finally, our simulation results of the Approximated Maximum Likelihood (AML) algorithm are demonstrated to match well to the CRB analysis at high SNR.
ERIC Educational Resources Information Center
Oliver, Rhonda; Grote, Ellen; Rochecouste, Judith; Exell, Michael
2013-01-01
While needs analyses underpin the design of second language analytic syllabi, the methodologies undertaken are rarely examined. This paper explores the value of multiple data sources and collection methods for developing a needs analysis model to enable vocational education and training teachers to address the needs of Australian Aboriginal…
Mark Spencer; Kevin O' Hara
2006-01-01
Phytophthora ramorum is a major source of tanoak (Lithocarpus densiflorus) mortality in the tanoak/redwood (Sequoia sempervirens) forests of central California. This study presents a spatial analysis of the spread of the disease using second-order point pattern and GIS analyses. Our data set includes four plots...
Knowledge-Base Semantic Gap Analysis for the Vulnerability Detection
NASA Astrophysics Data System (ADS)
Wu, Raymond; Seki, Keisuke; Sakamoto, Ryusuke; Hisada, Masayuki
Web security became an alert in internet computing. To cope with ever-rising security complexity, semantic analysis is proposed to fill-in the gap that the current approaches fail to commit. Conventional methods limit their focus to the physical source codes instead of the abstraction of semantics. It bypasses new types of vulnerability and causes tremendous business loss.
UNCLASSIFIED TPBAR RELEASES, INCLUDING TRITIUM TTQP-1-091 Rev 14
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gruel, Robert L.; Love, Edward F.; Thornhill, Cheryl K.
This document provides a listing of unclassified tritium release values that should be assumed for unclassified analysis. Much of the information is brought forth from the related documents listed in Section 5.0 to provide a single-source listing of unclassified release values. This information has been updated based on current design analysis and available experimental data.
NASA Technical Reports Server (NTRS)
Hill, Geoffrey A.; Olson, Erik D.
2004-01-01
Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.
a Cognitive Approach to Teaching a Graduate-Level Geobia Course
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel A.
2016-06-01
Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel
Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less
Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)
NASA Astrophysics Data System (ADS)
Li, L.; Wu, Y.
2017-12-01
Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.
Wang, Anliang; Yan, Xiaolong; Wei, Zhijun
2018-04-27
This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.
Information Communication using Knowledge Engine on Flood Issues
NASA Astrophysics Data System (ADS)
Demir, I.; Krajewski, W. F.
2012-04-01
The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The system is designed for use by general public, often people with no domain knowledge and poor general science background. To improve effective communication with such audience, we have introduced a new way in IFIS to get information on flood related issues - instead of by navigating within hundreds of features and interfaces of the information system and web-based sources-- by providing dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to distributed sources of real-time stream gauges, and in-house data sources, analysis and visualization tools to answer questions grouped into several categories. Users will be able to provide input based on the query within the categories of rainfall, flood conditions, forecast, inundation maps, flood risk and data sensors. Our goal is the systematization of knowledge on flood related issues, and to provide a single source for definitive answers to factual queries. Long-term goal of this knowledge engine is to make all flood related knowledge easily accessible to everyone, and provide educational geoinformatics tool. The future implementation of the system will be able to accept free-form input and voice recognition capabilities within browser and mobile applications. We intend to deliver increasing capabilities for the system over the coming releases of IFIS. This presentation provides an overview of our Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources.
Leveraging biodiversity knowledge for potential phyto-therapeutic applications
Sharma, Vivekanand; Sarkar, Indra Neil
2013-01-01
Objective To identify and highlight the feasibility, challenges, and advantages of providing a cross-domain pipeline that can link relevant biodiversity information for phyto-therapeutic assessment. Materials and methods A public repository of clinical trials information (ClinicalTrials.gov) was explored to determine the state of plant-based interventions under investigation. Results The results showed that ∼15% of drug interventions in ClinicalTrials.gov were potentially plant related, with about 60% of them clustered within 10 taxonomic families. Further analysis of these plant-based interventions identified ∼3.7% of associated plant species as endangered as determined from the International Union for the Conservation of Nature Red List. Discussion The diversity of the plant kingdom has provided human civilization with life-sustaining food and medicine for centuries. There has been renewed interest in the investigation of botanicals as sources of new drugs, building on traditional knowledge about plant-based medicines. However, data about the plant-based biodiversity potential for therapeutics (eg, based on genetic or chemical information) are generally scattered across a range of sources and isolated from contemporary pharmacological resources. This study explored the potential to bridge biodiversity and biomedical knowledge sources. Conclusions The findings from this feasibility study suggest that there is an opportunity for developing plant-based drugs and further highlight taxonomic relationships between plants that may be rich sources for bioprospecting. PMID:23518859
Mirabelli, Mario F; Zenobi, Renato
2018-04-17
A novel capillary ionization source based on atmospheric pressure photoionization (cAPPI) was developed and used for the direct interfacing between solid-phase microextraction (SPME) and mass spectrometry (MS). The efficiency of the source was evaluated for direct and dopant-assisted photoionization, analyzing both polar (e.g., triazines and organophosphorus pesticides) and nonpolar (polycyclic aromatic hydrocarbons, PAHs) compounds. The results show that the range of compound polarity, which can be addressed by direct SPME-MS can be substantially extended by using cAPPI, compared to other sensitive techniques like direct analysis in real time (DART) and dielectric barrier discharge ionization (DBDI). The new source delivers a very high sensitivity, down to sub parts-per-trillion (ppt), making it a viable alternative when compared to previously reported and less comprehensive direct approaches.
Klein, Max; Sharma, Rati; Bohrer, Chris H; Avelis, Cameron M; Roberts, Elijah
2017-01-15
Data-parallel programming techniques can dramatically decrease the time needed to analyze large datasets. While these methods have provided significant improvements for sequencing-based analyses, other areas of biological informatics have not yet adopted them. Here, we introduce Biospark, a new framework for performing data-parallel analysis on large numerical datasets. Biospark builds upon the open source Hadoop and Spark projects, bringing domain-specific features for biology. Source code is licensed under the Apache 2.0 open source license and is available at the project website: https://www.assembla.com/spaces/roberts-lab-public/wiki/Biospark CONTACT: eroberts@jhu.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Noise analysis for CCD-based ultraviolet and visible spectrophotometry.
Davenport, John J; Hodgkinson, Jane; Saffell, John R; Tatam, Ralph P
2015-09-20
We present the results of a detailed analysis of the noise behavior of two CCD spectrometers in common use, an AvaSpec-3648 CCD UV spectrometer and an Ocean Optics S2000 Vis spectrometer. Light sources used include a deuterium UV/Vis lamp and UV and visible LEDs. Common noise phenomena include source fluctuation noise, photoresponse nonuniformity, dark current noise, fixed pattern noise, and read noise. These were identified and characterized by varying light source, spectrometer settings, or temperature. A number of noise-limiting techniques are proposed, demonstrating a best-case spectroscopic noise equivalent absorbance of 3.5×10(-4) AU for the AvaSpec-3648 and 5.6×10(-4) AU for the Ocean Optics S2000 over a 30 s integration period. These techniques can be used on other CCD spectrometers to optimize performance.
Assessment of statistical methods used in library-based approaches to microbial source tracking.
Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D
2003-12-01
Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.