Floating-point scaling technique for sources separation automatic gain control
NASA Astrophysics Data System (ADS)
Fermas, A.; Belouchrani, A.; Ait-Mohamed, O.
2012-07-01
Based on the floating-point representation and taking advantage of scaling factor indetermination in blind source separation (BSS) processing, we propose a scaling technique applied to the separation matrix, to avoid the saturation or the weakness in the recovered source signals. This technique performs an automatic gain control in an on-line BSS environment. We demonstrate the effectiveness of this technique by using the implementation of a division-free BSS algorithm with two inputs, two outputs. The proposed technique is computationally cheaper and efficient for a hardware implementation compared to the Euclidean normalisation.
Disturbance Source Separation in Shear Flows Using Blind Source Separation Methods
NASA Astrophysics Data System (ADS)
Gluzman, Igal; Cohen, Jacob; Oshman, Yaakov
2017-11-01
A novel approach is presented for identifying disturbance sources in wall-bounded shear flows. The method can prove useful for active control of boundary layer transition from laminar to turbulent flow. The underlying idea is to consider the flow state, as measured in sensors, to be a mixture of sources, and to use Blind Source Separation (BSS) techniques to recover the separate sources and their unknown mixing process. We present a BSS method based on the Degenerate Unmixing Estimation Technique. This method can be used to identify any (a priori unknown) number of sources by using the data acquired by only two sensors. The power of the new method is demonstrated via numerical and experimental proofs of concept. Wind tunnel experiments involving boundary layer flow over a flat plate were carried out, in which two hot-wire anemometers were used to separate disturbances generated by disturbance generators such as a single dielectric barrier discharge plasma actuator and a loudspeaker.
Active room compensation for sound reinforcement using sound field separation techniques.
Heuchel, Franz M; Fernandez-Grande, Efren; Agerkvist, Finn T; Shabalina, Elena
2018-03-01
This work investigates how the sound field created by a sound reinforcement system can be controlled at low frequencies. An indoor control method is proposed which actively absorbs the sound incident on a reflecting boundary using an array of secondary sources. The sound field is separated into incident and reflected components by a microphone array close to the secondary sources, enabling the minimization of reflected components by means of optimal signals for the secondary sources. The method is purely feed-forward and assumes constant room conditions. Three different sound field separation techniques for the modeling of the reflections are investigated based on plane wave decomposition, equivalent sources, and the Spatial Fourier transform. Simulations and an experimental validation are presented, showing that the control method performs similarly well at enhancing low frequency responses with the three sound separation techniques. Resonances in the entire room are reduced, although the microphone array and secondary sources are confined to a small region close to the reflecting wall. Unlike previous control methods based on the creation of a plane wave sound field, the investigated method works in arbitrary room geometries and primary source positions.
Warmerdam, G; Vullings, R; Van Pul, C; Andriessen, P; Oei, S G; Wijn, P
2013-01-01
Non-invasive fetal electrocardiography (ECG) can be used for prolonged monitoring of the fetal heart rate (FHR). However, the signal-to-noise-ratio (SNR) of non-invasive ECG recordings is often insufficient for reliable detection of the FHR. To overcome this problem, source separation techniques can be used to enhance the fetal ECG. This study uses a physiology-based source separation (PBSS) technique that has already been demonstrated to outperform widely used blind source separation techniques. Despite the relatively good performance of PBSS in enhancing the fetal ECG, PBSS is still susceptible to artifacts. In this study an augmented PBSS technique is developed to reduce the influence of artifacts. The performance of the developed method is compared to PBSS on multi-channel non-invasive fetal ECG recordings. Based on this comparison, the developed method is shown to outperform PBSS for the enhancement of the fetal ECG.
NASA Astrophysics Data System (ADS)
Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho
2015-01-01
Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.
Rifai Chai; Naik, Ganesh R; Tran, Yvonne; Sai Ho Ling; Craig, Ashley; Nguyen, Hung T
2015-08-01
An electroencephalography (EEG)-based counter measure device could be used for fatigue detection during driving. This paper explores the classification of fatigue and alert states using power spectral density (PSD) as a feature extractor and fuzzy swarm based-artificial neural network (ANN) as a classifier. An independent component analysis of entropy rate bound minimization (ICA-ERBM) is investigated as a novel source separation technique for fatigue classification using EEG analysis. A comparison of the classification accuracy of source separator versus no source separator is presented. Classification performance based on 43 participants without the inclusion of the source separator resulted in an overall sensitivity of 71.67%, a specificity of 75.63% and an accuracy of 73.65%. However, these results were improved after the inclusion of a source separator module, resulting in an overall sensitivity of 78.16%, a specificity of 79.60% and an accuracy of 78.88% (p <; 0.05).
Review of chemical separation techniques applicable to alpha spectrometric measurements
NASA Astrophysics Data System (ADS)
de Regge, P.; Boden, R.
1984-06-01
Prior to alpha-spectrometric measurements several chemical manipulations are usually required to obtain alpha-radiating sources with the desired radiochemical and chemical purity. These include sampling, dissolution or leaching of the elements of interest, conditioning of the solution, chemical separation and preparation of the alpha-emitting source. The choice of a particular method is dependent on different criteria but always involves aspects of the selectivity or the quantitative nature of the separations. The availability of suitable tracers or spikes and modern high resolution instruments resulted in the wide-spread application of isotopic dilution techniques to the problems associated with quantitative chemical separations. This enhanced the development of highly elective methods and reagents which led to important simplifications in the separation schemes. The chemical separation methods commonly used in connection with alpha-spectrometric measurements involve precipitation with selected scavenger elements, solvent extraction, ion exchange and electrodeposition techniques or any combination of them. Depending on the purpose of the final measurement and the type of sample available the chemical separation methods have to be adapted to the particular needs of environment monitoring, nuclear chemistry and metrology, safeguards and safety, waste management and requirements in the nuclear fuel cycle. Against the background of separation methods available in the literature the present paper highlights the current developments and trends in the chemical techniques applicable to alpha spectrometry.
Source Separation and Composting of Organic Municipal Solid Waste.
ERIC Educational Resources Information Center
Gould, Mark; And Others
1992-01-01
Describes a variety of composting techniques that may be utilized in a municipal level solid waste management program. Suggests how composting system designers should determine the amount and type of organics in the waste stream, evaluate separation approaches and assess collection techniques. Outlines the advantages of mixed waste composting and…
Full-Scale Turbofan Engine Noise-Source Separation Using a Four-Signal Method
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.; Arechiga, Rene O.
2016-01-01
Contributions from the combustor to the overall propulsion noise of civilian transport aircraft are starting to become important due to turbofan design trends and expected advances in mitigation of other noise sources. During on-ground, static-engine acoustic tests, combustor noise is generally sub-dominant to other engine noise sources because of the absence of in-flight effects. Consequently, noise-source separation techniques are needed to extract combustor-noise information from the total noise signature in order to further progress. A novel four-signal source-separation method is applied to data from a static, full-scale engine test and compared to previous methods. The new method is, in a sense, a combination of two- and three-signal techniques and represents an attempt to alleviate some of the weaknesses of each of those approaches. This work is supported by the NASA Advanced Air Vehicles Program, Advanced Air Transport Technology Project, Aircraft Noise Reduction Subproject and the NASA Glenn Faculty Fellowship Program.
NASA Technical Reports Server (NTRS)
Franke, John M.; Rhodes, David B.; Jones, Stephen B.; Dismond, Harriet R.
1992-01-01
A technique for synchronizing a pulse light source to charge coupled device cameras is presented. The technique permits the use of pulse light sources for continuous as well as stop action flow visualization. The technique has eliminated the need to provide separate lighting systems at facilities requiring continuous and stop action viewing or photography.
Ardila-Rey, Jorge Alfredo; Rojas-Moreno, Mónica Victoria; Martínez-Tarifa, Juan Manuel; Robles, Guillermo
2014-02-19
Partial discharge (PD) detection is a standardized technique to qualify electrical insulation in machines and power cables. Several techniques that analyze the waveform of the pulses have been proposed to discriminate noise from PD activity. Among them, spectral power ratio representation shows great flexibility in the separation of the sources of PD. Mapping spectral power ratios in two-dimensional plots leads to clusters of points which group pulses with similar characteristics. The position in the map depends on the nature of the partial discharge, the setup and the frequency response of the sensors. If these clusters are clearly separated, the subsequent task of identifying the source of the discharge is straightforward so the distance between clusters can be a figure of merit to suggest the best option for PD recognition. In this paper, two inductive sensors with different frequency responses to pulsed signals, a high frequency current transformer and an inductive loop sensor, are analyzed to test their performance in detecting and separating the sources of partial discharges.
A blind source separation approach for humpback whale song separation.
Zhang, Zhenbin; White, Paul R
2017-04-01
Many marine mammal species are highly social and are frequently encountered in groups or aggregations. When conducting passive acoustic monitoring in such circumstances, recordings commonly contain vocalizations of multiple individuals which overlap in time and frequency. This paper considers the use of blind source separation as a method for processing these recordings to separate the calls of individuals. The example problem considered here is that of the songs of humpback whales. The high levels of noise and long impulse responses can make source separation in underwater contexts a challenging proposition. The approach present here is based on time-frequency masking, allied to a noise reduction process. The technique is assessed using simulated and measured data sets, and the results demonstrate the effectiveness of the method for separating humpback whale songs.
Perceptually controlled doping for audio source separation
NASA Astrophysics Data System (ADS)
Mahé, Gaël; Nadalin, Everton Z.; Suyama, Ricardo; Romano, João MT
2014-12-01
The separation of an underdetermined audio mixture can be performed through sparse component analysis (SCA) that relies however on the strong hypothesis that source signals are sparse in some domain. To overcome this difficulty in the case where the original sources are available before the mixing process, the informed source separation (ISS) embeds in the mixture a watermark, which information can help a further separation. Though powerful, this technique is generally specific to a particular mixing setup and may be compromised by an additional bitrate compression stage. Thus, instead of watermarking, we propose a `doping' method that makes the time-frequency representation of each source more sparse, while preserving its audio quality. This method is based on an iterative decrease of the distance between the distribution of the signal and a target sparse distribution, under a perceptual constraint. We aim to show that the proposed approach is robust to audio coding and that the use of the sparsified signals improves the source separation, in comparison with the original sources. In this work, the analysis is made only in instantaneous mixtures and focused on voice sources.
Liao, Wei; Hua, Xue-Ming; Zhang, Wang; Li, Fang
2014-05-01
In the present paper, the authors calculated the plasma's peak electron temperatures under different heat source separation distance in laser- pulse GMAW hybrid welding based on Boltzmann spectrometry. Plasma's peak electron densities under the corresponding conditions were also calculated by using the Stark width of the plasma spectrum. Combined with high-speed photography, the effect of heat source separation distance on electron temperature and electron density was studied. The results show that with the increase in heat source separation distance, the electron temperatures and electron densities of laser plasma did not changed significantly. However, the electron temperatures of are plasma decreased, and the electron densities of are plasma first increased and then decreased.
Blind source separation by sparse decomposition
NASA Astrophysics Data System (ADS)
Zibulevsky, Michael; Pearlmutter, Barak A.
2000-04-01
The blind source separation problem is to extract the underlying source signals from a set of their linear mixtures, where the mixing matrix is unknown. This situation is common, eg in acoustics, radio, and medical signal processing. We exploit the property of the sources to have a sparse representation in a corresponding signal dictionary. Such a dictionary may consist of wavelets, wavelet packets, etc., or be obtained by learning from a given family of signals. Starting from the maximum a posteriori framework, which is applicable to the case of more sources than mixtures, we derive a few other categories of objective functions, which provide faster and more robust computations, when there are an equal number of sources and mixtures. Our experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques.
Surface acoustical intensity measurements on a diesel engine
NASA Technical Reports Server (NTRS)
Mcgary, M. C.; Crocker, M. J.
1980-01-01
The use of surface intensity measurements as an alternative to the conventional selective wrapping technique of noise source identification and ranking on diesel engines was investigated. A six cylinder, in line turbocharged, 350 horsepower diesel engine was used. Sound power was measured under anechoic conditions for eight separate parts of the engine at steady state operating conditions using the conventional technique. Sound power measurements were repeated on five separate parts of the engine using the surface intensity at the same steady state operating conditions. The results were compared by plotting sound power level against frequency and noise source rankings for the two methods.
Ardila-Rey, Jorge Alfredo; Rojas-Moreno, Mónica Victoria; Martínez-Tarifa, Juan Manuel; Robles, Guillermo
2014-01-01
Partial discharge (PD) detection is a standardized technique to qualify electrical insulation in machines and power cables. Several techniques that analyze the waveform of the pulses have been proposed to discriminate noise from PD activity. Among them, spectral power ratio representation shows great flexibility in the separation of the sources of PD. Mapping spectral power ratios in two-dimensional plots leads to clusters of points which group pulses with similar characteristics. The position in the map depends on the nature of the partial discharge, the setup and the frequency response of the sensors. If these clusters are clearly separated, the subsequent task of identifying the source of the discharge is straightforward so the distance between clusters can be a figure of merit to suggest the best option for PD recognition. In this paper, two inductive sensors with different frequency responses to pulsed signals, a high frequency current transformer and an inductive loop sensor, are analyzed to test their performance in detecting and separating the sources of partial discharges. PMID:24556674
Time-dependent wave splitting and source separation
NASA Astrophysics Data System (ADS)
Grote, Marcus J.; Kray, Marie; Nataf, Frédéric; Assous, Franck
2017-02-01
Starting from classical absorbing boundary conditions, we propose a method for the separation of time-dependent scattered wave fields due to multiple sources or obstacles. In contrast to previous techniques, our method is local in space and time, deterministic, and avoids a priori assumptions on the frequency spectrum of the signal. Numerical examples in two space dimensions illustrate the usefulness of wave splitting for time-dependent scattering problems.
Concentration and separation of biological organisms by ultrafiltration and dielectrophoresis
Simmons, Blake A.; Hill, Vincent R.; Fintschenko, Yolanda; Cummings, Eric B.
2010-10-12
Disclosed is a method for monitoring sources of public water supply for a variety of pathogens by using a combination of ultrafiltration techniques together dielectrophoretic separation techniques. Because water-borne pathogens, whether present due to "natural" contamination or intentional introduction, would likely be present in drinking water at low concentrations when samples are collected for monitoring or outbreak investigations, an approach is needed to quickly and efficiently concentrate and separate particles such as viruses, bacteria, and parasites in large volumes of water (e.g., 100 L or more) while simultaneously reducing the sample volume to levels sufficient for detecting low concentrations of microbes (e.g., <10 mL). The technique is also designed to screen the separated microbes based on specific conductivity and size.
Simmons, Blake A.; Hill, Vincent R.; Fintschenko, Yolanda; Cummings, Eric B.
2012-09-04
Disclosed is a method for monitoring sources of public water supply for a variety of pathogens by using a combination of ultrafiltration techniques together dielectrophoretic separation techniques. Because water-borne pathogens, whether present due to "natural" contamination or intentional introduction, would likely be present in drinking water at low concentrations when samples are collected for monitoring or outbreak investigations, an approach is needed to quickly and efficiently concentrate and separate particles such as viruses, bacteria, and parasites in large volumes of water (e.g., 100 L or more) while simultaneously reducing the sample volume to levels sufficient for detecting low concentrations of microbes (e.g., <10 mL). The technique is also designed to screen the separated microbes based on specific conductivity and size.
NASA Technical Reports Server (NTRS)
Boggs, S. E.; Lin, R. P.; Coburn, W.; Feffer, P.; Pelling, R. M.; Schroeder, P.; Slassi-Sennou, S.
1997-01-01
The balloon-borne high resolution gamma ray and X-ray germanium spectrometer (HIREGS) was used to observe the Galactic center and two positions along the Galactic plane from Antarctica in January 1995. For its flight, the collimators were configured to measure the Galactic diffuse hard X-ray continuum between 20 and 200 keV by directly measuring the point source contributions to the wide field of view flux for subtraction. The hard X-ray spectra of GX 1+4 and GRO J1655-40 were measured with the diffuse continuum subtracted off. The analysis technique for source separation is discussed and the preliminary separated spectra for these point sources and the Galactic diffuse emission are presented.
Blind source separation and localization using microphone arrays
NASA Astrophysics Data System (ADS)
Sun, Longji
The blind source separation and localization problem for audio signals is studied using microphone arrays. Pure delay mixtures of source signals typically encountered in outdoor environments are considered. Our proposed approach utilizes the subspace methods, including multiple signal classification (MUSIC) and estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithms, to estimate the directions of arrival (DOAs) of the sources from the collected mixtures. Since audio signals are generally considered broadband, the DOA estimates at frequencies with the large sum of squared amplitude values are combined to obtain the final DOA estimates. Using the estimated DOAs, the corresponding mixing and demixing matrices are computed, and the source signals are recovered using the inverse short time Fourier transform. Subspace methods take advantage of the spatial covariance matrix of the collected mixtures to achieve robustness to noise. While the subspace methods have been studied for localizing radio frequency signals, audio signals have their special properties. For instance, they are nonstationary, naturally broadband and analog. All of these make the separation and localization for the audio signals more challenging. Moreover, our algorithm is essentially equivalent to the beamforming technique, which suppresses the signals in unwanted directions and only recovers the signals in the estimated DOAs. Several crucial issues related to our algorithm and their solutions have been discussed, including source number estimation, spatial aliasing, artifact filtering, different ways of mixture generation, and source coordinate estimation using multiple arrays. Additionally, comprehensive simulations and experiments have been conducted to examine various aspects of the algorithm. Unlike the existing blind source separation and localization methods, which are generally time consuming, our algorithm needs signal mixtures of only a short duration and therefore supports real-time implementation.
A review on automated sorting of source-separated municipal solid waste for recycling.
Gundupalli, Sathish Paulraj; Hait, Subrata; Thakur, Atul
2017-02-01
A crucial prerequisite for recycling forming an integral part of municipal solid waste (MSW) management is sorting of useful materials from source-separated MSW. Researchers have been exploring automated sorting techniques to improve the overall efficiency of recycling process. This paper reviews recent advances in physical processes, sensors, and actuators used as well as control and autonomy related issues in the area of automated sorting and recycling of source-separated MSW. We believe that this paper will provide a comprehensive overview of the state of the art and will help future system designers in the area. In this paper, we also present research challenges in the field of automated waste sorting and recycling. Copyright © 2016 Elsevier Ltd. All rights reserved.
High suspended sediment concentrations (SSCs) from natural and anthropogenic sources are responsible for biological impairments of many streams, rivers, lakes, and estuaries, but techniques to estimate sediment concentrations or loads accurately at the daily temporal resolution a...
Yandayan, T; Geckeler, R D; Aksulu, M; Akgoz, S A; Ozgur, B
2016-05-01
The application of advanced error-separating shearing techniques to the precise calibration of autocollimators with Small Angle Generators (SAGs) was carried out for the first time. The experimental realization was achieved using the High Precision Small Angle Generator (HPSAG) of TUBITAK UME under classical dimensional metrology laboratory environmental conditions. The standard uncertainty value of 5 mas (24.2 nrad) reached by classical calibration method was improved to the level of 1.38 mas (6.7 nrad). Shearing techniques, which offer a unique opportunity to separate the errors of devices without recourse to any external standard, were first adapted by Physikalisch-Technische Bundesanstalt (PTB) to the calibration of autocollimators with angle encoders. It has been demonstrated experimentally in a clean room environment using the primary angle standard of PTB (WMT 220). The application of the technique to a different type of angle measurement system extends the range of the shearing technique further and reveals other advantages. For example, the angular scales of the SAGs are based on linear measurement systems (e.g., capacitive nanosensors for the HPSAG). Therefore, SAGs show different systematic errors when compared to angle encoders. In addition to the error-separation of HPSAG and the autocollimator, detailed investigations on error sources were carried out. Apart from determination of the systematic errors of the capacitive sensor used in the HPSAG, it was also demonstrated that the shearing method enables the unique opportunity to characterize other error sources such as errors due to temperature drift in long term measurements. This proves that the shearing technique is a very powerful method for investigating angle measuring systems, for their improvement, and for specifying precautions to be taken during the measurements.
Research on preventive technologies for bed-separation water hazard in China coal mines
NASA Astrophysics Data System (ADS)
Gui, Herong; Tong, Shijie; Qiu, Weizhong; Lin, Manli
2018-03-01
Bed-separation water is one of the major water hazards in coal mines. Targeted researches on the preventive technologies are of paramount importance to safe mining. This article studied the restrictive effect of geological and mining factors, such as lithological properties of roof strata, coal seam inclination, water source to bed separations, roof management method, dimensions of mining working face, and mining progress, on the formation of bed-separation water hazard. The key techniques to prevent bed-separation water-related accidents include interception, diversion, destructing the buffer layer, grouting and backfilling, etc. The operation and efficiency of each technique are corroborated in field engineering cases. The results of this study will offer reference to countries with similar mining conditions in the researches on bed-separation water burst and hazard control in coal mines.
Discovery of three strongly lensed quasars in the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Williams, P. R.; Agnello, A.; Treu, T.; Abramson, L. E.; Anguita, T.; Apostolovski, Y.; Chen, G. C.-F.; Fassnacht, C. D.; Hsueh, J. W.; Lemaux, B. C.; Motta, V.; Oldham, L.; Rojas, K.; Rusu, C. E.; Shajib, A. J.; Wang, X.
2018-06-01
We present the discovery of three quasar lenses in the Sloan Digital Sky Survey, selected using two novel photometry-based selection techniques. The J0941+0518 system, with two point sources separated by 5.46 arcsec on either side of a galaxy, has source and lens redshifts 1.54 and 0.343. Images of J2257+2349 show two point sources separated by 1.67 arcsec on either side of an E/S0 galaxy. The extracted spectra show two images of the same quasar at zs = 2.10. SDSS J1640+1045 has two quasar spectra at zs = 1.70 and fits to the SDSS and Pan-STARRS images confirm the presence of a galaxy between the two point sources. We observed 56 photometrically selected lens candidates in this follow-up campaign, confirming three new lenses, re-discovering one known lens, and ruling out 36 candidates, with 16 still inconclusive. This initial campaign demonstrates the power of purely photometric selection techniques in finding lensed quasars.
40 CFR 246.202-3 - Recommended procedures: Market study.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... techniques. (b) Directly contacting buyers and determining the buyers' quality specifications, potential...
40 CFR 246.201-4 - Recommended procedures: Market study.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... research techniques. (b) Directly contacting buyers and determining the buyers' quality specifications...
40 CFR 246.202-3 - Recommended procedures: Market study.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... techniques. (b) Directly contacting buyers and determining the buyers' quality specifications, potential...
40 CFR 246.202-3 - Recommended procedures: Market study.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... techniques. (b) Directly contacting buyers and determining the buyers' quality specifications, potential...
40 CFR 246.201-4 - Recommended procedures: Market study.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... research techniques. (b) Directly contacting buyers and determining the buyers' quality specifications...
40 CFR 246.201-4 - Recommended procedures: Market study.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... research techniques. (b) Directly contacting buyers and determining the buyers' quality specifications...
NASA Astrophysics Data System (ADS)
Wright, L.; Coddington, O.; Pilewskie, P.
2016-12-01
Hyperspectral instruments are a growing class of Earth observing sensors designed to improve remote sensing capabilities beyond discrete multi-band sensors by providing tens to hundreds of continuous spectral channels. Improved spectral resolution, range and radiometric accuracy allow the collection of large amounts of spectral data, facilitating thorough characterization of both atmospheric and surface properties. These new instruments require novel approaches for processing imagery and separating surface and atmospheric signals. One approach is numerical source separation, which allows the determination of the underlying physical causes of observed signals. Improved source separation will enable hyperspectral imagery to better address key science questions relevant to climate change, including land-use changes, trends in clouds and atmospheric water vapor, and aerosol characteristics. We developed an Informed Non-negative Matrix Factorization (INMF) method for separating atmospheric and surface sources. INMF offers marked benefits over other commonly employed techniques including non-negativity, which avoids physically impossible results; and adaptability, which tailors the method to hyperspectral source separation. The INMF algorithm is adapted to separate contributions from physically distinct sources using constraints on spectral and spatial variability, and library spectra to improve the initial guess. We also explore methods to produce an initial guess of the spatial separation patterns. Using this INMF algorithm we decompose hyperspectral imagery from the NASA Hyperspectral Imager for the Coastal Ocean (HICO) with a focus on separating surface and atmospheric signal contributions. HICO's coastal ocean focus provides a dataset with a wide range of atmospheric conditions, including high and low aerosol optical thickness and cloud cover, with only minor contributions from the ocean surfaces in order to isolate the contributions of the multiple atmospheric sources.
40 CFR 246.200-3 - Recommended procedures: Market study.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... techniques; (b) Directly contacting buyers, and determining the buyers' quality specifications, the exact...
40 CFR 246.200-3 - Recommended procedures: Market study.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... techniques; (b) Directly contacting buyers, and determining the buyers' quality specifications, the exact...
40 CFR 246.200-3 - Recommended procedures: Market study.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) SOLID WASTES SOURCE SEPARATION FOR MATERIALS RECOVERY GUIDELINES Requirements and Recommended Procedures... techniques; (b) Directly contacting buyers, and determining the buyers' quality specifications, the exact...
Blind source separation problem in GPS time series
NASA Astrophysics Data System (ADS)
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2016-04-01
A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition techniques in explaining the data and in recovering the original (known) sources. Using the same number of components, we find that the vbICA method fits the data almost as well as a PCA method, since the χ 2 increase is less than 10 % the value calculated using a PCA decomposition. Unlike PCA, the vbICA algorithm is found to correctly separate the sources if the correlation of the dataset is low (<0.67) and the geodetic network is sufficiently dense (ten continuous GPS stations within a box of side equal to two times the locking depth of a fault where an earthquake of Mw >6 occurred). We also provide a cookbook for the use of the vbICA algorithm in analyses of position time series for tectonic and non-tectonic applications.
Limitations and applications of ICA for surface electromyogram.
Djuwari, D; Kumar, D K; Naik, G R; Arjunan, S P; Palaniswami, M
2006-09-01
Surface electromyogram (SEMG) has numerous applications, but the presence of artefacts and noise, especially at low level of muscle activity make the recordings unreliable. Spectral and temporal overlap can make the removal of artefacts and noise, or separation of relevant signals from other bioelectric signals extremely difficult. Individual muscles may be considered as independent at the local level and this makes an argument for separating the signals using independent component analysis (ICA). In the recent past, due to the easy availability of ICA tools, numbers of researchers have attempted to use ICA for this application. This paper reports research conducted to evaluate the use of ICA for the separation of muscle activity and removal of the artefacts from SEMG. It discusses some of the conditions that could affect the reliability of the separation and evaluates issues related to the properties of the signals and number of sources. The paper also identifies the lack of suitable measure of quality of separation for bioelectric signals and it recommends and tests a more robust measure of separation. The paper also reports tests using Zibulevsky's technique of temporal plotting to identify number of independent sources in SEMG recordings. The theoretical analysis and experimental results demonstrate that ICA is suitable for SEMG signals. The results identify the unsuitability of ICA when the number of sources is greater than the number of recording channels. The results also demonstrate the limitations of such applications due to the inability of the system to identify the correct order and magnitude of the signals. The paper determines the suitability of the use of error measure using simulated mixing matrix and the estimated unmixing matrix as a means identifying the quality of separation of the output. The work demonstrates that even at extremely low level of muscle contraction, and with filtering using wavelets and band pass filters, it is not possible to get the data sparse enough to identify number of independent sources using Zibulevs.ky's technique.
NASA Astrophysics Data System (ADS)
Fourel, Loïc; Limare, Angela; Jaupart, Claude; Surducan, Emanoil; Farnetani, Cinzia G.; Kaminski, Edouard C.; Neamtu, Camelia; Surducan, Vasile
2017-08-01
Convective motions in silicate planets are largely driven by internal heat sources and secular cooling. The exact amount and distribution of heat sources in the Earth are poorly constrained and the latter is likely to change with time due to mixing and to the deformation of boundaries that separate different reservoirs. To improve our understanding of planetary-scale convection in these conditions, we have designed a new laboratory setup allowing a large range of heat source distributions. We illustrate the potential of our new technique with a study of an initially stratified fluid involving two layers with different physical properties and internal heat production rates. A modified microwave oven is used to generate a uniform radiation propagating through the fluids. Experimental fluids are solutions of hydroxyethyl cellulose and salt in water, such that salt increases both the density and the volumetric heating rate. We determine temperature and composition fields in 3D with non-invasive techniques. Two fluorescent dyes are used to determine temperature. A Nd:YAG planar laser beam excites fluorescence, and an optical system, involving a beam splitter and a set of colour filters, captures the fluorescence intensity distribution on two separate spectral bands. The ratio between the two intensities provides an instantaneous determination of temperature with an uncertainty of 5% (typically 1K). We quantify mixing processes by precisely tracking the interfaces separating the two fluids. These novel techniques allow new insights on the generation, morphology and evolution of large-scale heterogeneities in the Earth's lower mantle.
Turboprop IDEAL: a motion-resistant fat-water separation technique.
Huo, Donglai; Li, Zhiqiang; Aboussouan, Eric; Karis, John P; Pipe, James G
2009-01-01
Suppression of the fat signal in MRI is very important for many clinical applications. Multi-point water-fat separation methods, such as IDEAL (Iterative Decomposition of water and fat with Echo Asymmetry and Least-squares estimation), can robustly separate water and fat signal, but inevitably increase scan time, making separated images more easily affected by patient motions. PROPELLER (Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction) and Turboprop techniques offer an effective approach to correct for motion artifacts. By combining these techniques together, we demonstrate that the new TP-IDEAL method can provide reliable water-fat separation with robust motion correction. The Turboprop sequence was modified to acquire source images, and motion correction algorithms were adjusted to assure the registration between different echo images. Theoretical calculations were performed to predict the optimal shift and spacing of the gradient echoes. Phantom images were acquired, and results were compared with regular FSE-IDEAL. Both T1- and T2-weighted images of the human brain were used to demonstrate the effectiveness of motion correction. TP-IDEAL images were also acquired for pelvis, knee, and foot, showing great potential of this technique for general clinical applications.
Noise-Source Separation Using Internal and Far-Field Sensors for a Full-Scale Turbofan Engine
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.; Miles, Jeffrey H.
2009-01-01
Noise-source separation techniques for the extraction of the sub-dominant combustion noise from the total noise signatures obtained in static-engine tests are described. Three methods are applied to data from a static, full-scale engine test. Both 1/3-octave and narrow-band results are discussed. The results are used to assess the combustion-noise prediction capability of the Aircraft Noise Prediction Program (ANOPP). A new additional phase-angle-based discriminator for the three-signal method is also introduced.
Maksimovic, Svetolik; Tadic, Vanja; Skala, Dejan; Zizovic, Irena
2017-06-01
Helichrysum italicum presents a valuable source of natural bioactive compounds. In this work, a literature review of terpenes, phenolic compounds, and other less common phytochemicals from H. italicum with regard to application of different separation methods is presented. Data including extraction/separation methods and experimental conditions applied, obtained yields, number of identified compounds, content of different compound groups, and analytical techniques applied are shown as corresponding tables. Numerous biological activities of both isolates and individual compounds are emphasized. In addition, the data reported are discussed, and the directions for further investigations are proposed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Detection of Partial Discharge Sources Using UHF Sensors and Blind Signal Separation
Boya, Carlos; Parrado-Hernández, Emilio
2017-01-01
The measurement of the emitted electromagnetic energy in the UHF region of the spectrum allows the detection of partial discharges and, thus, the on-line monitoring of the condition of the insulation of electrical equipment. Unfortunately, determining the affected asset is difficult when there are several simultaneous insulation defects. This paper proposes the use of an independent component analysis (ICA) algorithm to separate the signals coming from different partial discharge (PD) sources. The performance of the algorithm has been tested using UHF signals generated by test objects. The results are validated by two automatic classification techniques: support vector machines and similarity with class mean. Both methods corroborate the suitability of the algorithm to separate the signals emitted by each PD source even when they are generated by the same type of insulation defect. PMID:29140267
Quantitative identification of riverine nitrogen from point, direct runoff and base flow sources.
Huang, Hong; Zhang, Baifa; Lu, Jun
2014-01-01
We present a methodological example for quantifying the contributions of riverine total nitrogen (TN) from point, direct runoff and base flow sources by combining a recursive digital filter technique and statistical methods. First, we separated daily riverine flow into direct runoff and base flow using a recursive digital filter technique; then, a statistical model was established using daily simultaneous data for TN load, direct runoff rate, base flow rate, and temperature; and finally, the TN loading from direct runoff and base flow sources could be inversely estimated. As a case study, this approach was adopted to identify the TN source contributions in Changle River, eastern China. Results showed that, during 2005-2009, the total annual TN input to the river was 1,700.4±250.2 ton, and the contributions of point, direct runoff and base flow sources were 17.8±2.8%, 45.0±3.6%, and 37.2±3.9%, respectively. The innovation of the approach is that the nitrogen from direct runoff and base flow sources could be separately quantified. The approach is simple but detailed enough to take the major factors into account, providing an effective and reliable method for riverine nitrogen loading estimation and source apportionment.
Ground Truth Events with Source Geometry in Eurasia and the Middle East
2016-06-02
source properties, including seismic moment, corner frequency, radiated energy , and stress drop have been obtained using spectra for S waves following...PARAMETERS Other source parameters, including radiated energy , corner frequency, seismic moment, and static stress drop were calculated using a spectral...technique (Richardson & Jordan, 2002; Andrews, 1986). The process entails separating event and station spectra and median- stacking each event’s
System identification through nonstationary data using Time-Frequency Blind Source Separation
NASA Astrophysics Data System (ADS)
Guo, Yanlin; Kareem, Ahsan
2016-06-01
Classical output-only system identification (SI) methods are based on the assumption of stationarity of the system response. However, measured response of buildings and bridges is usually non-stationary due to strong winds (e.g. typhoon, and thunder storm etc.), earthquakes and time-varying vehicle motions. Accordingly, the response data may have time-varying frequency contents and/or overlapping of modal frequencies due to non-stationary colored excitation. This renders traditional methods problematic for modal separation and identification. To address these challenges, a new SI technique based on Time-Frequency Blind Source Separation (TFBSS) is proposed. By selectively utilizing "effective" information in local regions of the time-frequency plane, where only one mode contributes to energy, the proposed technique can successfully identify mode shapes and recover modal responses from the non-stationary response where the traditional SI methods often encounter difficulties. This technique can also handle response with closely spaced modes which is a well-known challenge for the identification of large-scale structures. Based on the separated modal responses, frequency and damping can be easily identified using SI methods based on a single degree of freedom (SDOF) system. In addition to the exclusive advantage of handling non-stationary data and closely spaced modes, the proposed technique also benefits from the absence of the end effects and low sensitivity to noise in modal separation. The efficacy of the proposed technique is demonstrated using several simulation based studies, and compared to the popular Second-Order Blind Identification (SOBI) scheme. It is also noted that even some non-stationary response data can be analyzed by the stationary method SOBI. This paper also delineates non-stationary cases where SOBI and the proposed scheme perform comparably and highlights cases where the proposed approach is more advantageous. Finally, the performance of the proposed method is evaluated using a full-scale non-stationary response of a tall building during an earthquake and found it to perform satisfactorily.
Blind source separation of ex-vivo aorta tissue multispectral images
Galeano, July; Perez, Sandra; Montoya, Yonatan; Botina, Deivid; Garzón, Johnson
2015-01-01
Blind Source Separation methods (BSS) aim for the decomposition of a given signal in its main components or source signals. Those techniques have been widely used in the literature for the analysis of biomedical images, in order to extract the main components of an organ or tissue under study. The analysis of skin images for the extraction of melanin and hemoglobin is an example of the use of BSS. This paper presents a proof of concept of the use of source separation of ex-vivo aorta tissue multispectral Images. The images are acquired with an interference filter-based imaging system. The images are processed by means of two algorithms: Independent Components analysis and Non-negative Matrix Factorization. In both cases, it is possible to obtain maps that quantify the concentration of the main chromophores present in aortic tissue. Also, the algorithms allow for spectral absorbance of the main tissue components. Those spectral signatures were compared against the theoretical ones by using correlation coefficients. Those coefficients report values close to 0.9, which is a good estimator of the method’s performance. Also, correlation coefficients lead to the identification of the concentration maps according to the evaluated chromophore. The results suggest that Multi/hyper-spectral systems together with image processing techniques is a potential tool for the analysis of cardiovascular tissue. PMID:26137366
Ball, J.W.; Bassett, R.L.
2000-01-01
A method has been developed for separating the Cr dissolved in natural water from matrix elements and determination of its stable isotope ratios using solid-source thermal-ionization mass spectrometry (TIMS). The separation method takes advantage of the existence of the oxidized form of Cr as an oxyanion to separate it from interfering cations using anion-exchange chromatography, and of the reduced form of Cr as a positively charged ion to separate it from interfering anions such as sulfate. Subsequent processing of the separated sample eliminates residual organic material for application to a solid source filament. Ratios for 53Cr/52Cr for National Institute of Standards and Technology Standard Reference Material 979 can be measured using the silica gel-boric acid technique with a filament-to-filament standard deviation in the mean 53Cr/52Cr ratio for 50 replicates of 0.00005 or less. (C) 2000 Elsevier Science B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wright, L.; Coddington, O.; Pilewskie, P.
2015-12-01
Current challenges in Earth remote sensing require improved instrument spectral resolution, spectral coverage, and radiometric accuracy. Hyperspectral instruments, deployed on both aircraft and spacecraft, are a growing class of Earth observing sensors designed to meet these challenges. They collect large amounts of spectral data, allowing thorough characterization of both atmospheric and surface properties. The higher accuracy and increased spectral and spatial resolutions of new imagers require new numerical approaches for processing imagery and separating surface and atmospheric signals. One potential approach is source separation, which allows us to determine the underlying physical causes of observed changes. Improved signal separation will allow hyperspectral instruments to better address key science questions relevant to climate change, including land-use changes, trends in clouds and atmospheric water vapor, and aerosol characteristics. In this work, we investigate a Non-negative Matrix Factorization (NMF) method for the separation of atmospheric and land surface signal sources. NMF offers marked benefits over other commonly employed techniques, including non-negativity, which avoids physically impossible results, and adaptability, which allows the method to be tailored to hyperspectral source separation. We adapt our NMF algorithm to distinguish between contributions from different physically distinct sources by introducing constraints on spectral and spatial variability and by using library spectra to inform separation. We evaluate our NMF algorithm with simulated hyperspectral images as well as hyperspectral imagery from several instruments including, the NASA Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), NASA Hyperspectral Imager for the Coastal Ocean (HICO) and National Ecological Observatory Network (NEON) Imaging Spectrometer.
A modal separation measurement technique for broadband noise propagating inside circular ducts
NASA Technical Reports Server (NTRS)
Kerschen, E. J.; Johnston, J. P.
1981-01-01
A measurement technique which separates broadband noise propagating inside circular ducts into the acoustic duct modes is developed. The technique is also applicable to discrete frequency noise. The acoustic modes are produced by weighted combinations of the instantaneous outputs of microphones spaced around the duct circumference. The technique is compared with the cross spectral density approach presently available and found to have certain advantages, and disadvantages. Considerable simplification of both the new technique and the cross spectral density approach occurs when no correlation exists between different circumferential mode orders. The properties leading to uncorrelated modes and experimental tests which verify this condition are discussed. The modal measurement technique is applied to the case of broadband noise generated by flow through a coaxial obstruction (nozzle or orifice) in a pipe. Different circumferential mode orders are shown to be uncorrelated for this type of noise source.
Efficient image enhancement using sparse source separation in the Retinex theory
NASA Astrophysics Data System (ADS)
Yoon, Jongsu; Choi, Jangwon; Choe, Yoonsik
2017-11-01
Color constancy is the feature of the human vision system (HVS) that ensures the relative constancy of the perceived color of objects under varying illumination conditions. The Retinex theory of machine vision systems is based on the HVS. Among Retinex algorithms, the physics-based algorithms are efficient; however, they generally do not satisfy the local characteristics of the original Retinex theory because they eliminate global illumination from their optimization. We apply the sparse source separation technique to the Retinex theory to present a physics-based algorithm that satisfies the locality characteristic of the original Retinex theory. Previous Retinex algorithms have limited use in image enhancement because the total variation Retinex results in an overly enhanced image and the sparse source separation Retinex cannot completely restore the original image. In contrast, our proposed method preserves the image edge and can very nearly replicate the original image without any special operation.
Sensors research and technology
NASA Technical Reports Server (NTRS)
Cutts, James A.
1988-01-01
Information on sensors research and technology is given in viewgraph form. Information is given on sensing techniques for space science, passive remote sensing techniques and applications, submillimeter coherent sensing, submillimeter mixers and local oscillator sources, non-coherent sensors, active remote sensing, solid state laser development, a low vibration cooler, separation of liquid helium and vapor phase in zero gravity, and future plans.
Explosion localization via infrasound.
Szuberla, Curt A L; Olson, John V; Arnoult, Kenneth M
2009-11-01
Two acoustic source localization techniques were applied to infrasonic data and their relative performance was assessed. The standard approach for low-frequency localization uses an ensemble of small arrays to separately estimate far-field source bearings, resulting in a solution from the various back azimuths. This method was compared to one developed by the authors that treats the smaller subarrays as a single, meta-array. In numerical simulation and a field experiment, the latter technique was found to provide improved localization precision everywhere in the vicinity of a 3-km-aperture meta-array, often by an order of magnitude.
The effects of inter-cavity separation on optical coupling in dielectric bispheres.
Ashili, Shashanka P; Astratov, Vasily N; Sykes, E Charles H
2006-10-02
The optical coupling between two size-mismatched spheres was studied by using one sphere as a local source of light with whispering gallery modes (WGMs) and detecting the intensity of the light scattered by a second sphere playing the part of a receiver of electromagnetic energy. We developed techniques to control inter-cavity gap sizes between microspheres with ~30nm accuracy. We demonstrate high efficiencies (up to 0.2-0.3) of coupling between two separated cavities with strongly detuned eigenstates. At small separations (<1 microm) between the spheres, the mechanism of coupling is interpreted in terms of the Fano resonance between discrete level (true WGMs excited in a source sphere) and a continuum of "quasi"-WGMs with distorted shape which can be induced in the receiving sphere. At larger separations the spectra detected from the receiving sphere originate from scattering of the radiative modes.
NASA Astrophysics Data System (ADS)
Manicke, Nicholas E.; Belford, Michael
2015-05-01
One limitation in the growing field of ambient or direct analysis methods is reduced selectivity caused by the elimination of chromatographic separations prior to mass spectrometric analysis. We explored the use of high-field asymmetric waveform ion mobility spectrometry (FAIMS), an ambient pressure ion mobility technique, to separate the closely related opiate isomers of morphine, hydromorphone, and norcodeine. These isomers cannot be distinguished by tandem mass spectrometry. Separation prior to MS analysis is, therefore, required to distinguish these compounds, which are important in clinical chemistry and toxicology. FAIMS was coupled to a triple quadrupole mass spectrometer, and ionization was performed using either a pneumatically assisted heated electrospray ionization source (H-ESI) or paper spray, a direct analysis method that has been applied to the direct analysis of dried blood spots and other complex samples. We found that FAIMS was capable of separating the three opiate structural isomers using both H-ESI and paper spray as the ionization source.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravelo Arias, S. I.; Ramírez Muñoz, D.; Cardoso, S.
2015-06-15
The work shows a measurement technique to obtain the correct value of the four elements in a resistive Wheatstone bridge without the need to separate the physical connections existing between them. Two electronic solutions are presented, based on a source-and-measure unit and using discrete electronic components. The proposed technique brings the possibility to know the mismatching or the tolerance between the bridge resistive elements and then to pass or reject it in terms of its related common-mode rejection. Experimental results were taken in various Wheatstone resistive bridges (discrete and magnetoresistive integrated bridges) validating the proposed measurement technique specially when themore » bridge is micro-fabricated and there is no physical way to separate one resistive element from the others.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karim Ghani, Wan Azlina Wan Ab., E-mail: wanaz@eng.upm.edu.my; Rusli, Iffah Farizan, E-mail: iffahrusli@yahoo.com; Biak, Dayang Radiah Awang, E-mail: dayang@eng.upm.edu.my
Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designingmore » campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and collection times are also encouraged public’s involvement and consequently, the participations rate. The findings from this study may provide useful indicator to the waste management authorities in Malaysia in identifying mechanisms for future development and implementation of food waste source separation activities in household programmes and communication campaign which advocate the use of these programmes.« less
Guo, Hailing; Zhu, Guangshan; Hewitt, Ian J; Qiu, Shilun
2009-02-11
In this communication, the copper net supported Cu(3)(BTC)(2) membranes have been successfully synthesized by means of a "twin copper source" technique. Separation studies on gaseous mixtures (H(2)/CO(2), H(2)/CH(4), and H(2)/N(2)) using the membrane revealed that the membrane possesses high permeability and selectivity for H(2) over CO(2), N(2), and CH(4). Compared with the conventional zeolite membranes, the copper net supported Cu(3)(BTC)(2) membrane exhibited high permeation flux in gas separation. Such highly efficient copper net supported Cu(3)(BTC)(2) membranes could be used to separate, recycle, and reuse H(2) exhausted from steam reforming natural gas.
Karim Ghani, Wan Azlina Wan Ab; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni
2013-05-01
Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public's view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and collection times are also encouraged public's involvement and consequently, the participations rate. The findings from this study may provide useful indicator to the waste management authorities in Malaysia in identifying mechanisms for future development and implementation of food waste source separation activities in household programmes and communication campaign which advocate the use of these programmes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Probing interferometric parallax with interplanetary spacecraft
NASA Astrophysics Data System (ADS)
Rodeghiero, G.; Gini, F.; Marchili, N.; Jain, P.; Ralston, J. P.; Dallacasa, D.; Naletto, G.; Possenti, A.; Barbieri, C.; Franceschini, A.; Zampieri, L.
2017-07-01
We describe an experimental scenario for testing a novel method to measure distance and proper motion of astronomical sources. The method is based on multi-epoch observations of amplitude or intensity correlations between separate receiving systems. This technique is called Interferometric Parallax, and efficiently exploits phase information that has traditionally been overlooked. The test case we discuss combines amplitude correlations of signals from deep space interplanetary spacecraft with those from distant galactic and extragalactic radio sources with the goal of estimating the interplanetary spacecraft distance. Interferometric parallax relies on the detection of wavefront curvature effects in signals collected by pairs of separate receiving systems. The method shows promising potentialities over current techniques when the target is unresolved from the background reference sources. Developments in this field might lead to the construction of an independent, geometrical cosmic distance ladder using a dedicated project and future generation instruments. We present a conceptual overview supported by numerical estimates of its performances applied to a spacecraft orbiting the Solar System. Simulations support the feasibility of measurements with a simple and time-saving observational scheme using current facilities.
NASA Astrophysics Data System (ADS)
Bi, ChuanXing; Jing, WenQian; Zhang, YongBin; Xu, Liang
2015-02-01
The conventional nearfield acoustic holography (NAH) is usually based on the assumption of free-field conditions, and it also requires that the measurement aperture should be larger than the actual source. This paper is to focus on the problem that neither of the above-mentioned requirements can be met, and to examine the feasibility of reconstructing the sound field radiated by partial source, based on double-layer pressure measurements made in a non-free field by using patch NAH combined with sound field separation technique. And also, the sensitivity of the reconstructed result to the measurement error is analyzed in detail. Two experiments involving two speakers in an exterior space and one speaker inside a car cabin are presented. The experimental results demonstrate that the patch NAH based on single-layer pressure measurement cannot obtain a satisfied result due to the influences of disturbing sources and reflections, while the patch NAH based on double-layer pressure measurements can successfully remove these influences and reconstruct the patch sound field effectively.
Yu, Huanzhou; Shimakawa, Ann; Hines, Catherine D. G.; McKenzie, Charles A.; Hamilton, Gavin; Sirlin, Claude B.; Brittain, Jean H.; Reeder, Scott B.
2011-01-01
Multipoint water–fat separation techniques rely on different water–fat phase shifts generated at multiple echo times to decompose water and fat. Therefore, these methods require complex source images and allow unambiguous separation of water and fat signals. However, complex-based water–fat separation methods are sensitive to phase errors in the source images, which may lead to clinically important errors. An alternative approach to quantify fat is through “magnitude-based” methods that acquire multiecho magnitude images. Magnitude-based methods are insensitive to phase errors, but cannot estimate fat-fraction greater than 50%. In this work, we introduce a water–fat separation approach that combines the strengths of both complex and magnitude reconstruction algorithms. A magnitude-based reconstruction is applied after complex-based water–fat separation to removes the effect of phase errors. The results from the two reconstructions are then combined. We demonstrate that using this hybrid method, 0–100% fat-fraction can be estimated with improved accuracy at low fat-fractions. PMID:21695724
NASA Astrophysics Data System (ADS)
Eiserbeck, Christiane; Nelson, Robert K.; Grice, Kliti; Curiale, Joseph; Reddy, Christopher M.
2012-06-01
Higher plant biomarkers occur in various compound classes with an array of isomers that are challenging to separate and identify. Traditional one-dimensional (1D) gas chromatographic (GC) techniques achieved impressive results in the past, but have reached limitations in many cases. Comprehensive two-dimensional gas chromatography (GC × GC) either coupled to a flame ionization detector (GC × GC-FID) or time-of-flight mass spectrometer (GC × GC-TOFMS) is a powerful tool to overcome the challenges of 1D GC, such as the resolution of unresolved complex mixture (UCM). We studied a number of Tertiary, terrigenous oils, and source rocks from the Arctic and Southeast Asia, with special focus on angiosperm biomarkers, such as oleanoids and lupanoids. Different chromatographic separation and detection techniques such as traditional 1D GC-MS, metastable reaction monitoring (GC-MRM-MS), GC × GC-FID, and GC × GC-TOFMS are compared and applied to evaluate the differences and advantages in their performance for biomarker identification. The measured 22S/(22S + 22R) homohopane ratios for all applied techniques were determined and compare exceptionally well (generally between 2% and 10%). Furthermore, we resolved a variety of angiosperm-derived compounds that co-eluted using 1D GC techniques, demonstrating the superior separation power of GC × GC for these biomarkers, which indicate terrigenous source input and Cretaceous or younger ages. Samples of varying thermal maturity and biodegradation contain higher plant biomarkers from various stages of diagenesis and catagenesis, which can be directly assessed in a GC × GC chromatogram. The analysis of whole crude oils and rock extracts without loss in resolution enables the separation of unstable compounds that are prone to rearrangement (e.g. unsaturated triterpenoids such as taraxer-14-ene) when exposed to fractionation techniques like molecular sieving. GC × GC-TOFMS is particularly valuable for the successful separation of co-eluting components having identical molecular masses and similar fragmentation patterns. Such components co-elute when analysed by 1D GC and cannot be resolved by single-ion-monitoring, which prevents accurate mass spectral assessment for identification or quantification.
NASA Technical Reports Server (NTRS)
Parrott, T. L.; Schein, D. B.; Gridley, D.
1985-01-01
The acoustic response of a semireverberant enclosure with two interacting, velocity-prescribed source distributions was analyzed using standard modal analysis techniques with a view toward a better understanding of active noise control. Different source and enclosure dimensions, source separations, and single-wall admittances were studied over representative frequency bandwidths of 10 Hz with source relative phase as a parameter. Results indicate that power radiated into the enclosure agree qualitatively with the spatial average of the mean square pressure, even though the reverberant field is nondiffuse. Decreases in acoustic power can therefore be used to estimate global noise reduction in a nondiffuse semireverberant environment. As might be expected, parametric studies indicate that maximum power reductions of up to 25 dB can be achieved when secondary and primary sources are compact and closely spaced. Although less success is achieved with increasing frequency and source separation or size, significant suppression of up to 8 dB still occurs over the 1 to 2 Hz bandwidth.
Apparatus And Methods For Launching And Receiving A Broad Wavelength Range Source
Von Drasek, William A.; Sonnenfroh, David; Allen, Mark G.; Stafford-Evans, Joy
2006-02-28
An apparatus and method for simultaneous detection of N gas species through laser radiation attenuation techniques is disclosed. Each of the N species has a spectral absorption band. N laser sources operate at a wavelength ?N in a spectral absorption band separated by the cutoff wavelength for single-mode transmission. Each laser source corresponds to a gas species and transmits radiation through an optical fiber constructed and arranged to provide single-mode transmission with minimal power loss.
Parameterizing unresolved obstacles with source terms in wave modeling: A real-world application
NASA Astrophysics Data System (ADS)
Mentaschi, Lorenzo; Kakoulaki, Georgia; Vousdoukas, Michalis; Voukouvalas, Evangelos; Feyen, Luc; Besio, Giovanni
2018-06-01
Parameterizing the dissipative effects of small, unresolved coastal features, is fundamental to improve the skills of wave models. The established technique to deal with this problem consists in reducing the amount of energy advected within the propagation scheme, and is currently available only for regular grids. To find a more general approach, Mentaschi et al., 2015b formulated a technique based on source terms, and validated it on synthetic case studies. This technique separates the parameterization of the unresolved features from the energy advection, and can therefore be applied to any numerical scheme and to any type of mesh. Here we developed an open-source library for the estimation of the transparency coefficients needed by this approach, from bathymetric data and for any type of mesh. The spectral wave model WAVEWATCH III was used to show that in a real-world domain, such as the Caribbean Sea, the proposed approach has skills comparable and sometimes better than the established propagation-based technique.
Ardila-Rey, Jorge Alfredo; Montaña, Johny; de Castro, Bruno Albuquerque; Schurch, Roger; Covolan Ulson, José Alfredo; Muhammad-Sukki, Firdaus; Bani, Nurul Aini
2018-03-29
Partial discharges (PDs) are one of the most important classes of ageing processes that occur within electrical insulation. PD detection is a standardized technique to qualify the state of the insulation in electric assets such as machines and power cables. Generally, the classical phase-resolved partial discharge (PRPD) patterns are used to perform the identification of the type of PD source when they are related to a specific degradation process and when the electrical noise level is low compared to the magnitudes of the PD signals. However, in practical applications such as measurements carried out in the field or in industrial environments, several PD sources and large noise signals are usually present simultaneously. In this study, three different inductive sensors have been used to evaluate and compare their performance in the detection and separation of multiple PD sources by applying the chromatic technique to each of the measured signals.
Wavelet-based techniques for the gamma-ray sky
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias
2016-07-01
Here, we demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from darkmore » matter annihilation and extended gamma-ray point source populations in a data-driven way.« less
An algorithm for separation of mixed sparse and Gaussian sources
Akkalkotkar, Ameya
2017-01-01
Independent component analysis (ICA) is a ubiquitous method for decomposing complex signal mixtures into a small set of statistically independent source signals. However, in cases in which the signal mixture consists of both nongaussian and Gaussian sources, the Gaussian sources will not be recoverable by ICA and will pollute estimates of the nongaussian sources. Therefore, it is desirable to have methods for mixed ICA/PCA which can separate mixtures of Gaussian and nongaussian sources. For mixtures of purely Gaussian sources, principal component analysis (PCA) can provide a basis for the Gaussian subspace. We introduce a new method for mixed ICA/PCA which we call Mixed ICA/PCA via Reproducibility Stability (MIPReSt). Our method uses a repeated estimations technique to rank sources by reproducibility, combined with decomposition of multiple subsamplings of the original data matrix. These multiple decompositions allow us to assess component stability as the size of the data matrix changes, which can be used to determinine the dimension of the nongaussian subspace in a mixture. We demonstrate the utility of MIPReSt for signal mixtures consisting of simulated sources and real-word (speech) sources, as well as mixture of unknown composition. PMID:28414814
An algorithm for separation of mixed sparse and Gaussian sources.
Akkalkotkar, Ameya; Brown, Kevin Scott
2017-01-01
Independent component analysis (ICA) is a ubiquitous method for decomposing complex signal mixtures into a small set of statistically independent source signals. However, in cases in which the signal mixture consists of both nongaussian and Gaussian sources, the Gaussian sources will not be recoverable by ICA and will pollute estimates of the nongaussian sources. Therefore, it is desirable to have methods for mixed ICA/PCA which can separate mixtures of Gaussian and nongaussian sources. For mixtures of purely Gaussian sources, principal component analysis (PCA) can provide a basis for the Gaussian subspace. We introduce a new method for mixed ICA/PCA which we call Mixed ICA/PCA via Reproducibility Stability (MIPReSt). Our method uses a repeated estimations technique to rank sources by reproducibility, combined with decomposition of multiple subsamplings of the original data matrix. These multiple decompositions allow us to assess component stability as the size of the data matrix changes, which can be used to determinine the dimension of the nongaussian subspace in a mixture. We demonstrate the utility of MIPReSt for signal mixtures consisting of simulated sources and real-word (speech) sources, as well as mixture of unknown composition.
Removal of EMG and ECG artifacts from EEG based on wavelet transform and ICA.
Zhou, Weidong; Gotman, Jean
2004-01-01
In this study, the methods of wavelet threshold de-noising and independent component analysis (ICA) are introduced. ICA is a novel signal processing technique based on high order statistics, and is used to separate independent components from measurements. The extended ICA algorithm does not need to calculate the higher order statistics, converges fast, and can be used to separate subGaussian and superGaussian sources. A pre-whitening procedure is performed to de-correlate the mixed signals before extracting sources. The experimental results indicate the electromyogram (EMG) and electrocardiograph (ECG) artifacts in electroencephalograph (EEG) can be removed by a combination of wavelet threshold de-noising and ICA.
Wei, Rongfei; Guo, Qingjun; Wen, Hanjie; Peters, Marc; Yang, Junxing; Tian, Liyan; Han, Xiaokun
2017-01-01
In this study, key factors affecting the chromatographic separation of Cd from plants, such as the resin column, digestion and purification procedures, were experimentally investigated. A technique for separating Cd from plant samples based on single ion-exchange chromatography has been developed, which is suitable for the high-precision analysis of Cd isotopes by multiple-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). The robustness of the technique was assessed by replicate analyses of Cd standard solutions and plant samples. The Cd yields of the whole separation process were higher than 95%, and the 114/110 Cd values of three Cd second standard solutions (Münster Cd, Spex Cd, Spex-1 Cd solutions) relative to the NIST SRM 3108 were measured accurately, which enabled the comparisons of Cd isotope results obtained in other laboratories. Hence, stable Cd isotope analyses represent a powerful tool for fingerprinting specific Cd sources and/or examining biogeochemical reactions in ecological and environmental systems.
NASA Technical Reports Server (NTRS)
Pawson, Steven; Nielsen, J. Eric
2011-01-01
Attribution of observed atmospheric carbon concentrations to emissions on the country, state or city level is often inferred using "inversion" techniques. Such computations are often performed using advanced mathematical techniques, such as synthesis inversion or four-dimensional variational analysis, that invoke tracing observed atmospheric concentrations backwards through a transport model to a source region. It is, to date, not well understood how well such techniques can represent fine spatial (and temporal) structure in the inverted flux fields. This question is addressed using forward-model computations with idealized tracers emitted at the surface in a large number of grid boxes over selected regions and examining how distinctly these emitted tracers can be detected downstream. Initial results show that tracers emitted in half-degree grid boxes over a large region of the Eastern USA cannot be distinguished from each other, even at short distances over the Atlantic Ocean, when they are emitted in grid boxes separated by less than five degrees of latitude - especially when only total-column observations are available. A large number of forward model simulations, with varying meteorological conditions, are used to assess how distinctly three types observations (total column, upper tropospheric column, and surface mixing ratio) can separate emissions from different sources. Inferences inverse modeling and source attribution will be drawn.
Rapid calibrated high-resolution hyperspectral imaging using tunable laser source
NASA Astrophysics Data System (ADS)
Nguyen, Lam K.; Margalith, Eli
2009-05-01
We present a novel hyperspectral imaging technique based on tunable laser technology. By replacing the broadband source and tunable filters of a typical NIR imaging instrument, several advantages are realized, including: high spectral resolution, highly variable field-of-views, fast scan-rates, high signal-to-noise ratio, and the ability to use optical fiber for efficient and flexible sample illumination. With this technique, high-resolution, calibrated hyperspectral images over the NIR range can be acquired in seconds. The performance of system features will be demonstrated on two example applications: detecting melamine contamination in wheat gluten and separating bovine protein from wheat protein in cattle feed.
NASA Astrophysics Data System (ADS)
Araújo, Iván Gómez; Sánchez, Jesús Antonio García; Andersen, Palle
2018-05-01
Transmissibility-based operational modal analysis is a recent and alternative approach used to identify the modal parameters of structures under operational conditions. This approach is advantageous compared with traditional operational modal analysis because it does not make any assumptions about the excitation spectrum (i.e., white noise with a flat spectrum). However, common methodologies do not include a procedure to extract closely spaced modes with low signal-to-noise ratios. This issue is relevant when considering that engineering structures generally have closely spaced modes and that their measured responses present high levels of noise. Therefore, to overcome these problems, a new combined method for modal parameter identification is proposed in this work. The proposed method combines blind source separation (BSS) techniques and transmissibility-based methods. Here, BSS techniques were used to recover source signals, and transmissibility-based methods were applied to estimate modal information from the recovered source signals. To achieve this combination, a new method to define a transmissibility function was proposed. The suggested transmissibility function is based on the relationship between the power spectral density (PSD) of mixed signals and the PSD of signals from a single source. The numerical responses of a truss structure with high levels of added noise and very closely spaced modes were processed using the proposed combined method to evaluate its ability to identify modal parameters in these conditions. Colored and white noise excitations were used for the numerical example. The proposed combined method was also used to evaluate the modal parameters of an experimental test on a structure containing closely spaced modes. The results showed that the proposed combined method is capable of identifying very closely spaced modes in the presence of noise and, thus, may be potentially applied to improve the identification of damping ratios.
Verification of Agricultural Methane Emission Inventories
NASA Astrophysics Data System (ADS)
Desjardins, R. L.; Pattey, E.; Worth, D. E.; VanderZaag, A.; Mauder, M.; Srinivasan, R.; Worthy, D.; Sweeney, C.; Metzger, S.
2017-12-01
It is estimated that agriculture contributes more than 40% of anthropogenic methane (CH4) emissions in North America. However, these estimates, which are either based on the Intergovernmental Panel on Climate Change (IPCC) methodology or inverse modeling techniques, are poorly validated due to the challenges of separating interspersed CH4 sources within agroecosystems. A flux aircraft, instrumented with a fast-response Picarro CH4 analyzer for the eddy covariance (EC) technique and a sampling system for the relaxed eddy accumulation technique (REA), was flown at an altitude of about 150 m along several 20-km transects over an agricultural region in Eastern Canada. For all flight days, the top-down CH4 flux density measurements were compared to the footprint adjusted bottom-up estimates based on an IPCC Tier II methodology. Information on the animal population, land use type and atmospheric and surface variables were available for each transect. Top-down and bottom-up estimates of CH4 emissions were found to be poorly correlated, and wetlands were the most frequent confounding source of CH4; however, there were other sources such as waste treatment plants and biodigesters. Spatially resolved wavelet covariance estimates of CH4 emissions helped identify the contribution of wetlands to the overall CH4 flux, and the dependence of these emissions on temperature. When wetland contribution in the flux footprint was minimized, top-down and bottom-up estimates agreed to within measurement error. This research demonstrates that although existing aircraft-based technology can be used to verify regional ( 100 km2) agricultural CH4 emissions, it remains challenging due to diverse sources of CH4 present in many regions. The use of wavelet covariance to generate spatially-resolved flux estimates was found to be the best way to separate interspersed sources of CH4.
Application of acoustic imaging techniques on snowmobile pass-by noise.
Padois, Thomas; Berry, Alain
2017-02-01
Snowmobile manufacturers invest important efforts to reduce the noise emission of their products. The noise sources of snowmobiles are multiple and closely spaced, leading to difficult source separation in practice. In this study, source imaging results for snowmobile pass-by noise are discussed. The experiments involve a 193-microphone Underbrink array, with synchronization of acoustic with video data provided by a high-speed camera. Both conventional beamforming and Clean-SC deconvolution are implemented to provide noise source maps of the snowmobile. The results clearly reveal noise emission from the engine, exhaust, and track depending on the frequency range considered.
Solid Phase Extraction (SPE) for Biodiesel Processing and Analysis
2017-12-13
1 METHODS ...sources. There are several methods than can be applied to development of separation techniques that may replace necessary water wash steps in...biodiesel refinement. Unfortunately, the most common methods are poorly suited or face high costs when applied to diesel purification. Distillation is
Quantum Theory of Superresolution for Incoherent Optical Imaging
NASA Astrophysics Data System (ADS)
Tsang, Mankei
Rayleigh's criterion for resolving two incoherent point sources has been the most influential measure of optical imaging resolution for over a century. In the context of statistical image processing, violation of the criterion is especially detrimental to the estimation of the separation between the sources, and modern far-field superresolution techniques rely on suppressing the emission of close sources to enhance the localization precision. Using quantum optics, quantum metrology, and statistical analysis, here we show that, even if two close incoherent sources emit simultaneously, measurements with linear optics and photon counting can estimate their separation from the far field almost as precisely as conventional methods do for isolated sources, rendering Rayleigh's criterion irrelevant to the problem. Our results demonstrate that superresolution can be achieved not only for fluorophores but also for stars. Recent progress in generalizing our theory for multiple sources and spectroscopy will also be discussed. This work is supported by the Singapore National Research Foundation under NRF Grant No. NRF-NRFF2011-07 and the Singapore Ministry of Education Academic Research Fund Tier 1 Project R-263-000-C06-112.
NASA Astrophysics Data System (ADS)
Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi
2018-02-01
In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.
Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise
NASA Astrophysics Data System (ADS)
Rozhkov, Mikhail; Kitov, Ivan
2015-04-01
Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the International Monitoring System of CTBTO and by small-aperture seismic array Mikhnevo (MHVAR) operated by the Institute of Geosphere Dynamics, Russian Academy of Sciences. Our approach demonstrated a good ability of separation of seismic sources with very close origin times and locations (hundreds of meters), and/or having close arrival times (fractions of seconds), and recovering their waveforms from the mixture. Perspectives and limitations of the method are discussed.
Use of color-coded sleeve shutters accelerates oscillograph channel selection
NASA Technical Reports Server (NTRS)
Bouchlas, T.; Bowden, F. W.
1967-01-01
Sleeve-type shutters mechanically adjust individual galvanometer light beams onto or away from selected channels on oscillograph papers. In complex test setups, the sleeve-type shutters are color coded to separately identify each oscillograph channel. This technique could be used on any equipment using tubular galvanometer light sources.
Sound field separation with sound pressure and particle velocity measurements.
Fernandez-Grande, Efren; Jacobsen, Finn; Leclère, Quentin
2012-12-01
In conventional near-field acoustic holography (NAH) it is not possible to distinguish between sound from the two sides of the array, thus, it is a requirement that all the sources are confined to only one side and radiate into a free field. When this requirement cannot be fulfilled, sound field separation techniques make it possible to distinguish between outgoing and incoming waves from the two sides, and thus NAH can be applied. In this paper, a separation method based on the measurement of the particle velocity in two layers and another method based on the measurement of the pressure and the velocity in a single layer are proposed. The two methods use an equivalent source formulation with separate transfer matrices for the outgoing and incoming waves, so that the sound from the two sides of the array can be modeled independently. A weighting scheme is proposed to account for the distance between the equivalent sources and measurement surfaces and for the difference in magnitude between pressure and velocity. Experimental and numerical studies have been conducted to examine the methods. The double layer velocity method seems to be more robust to noise and flanking sound than the combined pressure-velocity method, although it requires an additional measurement surface. On the whole, the separation methods can be useful when the disturbance of the incoming field is significant. Otherwise the direct reconstruction is more accurate and straightforward.
High-performance liquid chromatography of oligoguanylates at high pH
NASA Technical Reports Server (NTRS)
Stribling, R.; Deamer, D. (Principal Investigator)
1991-01-01
Because of the stable self-structures formed by oligomers of guanosine, standard high-performance liquid chromatography techniques for oligonucleotide fractionation are not applicable. Previously, oligoguanylate separations have been carried out at pH 12 using RPC-5 as the packing material. While RPC-5 provides excellent separations, there are several limitations, including the lack of a commercially available source. This report describes a new anion-exchange high-performance liquid chromatography method using HEMA-IEC BIO Q, which successfully separates different forms of the guanosine monomer as well as longer oligoguanylates. The reproducibility and stability at high pH suggests a versatile role for this material.
Techniques and instrumentation for the measurement of transient sound energy flux
NASA Astrophysics Data System (ADS)
Watkinson, P. S.; Fahy, F. J.
1983-12-01
The evaluation of sound intensity distributions, and sound powers, of essentially continuous sources such as automotive engines, electric motors, production line machinery, furnaces, earth moving machinery and various types of process plants were studied. Although such systems are important sources of community disturbance and, to a lesser extent, of industrial health hazard, the most serious sources of hearing hazard in industry are machines operating on an impact principle, such as drop forges, hammers and punches. Controlled experiments to identify major noise source regions and mechanisms are difficult because it is normally impossible to install them in quiet, anechoic environments. The potential for sound intensity measurement to provide a means of overcoming these difficulties has given promising results, indicating the possibility of separation of directly radiated and reverberant sound fields. However, because of the complexity of transient sound fields, a fundamental investigation is necessary to establish the practicability of intensity field decomposition, which is basic to source characterization techniques.
Ardila-Rey, Jorge Alfredo; Montaña, Johny; Schurch, Roger; Covolan Ulson, José Alfredo; Bani, Nurul Aini
2018-01-01
Partial discharges (PDs) are one of the most important classes of ageing processes that occur within electrical insulation. PD detection is a standardized technique to qualify the state of the insulation in electric assets such as machines and power cables. Generally, the classical phase-resolved partial discharge (PRPD) patterns are used to perform the identification of the type of PD source when they are related to a specific degradation process and when the electrical noise level is low compared to the magnitudes of the PD signals. However, in practical applications such as measurements carried out in the field or in industrial environments, several PD sources and large noise signals are usually present simultaneously. In this study, three different inductive sensors have been used to evaluate and compare their performance in the detection and separation of multiple PD sources by applying the chromatic technique to each of the measured signals. PMID:29596337
Metal catalyst technique for texturing silicon solar cells
Ruby, Douglas S.; Zaidi, Saleem H.
2001-01-01
Textured silicon solar cells and techniques for their manufacture utilizing metal sources to catalyze formation of randomly distributed surface features such as nanoscale pyramidal and columnar structures. These structures include dimensions smaller than the wavelength of incident light, thereby resulting in a highly effective anti-reflective surface. According to the invention, metal sources present in a reactive ion etching chamber permit impurities (e.g. metal particles) to be introduced into a reactive ion etch plasma resulting in deposition of micro-masks on the surface of a substrate to be etched. Separate embodiments are disclosed including one in which the metal source includes one or more metal-coated substrates strategically positioned relative to the surface to be textured, and another in which the walls of the reaction chamber are pre-conditioned with a thin coating of metal catalyst material.
Screening of polar components of petroleum products by electrospray ionization mass spectrometry
Rostad, Colleen E.
2005-01-01
The polar components of fuels may enable differentiation between fuel types or commercial fuel sources. Screening for these components in the hydrocarbon product is difficult due to their very low concentrations in such a complex matrix. Various commercial fuels from several sources were analyzed by flow injection analysis/electrospray ionization/mass spectrometry without extensive sample preparation, separation, or chromatography. This technique enabled screening for unique polar components at very low concentrations in commercial hydrocarbon products. This analysis was then applied to hydrocarbon samples collected from the subsurface with a different extent of biodegradation or weathering. Although the alkane and isoprenoid portion had begun to biodegrade or weather, the polar components had changed little over time. Because these polar compounds are unique in different fuels, this screening technique can provide source information on hydrocarbons released into the environment.
Phase sensitive optical coherence microscopy for photothermal imaging of gold nanorods
NASA Astrophysics Data System (ADS)
Hu, Yong; Podoleanu, Adrian G.; Dobre, George
2018-03-01
We describe a swept source based phase sensitive optical coherence microscopy (OCM) system for photothermal imaging of gold nanorods (GNR). The phase sensitive OCM system employed in the study has a displacement sensitivity of 0.17 nm to vibrations at single frequencies below 250 Hz. We demonstrate the generation of phase maps and confocal phase images. By displaying the difference between successive confocal phase images, we perform the confocal photothermal imaging of accumulated GNRs behind a glass coverslip and behind the scattering media separately. Compared with two-photon luminescence (TPL) detection techniques reported in literature, the technique in this study has the advantage of a simplified experimental setup and provides a more efficient method for imaging the aggregation of GNR. However, the repeatability performance of this technique suffers due to jitter noise from the swept laser source.
Numerical Simulation of Dispersion from Urban Greenhouse Gas Sources
NASA Astrophysics Data System (ADS)
Nottrott, Anders; Tan, Sze; He, Yonggang; Winkler, Renato
2017-04-01
Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model scalar emissions from various components of the natural gas distribution system, to study the impact of urban meteorology on mobile greenhouse gas measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of plumes, due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments. The Boussinesq approximation was applied to investigate the effects of canopy layer temperature gradients and convection on sensor footprints.
NASA Astrophysics Data System (ADS)
Hoenders, Bernhard J.; Ferwerda, Hedzer A.
1998-09-01
We separate the field generated by a spherically symmetric bounded scalar monochromatic source into a radiative and non-radiative part. The non-radiative part is obtained by projecting the total field on the space spanned by the non-radiating inhomogeneous modes, i.e. the modes which satisfy the inhomogeneous wave equation. Using residue techniques, introduced by Cauchy, we obtain an explicit analytical expression for the non-radiating component. We also identify the part of the source distribution which corresponds to this non-radiating part. The analysis is based on the scalar wave equation.
Depth profile of 236U/238U in soil samples in La Palma, Canary Islands
Srncik, M.; Steier, P.; Wallner, G.
2011-01-01
The vertical distribution of the 236U/238U isotopic ratio was investigated in soil samples from three different locations on La Palma (one of the seven Canary Islands, Spain). Additionally the 240Pu/239Pu atomic ratio, as it is a well establish tool for the source identification, was determined. The radiochemical procedure consisted of a U separation step by extraction chromatography using UTEVA® Resin (Eichrom Technologies, Inc.). Afterwards Pu was separated from Th and Np by anion exchange using Dowex 1x2 (Dow Chemical Co.). Furthermore a new chemical procedure with tandem columns to separate Pu and U from the matrix was tested. For the determination of the uranium and plutonium isotopes by alpha spectrometry thin sources were prepared by microprecipitation techniques. Additionally these fractions separated from the soil samples were measured by Accelerator Mass Spectrometry (AMS) to get information on the isotopic ratios 236U/238U, 240Pu/239Pu and 236U/239Pu, respectively. The 236U concentrations [atoms/g] in each surface layer (∼2 cm) were surprisingly high compared to deeper layers where values around two orders of magnitude smaller were found. Since the isotopic ratio 240Pu/239Pu indicated a global fallout signature we assume the same origin as the probable source for 236U. Our measured 236U/239Pu value of around 0.2 is within the expected range for this contamination source. PMID:21481502
A method for monitoring nuclear absorption coefficients of aviation fuels
NASA Technical Reports Server (NTRS)
Sprinkle, Danny R.; Shen, Chih-Ping
1989-01-01
A technique for monitoring variability in the nuclear absorption characteristics of aviation fuels has been developed. It is based on a highly collimated low energy gamma radiation source and a sodium iodide counter. The source and the counter assembly are separated by a geometrically well-defined test fuel cell. A computer program for determining the mass attenuation coefficient of the test fuel sample, based on the data acquired for a preset counting period, has been developed and tested on several types of aviation fuel.
NASA Technical Reports Server (NTRS)
Vincent, R. K.; Thomas, G. S.; Nalepka, R. F.
1974-01-01
The importance of specific spectral regions to signature extension is explored. In the recent past, the signature extension task was focused on the development of new techniques. Tested techniques are now used to investigate this spectral aspect of the large area survey. Sets of channels were sought which, for a given technique, were the least affected by several sources of variation over four data sets and yet provided good object class separation on each individual data set. Using sets of channels determined as part of this study, signature extension was accomplished between data sets collected over a six-day period and over a range of about 400 kilometers.
Partial information decomposition as a spatiotemporal filter.
Flecker, Benjamin; Alford, Wesley; Beggs, John M; Williams, Paul L; Beer, Randall D
2011-09-01
Understanding the mechanisms of distributed computation in cellular automata requires techniques for characterizing the emergent structures that underlie information processing in such systems. Recently, techniques from information theory have been brought to bear on this problem. Building on this work, we utilize the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information. We then propose a new set of filters and demonstrate that they more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.
NASA Astrophysics Data System (ADS)
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2014-12-01
A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise), and study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, we apply vbICA to different tectonically active scenarios, such as earthquakes in central and northern Italy, as well as the study of slow slip events in Cascadia.
Astrometria diferencial de precision con VLBI el triangulo de Draco (y estudios de SN1993J)
NASA Astrophysics Data System (ADS)
Ros, E.
1997-11-01
The Very Long Baseline Interferometry (VLBI) technique provides unprecedented resolutions in astronomy. In this PhD we show progress in the study of high precision phase-delay differential astrometry through observations of the radio source triangle formed by the BL-Lac objects 1803+784 and 2007+777, and the QSO 1928+738, in the Northern constellation of Draco (the Dragon), from observations carried out on 20/21 November 1991 with an intercontinental interferometric array simultaneously at the frequencies of 2.3 and 8.4 GHz. We have determined the angular separations among the three radio sources with submilliarcsecond accuracy from a weighted least squares analysis of the differential phase delay from the three celestial bodies. Our present work introduces important advances with respect to previous astrometric studies, carried out over radio source pairs separated by smaller angular distances. We have consistently modeled the parameters involved in an astrometric VLBI observation, in order to reproduce the differential phase observed for radio sources separated by almost 7o on the sky. We have demonstrated the possibility of phase-connection over these angular distances at 8.4 GHz, even at an epoch of a maximum in the solar activity. After the phase-connection we have corrected the effects of the extended structure of the radio source and of the ionosphere. This last correction is one of the main technical achievements of this thesis: it is possible to remove the ionospheric contribution with independent measurements of the ionosphere total electron content obtained at Global Positioning Systems (GPS) sites the VLBI observing stations. The triangular geometry introduces constraints in parameter space that allow a better estimation of the angular separations among the radio sources. It is possible to test the consistency of the astrometric results through the Sky-Closure, defined as the circular sum of the angular separations of the three radio sources, determined pairwise and independently. In our case it is consistent with zero, and verifies satisfactorily the data process followed. The comparison of the measurements of the separations of the pair 1928+738/2007+777 (1991 data) with previous measurements (data from 1985 and 1988), carried out with the same technique, allows us to register adequately the absolute position of 1928+738 relative to 2007+777. We estimate the proper motion of components in 1928+738, and also identify the position of the radio source core. We confirm the superluminal motion of the components of 1928+738. The comparison of our results with those obtained by Eubanks (USNO) from group delay measurements (without structure correction) show the incorrectness of the latter. We also include succinctly in this PhD my colaboration in the work on the radio supernova SN 1993J, in galaxy M81. We have discovered a shell-like structure of the radio emission of SN 1993J which exploded on March 1993. We have also elaborated a movie of its evolution, by monitoring the shell structure for different epochs, and determined the deceleration of its expansion.
Auto white balance method using a pigmentation separation technique for human skin color
NASA Astrophysics Data System (ADS)
Tanaka, Satomi; Kakinuma, Akihiro; Kamijo, Naohiro; Takahashi, Hiroshi; Tsumura, Norimichi
2017-02-01
The human visual system maintains the perception of colors of an object across various light sources. Similarly, current digital cameras feature an auto white balance function, which estimates the illuminant color and corrects the color of a photograph as if the photograph was taken under a certain light source. The main subject in a photograph is often a person's face, which could be used to estimate the illuminant color. However, such estimation is adversely affected by differences in facial colors among individuals. The present paper proposes an auto white balance algorithm based on a pigmentation separation method that separates the human skin color image into the components of melanin, hemoglobin and shading. Pigment densities have a uniform property within the same race that can be calculated from the components of melanin and hemoglobin in the face. We, thus, propose a method that uses the subject's facial color in an image and is unaffected by individual differences in facial color among Japanese people.
Palmgren, M S; Lee, L S
1986-01-01
Two distinct reservoirs of mycotoxins exist in fungal-infected cereal grains--the fungal spores and the spore-free mycelium-substrate matrix. Many fungal spores are of respirable size and the mycelium-substrate matrix can be pulverized to form particles of respirable size during routine handling of grain. In order to determine the contribution of each source to the level of mycotoxin contamination of dust, we developed techniques to harvest and separate mycelium-substrate matrices from spores of fungi. Conventional quantitative chromatographic analyses of separated materials indicated that aflatoxin from Aspergillus parasiticus, norsolorinic acid from a mutant of A. parasiticus, and secalonic acid D from Penicillium oxalicum were concentrated in the mycelium-substrate matrices and not in the spores. In contrast, spores of Aspergillus niger and Aspergillus fumigatus contained significant concentrations of aurasperone C and fumigaclavine C, respectively; only negligible amounts of the toxins were detected in the mycelium-substrate matrices of these two fungi. PMID:3709472
NASA Technical Reports Server (NTRS)
Hartfield, Roy J., Jr.; Dobson, Chris; Eskridge, Richard; Wehrmeyer, Joseph A.
1997-01-01
A novel technique for extracting Q-branch Raman signals scattered by a diatomic species from the emission spectrum resulting from the irradiation of combustion products using a broadband excimer laser has been developed. This technique is based on the polarization characteristics of vibrational Raman scattering and can be used for both single-shot Raman extraction and time-averaged data collection. The Q-branch Raman signal has a unique set of polarization characteristics which depend on the direction of the scattering while fluorescence signals are unpolarized. For the present work, a calcite crystal is used to separate the horizonal component of a collected signal from the vertical component. The two components are then sent through a UV spectrometer and imaged onto an intensified CCD camera separately. The vertical component contains both the Raman signal and the interfering fluorescence signal. The horizontal component contains the fluorescence signal and a very weak component of the Raman signal; hence, the Raman scatter can be extracted by taking the difference between the two signals. The separation of the Raman scatter from interfering fluorescence signals is critically important to the interpretation of the Raman for cases in which a broadband ultraviolet (UV) laser is used as an excitation source in a hydrogen-oxygen flame and in all hydrocarbon flames. The present work provides a demonstration of the separation of the Raman scatter from the fluorescence background in real time.
Imaging of neural oscillations with embedded inferential and group prevalence statistics.
Donhauser, Peter W; Florin, Esther; Baillet, Sylvain
2018-02-01
Magnetoencephalography and electroencephalography (MEG, EEG) are essential techniques for studying distributed signal dynamics in the human brain. In particular, the functional role of neural oscillations remains to be clarified. For that reason, imaging methods need to identify distinct brain regions that concurrently generate oscillatory activity, with adequate separation in space and time. Yet, spatial smearing and inhomogeneous signal-to-noise are challenging factors to source reconstruction from external sensor data. The detection of weak sources in the presence of stronger regional activity nearby is a typical complication of MEG/EEG source imaging. We propose a novel, hypothesis-driven source reconstruction approach to address these methodological challenges. The imaging with embedded statistics (iES) method is a subspace scanning technique that constrains the mapping problem to the actual experimental design. A major benefit is that, regardless of signal strength, the contributions from all oscillatory sources, which activity is consistent with the tested hypothesis, are equalized in the statistical maps produced. We present extensive evaluations of iES on group MEG data, for mapping 1) induced oscillations using experimental contrasts, 2) ongoing narrow-band oscillations in the resting-state, 3) co-modulation of brain-wide oscillatory power with a seed region, and 4) co-modulation of oscillatory power with peripheral signals (pupil dilation). Along the way, we demonstrate several advantages of iES over standard source imaging approaches. These include the detection of oscillatory coupling without rejection of zero-phase coupling, and detection of ongoing oscillations in deeper brain regions, where signal-to-noise conditions are unfavorable. We also show that iES provides a separate evaluation of oscillatory synchronization and desynchronization in experimental contrasts, which has important statistical advantages. The flexibility of iES allows it to be adjusted to many experimental questions in systems neuroscience.
Imaging of neural oscillations with embedded inferential and group prevalence statistics
2018-01-01
Magnetoencephalography and electroencephalography (MEG, EEG) are essential techniques for studying distributed signal dynamics in the human brain. In particular, the functional role of neural oscillations remains to be clarified. For that reason, imaging methods need to identify distinct brain regions that concurrently generate oscillatory activity, with adequate separation in space and time. Yet, spatial smearing and inhomogeneous signal-to-noise are challenging factors to source reconstruction from external sensor data. The detection of weak sources in the presence of stronger regional activity nearby is a typical complication of MEG/EEG source imaging. We propose a novel, hypothesis-driven source reconstruction approach to address these methodological challenges. The imaging with embedded statistics (iES) method is a subspace scanning technique that constrains the mapping problem to the actual experimental design. A major benefit is that, regardless of signal strength, the contributions from all oscillatory sources, which activity is consistent with the tested hypothesis, are equalized in the statistical maps produced. We present extensive evaluations of iES on group MEG data, for mapping 1) induced oscillations using experimental contrasts, 2) ongoing narrow-band oscillations in the resting-state, 3) co-modulation of brain-wide oscillatory power with a seed region, and 4) co-modulation of oscillatory power with peripheral signals (pupil dilation). Along the way, we demonstrate several advantages of iES over standard source imaging approaches. These include the detection of oscillatory coupling without rejection of zero-phase coupling, and detection of ongoing oscillations in deeper brain regions, where signal-to-noise conditions are unfavorable. We also show that iES provides a separate evaluation of oscillatory synchronization and desynchronization in experimental contrasts, which has important statistical advantages. The flexibility of iES allows it to be adjusted to many experimental questions in systems neuroscience. PMID:29408902
Coupled LC-GC techniques for the characterisation of polycyclic aromatic compounds in fuel materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Askey, S.A.; Holden, K.M.L.; Bartle, K.D.
1995-12-31
Exposure to polycyclic aromatic compounds (PAC) has long been identified as of considerable environmental concern. Originating from both natural and anthropogenic sources, many PAC exhibit significant carcinogenic and mutagenic properties. Multi-dimensional chromatographic techniques which provide separation by virtue of chemical class (group-type) or by molecular mass greatly simplifies the analysis of inherently complex fuel materials. In this study, on-line LC-GC techniques in which high resolution gas chromatography (HPLC) have been investigated. Comprehensive characterisation of fuel feedstocks and post-pyrolysis and combustion products was achieved by coupling LC-GC to low resolution ion trap mass spectrometry (ITD-MS) and atomic emission detection (AED). Themore » identification of PAC in diesel and coal materials, as well as urban air and diesel exhaust particulate extracts has provided valuable insight into the source, formation and distribution of such compounds pre- and post processing.« less
Enhanced Imaging of Corrosion in Aircraft Structures with Reverse Geometry X-ray(registered tm)
NASA Technical Reports Server (NTRS)
Winfree, William P.; Cmar-Mascis, Noreen A.; Parker, F. Raymond
2000-01-01
The application of Reverse Geometry X-ray to the detection and characterization of corrosion in aircraft structures is presented. Reverse Geometry X-ray is a unique system that utilizes an electronically scanned x-ray source and a discrete detector for real time radiographic imaging of a structure. The scanned source system has several advantages when compared to conventional radiography. First, the discrete x-ray detector can be miniaturized and easily positioned inside a complex structure (such as an aircraft wing) enabling images of each surface of the structure to be obtained separately. Second, using a measurement configuration with multiple detectors enables the simultaneous acquisition of data from several different perspectives without moving the structure or the measurement system. This provides a means for locating the position of flaws and enhances separation of features at the surface from features inside the structure. Data is presented on aircraft specimens with corrosion in the lap joint. Advanced laminographic imaging techniques utilizing data from multiple detectors are demonstrated to be capable of separating surface features from corrosion in the lap joint and locating the corrosion in multilayer structures. Results of this technique are compared to computed tomography cross sections obtained from a microfocus x-ray tomography system. A method is presented for calibration of the detectors of the Reverse Geometry X-ray system to enable quantification of the corrosion to within 2%.
Kumar, B Ramesh
2017-12-01
Diets containing high proportions of fruits and vegetables reduce the risk of onset of chronic diseases. The role of herbal medicines in improving human health is gaining popularity over the years, which also increases the need for safety and efficiency of these products. Green leafy vegetables (GLVs) are the richest source of phenolic compounds with excellent antioxidant properties. Increased consumption of diets containing phenolic compounds may give positive and better results to human health and significantly improves the immune system. Highly selective, susceptible and versatile analytical techniques are necessary for extraction, identification, and quantification of phenolic compounds from plant extracts, which helps to utilize their important biological properties. Recent advances in the pre-treatment procedures, separation techniques and spectrometry methods are used for qualitative and quantitative analysis of phenolic compounds. The online coupling of liquid chromatography with mass spectrometry (LC-MS) has become a useful tool in the metabolic profiling of plant samples. In this review, the separation and identification of phenolic acids and flavonoids from GLVs by LC-MS have been discussed along with the general extraction procedures and other sources of mass spectrometer used. The review is devoted to the understanding of the structural configuration, nature and accumulation pattern of phenolic acids and flavonoids in plants and to highlighting the recent developments in the chemical investigation of these compounds by chromatographic and spectroscopic techniques. It concludes with the advantages of the combination of these two methods and prospects.
An RFI Detection Algorithm for Microwave Radiometers Using Sparse Component Analysis
NASA Technical Reports Server (NTRS)
Mohammed-Tano, Priscilla N.; Korde-Patel, Asmita; Gholian, Armen; Piepmeier, Jeffrey R.; Schoenwald, Adam; Bradley, Damon
2017-01-01
Radio Frequency Interference (RFI) is a threat to passive microwave measurements and if undetected, can corrupt science retrievals. The sparse component analysis (SCA) for blind source separation has been investigated to detect RFI in microwave radiometer data. Various techniques using SCA have been simulated to determine detection performance with continuous wave (CW) RFI.
NASA Astrophysics Data System (ADS)
Geddes, Earl Russell
The details of the low frequency sound field for a rectangular room can be studied by the use of an established analytic technique--separation of variables. The solution is straightforward and the results are well-known. A non -rectangular room has boundary conditions which are not separable and therefore other solution techniques must be used. This study shows that the finite element method can be adapted for use in the study of sound fields in arbitrary shaped enclosures. The finite element acoustics problem is formulated and the modification of a standard program, which is necessary for solving acoustic field problems, is examined. The solution of the semi-non-rectangular room problem (one where the floor and ceiling remain parallel) is carried out by a combined finite element/separation of variables approach. The solution results are used to construct the Green's function for the low frequency sound field in five rooms (or data cases): (1) a rectangular (Louden) room; (2) The smallest wall of the Louden room canted 20 degrees from normal; (3) The largest wall of the Louden room canted 20 degrees from normal; (4) both the largest and the smallest walls are canted 20 degrees; and (5) a five-sided room variation of Case 4. Case 1, the rectangular room was calculated using both the finite element method and the separation of variables technique. The results for the two methods are compared in order to access the accuracy of the finite element method models. The modal damping coefficient are calculated and the results examined. The statistics of the source and receiver average normalized RMS P('2) responses in the 80 Hz, 100 Hz, and 125 Hz one-third octave bands are developed. The receiver averaged pressure response is developed to determine the effect of the source locations on the response. Twelve source locations are examined and the results tabulated for comparison. The effect of a finite sized source is looked at briefly. Finally, the standard deviation of the spatial pressure response is studied. The results for this characteristic show that it not significantly different in any of the rooms. The conclusions of the study are that only the frequency variations of the pressure response are affected by a room's shape. Further, in general, the simplest modification of a rectangular room (i.e., changing the angle of only one of the smallest walls), produces the most pronounced decrease of the pressure response variations in the low frequency region.
Optical element for full spectral purity from IR-generated EUV light sources
NASA Astrophysics Data System (ADS)
van den Boogaard, A. J. R.; Louis, E.; van Goor, F. A.; Bijkerk, F.
2009-03-01
Laser produced plasma (LLP) sources are generally considered attractive for high power EUV production in next generation lithography equipment. Such plasmas are most efficiently excited by the relatively long, infrared wavelengths of CO2-lasers, but a significant part of the rotational-vibrational excitation lines of the CO2 radiation will be backscattered by the plasma's critical density surface and consequently will be present as parasitic radiation in the spectrum of such sources. Since most optical elements in the EUV collecting and imaging train have a high reflection coefficient for IR radiation, undesirable heating phenomena at the resist level are likely to occur. In this study a completely new principle is employed to obtain full separation of EUV and IR radiation from the source by a single optical component. While the application of a transmission filter would come at the expense of EUV throughput, this technique potentially enables wavelength separation without loosing reflectance compared to a conventional Mo/Si multilayer coated element. As a result this method provides full spectral purity from the source without loss in EUV throughput. Detailed calculations on the principal of functioning are presented.
Restoration of recto-verso colour documents using correlated component analysis
NASA Astrophysics Data System (ADS)
Tonazzini, Anna; Bedini, Luigi
2013-12-01
In this article, we consider the problem of removing see-through interferences from pairs of recto-verso documents acquired either in grayscale or RGB modality. The see-through effect is a typical degradation of historical and archival documents or manuscripts, and is caused by transparency or seeping of ink from the reverse side of the page. We formulate the problem as one of separating two individual texts, overlapped in the recto and verso maps of the colour channels through a linear convolutional mixing operator, where the mixing coefficients are unknown, while the blur kernels are assumed known a priori or estimated off-line. We exploit statistical techniques of blind source separation to estimate both the unknown model parameters and the ideal, uncorrupted images of the two document sides. We show that recently proposed correlated component analysis techniques overcome the already satisfactory performance of independent component analysis techniques and colour decorrelation, when the two texts are even sensibly correlated.
Du, Yu; Zhuang, Ziwei; He, Jiexing; Liu, Hongji; Qiu, Ping; Wang, Ke
2018-05-16
With tunable excitation light, multiphoton microscopy (MPM) is widely used for imaging biological structures at subcellular resolution. Axial chromatic dispersion, present in virtually every transmissive optical system including the multiphoton microscope, leads to focal (and the resultant image) plane separation. Here we demonstrate experimentally a technique to measure the axial chromatic dispersion in a multiphoton microscope, using simultaneous 2-color third-harmonic generation (THG) imaging excited by a 2-color soliton source with tunable wavelength separation. Our technique is self-referenced, eliminating potential measurement error when 1-color tunable excitation light is used which necessitates reciprocating motion of the mechanical translation stage. Using this technique, we demonstrate measured axial chromatic dispersion with 2 different objective lenses in a multiphoton microscope. Further measurement in a biological sample also indicates that this axial chromatic dispersion, in combination with 2-color imaging, may open up opportunity for simultaneous imaging of two different axial planes. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Piotrowski, Paulina K; Weggler, Benedikt A; Yoxtheimer, David A; Kelly, Christina N; Barth-Naftilan, Erica; Saiers, James E; Dorman, Frank L
2018-04-17
Hydraulic fracturing is an increasingly common technique for the extraction of natural gas entrapped in shale formations. This technique has been highly criticized due to the possibility of environmental contamination, underscoring the need for method development to identify chemical factors that could be utilized in point-source identification of environmental contamination events. Here, we utilize comprehensive two-dimensional gas chromatography (GC × GC) coupled to high-resolution time-of-flight (HRT) mass spectrometry, which offers a unique instrumental combination allowing for petroleomics hydrocarbon fingerprinting. Four flowback fluids from Marcellus shale gas wells in geographic proximity were analyzed for differentiating factors that could be exploited in environmental forensics investigations of shale gas impacts. Kendrick mass defect (KMD) plots of these flowback fluids illustrated well-to-well differences in heteroatomic substituted hydrocarbons, while GC × GC separations showed variance in cyclic hydrocarbons and polyaromatic hydrocarbons among the four wells. Additionally, generating plots that combine GC × GC separation with KMD established a novel data-rich visualization technique that further differentiated the samples.
Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey
NASA Astrophysics Data System (ADS)
Guillemot, Christine; Siohan, Pierre
2005-12-01
Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.
Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images
NASA Astrophysics Data System (ADS)
Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker
2004-11-01
A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.
Zeremdini, Jihen; Ben Messaoud, Mohamed Anouar; Bouzid, Aicha
2015-09-01
Humans have the ability to easily separate a composed speech and to form perceptual representations of the constituent sources in an acoustic mixture thanks to their ears. Until recently, researchers attempt to build computer models of high-level functions of the auditory system. The problem of the composed speech segregation is still a very challenging problem for these researchers. In our case, we are interested in approaches that are addressed to the monaural speech segregation. For this purpose, we study in this paper the computational auditory scene analysis (CASA) to segregate speech from monaural mixtures. CASA is the reproduction of the source organization achieved by listeners. It is based on two main stages: segmentation and grouping. In this work, we have presented, and compared several studies that have used CASA for speech separation and recognition.
Contribution of non-negative matrix factorization to the classification of remote sensing images
NASA Astrophysics Data System (ADS)
Karoui, M. S.; Deville, Y.; Hosseini, S.; Ouamri, A.; Ducrot, D.
2008-10-01
Remote sensing has become an unavoidable tool for better managing our environment, generally by realizing maps of land cover using classification techniques. The classification process requires some pre-processing, especially for data size reduction. The most usual technique is Principal Component Analysis. Another approach consists in regarding each pixel of the multispectral image as a mixture of pure elements contained in the observed area. Using Blind Source Separation (BSS) methods, one can hope to unmix each pixel and to perform the recognition of the classes constituting the observed scene. Our contribution consists in using Non-negative Matrix Factorization (NMF) combined with sparse coding as a solution to BSS, in order to generate new images (which are at least partly separated images) using HRV SPOT images from Oran area, Algeria). These images are then used as inputs of a supervised classifier integrating textural information. The results of classifications of these "separated" images show a clear improvement (correct pixel classification rate improved by more than 20%) compared to classification of initial (i.e. non separated) images. These results show the contribution of NMF as an attractive pre-processing for classification of multispectral remote sensing imagery.
A method for monitoring the variability in nuclear absorption characteristics of aviation fuels
NASA Technical Reports Server (NTRS)
Sprinkle, Danny R.; Shen, Chih-Ping
1988-01-01
A technique for monitoring variability in the nuclear absorption characteristics of aviation fuels has been developed. It is based on a highly collimated low energy gamma radiation source and a sodium iodide counter. The source and the counter assembly are separated by a geometrically well-defined test fuel cell. A computer program for determining the mass attenuation coefficient of the test fuel sample, based on the data acquired for a preset counting period, has been developed and tested on several types of aviation fuel.
NASA Astrophysics Data System (ADS)
Buttler, W. T.; Hixson, R. S.; King, N. S. P.; Olson, R. T.; Rigg, P. A.; Zellner, M. B.; Routley, N.; Rimmer, A.
2007-04-01
The authors consider a mathematical method to separate and determine the amount of ejecta produced in a second-shock material-fragmentation process. The technique is theoretical and assumes that a material undergoing a shock release at a vacuum interface ejects particulate material or fragments as the initial shock unloads and reflects at the vacuum-surface interface. In this case it is thought that the reflected shock may reflect again at the source of the shock and return to the vacuum-surface interface and eject another amount of fragments or particulate material.
Dipping-interface mapping using mode-separated Rayleigh waves
Luo, Y.; Xia, J.; Xu, Y.; Zeng, C.; Miller, R.D.; Liu, Q.
2009-01-01
Multichannel analysis of surface waves (MASW) method is a non-invasive geophysical technique that uses the dispersive characteristic of Rayleigh waves to estimate a vertical shear (S)-wave velocity profile. A pseudo-2D S-wave velocity section is constructed by aligning 1D S-wave velocity profiles at the midpoint of each receiver spread that are contoured using a spatial interpolation scheme. The horizontal resolution of the section is therefore most influenced by the receiver spread length and the source interval. Based on the assumption that a dipping-layer model can be regarded as stepped flat layers, high-resolution linear Radon transform (LRT) has been proposed to image Rayleigh-wave dispersive energy and separate modes of Rayleigh waves from a multichannel record. With the mode-separation technique, therefore, a dispersion curve that possesses satisfactory accuracy can be calculated using a pair of consecutive traces within a mode-separated shot gather. In this study, using synthetic models containing a dipping layer with a slope of 5, 10, 15, 20, or 30 degrees and a real-world example, we assess the ability of using high-resolution LRT to image and separate fundamental-mode Rayleigh waves from raw surface-wave data and accuracy of dispersion curves generated by a pair of consecutive traces within a mode-separated shot gather. Results of synthetic and real-world examples demonstrate that a dipping interface with a slope smaller than 15 degrees can be successfully mapped by separated fundamental waves using high-resolution LRT. ?? Birkh??user Verlag, Basel 2009.
NASA Astrophysics Data System (ADS)
Vlachou, Athanasia; Daellenbach, Kaspar R.; Bozzetti, Carlo; Chazeau, Benjamin; Salazar, Gary A.; Szidat, Soenke; Jaffrezo, Jean-Luc; Hueglin, Christoph; Baltensperger, Urs; El Haddad, Imad; Prévôt, André S. H.
2018-05-01
Carbonaceous aerosols are related to adverse human health effects. Therefore, identification of their sources and analysis of their chemical composition is important. The offline AMS (aerosol mass spectrometer) technique offers quantitative separation of organic aerosol (OA) factors which can be related to major OA sources, either primary or secondary. While primary OA can be more clearly separated into sources, secondary (SOA) source apportionment is more challenging because different sources - anthropogenic or natural, fossil or non-fossil - can yield similar highly oxygenated mass spectra. Radiocarbon measurements provide unequivocal separation between fossil and non-fossil sources of carbon. Here we coupled these two offline methods and analysed the OA and organic carbon (OC) of different size fractions (particulate matter below 10 and 2.5 µm - PM10 and PM2.5, respectively) from the Alpine valley of Magadino (Switzerland) during the years 2013 and 2014 (219 samples). The combination of the techniques gave further insight into the characteristics of secondary OC (SOC) which was rather based on the type of SOC precursor and not on the volatility or the oxidation state of OC, as typically considered. Out of the primary sources separated in this study, biomass burning OC was the dominant one in winter, with average concentrations of 5.36 ± 2.64 µg m-3 for PM10 and 3.83 ± 1.81 µg m-3 for PM2.5, indicating that wood combustion particles were predominantly generated in the fine mode. The additional information from the size-segregated measurements revealed a primary sulfur-containing factor, mainly fossil, detected in the coarse size fraction and related to non-exhaust traffic emissions with a yearly average PM10 (PM2.5) concentration of 0.20 ± 0.24 µg m-3 (0.05 ± 0.04 µg m-3). A primary biological OC (PBOC) was also detected in the coarse mode peaking in spring and summer with a yearly average PM10 (PM2.5) concentration of 0.79 ± 0.31 µg m-3 (0.24 ± 0.20 µg m-3). The secondary OC was separated into two oxygenated, non-fossil OC factors which were identified based on their seasonal variability (i.e. summer and winter oxygenated organic carbon, OOC) and a third anthropogenic OOC factor which correlated with fossil OC mainly peaking in winter and spring, contributing on average 13 % ± 7 % (10 % ± 9 %) to the total OC in PM10 (PM2.5). The winter OOC was also connected to anthropogenic sources, contributing on average 13 % ± 13 % (6 % ± 6 %) to the total OC in PM10 (PM2.5). The summer OOC (SOOC), stemming from oxidation of biogenic emissions, was more pronounced in the fine mode, contributing on average 43 % ± 12 % (75 % ± 44 %) to the total OC in PM10 (PM2.5). In total the non-fossil OC significantly dominated the fossil OC throughout all seasons, by contributing on average 75 % ± 24 % to the total OC. The results also suggested that during the cold period the prevailing source was residential biomass burning while during the warm period primary biological sources and secondary organic aerosol from the oxidation of biogenic emissions became important. However, SOC was also formed by aged fossil fuel combustion emissions not only in summer but also during the rest of the year.
NASA Astrophysics Data System (ADS)
de León, Jesús Ponce; Beltrán, José Ramón
2012-12-01
In this study, a new method of blind audio source separation (BASS) of monaural musical harmonic notes is presented. The input (mixed notes) signal is processed using a flexible analysis and synthesis algorithm (complex wavelet additive synthesis, CWAS), which is based on the complex continuous wavelet transform. When the harmonics from two or more sources overlap in a certain frequency band (or group of bands), a new technique based on amplitude similarity criteria is used to obtain an approximation to the original partial information. The aim is to show that the CWAS algorithm can be a powerful tool in BASS. Compared with other existing techniques, the main advantages of the proposed algorithm are its accuracy in the instantaneous phase estimation, its synthesis capability and that the only input information needed is the mixed signal itself. A set of synthetically mixed monaural isolated notes have been analyzed using this method, in eight different experiments: the same instrument playing two notes within the same octave and two harmonically related notes (5th and 12th intervals), two different musical instruments playing 5th and 12th intervals, two different instruments playing non-harmonic notes, major and minor chords played by the same musical instrument, three different instruments playing non-harmonically related notes and finally the mixture of a inharmonic instrument (piano) and one harmonic instrument. The results obtained show the strength of the technique.
The Wilkinson Microwave Anisotropy Probe (WMAP) Source Catalog
NASA Technical Reports Server (NTRS)
Wright, E.L.; Chen, X.; Odegard, N.; Bennett, C.L.; Hill, R.S.; Hinshaw, G.; Jarosik, N.; Komatsu, E.; Nolta, M.R.; Page, L.;
2008-01-01
We present the list of point sources found in the WMAP 5-year maps. The technique used in the first-year and three-year analysis now finds 390 point sources, and the five-year source catalog is complete for regions of the sky away from the galactic plane to a 2 Jy limit, with SNR greater than 4.7 in all bands in the least covered parts of the sky. The noise at high frequencies is still mainly radiometer noise, but at low frequencies the CMB anisotropy is the largest uncertainty. A separate search of CMB-free V-W maps finds 99 sources of which all but one can be identified with known radio sources. The sources seen by WMAP are not strongly polarized. Many of the WMAP sources show significant variability from year to year, with more than a 2:l range between the minimum and maximum fluxes.
Thermal testing by internal IR heating of the FEP module
NASA Technical Reports Server (NTRS)
Nathanson, D. M.; Efromson, R. A.; Lee, E. I.
1986-01-01
A spacecraft module, to be integrated with the FLTSATCOM spacecraft, was tested in a simulated orbit environment separate from the host spacecraft. Thermal vacuum testing of the module was accomplished using internal IR heating rather than conventional external heat sources. For this configuration, the technique produced boundary conditions expected for flight to enable verification of system performance and thermal design details.
Optimal parameters for laser tissue soldering: II. Premixed versus separate dye-solder techniques.
McNally, K M; Sorg, B S; Chan, E K; Welch, A J; Dawes, J M; Owen, E R
2000-01-01
Laser tissue soldering by using an indocyanine green (ICG)-doped protein solder applied topically to the tissue surface and denatured with a diode laser was investigated in Part I of this study. The depth of light absorption was predominantly determined by the concentration of the ICG dye added to the solder. This study builds on that work with an in vitro investigation of the effects of limiting the zone of heat generation to the solder-tissue interface to determine whether more stable solder-tissue fusion can be achieved. An alternative laser tissue soldering technique was investigated, which increased light absorption at the vital solder-tissue interface. A thin layer of ICG dye was smeared over the surface to be treated, the protein solder was then placed directly on top of the dye, and the solder was denatured with an 808-nm diode laser. Because laser light at approximately 800 nm is absorbed primarily by the ICG dye, this thin layer of ICG solution restricted the heat source to the space between the solder and the tissue surfaces. A tensile strength analysis was conducted to compare the separate dye-solder technique with conventional techniques of laser tissue soldering for which a premixed dye-solder is applied directly to the tissue surface. The effect of hydration on bond stability of repairs formed by using both techniques was also investigated using tensile strength and scanning electron microscopy analysis. Equivalent results in terms of tensile strength were obtained for the premixed dye-solder technique using protein solders containing 0.25 mg/ml ICG (liquid solder, 220 +/- 35 N/cm(2); solid solder, 602 +/- 32 N/cm(2)) and for the separate dye-solder technique (liquid solder, 228 +/- 41 N/cm(2); solid solder, 578 +/- 29 N/cm(2)). The tensile strength of native bovine thoracic aorta was 596 +/- 31 N/cm(2). Repairs created by using the separate dye-solder technique were more stable during hydration than their premixed dye-solder counterparts. The conventional premixed dye-solder was simpler and approximately twice as fast to apply. The separate dye-solder technique, however, increased the shelf-life of the solder, because the dye was mixed at the time of the experiment, thus conserving its spectral absorbency properties. Two laser-assisted tissue soldering techniques have been evaluated for repairing aorta incisions in vitro. The advantages and disadvantages of each of these techniques are discussed. Copyright 2000 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we introduce the vbICA technique and present its application on synthetic data that simulate a GPS network recording ground deformation in a tectonically active region, with synthetic time-series containing interseismic, coseismic, and postseismic deformation, plus seasonal deformation, and white and coloured noise. We study the ability of the algorithm to recover the original (known) sources of deformation, and then apply it to a real scenario: the Emilia seismic sequence (2012, northern Italy), which is an example of seismic sequence occurred in a slowly converging tectonic setting, characterized by several local to regional anthropogenic or natural sources of deformation, mainly subsidence due to fluid withdrawal and sediments compaction. We apply both PCA and vbICA to displacement time-series recorded by continuous GPS and InSAR (Pezzo et al., EGU2015-8950).
Rostad, C.E.
2006-01-01
Polar components in fuels may enable differentiation between fuel types or commercial fuel sources. A range of commercial fuels from numerous sources were analyzed by flow injection analysis/electrospray ionization/mass spectrometry without extensive sample preparation, separation, or chromatography. This technique enabled screening for unique polar components at parts per million levels in commercial hydrocarbon products, including a range of products from a variety of commercial sources and locations. Because these polar compounds are unique in different fuels, their presence may provide source information on hydrocarbons released into the environment. This analysis was then applied to mixtures of various products, as might be found in accidental releases into the environment. Copyright ?? Taylor & Francis Group, LLC.
Geologic sources of asbestos in Seattle's tolt reservoir
Reid, M.E.; Craven, G.
1996-01-01
Water from Seattle's South Fork Tolt Reservoir contains chrysotile and amphibole asbestos fibers, derived from natural sources. Using optical petrographic techniques, X-ray diffraction, and scanning electron microscopy, we identified the geologic source of these asbestiform minerals within the watershed. No asbestos was found in the bedrock underlying the watershed, while both chrysotile and amphibole fibers were found in sediments transported by Puget-lobe glacial processes. These materials, widely distributed throughout the lower watershed, would be difficult to separate from the reservoir sediments. The probable source of this asbestos is in pods of ultramafic rock occurring north of the watershed. Because asbestos is contained in widespread Pugetlobe glacial materials, it may be naturally distributed in other watersheds in the Puget Sound area.
Tecchio, Franca; Porcaro, Camillo; Barbati, Giulia; Zappasodi, Filippo
2007-01-01
A brain–computer interface (BCI) can be defined as any system that can track the person's intent which is embedded in his/her brain activity and, from it alone, translate the intention into commands of a computer. Among the brain signal monitoring systems best suited for this challenging task, electroencephalography (EEG) and magnetoencephalography (MEG) are the most realistic, since both are non-invasive, EEG is portable and MEG could provide more specific information that could be later exploited also through EEG signals. The first two BCI steps require set up of the appropriate experimental protocol while recording the brain signal and then to extract interesting features from the recorded cerebral activity. To provide information useful in these BCI stages, our aim is to provide an overview of a new procedure we recently developed, named functional source separation (FSS). As it comes from the blind source separation algorithms, it exploits the most valuable information provided by the electrophysiological techniques, i.e. the waveform signal properties, remaining blind to the biophysical nature of the signal sources. FSS returns the single trial source activity, estimates the time course of a neuronal pool along different experimental states on the basis of a specific functional requirement in a specific time period, and uses the simulated annealing as the optimization procedure allowing the exploit of functional constraints non-differentiable. Moreover, a minor section is included, devoted to information acquired by MEG in stroke patients, to guide BCI applications aiming at sustaining motor behaviour in these patients. Relevant BCI features – spatial and time-frequency properties – are in fact altered by a stroke in the regions devoted to hand control. Moreover, a method to investigate the relationship between sensory and motor hand cortical network activities is described, providing information useful to develop BCI feedback control systems. This review provides a description of the FSS technique, a promising tool for the BCI community for online electrophysiological feature extraction, and offers interesting information to develop BCI applications to sustain hand control in stroke patients. PMID:17331989
Multi-distance diffuse optical spectroscopy with a single optode via hypotrochoidal scanning.
Applegate, Matthew B; Roblyer, Darren
2018-02-15
Frequency-domain diffuse optical spectroscopy (FD-DOS) is an established technique capable of determining optical properties and chromophore concentrations in biological tissue. Most FD-DOS systems use either manually positioned, handheld probes or complex arrays of source and detector fibers to acquire data from many tissue locations, allowing for the generation of 2D or 3D maps of tissue. Here, we present a new method to rapidly acquire a wide range of source-detector (SD) separations by mechanically scanning a single SD pair. The source and detector fibers are mounted on a scan head that traces a hypotrochoidal pattern over the sample that, when coupled with a high-speed FD-DOS system, enables the rapid collection of dozens of SD separations for depth-resolved imaging. We demonstrate that this system has an average error of 4±2.6% in absorption and 2±1.8% in scattering across all SD separations. Additionally, by linearly translating the device, the size and location of an absorbing inhomogeneity can be determined through the generation of B-scan images in a manner conceptually analogous to ultrasound imaging. This work demonstrates the potential of single optode diffuse optical scanning for depth resolved visualization of heterogeneous biological tissues at near real-time rates.
Chen, Yu-Wen; Chen, Chien-Chih; Huang, Po-Jung; Tseng, Sheng-Hao
2016-01-01
Diffuse reflectance spectroscopy (DRS) based on the frequency-domain (FD) technique has been employed to investigate the optical properties of deep tissues such as breast and brain using source to detector separation up to 40 mm. Due to the modeling and system limitations, efficient and precise determination of turbid sample optical properties from the FD diffuse reflectance acquired at a source-detector separation (SDS) of around 1 mm has not been demonstrated. In this study, we revealed that at SDS of 1 mm, acquiring FD diffuse reflectance at multiple frequencies is necessary for alleviating the influence of inevitable measurement uncertainty on the optical property recovery accuracy. Furthermore, we developed artificial neural networks (ANNs) trained by Monte Carlo simulation generated databases that were capable of efficiently determining FD reflectance at multiple frequencies. The ANNs could work in conjunction with a least-square optimization algorithm to rapidly (within 1 second), accurately (within 10%) quantify the sample optical properties from FD reflectance measured at SDS of 1 mm. In addition, we demonstrated that incorporating the steady-state apparatus into the FD DRS system with 1 mm SDS would enable obtaining broadband absorption and reduced scattering spectra of turbid samples in the wavelength range from 650 to 1000 nm. PMID:27446671
Becker, H; Albera, L; Comon, P; Nunes, J-C; Gribonval, R; Fleureau, J; Guillotel, P; Merlet, I
2017-08-15
Over the past decades, a multitude of different brain source imaging algorithms have been developed to identify the neural generators underlying the surface electroencephalography measurements. While most of these techniques focus on determining the source positions, only a small number of recently developed algorithms provides an indication of the spatial extent of the distributed sources. In a recent comparison of brain source imaging approaches, the VB-SCCD algorithm has been shown to be one of the most promising algorithms among these methods. However, this technique suffers from several problems: it leads to amplitude-biased source estimates, it has difficulties in separating close sources, and it has a high computational complexity due to its implementation using second order cone programming. To overcome these problems, we propose to include an additional regularization term that imposes sparsity in the original source domain and to solve the resulting optimization problem using the alternating direction method of multipliers. Furthermore, we show that the algorithm yields more robust solutions by taking into account the temporal structure of the data. We also propose a new method to automatically threshold the estimated source distribution, which permits to delineate the active brain regions. The new algorithm, called Source Imaging based on Structured Sparsity (SISSY), is analyzed by means of realistic computer simulations and is validated on the clinical data of four patients. Copyright © 2017 Elsevier Inc. All rights reserved.
Spectral studies of cosmic X-ray sources
NASA Astrophysics Data System (ADS)
Blissett, R. J.
1980-01-01
The conventional "indirect" method of reduction and data analysis of spectral data from non-dispersive X-ray detectors, by the fitting of assumed spectral models, is examined. The limitations of this procedure are presented, and alternative schemes are considered in which the derived spectra are not biased to an astrophysical source model. A new method is developed in detail to directly restore incident photon spectra from the detected count histograms. This Spectral Restoration Technique allows an increase in resolution, to a degree dependent on the statistical precision of the data. This is illustrated by numerical simulations. Proportional counter data from Ariel 5 are analysed using this technique. The results obtained for the sources Cas A and the Crab Nebula are consistent with previous analyses and show that increases in resolution of up to a factor three are possible in practice. The source Cyg X-3 is closely examined. Complex spectral variability is found, with the continuum and iron-line emission modulated with the 4.8 hour period of the source. The data suggest multi-component emission in the source. Comparing separate Ariel 5 observations and published data from other experiments, a correlation between the spectral shape and source intensity is evident. The source behaviour is discussed with reference to proposed source models. Data acquired by the low-energy detectors on-board HEAO-1 are analysed using the Spectral Restoration Technique. This treatment explicitly demonstrates the existence of oxygen K-absorption edges in the soft X-ray spectra of the Crab Nebula and Sco X-1. These results are considered with reference to current theories of the interstellar medium. The thesis commences with a review of cosmic X-ray sources and the mechanisms responsible for their spectral signatures, and continues with a discussion of the instruments appropriate for spectral studies in X-ray astronomy.
In situ surface/interface x-ray diffractometer for oxide molecular beam epitaxy
NASA Astrophysics Data System (ADS)
Lee, J. H.; Tung, I. C.; Chang, S.-H.; Bhattacharya, A.; Fong, D. D.; Freeland, J. W.; Hong, Hawoong
2016-01-01
In situ studies of oxide molecular beam epitaxy by synchrotron x-ray scattering has been made possible by upgrading an existing UHV/molecular beam epitaxy (MBE) six-circle diffractometer system. For oxide MBE growth, pure ozone delivery to the chamber has been made available, and several new deposition sources have been made available on a new 12 in. CF (ConFlat, a registered trademark of Varian, Inc.) flange. X-ray diffraction has been used as a major probe for film growth and structures for the system. In the original design, electron diffraction was intended for the secondary diagnostics available without the necessity of the x-ray and located at separate positions. Deposition of films was made possible at the two diagnostic positions. And, the aiming of the evaporation sources is fixed to the point between two locations. Ozone can be supplied through two separate nozzles for each location. Also two separate thickness monitors are installed. Additional features of the equipment are also presented together with the data taken during typical oxide film growth to illustrate the depth of information available via in situ x-ray techniques.
In situ surface/interface x-ray diffractometer for oxide molecular beam epitaxy.
Lee, J H; Tung, I C; Chang, S-H; Bhattacharya, A; Fong, D D; Freeland, J W; Hong, Hawoong
2016-01-01
In situ studies of oxide molecular beam epitaxy by synchrotron x-ray scattering has been made possible by upgrading an existing UHV/molecular beam epitaxy (MBE) six-circle diffractometer system. For oxide MBE growth, pure ozone delivery to the chamber has been made available, and several new deposition sources have been made available on a new 12 in. CF (ConFlat, a registered trademark of Varian, Inc.) flange. X-ray diffraction has been used as a major probe for film growth and structures for the system. In the original design, electron diffraction was intended for the secondary diagnostics available without the necessity of the x-ray and located at separate positions. Deposition of films was made possible at the two diagnostic positions. And, the aiming of the evaporation sources is fixed to the point between two locations. Ozone can be supplied through two separate nozzles for each location. Also two separate thickness monitors are installed. Additional features of the equipment are also presented together with the data taken during typical oxide film growth to illustrate the depth of information available via in situ x-ray techniques.
Recent developments in optical detection methods for microchip separations.
Götz, Sebastian; Karst, Uwe
2007-01-01
This paper summarizes the features and performances of optical detection systems currently applied in order to monitor separations on microchip devices. Fluorescence detection, which delivers very high sensitivity and selectivity, is still the most widely applied method of detection. Instruments utilizing laser-induced fluorescence (LIF) and lamp-based fluorescence along with recent applications of light-emitting diodes (LED) as excitation sources are also covered in this paper. Since chemiluminescence detection can be achieved using extremely simple devices which no longer require light sources and optical components for focusing and collimation, interesting approaches based on this technique are presented, too. Although UV/vis absorbance is a detection method that is commonly used in standard desktop electrophoresis and liquid chromatography instruments, it has not yet reached the same level of popularity for microchip applications. Current applications of UV/vis absorbance detection to microchip separations and innovative approaches that increase sensitivity are described. This article, which contains 85 references, focuses on developments and applications published within the last three years, points out exciting new approaches, and provides future perspectives on this field.
Characterization and identification of Na-Cl sources in ground water
Panno, S.V.; Hackley, Keith C.; Hwang, H.-H.; Greenberg, S.E.; Krapac, I.G.; Landsberger, S.; O'Kelly, D. J.
2006-01-01
Elevated concentrations of sodium (Na+) and chloride (Cl -) in surface and ground water are common in the United States and other countries, and can serve as indicators of, or may constitute, a water quality problem. We have characterized the most prevalent natural and anthropogenic sources of Na+ and Cl- in ground water, primarily in Illinois, and explored techniques that could be used to identify their source. We considered seven potential sources that included agricultural chemicals, septic effluent, animal waste, municipal landfill leachate, sea water, basin brines, and road deicers. The halides Cl-, bromide (Br-), and iodide (I-) were useful indicators of the sources of Na+-Cl- contamination. Iodide enrichment (relative to Cl-) was greatest in precipitation, followed by uncontaminated soil water and ground water, and landfill leachate. The mass ratios of the halides among themselves, with total nitrogen (N), and with Na+ provided diagnostic methods for graphically distinguishing among sources of Na+ and Cl- in contaminated water. Cl/Br ratios relative to Cl- revealed a clear, although overlapping, separation of sample groups. Samples of landfill leachate and ground water known to be contaminated by leachate were enriched in I- and Br-; this provided an excellent fingerprint for identifying leachate contamination. In addition, total N, when plotted against Cl/Br ratios, successfully separated water contaminated by road salt from water contaminated by other sources. Copyright ?? 2005 National Ground Water Association.
Isolation and Characterization of Rat Pituitary Endothelial Cells
Chaturvedi, Kirti; Sarkar, Dipak K.
2010-01-01
Most previous studies that determined the effect of estradiol on angiogenesis used endothelial cells from nonpituitary sources. Because pituitary tumor tissue receives its blood supply via portal and arterial circulation, it is important to use pituitary-derived endothelial cells in studying pituitary angiogenesis. We have developed a magnetic separation technique to isolate endothelial cells from pituitary tissues and have characterized these cells in primary cultures. Endothelial cells of the pituitary showed the existence of endothelial cell marker, CD31, and of von Willebrand factor protein. These cells in cultures also showed immunore-activity of estrogen receptors alpha and beta. The angiogenic factors, vascular endothelial growth factor and basic fibroblast growth factor, significantly increased proliferation and migration of the pituitary-derived endothelial cells in primary cultures. These results suggest that a magnetic separation technique can be used for enrichment of pituitary-derived endothelial cells for determination of cellular mechanisms governing the vascularization in the pituitary. PMID:17028416
Faint source detection in ISOCAM images
NASA Astrophysics Data System (ADS)
Starck, J. L.; Aussel, H.; Elbaz, D.; Fadda, D.; Cesarsky, C.
1999-08-01
We present a tool adapted to the detection of faint mid-infrared sources within ISOCAM mosaics. This tool is based on a wavelet analysis which allows us to discriminate sources from cosmic ray impacts at the very limit of the instrument, four orders of magnitudes below IRAS. It is called PRETI for Pattern REcognition Technique for ISOCAM data, because glitches with transient behaviors are isolated in the wavelet space, i.e. frequency space, where they present peculiar signatures in the form of patterns automatically identified and then reconstructed. We have tested PRETI with Monte-Carlo simulations of fake ISOCAM data. These simulations allowed us to define the fraction of remaining false sources due to cosmic rays, the sensitivity and completeness limits as well as the photometric accuracy as a function of the observation parameters. Although the main scientific applications of this technique have appeared or will appear in separated papers, we present here an application to the ISOCAM-Hubble Deep Field image. This work completes and confirms the results already published (\\cite[Aussel et al. 1999]{starck:aussel99}).
Mitigating fringing in discrete frequency infrared imaging using time-delayed integration
Ran, Shihao; Berisha, Sebastian; Mankar, Rupali; Shih, Wei-Chuan; Mayerich, David
2018-01-01
Infrared (IR) spectroscopic microscopes provide the potential for label-free quantitative molecular imaging of biological samples, which can be used to aid in histology, forensics, and pharmaceutical analysis. Most IR imaging systems use broadband illumination combined with a spectrometer to separate the signal into spectral components. This technique is currently too slow for many biomedical applications such as clinical diagnosis, primarily due to the availability of bright mid-infrared sources and sensitive MCT detectors. There has been a recent push to increase throughput using coherent light sources, such as synchrotron radiation and quantum cascade lasers. While these sources provide a significant increase in intensity, the coherence introduces fringing artifacts in the final image. We demonstrate that applying time-delayed integration in one dimension can dramatically reduce fringing artifacts with minimal alterations to the standard infrared imaging pipeline. The proposed technique also offers the potential for less expensive focal plane array detectors, since linear arrays can be more readily incorporated into the proposed framework. PMID:29552416
NASA Astrophysics Data System (ADS)
Yao, Jiachi; Xiang, Yang; Qian, Sichong; Li, Shengyang; Wu, Shaowei
2017-11-01
In order to separate and identify the combustion noise and the piston slap noise of a diesel engine, a noise source separation and identification method that combines a binaural sound localization method and blind source separation method is proposed. During a diesel engine noise and vibration test, because a diesel engine has many complex noise sources, a lead covering method was carried out on a diesel engine to isolate other interference noise from the No. 1-5 cylinders. Only the No. 6 cylinder parts were left bare. Two microphones that simulated the human ears were utilized to measure the radiated noise signals 1 m away from the diesel engine. First, a binaural sound localization method was adopted to separate the noise sources that are in different places. Then, for noise sources that are in the same place, a blind source separation method is utilized to further separate and identify the noise sources. Finally, a coherence function method, continuous wavelet time-frequency analysis method, and prior knowledge of the diesel engine are combined to further identify the separation results. The results show that the proposed method can effectively separate and identify the combustion noise and the piston slap noise of a diesel engine. The frequency of the combustion noise and the piston slap noise are respectively concentrated at 4350 Hz and 1988 Hz. Compared with the blind source separation method, the proposed method has superior separation and identification effects, and the separation results have fewer interference components from other noise.
Fluid flow in solidifying monotectic alloys
NASA Technical Reports Server (NTRS)
Ecker, A.; Frazier, D. O.; Alexander, J. Iwan D.
1989-01-01
Use of a two-wavelength holographic technique results in a simultaneous determination of temperature and composition profiles during directional solidification in a system with a miscibility gap. The relationships among fluid flow, phase separation, and mass transport during the solidification of the monotectic alloy are discussed. The primary sources of fluid motion in this system are buoyancy and thermocapillary forces. These forces act together when phase separation results in the formation of droplets (this occurs at the solid-liquid interface and in the bulk melt). In the absence of phase separation, buoyancy results from density gradients related to temperature and compositional gradients in the single-phase bulk melt. The effects of buoyancy are especially evident in association with water- or ethanol-rich volumes created at the solid-liquid growth interface.
Measurement of splanchnic photoplethysmographic signals using a new reflectance fiber optic sensor
NASA Astrophysics Data System (ADS)
Hickey, Michelle; Samuels, Neal; Randive, Nilesh; Langford, Richard M.; Kyriacou, Panayiotis A.
2010-03-01
Splanchnic organs are particularly vulnerable to hypoperfusion. Currently, there is no technique that allows for the continuous estimation of splanchnic blood oxygen saturation (SpO2). As a preliminary to developing a suitable splanchnic SpO2 sensor, a new reflectance fiber optic photoplethysmographic (PPG) sensor and processing system are developed. An experimental procedure to examine the effect of fiber source detector separation distance on acquired PPG signals is carried out before finalizing the sensor design. PPG signals are acquired from four volunteers for separation distances of 1 to 8 mm. The separation range of 3 to 6 mm provides the best quality PPG signals with large amplitudes and the highest signal-to-noise ratios (SNRs). Preliminary calculation of SpO2 shows that distances of 3 and 4 mm provide the most realistic values. Therefore, it is suggested that the separation distance in the design of a fiber optic reflectance pulse oximeter be in the range of 3 to 4 mm. Preliminary PPG signals from various splanchnic organs and the periphery are obtained from six anaesthetized patients. The normalized amplitudes of the splanchnic PPGs are, on average, approximately the same as those obtained simultaneously from the periphery. These observations suggest that fiber optic pulse oximetry may be a valid monitoring technique for splanchnic organs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodds, W. K.; Collins, S. M.; Hamilton, S. K.
Analyses of 21 15N stable isotope tracer experiments, designed to examine food web dynamics in streams around the world, indicated that the isotopic composition of food resources assimilated by primary consumers (mostly invertebrates) poorly reflected the presumed food sources. Modeling indicated that consumers assimilated only 33–50% of the N available in sampled food sources such as decomposing leaves, epilithon, and fine particulate detritus over feeding periods of weeks or more. Thus, common methods of sampling food sources consumed by animals in streams do not sufficiently reflect the pool of N they assimilate. Lastly, Isotope tracer studies, combined with modeling andmore » food separation techniques, can improve estimation of N pools in food sources that are assimilated by consumers.« less
Dodds, W. K.; Collins, S. M.; Hamilton, S. K.; ...
2014-10-01
Analyses of 21 15N stable isotope tracer experiments, designed to examine food web dynamics in streams around the world, indicated that the isotopic composition of food resources assimilated by primary consumers (mostly invertebrates) poorly reflected the presumed food sources. Modeling indicated that consumers assimilated only 33–50% of the N available in sampled food sources such as decomposing leaves, epilithon, and fine particulate detritus over feeding periods of weeks or more. Thus, common methods of sampling food sources consumed by animals in streams do not sufficiently reflect the pool of N they assimilate. Lastly, Isotope tracer studies, combined with modeling andmore » food separation techniques, can improve estimation of N pools in food sources that are assimilated by consumers.« less
Proceedings of the Numerical Modeling for Underground Nuclear Test Monitoring Symposium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, S.R.; Kamm, J.R.
1993-11-01
The purpose of the meeting was to discuss the state-of-the-art in numerical simulations of nuclear explosion phenomenology with applications to test ban monitoring. We focused on the uniqueness of model fits to data, the measurement and characterization of material response models, advanced modeling techniques, and applications of modeling to monitoring problems. The second goal of the symposium was to establish a dialogue between seismologists and explosion-source code calculators. The meeting was divided into five main sessions: explosion source phenomenology, material response modeling, numerical simulations, the seismic source, and phenomenology from near source to far field. We feel the symposium reachedmore » many of its goals. Individual papers submitted at the conference are indexed separately on the data base.« less
External control of electron energy distributions in a dual tandem inductively coupled plasma
NASA Astrophysics Data System (ADS)
Liu, Lei; Sridhar, Shyam; Zhu, Weiye; Donnelly, Vincent M.; Economou, Demetre J.; Logue, Michael D.; Kushner, Mark J.
2015-08-01
The control of electron energy probability functions (EEPFs) in low pressure partially ionized plasmas is typically accomplished through the format of the applied power. For example, through the use of pulse power, the EEPF can be modulated to produce shapes not possible under continuous wave excitation. This technique uses internal control. In this paper, we discuss a method for external control of EEPFs by transport of electrons between separately powered inductively coupled plasmas (ICPs). The reactor incorporates dual ICP sources (main and auxiliary) in a tandem geometry whose plasma volumes are separated by a grid. The auxiliary ICP is continuously powered while the main ICP is pulsed. Langmuir probe measurements of the EEPFs during the afterglow of the main ICP suggests that transport of hot electrons from the auxiliary plasma provided what is effectively an external source of energetic electrons. The tail of the EEPF and bulk electron temperature were then elevated in the afterglow of the main ICP by this external source of power. Results from a computer simulation for the evolution of the EEPFs concur with measured trends.
Use of global ionospheric maps for HF Doppler measurements interpretation
NASA Astrophysics Data System (ADS)
Petrova, I. R.; Bochkarev, V. V.; Latypov, R. R.
2018-04-01
The HF Doppler technique, a method of measurement of Doppler frequency shift of ionospheric signal, is one of the well-known and widely used techniques of ionosphere research. It allows investigation of various disturbances in the ionosphere. There are different sources of disturbances in the ionosphere such as geomagnetic storms, solar flashes, meteorological effects and atmospheric waves. The HF Doppler technique allows us to find out the influence of earthquakes, explosions and other processes on the ionosphere, which occurs near the Earth. HF Doppler technique has high sensitivity to small frequency variations and high time resolution but interpretation of results is difficult. In this paper, we attempt to use GPS data for Doppler measurements interpretation. Modeling of Doppler frequency shift variations with use of TEC allows separation of ionosphere disturbances of medium scale.
Investigation to advance prediction techniques of the low-speed aerodynamics of V/STOL aircraft
NASA Technical Reports Server (NTRS)
Maskew, B.; Strash, D.; Nathman, J.; Dvorak, F. A.
1985-01-01
A computer program, VSAERO, has been applied to a number of V/STOL configurations with a view to advancing prediction techniques for the low-speed aerodynamic characteristics. The program couples a low-order panel method with surface streamline calculation and integral boundary layer procedures. The panel method--which uses piecewise constant source and doublet panels-includes an iterative procedure for wake shape and models boundary layer displacement effect using the source transpiration technique. Certain improvements to a basic vortex tube jet model were installed in the code prior to evaluation. Very promising results were obtained for surface pressures near a jet issuing at 90 deg from a flat plate. A solid core model was used in the initial part of the jet with a simple entrainment model. Preliminary representation of the downstream separation zone significantly improve the correlation. The program accurately predicted the pressure distribution inside the inlet on the Grumman 698-411 design at a range of flight conditions. Furthermore, coupled viscous/potential flow calculations gave very close correlation with experimentally determined operational boundaries dictated by the onset of separation inside the inlet. Experimentally observed degradation of these operational boundaries between nacelle-alone tests and tests on the full configuration were also indicated by the calculation. Application of the program to the General Dynamics STOL fighter design were equally encouraging. Very close agreement was observed between experiment and calculation for the effects of power on pressure distribution, lift and lift curve slope.
Restrictive loads powered by separate or by common electrical sources
NASA Technical Reports Server (NTRS)
Appelbaum, J.
1989-01-01
In designing a multiple load electrical system, the designer may wish to compare the performance of two setups: a common electrical source powering all loads, or separate electrical sources powering individual loads. Three types of electrical sources: an ideal voltage source, an ideal current source, and solar cell source powering resistive loads were analyzed for their performances in separate and common source systems. A mathematical proof is given, for each case, indicating the merit of the separate or common source system. The main conclusions are: (1) identical resistive loads powered by ideal voltage sources perform the same in both system setups, (2) nonidentical resistive loads powered by ideal voltage sources perform the same in both system setups, (3) nonidentical resistive loads powered by ideal current sources have higher performance in separate source systems, and (4) nonidentical resistive loads powered by solar cells have higher performance in a common source system for a wide range of load resistances.
Capillary electrophoresis in two-dimensional separation systems: Techniques and applications.
Kohl, Felix J; Sánchez-Hernández, Laura; Neusüß, Christian
2015-01-01
The analysis of complex samples requires powerful separation techniques. Here, 2D chromatographic separation techniques (e.g. LC-LC, GC-GC) are increasingly applied in many fields. Electrophoretic separation techniques show a different selectivity in comparison to LC and GC and very high separation efficiency. Thus, 2D separation systems containing at least one CE-based separation technique are an interesting alternative featuring potentially a high degree of orthogonality. However, the generally small volumes and strong electrical fields in CE require special coupling techniques. These technical developments are reviewed in this work, discussing benefits and drawbacks of offline and online systems. Emphasis is placed on the design of the systems, their coupling, and the detector used. Moreover, the employment of strategies to improve peak capacity, resolution, or sensitivity is highlighted. Various applications of 2D separations with CE are summarized. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Scalar mixtures in porous media
NASA Astrophysics Data System (ADS)
Kree, Mihkel; Villermaux, Emmanuel
2017-10-01
Using a technique allowing for in situ measurements of concentrations fields, the evolution of scalar mixtures flowing within a porous medium made of a three-dimensional random stack of solid spheres, is addressed. Two distinct fluorescent dyes are injected from separate sources. Their evolution as they disperse and mix through the medium is directly observed and quantified, which is made possible by matching the refractive indices of the spheres and the flowing interstitial liquid. We decipher the nature of the interaction rule between the scalar sources, explaining the phenomenon that alters the concentration distribution of the overall mixture as it decays toward uniformity. Any residual correlation of the initially merged sources is progressively hidden, leading to an effective fully random interaction rule of the two distinct subfields.
NASA Technical Reports Server (NTRS)
Hartfield, Roy
1996-01-01
Raman scattering is a powerful technique for quantitatively probing high temperature and high speed flows. However, this technique has typically been limited to clean hydrogen flames because of the broadband fluorescence interference which occurs in hydrocarbon flames. Fluorescence can also interfere with the Raman signal in clean hydrogen flames when broadband UV lasers are used as the scattering source. A solution to this problem has been demonstrated. The solution to the fluorescence interference lies in the fact that the vibrational Q-branch Raman signal is highly polarized for 90 deg. signal collection and the fluorescence background is essentially unpolarized. Two basic schemes are available for separating the Raman from the background. One scheme involves using a polarized laser and collecting a signal with both horizontal and vertical laser polarizations separately. The signal with the vertical polarization will contain both the Raman and the fluorescence while the signal with the horizontal polarization will contain only the fluorescence. The second scheme involves polarization discrimination on the collection side of the optical setup. For vertical laser polarization, the scattered Q-branch Raman signal will be vertically polarized; hence the two polarizations can be collected separately and the difference between the two is the Raman signal. This approach has been used for the work found herein and has the advantage of allowing the data to be collected from the same laser shot(s). This makes it possible to collect quantitative Raman data with single shot resolution in conditions where interference cannot otherwise be eliminated.
Chen, Haibin; Yang, Yan; Jiang, Wei; Song, Mengjie; Wang, Ying; Xiang, Tiantian
2017-02-01
A case study on the source separation of municipal solid waste (MSW) was performed in Changsha, the capital city of Hunan Province, China. The objective of this study is to analyze the effects of different separation methods and compare their effects with citizens' attitudes and inclination. An effect evaluation method based on accuracy rate and miscellany rate was proposed to study the performance of different separation methods. A large-scale questionnaire survey was conducted to determine citizens' attitudes and inclination toward source separation. Survey result shows that the vast majority of respondents hold consciously positive attitudes toward participation in source separation. Moreover, the respondents ignore the operability of separation methods and would rather choose the complex separation method involving four or more subclassed categories. For the effects of separation methods, the site experiment result demonstrates that the relatively simple separation method involving two categories (food waste and other waste) achieves the best effect with the highest accuracy rate (83.1%) and the lowest miscellany rate (16.9%) among the proposed experimental alternatives. The outcome reflects the inconsistency between people's environmental awareness and behavior. Such inconsistency and conflict may be attributed to the lack of environmental knowledge. Environmental education is assumed to be a fundamental solution to improve the effect of source separation of MSW in Changsha. Important management tips on source separation, including the reformation of the current pay-as-you-throw (PAYT) system, are presented in this work. A case study on the source separation of municipal solid waste was performed in Changsha. An effect evaluation method based on accuracy rate and miscellany rate was proposed to study the performance of different separation methods. The site experiment result demonstrates that the two-category (food waste and other waste) method achieves the best effect. The inconsistency between people's inclination and the effect of source separation exists. The proposed method can be expanded to other cities to determine the most effective separation method during planning stages or to evaluate the performance of running source separation systems.
Contributions of Organic Sources to Atmospheric Aerosol Particle Concentrations and Growth
NASA Astrophysics Data System (ADS)
Russell, L. M.
2017-12-01
Organic molecules are important contributors to aerosol particle mass and number concentrations through primary emissions as well as secondary growth in the atmosphere. New techniques for measuring organic aerosol components in atmospheric particles have improved measurements of this contribution in the last 20 years, including Scanning Transmission X-ray Microscopy Near Edge X-ray Absorption Fine Structure (STXM-NEXAFS), Fourier Transform Infrared spectroscopy (FTIR), and High-Resolution Aerosol Mass Spectrometry (AMS). STXM-NEXAFS individual aerosol particle composition illustrated the variety of morphology of organic components in marine aerosols, the inherent relationships between organic composition and shape, and the links between atmospheric aerosol composition and particles produced in smog chambers. This type of single particle microscopy has also added to size distribution measurements by providing evidence of how surface-controlled and bulk-controlled processes contribute to the growth of particles in the atmosphere. FTIR analysis of organic functional groups are sufficient to distinguish combustion, marine, and terrestrial organic particle sources and to show that each of those types of sources has a surprisingly similar organic functional group composition over four different oceans and four different continents. Augmenting the limited sampling of these off-line techniques with side-by-side inter-comparisons to online AMS provides complementary composition information and consistent quantitative attribution to sources (despite some clear method differences). Single-particle AMS techniques using light scattering and event trigger modes have now also characterized the types of particles found in urban, marine, and ship emission aerosols. Most recently, by combining with off-line techniques, single particle composition measurements have separated and quantified the contributions of organic, sulfate and salt components from ocean biogenic and sea spray emissions to particles, addressing the persistent question of the sources of cloud condensation nuclei in clean marine conditions.
NASA Astrophysics Data System (ADS)
Abdo, Aws Ahmad
2007-08-01
Very high energy gamma-rays can be used to probe some of the most powerful astrophysical objects in the universe, such as active galactic nuclei, supernova remnants and pulsar-powered nebulae. The diffuse gamma radiation arising from the interaction of cosmic-ray particles with matter and radiation in the Galaxy is one of the few probes available to study the origin of cosmic- rays. Milagro is a water Cherenkov detector that continuously views the entire overhead sky. The large field-of-view combined with the long observation time makes Milagro the most sensitive instrument available for the study of large, low surface brightness sources such as the diffuse gamma radiation arising from interactions of cosmic radiation with interstellar matter. In this thesis I present a new background rejection technique for the Milagro detector through the development of a new gamma hadron separation variable. The Abdo variable, A 4 , coupled with the weighting analysis technique significantly improves the sensitivity of the Milagro detector. This new analysis technique resulted in the first discoveries in Milagro. Four localized sources of TeV gamma-ray emission have been discovered, three of which are in the Cygnus region of the Galaxy and one closer to the Galactic center. In addition to these localized sources, a diffuse emission of TeV gamma-rays has been discovered from the Cygnus region of the Galaxy as well. However, the TeV gamma-ray flux as measured at ~12 TeV from the Cygnus region exceeds that predicted from a conventional model of cosmic-ray production and propagation. This observation indicates the existence of either hard-spectrum cosmic-ray sources and/or other sources of TeV gamma rays in the region. Other TeV gamma-ray source candidates with post-trial statistical significances of > 4s have also been observed in the Galactic plane.
In situ surface/interface x-ray diffractometer for oxide molecular beam epitaxy
Lee, J. H.; Tung, I. C.; Chang, S. -H.; ...
2016-01-05
In situ studies of oxide molecular beam epitaxy by synchrotron x-ray scattering has been made possible by upgrading an existing UHV/molecular beam epitaxy (MBE) six-circle diffractometer system. For oxide MBE growth, pure ozone delivery to the chamber has been made available, and several new deposition sources have been made available on a new 12 in. CF (ConFlat, a registered trademark of Varian, Inc.) flange. X-ray diffraction has been used as a major probe for film growth and structures for the system. In the original design, electron diffraction was intended for the secondary diagnostics available without the necessity of the x-raymore » and located at separate positions. Deposition of films was made possible at the two diagnostic positions. And, the aiming of the evaporation sources is fixed to the point between two locations. Ozone can be supplied through two separate nozzles for each location. Also two separate thickness monitors are installed. Finally, additional features of the equipment are also presented together with the data taken during typical oxide film growth to illustrate the depth of information available via in situ x-ray techniques.« less
ERP denoising in multichannel EEG data using contrasts between signal and noise subspaces.
Ivannikov, Andriy; Kalyakin, Igor; Hämäläinen, Jarmo; Leppänen, Paavo H T; Ristaniemi, Tapani; Lyytinen, Heikki; Kärkkäinen, Tommi
2009-06-15
In this paper, a new method intended for ERP denoising in multichannel EEG data is discussed. The denoising is done by separating ERP/noise subspaces in multidimensional EEG data by a linear transformation and the following dimension reduction by ignoring noise components during inverse transformation. The separation matrix is found based on the assumption that ERP sources are deterministic for all repetitions of the same type of stimulus within the experiment, while the other noise sources do not obey the determinancy property. A detailed derivation of the technique is given together with the analysis of the results of its application to a real high-density EEG data set. The interpretation of the results and the performance of the proposed method under conditions, when the basic assumptions are violated - e.g. the problem is underdetermined - are also discussed. Moreover, we study how the factors of the number of channels and trials used by the method influence the effectiveness of ERP/noise subspaces separation. In addition, we explore also the impact of different data resampling strategies on the performance of the considered algorithm. The results can help in determining the optimal parameters of the equipment/methods used to elicit and reliably estimate ERPs.
Structural changes in loaded equine tendons can be monitored by a novel spectroscopic technique
Kostyuk, Oksana; Birch, Helen L; Mudera, Vivek; Brown, Robert A
2004-01-01
This study aimed to investigate the preferential collagen fibril alignment in unloaded and loaded tendons using elastic scattering spectroscopy. The device consisted of an optical probe, a pulsed light source (320–860 nm), a spectrometer and a PC. Two probes with either 2.75 mm or 300 μm source-detector separations were used to monitor deep and superficial layers, respectively. Equine superficial digital flexor tendons were subjected to ex vivo progressive tensional loading. Seven times more backscattered light was detected parallel rather than perpendicular to the tendon axis with the 2.75 mm separation probe in unloaded tendons. In contrast, using the 300 μm separation probe the plane of maximum backscatter (3-fold greater) was perpendicular to the tendon axis. There was no optical anisotropy in the cross-sectional plane of the tendon (i.e. the transversely cut tendon surface), with no structural anisotropy. During mechanical loading (9–14% strain) backscatter anisotropy increased 8.5- to 18.5-fold along the principal strain axis for 2.75 mm probe separation, but almost disappeared in the perpendicular plane (measured using the 300 μm probe separation). Optical (anisotropy) and mechanical (strain) measurements were highly correlated. We conclude that spatial anisotropy of backscattered light can be used for quantitative monitoring of collagen fibril alignment and tissue reorganization during loading, with the potential for minimally invasive real-time structural monitoring of fibrous tissues in normal, pathological or repairing tissues and in tissue engineering. PMID:14578479
Simultaneous ocular and muscle artifact removal from EEG data by exploiting diverse statistics.
Chen, Xun; Liu, Aiping; Chen, Qiang; Liu, Yu; Zou, Liang; McKeown, Martin J
2017-09-01
Electroencephalography (EEG) recordings are frequently contaminated by both ocular and muscle artifacts. These are normally dealt with separately, by employing blind source separation (BSS) techniques relying on either second-order or higher-order statistics (SOS & HOS respectively). When HOS-based methods are used, it is usually in the setting of assuming artifacts are statistically independent to the EEG. When SOS-based methods are used, it is assumed that artifacts have autocorrelation characteristics distinct from the EEG. In reality, ocular and muscle artifacts do not completely follow the assumptions of strict temporal independence to the EEG nor completely unique autocorrelation characteristics, suggesting that exploiting HOS or SOS alone may be insufficient to remove these artifacts. Here we employ a novel BSS technique, independent vector analysis (IVA), to jointly employ HOS and SOS simultaneously to remove ocular and muscle artifacts. Numerical simulations and application to real EEG recordings were used to explore the utility of the IVA approach. IVA was superior in isolating both ocular and muscle artifacts, especially for raw EEG data with low signal-to-noise ratio, and also integrated usually separate SOS and HOS steps into a single unified step. Copyright © 2017 Elsevier Ltd. All rights reserved.
Column-coupling strategies for multidimensional electrophoretic separation techniques.
Kler, Pablo A; Sydes, Daniel; Huhn, Carolin
2015-01-01
Multidimensional electrophoretic separations represent one of the most common strategies for dealing with the analysis of complex samples. In recent years we have been witnessing the explosive growth of separation techniques for the analysis of complex samples in applications ranging from life sciences to industry. In this sense, electrophoretic separations offer several strategic advantages such as excellent separation efficiency, different methods with a broad range of separation mechanisms, and low liquid consumption generating less waste effluents and lower costs per analysis, among others. Despite their impressive separation efficiency, multidimensional electrophoretic separations present some drawbacks that have delayed their extensive use: the volumes of the columns, and consequently of the injected sample, are significantly smaller compared to other analytical techniques, thus the coupling interfaces between two separations components must be very efficient in terms of providing geometrical precision with low dead volume. Likewise, very sensitive detection systems are required. Additionally, in electrophoretic separation techniques, the surface properties of the columns play a fundamental role for electroosmosis as well as the unwanted adsorption of proteins or other complex biomolecules. In this sense the requirements for an efficient coupling for electrophoretic separation techniques involve several aspects related to microfluidics and physicochemical interactions of the electrolyte solutions and the solid capillary walls. It is interesting to see how these multidimensional electrophoretic separation techniques have been used jointly with different detection techniques, for intermediate detection as well as for final identification and quantification, particularly important in the case of mass spectrometry. In this work we present a critical review about the different strategies for coupling two or more electrophoretic separation techniques and the different intermediate and final detection methods implemented for such separations.
Porcaro, Camillo; Cottone, Carlo; Cancelli, Andrea; Salustri, Carlo; Tecchio, Franca
2018-04-01
High time resolution techniques are crucial for investigating the brain in action. Here, we propose a method to identify a section of the upper-limb motor area representation (FS_M1) by means of electroencephalographic (EEG) signals recorded during a completely passive condition (FS_M1bySS). We delivered a galvanic stimulation to the median nerve and we applied to EEG the semi-Blind Source Separation (s-BSS) algorithm named Functional Source Separation (FSS). In order to prove that FS_M1bySS is part of FS_M1, we also collected EEG in a motor condition, i.e. during a voluntary movement task (isometric handgrip) and in a rest condition, i.e. at rest with eyes open and closed. In motor condition, we show that the cortico-muscular coherence (CMC) of FS_M1bySS does not differ from FS_ M1 CMC (0.04 for both sources). Moreover, we show that the FS_M1bySS's ongoing whole band activity during Motor and both rest conditions displays high mutual information and time correlation with FS_M1 (above 0.900 and 0.800, respectively) whereas much smaller ones with the primary somatosensory cortex [Formula: see text] (about 0.300 and 0.500, [Formula: see text]). FS_M1bySS as a marker of the upper-limb FS_M1 representation obtainable without the execution of an active motor task is a great achievement of the FSS algorithm, relevant in most experimental, neurological and psychiatric protocols.
NASA Astrophysics Data System (ADS)
Daellenbach, Kaspar R.; El-Haddad, Imad; Karvonen, Lassi; Vlachou, Athanasia; Corbin, Joel C.; Slowik, Jay G.; Heringa, Maarten F.; Bruns, Emily A.; Luedin, Samuel M.; Jaffrezo, Jean-Luc; Szidat, Sönke; Piazzalunga, Andrea; Gonzalez, Raquel; Fermo, Paola; Pflueger, Valentin; Vogel, Guido; Baltensperger, Urs; Prévôt, André S. H.
2018-02-01
We assess the benefits of offline laser-desorption/ionization mass spectrometry in understanding ambient particulate matter (PM) sources. The technique was optimized for measuring PM collected on quartz-fiber filters using silver nitrate as an internal standard for m/z calibration. This is the first application of this technique to samples collected at nine sites in central Europe throughout the entire year of 2013 (819 samples). Different PM sources were identified by positive matrix factorization (PMF) including also concomitant measurements (such as NOx, levoglucosan, and temperature). By comparison to reference mass spectral signatures from laboratory wood burning experiments as well as samples from a traffic tunnel, three biomass burning factors and two traffic factors were identified. The wood burning factors could be linked to the burning conditions; the factors related to inefficient burns had a larger impact on air quality in southern Alpine valleys than in northern Switzerland. The traffic factors were identified as primary tailpipe exhaust and most possibly aged/secondary traffic emissions. The latter attribution was supported by radiocarbon analyses of both the organic and elemental carbon. Besides these sources, factors related to secondary organic aerosol were also separated. The contribution of the wood burning emissions based on LDI-PMF (laser-desorption/ionization PMF) correlates well with that based on AMS-PMF (aerosol mass spectrometer PMF) analyses, while the comparison between the two techniques for other components is more complex.
Timothy Callahan; Austin E. Morrison
2016-01-01
Interpreting storm-event runoff in coastal plain watersheds is challenging because of the space- and time-variable nature of different sources that contribute to stream flow. These flow vectors and the magnitude of water flux is dependent on the pre-storm soil moisture (as estimated from depth to water table) in the lower coastal plain (LCP) region.
Parallel Implementation of the Wideband DOA Algorithm on the IBM Cell BE Processor
2010-05-01
Abstract—The Multiple Signal Classification ( MUSIC ) algorithm is a powerful technique for determining the Direction of Arrival (DOA) of signals...Broadband Engine Processor (Cell BE). The process of adapting the serial based MUSIC algorithm to the Cell BE will be analyzed in terms of parallelism and...using Multiple Signal Classification MUSIC algorithm [4] • Computation of Focus matrix • Computation of number of sources • Separation of Signal
Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.
Yang, Zeyu; Hollebone, Bruce P; Wang, Zhendi; Yang, Chun; Brown, Carl; Landriault, Mike
2013-06-01
A case study is presented for the forensic identification of several spilled biodiesels and its blends with petroleum oil using integrated forensic oil fingerprinting techniques. The integrated fingerprinting techniques combined SPE with GC/MS for obtaining individual petroleum hydrocarbons (aliphatic hydrocarbons, polyaromatic hydrocarbons and their alkylated derivatives and biomarkers), and biodiesel hydrocarbons (fatty acid methyl esters, free fatty acids, glycerol, monoacylglycerides, and free sterols). HPLC equipped with evaporative scattering laser detector was also used for identifying the compounds that conventional GC/MS could not finish. The three environmental samples (E1, E2, and E3) and one suspected source sample (S2) were dominant with vegetable oil with high acid values and low concentration of fatty acid methyl ester. The suspected source sample S2 was responsible for the three spilled samples although E1 was slightly contaminated by petroleum oil with light hydrocarbons. The suspected source sample S1 exhibited with the high content of glycerol, low content of glycerides, and high polarity, indicating its difference from the other samples. These samples may be the separated byproducts in producing biodiesel. Canola oil source is the most possible feedstock for the three environmental samples and the suspected source sample S2. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Optical detection of gold nanoparticles in a prostate-shaped porcine phantom.
Grabtchak, Serge; Tonkopi, Elena; Whelan, William M
2013-07-01
Gold nanoparticles can be used as molecular contrast agents binding specifically to cancer sites and thus delineating tumor regions. Imaging gold nanoparticles deeply embedded in tissues with optical techniques possesses significant challenges due to multiple scattering of optical photons that blur the obtained images. Both diagnostic and therapeutic applications can benefit from a minimally invasive technique that can identify, localize, and quantify the payloads of gold nanoparticles deeply embedded in biological tissues. An optical radiance technique is applied to map localized inclusions of gold nanorods in 650- to 900-nm spectral range in a porcine phantom that mimics prostate geometry. Optical radiance defines a variation in the angular density of photons impinging on a selected point in the tissue from various directions. The inclusions are formed by immersing a capillary filled with gold nanorods in the phantom at increasing distances from the detecting fiber. The technique allows the isolation of the spectroscopic signatures of the inclusions from the background and identification of inclusion locations in the angular domain. Detection of ∼4×1010 gold nanoparticles or 0.04 mg Au/mL (detector-inclusion separation 10 mm, source-detector separation 15 mm) in the porcine tissue is demonstrated. The encouraging results indicate a promising potential of radiance spectroscopy in early prostate cancer diagnostics with gold nanoparticles.
A source number estimation method for single optical fiber sensor
NASA Astrophysics Data System (ADS)
Hu, Junpeng; Huang, Zhiping; Su, Shaojing; Zhang, Yimeng; Liu, Chunwu
2015-10-01
The single-channel blind source separation (SCBSS) technique makes great significance in many fields, such as optical fiber communication, sensor detection, image processing and so on. It is a wide range application to realize blind source separation (BSS) from a single optical fiber sensor received data. The performance of many BSS algorithms and signal process methods will be worsened with inaccurate source number estimation. Many excellent algorithms have been proposed to deal with the source number estimation in array signal process which consists of multiple sensors, but they can not be applied directly to the single sensor condition. This paper presents a source number estimation method dealing with the single optical fiber sensor received data. By delay process, this paper converts the single sensor received data to multi-dimension form. And the data covariance matrix is constructed. Then the estimation algorithms used in array signal processing can be utilized. The information theoretic criteria (ITC) based methods, presented by AIC and MDL, Gerschgorin's disk estimation (GDE) are introduced to estimate the source number of the single optical fiber sensor's received signal. To improve the performance of these estimation methods at low signal noise ratio (SNR), this paper make a smooth process to the data covariance matrix. By the smooth process, the fluctuation and uncertainty of the eigenvalues of the covariance matrix are reduced. Simulation results prove that ITC base methods can not estimate the source number effectively under colored noise. The GDE method, although gets a poor performance at low SNR, but it is able to accurately estimate the number of sources with colored noise. The experiments also show that the proposed method can be applied to estimate the source number of single sensor received data.
Ishikawa, Masayori; Nagase, Naomi; Matsuura, Taeko; Hiratsuka, Junichi; Suzuki, Ryusuke; Miyamoto, Naoki; Sutherland, Kenneth Lee; Fujita, Katsuhisa; Shirato, Hiroki
2015-01-01
Abstract The scintillator with optical fiber (SOF) dosimeter consists of a miniature scintillator mounted on the tip of an optical fiber. The scintillator of the current SOF dosimeter is a 1-mm diameter hemisphere. For a scintillation dosimeter coupled with an optical fiber, measurement accuracy is influenced by signals due to Cerenkov radiation in the optical fiber. We have implemented a spectral filtering technique for compensating for the Cerenkov radiation effect specifically for our plastic scintillator-based dosimeter, using a wavelength-separated counting method. A dichroic mirror was used for separating input light signals. Individual signal counting was performed for high- and low-wavelength light signals. To confirm the accuracy, measurements with various amounts of Cerenkov radiation were performed by changing the incident direction while keeping the Ir-192 source-to-dosimeter distance constant, resulting in a fluctuation of <5%. Optical fiber bending was also addressed; no bending effect was observed for our wavelength-separated SOF dosimeter. PMID:25618136
Blind source separation in retinal videos
NASA Astrophysics Data System (ADS)
Barriga, Eduardo S.; Truitt, Paul W.; Pattichis, Marios S.; Tüso, Dan; Kwon, Young H.; Kardon, Randy H.; Soliz, Peter
2003-05-01
An optical imaging device of retina function (OID-RF) has been developed to measure changes in blood oxygen saturation due to neural activity resulting from visual stimulation of the photoreceptors in the human retina. The video data that are collected represent a mixture of the functional signal in response to the retinal activation and other signals from undetermined physiological activity. Measured changes in reflectance in response to the visual stimulus are on the order of 0.1% to 1.0% of the total reflected intensity level which makes the functional signal difficult to detect by standard methods since it is masked by the other signals that are present. In this paper, we apply principal component analysis (PCA), blind source separation (BSS), using Extended Spatial Decorrelation (ESD) and independent component analysis (ICA) using the Fast-ICA algorithm to extract the functional signal from the retinal videos. The results revealed that the functional signal in a stimulated retina can be detected through the application of some of these techniques.
Long, Zhiying; Chen, Kewei; Wu, Xia; Reiman, Eric; Peng, Danling; Yao, Li
2009-02-01
Spatial Independent component analysis (sICA) has been widely used to analyze functional magnetic resonance imaging (fMRI) data. The well accepted implicit assumption is the spatially statistical independency of intrinsic sources identified by sICA, making the sICA applications difficult for data in which there exist interdependent sources and confounding factors. This interdependency can arise, for instance, from fMRI studies investigating two tasks in a single session. In this study, we introduced a linear projection approach and considered its utilization as a tool to separate task-related components from two-task fMRI data. The robustness and feasibility of the method are substantiated through simulation on computer data and fMRI real rest data. Both simulated and real two-task fMRI experiments demonstrated that sICA in combination with the projection method succeeded in separating spatially dependent components and had better detection power than pure model-based method when estimating activation induced by each task as well as both tasks.
Use of UV Sources for Detection and Identification of Explosives
NASA Technical Reports Server (NTRS)
Hug, William; Reid, Ray; Bhartia, Rohit; Lane, Arthur
2009-01-01
Measurement of Raman and native fluorescence emission using ultraviolet (UV) sources (<400 nm) on targeted materials is suitable for both sensitive detection and accurate identification of explosive materials. When the UV emission data are analyzed using a combination of Principal Component Analysis (PCA) and cluster analysis, chemicals and biological samples can be differentiated based on the geometric arrangement of molecules, the number of repeating aromatic rings, associated functional groups (nitrogen, sulfur, hydroxyl, and methyl), microbial life cycles (spores vs. vegetative cells), and the number of conjugated bonds. Explosive materials can be separated from one another as well as from a range of possible background materials, which includes microbes, car doors, motor oil, and fingerprints on car doors, etc. Many explosives are comprised of similar atomic constituents found in potential background samples such as fingerprint oils/skin, motor oil, and soil. This technique is sensitive to chemical bonds between the elements that lead to the discriminating separability between backgrounds and explosive materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGrath, Christopher A.
2015-04-01
The presence of radioactive xenon isotopes indicates that fission events have occurred, and is used to help enforce the Comprehensive Test Ban Treaty. Idaho National Laboratory (INL) produces 135Xe, 133mXe, 133Xe, and 131mXe standards used for the calibration and testing of collection equipment and analytical techniques used to monitor radio xenon emissions. At INL, xenon is produced and collected as one of several spontaneous fission products from a 252Cf source. Further chromatographic purification of the fission gases ensures the separations of the xenon fraction for selective collection. An explanation of the fission gas collection, separation and purification is presented. Additionally,more » the range of 135Xe to 133Xe ratio that can be isolated is explained. This is an operational update on the work introduced previously, now that it is in operation and has been recharged with a second 252Cf source.« less
New approaches for metabolomics by mass spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vertes, Akos
Small molecules constitute a large part of the world around us, including fossil and some renewable energy sources. Solar energy harvested by plants and bacteria is converted into energy rich small molecules on a massive scale. Some of the worst contaminants of the environment and compounds of interest for national security also fall in the category of small molecules. The development of large scale metabolomic analysis methods lags behind the state of the art established for genomics and proteomics. This is commonly attributed to the diversity of molecular classes included in a metabolome. Unlike nucleic acids and proteins, metabolites domore » not have standard building blocks, and, as a result, their molecular properties exhibit a wide spectrum. This impedes the development of dedicated separation and spectroscopic methods. Mass spectrometry (MS) is a strong contender in the quest for a quantitative analytical tool with extensive metabolite coverage. Although various MS-based techniques are emerging for metabolomics, many of these approaches include extensive sample preparation that make large scale studies resource intensive and slow. New ionization methods are redefining the range of analytical problems that can be solved using MS. This project developed new approaches for the direct analysis of small molecules in unprocessed samples, as well as pushed the limits of ultratrace analysis in volume limited complex samples. The projects resulted in techniques that enabled metabolomics investigations with enhanced molecular coverage, as well as the study of cellular response to stimuli on a single cell level. Effectively individual cells became reaction vessels, where we followed the response of a complex biological system to external perturbation. We established two new analytical platforms for the direct study of metabolic changes in cells and tissues following external perturbation. For this purpose we developed a novel technique, laser ablation electrospray ionization (LAESI), for metabolite profiling of functioning cells and tissues. The technique was based on microscopic sampling of biological specimens by mid-infrared laser ablation followed by electrospray ionization of the plume and MS analysis. The two main shortcomings of this technique had been limited specificity due to the lack of a separation step, and limited molecular coverage, especially for nonpolar chemical species. To improve specificity and the coverage of the metabolome, we implemented the LAESI ion source on a mass spectrometer with ion mobility separation (IMS). In this system, the gas phase ions produced by the LAESI source were first sorted according to their collisional cross sections in a mobility cell. These separated ion packets were then subjected to MS analysis. By combining the atmospheric pressure ionization with IMS, we improved the metabolite coverage. Further enhancement of the non-polar metabolite coverage resulted from the combination of laser ablation with vacuum UV irradiation of the ablation plume. Our results indicated that this new ionization modality provided improved detection for neutral and non-polar compounds. Based on rapid progress in photonics, we had introduced another novel ion source that utilized the interaction of a laser pulse with silicon nanopost arrays (NAPA). In these nanophotonic ion sources, the structural features were commensurate with the wavelength of the laser light. The enhanced interaction resulted in high ion yields. This ultrasensitive analytical platform enabled the MS analysis of single yeast cells. We extended these NAPA studies from yeast to other microorganisms, including green algae (Chlamydomonas reinhardtii) that captured energy from sunlight on a massive scale. Combining cellular perturbations, e.g., through environmental changes, with the newly developed single cell analysis methods enabled us to follow dynamic changes induced in the cells. In effect, we were able to use individual cells as a “laboratory,” and approached the long-standing goal of establishing a “lab-in-a-cell.” Model systems for these studies included cells of cyanobacteria (Anabaena), yeast (Saccharomyces cerevisiae), green algae (C. reinhardtii) and Arabidopsis thaliana.« less
Beam uniformity analysis of infrared laser illuminators
NASA Astrophysics Data System (ADS)
Allik, Toomas H.; Dixon, Roberta E.; Proffitt, R. Patrick; Fung, Susan; Ramboyong, Len; Soyka, Thomas J.
2015-02-01
Uniform near-infrared (NIR) and short-wave infrared (SWIR) illuminators are desired in low ambient light detection, recognition, and identification of military applications. Factors that contribute to laser illumination image degradation are high frequency, coherent laser speckle and low frequency nonuniformities created by the laser or external laser cavity optics. Laser speckle analysis and beam uniformity improvements have been independently studied by numerous authors, but analysis to separate these two effects from a single measurement technique has not been published. In this study, profiles of compact, diode laser NIR and SWIR illuminators were measured and evaluated. Digital 12-bit images were recorded with a flat-field calibrated InGaAs camera with measurements at F/1.4 and F/16. Separating beam uniformity components from laser speckle was approximated by filtering the original image. The goal of this paper is to identify and quantify the beam quality variation of illumination prototypes, draw awareness to its impact on range performance modeling, and develop measurement techniques and methodologies for military, industry, and vendors of active sources.
Investigation on improved Gabor order tracking technique and its applications
NASA Astrophysics Data System (ADS)
Pan, Min-Chun; Chiu, Chun-Ching
2006-08-01
The study proposes an improved Gabor order tracking (GOT) technique to cope with crossing-order/spectral components that cannot be effectively separated by using the original GOT scheme. The improvement aids both the reconstruction and interpretation of two crossing orders/spectra such as a transmission-element-regarding order and a structural resonance. The dual function of the Gabor elementary function can affect the precision of tracked orders. In the paper, its influence on the computed Gabor expansion coefficients is investigated. For applying the improved scheme in practical works, the separation and extraction of close-order components of vibration signals measured from a transmission-element test bench is illustrated by using both the GOT and Vold-Kalman filtering OT methods. Additionally, comparisons between these two schemes are summarized from processing results. The other experimental work demonstrates the ranking of noise components from a riding electric scooter. Singled-out dominant noise sources can be referred for subsequent design-remodeling tasks.
Hongtao, Li; Shichao, Chen; Yanjun, Han; Yi, Luo
2013-01-14
A feedback method combined with fitting technique based on variable separation mapping is proposed to design freeform optical systems for an extended LED source with prescribed illumination patterns, especially with uniform illuminance distribution. Feedback process performs well with extended sources, while fitting technique contributes not only to the decrease of pieces of sub-surfaces in discontinuous freeform lenses which may cause loss in manufacture, but also the reduction in the number of feedback iterations. It is proved that light control efficiency can be improved by 5%, while keeping a high uniformity of 82%, with only two feedback iterations and one fitting operation can improve. Furthermore, the polar angle θ and azimuthal angle φ is used to specify the light direction from the light source, and the (θ,φ)-(x,y) based mapping and feedback strategy makes sure that even few discontinuous sections along the equi-φ plane exist in the system, they are perpendicular to the base plane, making it eligible for manufacturing the surfaces using injection molding.
HEKATE-A novel grazing incidence neutron scattering concept for the European Spallation Source.
Glavic, Artur; Stahn, Jochen
2018-03-01
Structure and magnetism at surfaces and buried interfaces on the nanoscale can only be accessed by few techniques, one of which is grazing incidence neutron scattering. While the technique has its strongest limitation in a low signal and large background, due to the low scattering probability and need for high resolution, it can be expected that the high intensity of the European Spallation Source in Lund, Sweden, will make many more such studies possible, warranting a dedicated beamline for this technique. We present an instrument concept, Highly Extended K range And Tunable Experiment (HEKATE), for surface scattering that combines the advantages of two Selene neutron guides with unique capabilities of spatially separated distinct wavelength frames. With this combination, it is not only possible to measure large specular reflectometry ranges, even on free liquid surfaces, but also to use two independent incident beams with tunable sizes and resolutions that can be optimized for the specifics of the investigated samples. Further the instrument guide geometry is tuned for reduction of high energy particle background and only uses low to moderate supermirror coatings for high reliability and affordable cost.
HEKATE—A novel grazing incidence neutron scattering concept for the European Spallation Source
NASA Astrophysics Data System (ADS)
Glavic, Artur; Stahn, Jochen
2018-03-01
Structure and magnetism at surfaces and buried interfaces on the nanoscale can only be accessed by few techniques, one of which is grazing incidence neutron scattering. While the technique has its strongest limitation in a low signal and large background, due to the low scattering probability and need for high resolution, it can be expected that the high intensity of the European Spallation Source in Lund, Sweden, will make many more such studies possible, warranting a dedicated beamline for this technique. We present an instrument concept, Highly Extended K range And Tunable Experiment (HEKATE), for surface scattering that combines the advantages of two Selene neutron guides with unique capabilities of spatially separated distinct wavelength frames. With this combination, it is not only possible to measure large specular reflectometry ranges, even on free liquid surfaces, but also to use two independent incident beams with tunable sizes and resolutions that can be optimized for the specifics of the investigated samples. Further the instrument guide geometry is tuned for reduction of high energy particle background and only uses low to moderate supermirror coatings for high reliability and affordable cost.
NASA Astrophysics Data System (ADS)
Capuano, Paolo; De Lauro, Enza; De Martino, Salvatore; Falanga, Mariarosaria; Petrosino, Simona
2015-04-01
One of the main challenge in volcano-seismological literature is to locate and characterize the source of volcano/tectonic seismic activity. This passes through the identification at least of the onset of the main phases, i.e. the body waves. Many efforts have been made to solve the problem of a clear separation of P and S phases both from a theoretical point of view and developing numerical algorithms suitable for specific cases (see, e.g., Küperkoch et al., 2012). Recently, a robust automatic procedure has been implemented for extracting the prominent seismic waveforms from continuously recorded signals and thus allowing for picking the main phases. The intuitive notion of maximum non-gaussianity is achieved adopting techniques which involve higher-order statistics in frequency domain., i.e, the Convolutive Independent Component Analysis (CICA). This technique is successful in the case of the blind source separation of convolutive mixtures. In seismological framework, indeed, seismic signals are thought as the convolution of a source function with path, site and the instrument response. In addition, time-delayed versions of the same source exist, due to multipath propagation typically caused by reverberations from some obstacle. In this work, we focus on the Volcano Tectonic (VT) activity at Campi Flegrei Caldera (Italy) during the 2006 ground uplift (Ciaramella et al., 2011). The activity was characterized approximately by 300 low-magnitude VT earthquakes (Md < 2; for the definition of duration magnitude, see Petrosino et al. 2008). Most of them were concentrated in distinct seismic sequences with hypocenters mainly clustered beneath the Solfatara-Accademia area, at depths ranging between 1 and 4 km b.s.l.. The obtained results show the clear separation of P and S phases: the technique not only allows the identification of the S-P time delay giving the timing of both phases but also provides the independent waveforms of the P and S phases. This is an enormous advantage for all the problems related to the source inversion and location In addition, the VT seismicity was accompanied by hundreds of LP events (characterized by spectral peaks in the 0.5-2-Hz frequency band) that were concentrated in a 7-day interval. The main interest is to establish whether the occurrence of LPs is only limited to the swarm that reached a climax on days 26-28 October as indicated by Saccorotti et al. (2007), or a longer period is experienced. The automatically extracted waveforms with improved signal-to-noise ratio via CICA coupled with automatic phase picking allowed to compile a more complete seismic catalog and to better quantify the seismic energy release including the presence of LP events from the beginning of October until mid of November. Finally, a further check of the volcanic nature of extracted signals is achieved by looking at the seismological properties and the content of entropy held in the traces (Falanga and Petrosino 2012; De Lauro et al., 2012). Our results allow us to move towards a full description of the complexity of the source, which can be used for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the CICA method to regions with low seismicity in high ambient noise
Thibodeau, C; Monette, F; Glaus, M; Laflamme, C B
2011-01-01
The black water and grey water source-separation sanitation system aims at efficient use of energy (biogas), water and nutrients but currently lacks evidence of economic viability to be considered a credible alternative to the conventional system. This study intends to demonstrate economic viability, identify main cost contributors and assess critical influencing factors. A technico-economic model was built based on a new neighbourhood in a Canadian context. Three implementation scales of source-separation system are defined: 500, 5,000 and 50,000 inhabitants. The results show that the source-separation system is 33% to 118% more costly than the conventional system, with the larger cost differential obtained by lower source-separation system implementation scales. A sensitivity analysis demonstrates that vacuum toilet flow reduction from 1.0 to 0.25 L/flush decreases source-separation system cost between 23 and 27%. It also shows that high resource costs can be beneficial or unfavourable to the source-separation system depending on whether the vacuum toilet flow is low or normal. Therefore, the future of this configuration of the source-separation system lies mainly in vacuum toilet flow reduction or the introduction of new efficient effluent volume reduction processes (e.g. reverse osmosis).
Localization from near-source quasi-static electromagnetic fields
NASA Astrophysics Data System (ADS)
Mosher, J. C.
1993-09-01
A wide range of research has been published on the problem of estimating the parameters of electromagnetic and acoustical sources from measurements of signals measured at an array of sensors. In the quasi-static electromagnetic cases examined here, the signal variation from a point source is relatively slow with respect to the signal propagation and the spacing of the array of sensors. As such, the location of the point sources can only be determined from the spatial diversity of the received signal across the array. The inverse source localization problem is complicated by unknown model order and strong local minima. The nonlinear optimization problem is posed for solving for the parameters of the quasi-static source model. The transient nature of the sources can be exploited to allow subspace approaches to separate out the signal portion of the spatial correlation matrix. Decomposition techniques are examined for improved processing, and an adaptation of MUltiple SIgnal Characterization (MUSIC) is presented for solving the source localization problem. Recent results on calculating the Cramer-Rao error lower bounds are extended to the multidimensional problem here. This thesis focuses on the problem of source localization in magnetoencephalography (MEG), with a secondary application to thunderstorm source localization. Comparisons are also made between MEG and its electrical equivalent, electroencephalography (EEG). The error lower bounds are examined in detail for several MEG and EEG configurations, as well as localizing thunderstorm cells over Cape Canaveral and Kennedy Space Center. Time-eigenspectrum is introduced as a parsing technique for improving the performance of the optimization problem.
Localization from near-source quasi-static electromagnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mosher, John Compton
1993-09-01
A wide range of research has been published on the problem of estimating the parameters of electromagnetic and acoustical sources from measurements of signals measured at an array of sensors. In the quasi-static electromagnetic cases examined here, the signal variation from a point source is relatively slow with respect to the signal propagation and the spacing of the array of sensors. As such, the location of the point sources can only be determined from the spatial diversity of the received signal across the array. The inverse source localization problem is complicated by unknown model order and strong local minima. Themore » nonlinear optimization problem is posed for solving for the parameters of the quasi-static source model. The transient nature of the sources can be exploited to allow subspace approaches to separate out the signal portion of the spatial correlation matrix. Decomposition techniques are examined for improved processing, and an adaptation of MUtiple SIgnal Characterization (MUSIC) is presented for solving the source localization problem. Recent results on calculating the Cramer-Rao error lower bounds are extended to the multidimensional problem here. This thesis focuses on the problem of source localization in magnetoencephalography (MEG), with a secondary application to thunderstorm source localization. Comparisons are also made between MEG and its electrical equivalent, electroencephalography (EEG). The error lower bounds are examined in detail for several MEG and EEG configurations, as well as localizing thunderstorm cells over Cape Canaveral and Kennedy Space Center. Time-eigenspectrum is introduced as a parsing technique for improving the performance of the optimization problem.« less
Hydrograph separation techniques in snowmelt-dominated watersheds
NASA Astrophysics Data System (ADS)
Miller, S.; Miller, S. N.
2017-12-01
This study integrates hydrological, geochemical, and isotopic data for a better understanding of different streamflow generation pathways and residence times in a snowmelt-dominated region. A nested watershed design with ten stream gauging sites recording sub-hourly stream stage has been deployed in a snowmelt-dominated region in southeastern Wyoming, heavily impacted by the recent bark beetle epidemic. LiDAR-derived digital elevation models help elucidate effects from topography and watershed metrics. At each stream gauging site, sub-hourly stream water conductivity and temperature data are also recorded. Hydrograph separation is a useful technique for determining different sources of runoff and how volumes from each source vary over time. Following previous methods, diurnal cycles from sub-hourly recorded streamflow and specific conductance data are analyzed and used to separate hydrographs into overland flow and baseflow components, respectively. A final component, vadose-zone flow, is assumed to be the remaining water from the total hydrograph. With access to snowmelt and precipitation data from nearby instruments, runoff coefficients are calculated for the different mechanisms, providing information on watershed response. Catchments are compared to understand how different watershed characteristics translate snowmelt or precipitation events into runoff. Portable autosamplers were deployed at two of the gauging sites for high-frequency analysis of stream water isotopic composition during peak flow to compare methods of hydrograph separation. Sampling rates of one or two hours can detect the diurnal streamflow cycle common during peak snowmelt. Prior research suggests the bark beetle epidemic has had little effect on annual streamflow patterns; however, several results show an earlier shift in the day of year in which peak annual streamflow is observed. The diurnal cycle is likely to comprise a larger percentage of daily streamflow during snowmelt in post-epidemic forests, as more solar radiation is available to penetrate to the ground surface and induce snowmelt, contributing to the effect of an earlier observed peak annual streamflow.
Advanced techniques and technology for efficient data storage, access, and transfer
NASA Technical Reports Server (NTRS)
Rice, Robert F.; Miller, Warner
1991-01-01
Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.
Identification of sources of aerosol particles in three locations in eastern Botswana
NASA Astrophysics Data System (ADS)
Chimidza, S.; Moloi, K.
2000-07-01
Airborne particles have been collected using a dichotomous virtual impactor at three different locations in the eastern part of Botswana: Serowe, Selibe-Phikwe, and Francistown. The particles were separated into two fractions (fine and coarse). Sampling at the three locations was done consecutively during the months of July and August, which are usually dry and stable. The sampling time for each sample was 12 hours during the day. For elemental composition, energy-dispersive x-ray fluorescence technique was used. Correlations and principal component analysis with varimax rotation were used to identify major sources of aerosol particles. In all the three places, soil was found to be the main source of aerosol particles. A copper-nickel mine and smelter at Selibe-Phikwe was found to be not only a source of copper and nickel particles in Selibe-Phikwe but also a source of these particles in far places like Serowe. In Selibe-Phikwe and Francistown, car exhaust was found to be the major source of fine particles of lead and bromine.
Common source-multiple load vs. separate source-individual load photovoltaic system
NASA Technical Reports Server (NTRS)
Appelbaum, Joseph
1989-01-01
A comparison of system performance is made for two possible system setups: (1) individual loads powered by separate solar cell sources; and (2) multiple loads powered by a common solar cell source. A proof for resistive loads is given that shows the advantage of a common source over a separate source photovoltaic system for a large range of loads. For identical loads, both systems perform the same.
Layout-aware simulation of soft errors in sub-100 nm integrated circuits
NASA Astrophysics Data System (ADS)
Balbekov, A.; Gorbunov, M.; Bobkov, S.
2016-12-01
Single Event Transient (SET) caused by charged particle traveling through the sensitive volume of integral circuit (IC) may lead to different errors in digital circuits in some cases. In technologies below 180 nm, a single particle can affect multiple devices causing multiple SET. This fact adds the complexity to fault tolerant devices design, because the schematic design techniques become useless without their layout consideration. The most common layout mitigation technique is a spatial separation of sensitive nodes of hardened circuits. Spatial separation decreases the circuit performance and increases power consumption. Spacing should thus be reasonable and its scaling follows the device dimensions' scaling trend. This paper presents the development of the SET simulation approach comprised of SPICE simulation with "double exponent" current source as SET model. The technique uses layout in GDSII format to locate nearby devices that can be affected by a single particle and that can share the generated charge. The developed software tool automatizes multiple simulations and gathers the produced data to present it as the sensitivity map. The examples of conducted simulations of fault tolerant cells and their sensitivity maps are presented in this paper.
NASA Technical Reports Server (NTRS)
Worrall, Diana M.
1994-01-01
This report summarizes the activities related to two ROSAT investigations: (1) x-ray properties of radio galaxies thought to contain BL Lac type nuclei; and (2) x-ray spectra of a complete sample of flat-spectrum radio sources. The following papers describing the research are provided as attachments: Multiple X-ray Emission Components in Low Power Radio Galaxies; New X-ray Results on Radio Galaxies; Analysis Techniques for a Multiwavelength Study of Radio Galaxies; Separation of X-ray Emission Components in Radio Galaxies; X-ray Emission in Powerful Radio Galaxies and Quasars; Extended and Compact X-ray Emission in Powerful Radio Galaxies; and X-ray Spectra of a Complete Sample of Extragalactic Core-dominated Radio Sources.
NASA Technical Reports Server (NTRS)
Schmidt, Gordon S.; Mueller, Thomas J.
1987-01-01
The use of flow visualization to study separation bubbles is evaluated. The wind tunnel, two NACA 66(3)-018 airfoil models, and kerosene vapor, titanium tetrachloride, and surface flow visualizations techniques are described. The application of the three visualization techniques to the two airfoil models reveals that the smoke and vapor techniques provide data on the location of laminar separation and the onset of transition, and the surface method produces information about the location of turbulent boundary layer separation. The data obtained with the three flow visualization techniques are compared to pressure distribution data and good correlation is detected. It is noted that flow visualization is an effective technique for examining separation bubbles.
Isolating Discriminant Neural Activity in the Presence of Eye Movements and Concurrent Task Demands
Touryan, Jon; Lawhern, Vernon J.; Connolly, Patrick M.; Bigdely-Shamlo, Nima; Ries, Anthony J.
2017-01-01
A growing number of studies use the combination of eye-tracking and electroencephalographic (EEG) measures to explore the neural processes that underlie visual perception. In these studies, fixation-related potentials (FRPs) are commonly used to quantify early and late stages of visual processing that follow the onset of each fixation. However, FRPs reflect a mixture of bottom-up (sensory-driven) and top-down (goal-directed) processes, in addition to eye movement artifacts and unrelated neural activity. At present there is little consensus on how to separate this evoked response into its constituent elements. In this study we sought to isolate the neural sources of target detection in the presence of eye movements and over a range of concurrent task demands. Here, participants were asked to identify visual targets (Ts) amongst a grid of distractor stimuli (Ls), while simultaneously performing an auditory N-back task. To identify the discriminant activity, we used independent components analysis (ICA) for the separation of EEG into neural and non-neural sources. We then further separated the neural sources, using a modified measure-projection approach, into six regions of interest (ROIs): occipital, fusiform, temporal, parietal, cingulate, and frontal cortices. Using activity from these ROIs, we identified target from non-target fixations in all participants at a level similar to other state-of-the-art classification techniques. Importantly, we isolated the time course and spectral features of this discriminant activity in each ROI. In addition, we were able to quantify the effect of cognitive load on both fixation-locked potential and classification performance across regions. Together, our results show the utility of a measure-projection approach for separating task-relevant neural activity into meaningful ROIs within more complex contexts that include eye movements. PMID:28736519
NASA Astrophysics Data System (ADS)
Wright, L.; Coddington, O.; Pilewskie, P.
2017-12-01
Hyperspectral instruments are a growing class of Earth observing sensors designed to improve remote sensing capabilities beyond discrete multi-band sensors by providing tens to hundreds of continuous spectral channels. Improved spectral resolution, range and radiometric accuracy allow the collection of large amounts of spectral data, facilitating thorough characterization of both atmospheric and surface properties. We describe the development of an Informed Non-Negative Matrix Factorization (INMF) spectral unmixing method to exploit this spectral information and separate atmospheric and surface signals based on their physical sources. INMF offers marked benefits over other commonly employed techniques including non-negativity, which avoids physically impossible results; and adaptability, which tailors the method to hyperspectral source separation. The INMF algorithm is adapted to separate contributions from physically distinct sources using constraints on spectral and spatial variability, and library spectra to improve the initial guess. Using this INMF algorithm we decompose hyperspectral imagery from the NASA Hyperspectral Imager for the Coastal Ocean (HICO), with a focus on separating surface and atmospheric signal contributions. HICO's coastal ocean focus provides a dataset with a wide range of atmospheric and surface conditions. These include atmospheres with varying aerosol optical thicknesses and cloud cover. HICO images also provide a range of surface conditions including deep ocean regions, with only minor contributions from the ocean surfaces; and more complex shallow coastal regions with contributions from the seafloor or suspended sediments. We provide extensive comparison of INMF decomposition results against independent measurements of physical properties. These include comparison against traditional model-based retrievals of water-leaving, aerosol, and molecular scattering radiances and other satellite products, such as aerosol optical thickness from the Moderate Resolution Imaging Spectroradiometer (MODIS).
NASA Astrophysics Data System (ADS)
Gao, Shuang; Yang, Wen; Zhang, Hui; Sun, Yanling; Mao, Jian; Ma, Zhenxing; Cong, Zhiyuan; Zhang, Xian; Tian, Shasha; Azzi, Merched; Chen, Li; Bai, Zhipeng
2018-02-01
The determination of background concentration of PM2.5 is important to understand the contribution of local emission sources to total PM2.5 concentration. The purpose of this study was to exam the performance of baseline separation techniques to estimate PM2.5 background concentration. Five separation methods, which included recursive digital filters (Lyne-Hollick, one-parameter algorithm, and Boughton two-parameter algorithm), sliding interval and smoothed minima, were applied to one-year PM2.5 time-series data in two heavily polluted cities, Tianjin and Jinan. To obtain the proper filter parameters and recession constants for the separation techniques, we conducted regression analysis at a background site during the emission reduction period enforced by the Government for the 2014 Asia-Pacific Economic Cooperation (APEC) meeting in Beijing. Background concentrations in Tianjin and Jinan were then estimated by applying the determined filter parameters and recession constants. The chemical mass balance (CMB) model was also applied to ascertain the effectiveness of the new approach. Our results showed that the contribution of background PM concentration to ambient pollution was at a comparable level to the contribution obtained from the previous study. The best performance was achieved using the Boughton two-parameter algorithm. The background concentrations were estimated at (27 ± 2) μg/m3 for the whole year, (34 ± 4) μg/m3 for the heating period (winter), (21 ± 2) μg/m3 for the non-heating period (summer), and (25 ± 2) μg/m3 for the sandstorm period in Tianjin. The corresponding values in Jinan were (30 ± 3) μg/m3, (40 ± 4) μg/m3, (24 ± 5) μg/m3, and (26 ± 2) μg/m3, respectively. The study revealed that these baseline separation techniques are valid for estimating levels of PM2.5 air pollution, and that our proposed method has great potential for estimating the background level of other air pollutants.
A Comparative Study on Fetal Heart Rates Estimated from Fetal Phonography and Cardiotocography
Ibrahim, Emad A.; Al Awar, Shamsa; Balayah, Zuhur H.; Hadjileontiadis, Leontios J.; Khandoker, Ahsan H.
2017-01-01
The aim of this study is to investigate that fetal heart rates (fHR) extracted from fetal phonocardiography (fPCG) could convey similar information of fHR from cardiotocography (CTG). Four-channel fPCG sensors made of low cost (<$1) ceramic piezo vibration sensor within 3D-printed casings were used to collect abdominal phonogram signals from 20 pregnant mothers (>34 weeks of gestation). A novel multi-lag covariance matrix-based eigenvalue decomposition technique was used to separate maternal breathing, fetal heart sounds (fHS) and maternal heart sounds (mHS) from abdominal phonogram signals. Prior to the fHR estimation, the fPCG signals were denoised using a multi-resolution wavelet-based filter. The proposed source separation technique was first tested in separating sources from synthetically mixed signals and then on raw abdominal phonogram signals. fHR signals extracted from fPCG signals were validated using simultaneous recorded CTG-based fHR recordings.The experimental results have shown that the fHR derived from the acquired fPCG can be used to detect periods of acceleration and deceleration, which are critical indication of the fetus' well-being. Moreover, a comparative analysis demonstrated that fHRs from CTG and fPCG signals were in good agreement (Bland Altman plot has mean = −0.21 BPM and ±2 SD = ±3) with statistical significance (p < 0.001 and Spearman correlation coefficient ρ = 0.95). The study findings show that fHR estimated from fPCG could be a reliable substitute for fHR from the CTG, opening up the possibility of a low cost monitoring tool for fetal well-being. PMID:29089896
Hydrogen as an atomic beam standard
NASA Technical Reports Server (NTRS)
Peters, H. E.
1972-01-01
After a preliminary discussion of feasibility, new experimental work with a hydrogen beam is described. A space focused magnetic resonance technique with separated oscillatory fields is used with a monochromatic beam of cold hydrogen atoms which are selected from a higher temperature source. The first resonance curves and other experimental results are presented. These results are interpreted from the point of view of accuracy potential and frequency stability, and are compared with hydrogen maser and cesium beam capabilities.
A CMB foreground study in WMAP data: Extragalactic point sources and zodiacal light emission
NASA Astrophysics Data System (ADS)
Chen, Xi
The Cosmic Microwave Background (CMB) radiation is the remnant heat from the Big Bang. It serves as a primary tool to understand the global properties, content and evolution of the universe. Since 2001, NASA's Wilkinson Microwave Anisotropy Probe (WMAP) satellite has been napping the full sky anisotropy with unprecedented accuracy, precision and reliability. The CMB angular power spectrum calculated from the WMAP full sky maps not only enables accurate testing of cosmological models, but also places significant constraints on model parameters. The CMB signal in the WMAP sky maps is contaminated by microwave emission from the Milky Way and from extragalactic sources. Therefore, in order to use the maps reliably for cosmological studies, the foreground signals must be well understood and removed from the maps. This thesis focuses on the separation of two foreground contaminants from the WMAP maps: extragalactic point sources and zodiacal light emission. Extragalactic point sources constitute the most important foreground on small angular scales. Various methods have been applied to the WMAP single frequency maps to extract sources. However, due to the limited angular resolution of WMAP, it is possible to confuse positive CMB excursions with point sources or miss sources that are embedded in negative CMB fluctuations. We present a novel CMB-free source finding technique that utilizes the spectrum difference of point sources and CMB to form internal linear combinations of multifrequency maps to suppress the CMB and better reveal sources. When applied to the WMAP 41, 64 and 94 GHz maps, this technique has not only enabled detection of sources that are previously cataloged by independent methods, but also allowed disclosure of new sources. Without the noise contribution from the CMB, this method responds rapidly with the integration time. The number of detections varies as 0( t 0.72 in the two-band search and 0( t 0.70 in the three-band search from one year to five years, separately, in comparison to t 0.40 from the WMAP catalogs. Our source catalogs are a good supplement to the existing WMAP source catalogs, and the method itself is proven to be both complementary to and competitive with all the current source finding techniques in WMAP maps. Scattered light and thermal emission from the interplanetary dust (IPD) within our Solar System are major contributors to the diffuse sky brightness at most infrared wavelengths. For wavelengths longer than 3.5 mm, the thermal emission of the IPD dominates over scattering, and the emission is often referred to as the Zodiacal Light Emission (ZLE). To set a limit of ZLE contribution to the WMAP data, we have performed a simultaneous fit of the yearly WMAP time-ordered data to the time variation of ZLE predicted by the DIRBE IPD model (Kelsallet al. 1998) evaluated at 240 mm, plus [cursive l] = 1 - 4 CMB components. It is found that although this fitting procedure can successfully recover the CMB dipole to a 0.5% accuracy, it is not sensitive enough to determine the ZLE signal nor the other multipole moments very accurately.
Nowak, Jeremy A; Weber, Robert J; Goldstein, Allen H
2018-03-12
The ability to structurally characterize and isomerically quantify crude oil hydrocarbons relevant to refined fuels such as motor oil, diesel, and gasoline represents an extreme challenge for chromatographic and mass spectrometric techniques. This work incorporates two-dimensional gas chromatography coupled to a tunable vacuum ultraviolet soft photoionization source, the Chemical Dynamics Beamline 9.0.2 of the Advanced Light Source at the Lawrence Berkeley National Laboratory, with a time-of-flight mass spectrometer (GC × GC-VUV-TOF) to directly characterize and isomerically sum the contributions of aromatic and aliphatic species to hydrocarbon classes of four crude oils. When the VUV beam is tuned to 10.5 ± 0.2 eV, both aromatic and aliphatic crude oil hydrocarbons are ionized to reveal the complete chemical abundance of C 9 -C 30 hydrocarbons. When the VUV beam is tuned to 9.0 ± 0.2 eV only aromatic hydrocarbons are ionized, allowing separation of the aliphatic and aromatic fractions of the crude oil hydrocarbon chemical classes in an efficient manner while maintaining isomeric quantification. This technique provides an effective tool to determine the isomerically summed aromatic and aliphatic hydrocarbon compositions of crude oil, providing information that goes beyond typical GC × GC separations of the most dominant hydrocarbon isomers.
Evidence for Highly Inhomogeneous mm-Wave Sources During the Impulsive Flare of May 9, 1991
NASA Technical Reports Server (NTRS)
Hermann, R.; Magun, A.; Kaufmann, P.; Correia, E.; Costa, J. E. R.; Machado, M. E.; Fishman, G.
1997-01-01
In this paper multiwavelength observations of an impulsive flare of May 9, 1991 are presented. This event was observed with the 48 GHz multibeam focal array used at the Itapetinga radio telescope, the microwave patrol telescopes at Bem and the BATSE high time resolution hard X-ray spectrometer on board CGRO. While spatially unresolved low sensitivity observations show two major impulsive peaks, the mm-wave observations with the ability of spatially high resolved tracking of the emission centroids suggest a primarily bipolar source configuration. For the first time two mm-wave sources with a spacing below the HPBW could be separated with the multibeam technique. The general features of the observations are explained as emission of partially trapped electrons. Furthermore we present evidence for highly inhomogeneous substructures within one of the two mm-wave sources for which the positional scatter of the emission center, within 2s, is less than 2".
Informed Source Separation: A Bayesian Tutorial
NASA Technical Reports Server (NTRS)
Knuth, Kevin H.
2005-01-01
Source separation problems are ubiquitous in the physical sciences; any situation where signals are superimposed calls for source separation to estimate the original signals. In h s tutorial I will discuss the Bayesian approach to the source separation problem. This approach has a specific advantage in that it requires the designer to explicitly describe the signal model in addition to any other information or assumptions that go into the problem description. This leads naturally to the idea of informed source separation, where the algorithm design incorporates relevant information about the specific problem. This approach promises to enable researchers to design their own high-quality algorithms that are specifically tailored to the problem at hand.
NASA Astrophysics Data System (ADS)
Adler, Ronald S.; Swanson, Scott D.; Yeung, Hong N.
1996-01-01
A projection-operator technique is applied to a general three-component model for magnetization transfer, extending our previous two-component model [R. S. Adler and H. N. Yeung,J. Magn. Reson. A104,321 (1993), and H. N. Yeung, R. S. Adler, and S. D. Swanson,J. Magn. Reson. A106,37 (1994)]. The PO technique provides an elegant means of deriving a simple, effective rate equation in which there is natural separation of relaxation and source terms and allows incorporation of Redfield-Provotorov theory without any additional assumptions or restrictive conditions. The PO technique is extended to incorporate more general, multicomponent models. The three-component model is used to fit experimental data from samples of human hyaline cartilage and fibrocartilage. The fits of the three-component model are compared to the fits of the two-component model.
Open Source Tools for Numerical Simulation of Urban Greenhouse Gas Emissions
NASA Astrophysics Data System (ADS)
Nottrott, A.; Tan, S. M.; He, Y.
2016-12-01
There is a global movement toward urbanization. Approximately 7% of the global population lives in just 28 megacities, occupying less than 0.1% of the total land area used by human activity worldwide. These cities contribute a significant fraction of the global budget of anthropogenic primary pollutants and greenhouse gasses. The 27 largest cities consume 9.9%, 9.3%, 6.7% and 3.0% of global gasoline, electricity, energy and water use, respectively. This impact motivates novel approaches to quantify and mitigate the growing contribution of megacity emissions to global climate change. Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model methane (CH4) emissions from various components of the natural gas distribution system, to investigate the impact of urban meteorology on mobile CH4 measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of the plume due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments.
Measures and Relative Motions of Some Mostly F. G. W. Struve Doubles
NASA Astrophysics Data System (ADS)
Wiley, E. O.
2012-04-01
Measures of 59 pairs of double stars with long observational histories using "lucky imaging" techniques are reported. Relative motions of 59 pairs are investigated using histories of observation, scatter plots of relative motion, ordinary least-squares (OLS) and total proper motion analyses performed in "R," an open source programming language. A scatter plot of the coefficient of determinations derived from the OLS y|epoch and OLS x|epoch clearly separates common proper motion pairs from optical pairs and what are termed "long-period binary candidates." Differences in proper motion separate optical pairs from long-term binary candidates. An Appendix is provided that details how to use known rectilinear pairs as calibration pairs for the program REDUC.
Cannon shredding of municipal solid waste for the preparation of biological feedstock
NASA Astrophysics Data System (ADS)
Burke, J.
1981-04-01
Explosive decompression as a method of size reduction of materials found in municipal solid waste (MSW) was studied and preliminary data related to the handling and wet separation of exploded material was gathered. Steam was emphasized as the source of pressure. Municipal refuse was placed in an 8-ft long, 10.75-in. ID steel cannon which was sealed and pressurized. After an appropriate time, the cannon muzzle closure was opened and the test material expelled from the cannon through a constrictive orifice, resulting in explosive decompression. Flash evaporation of pressurized saturated water, expansion of steam, and the strong turbulence at the cannon muzzle accomplished size reduction. Hydraulic processing is shown to be an effective technique for separating heavy and light fractions.
NASA Astrophysics Data System (ADS)
Liu, X.; Beroza, G. C.; Nakata, N.
2017-12-01
Cross-correlation of fully diffuse wavefields provides Green's function between receivers, although the ambient noise field in the real world contains both diffuse and non-diffuse fields. The non-diffuse field potentially degrades the correlation functions. We attempt to blindly separate the diffuse and the non-diffuse components from cross-correlations of ambient seismic noise and analyze the potential bias caused by the non-diffuse components. We compute the 9-component noise cross-correlations for 17 stations in southern California. For the Rayleigh wave components, we assume that the cross-correlation of multiply scattered waves (diffuse component) is independent from the cross-correlation of ocean microseismic quasi-point source responses (non-diffuse component), and the cross-correlation function of ambient seismic data is the sum of both components. Thus we can blindly separate the non-diffuse component due to physical point sources and the more diffuse component due to cross-correlation of multiply scattered noise based on their statistical independence. We also perform beamforming over different frequency bands for the cross-correlations before and after the separation, and we find that the decomposed Rayleigh wave represents more coherent features among all Rayleigh wave polarization cross-correlation components. We show that after separating the non-diffuse component, the Frequency-Time Analysis results are less ambiguous. In addition, we estimate the bias in phase velocity on the raw cross-correlation data due to the non-diffuse component. We also apply this technique to a few borehole stations in Groningen, the Netherlands, to demonstrate its applicability in different instrument/geology settings.
SU-E-T-404: Simple Field-In-Field Technique for Total Body Irradiation in Large Patients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chi, P; Pinnix, C; Dabaja, B
2014-06-01
Purpose: A simple Field-in-Field technique for Total Body Irradiation (TBI) was developed for traditional AP/PA TBI treatments to improve dosimetric uniformity in patients with large separation. Methods: TBI at our institution currently utilizes an AP/PA technique at an extended source-to-surface distance (SSD) of 380cm with patients in left decubitus position during the AP beam and in right decubitus during the PA beam. Patients who have differences in thickness (separation) between the abdomen and head greater than 10cm undergo CT simulation in both left and right decubitus treatment positions. One plan for each CT is generated to evaluate dose to patientmore » midline with both AP and PA fields, but only corresponding AP fields will be exported for treatment for patient left decubitus position and PA fields for patient right decubitus position. Subfields are added by collimating with the x-ray jaws according to separation changes at 5–7% steps to minimize hot regions to less than 10%. Finally, the monitor units (MUs) for the plans are verified with hand calculation and water phantom measurements. Results: Dose uniformity (+/−10%) is achieved with field-in-field using only asymmetric jaws. It is dosimetrically robust with respect to minor setup/patient variations inevitable due to patient conditions. MUs calculated with Pinnacle were verified in 3 clinical cases and only a 2% difference was found compared to homogeneous calculation. In-vivo dosimeters were also used to verify doses received by each patient with and confirmed dose variations less than 10%. Conclusion: We encountered several cases with separation differences that raised uniformity concerns — based on a 1% dose difference per cm separation difference assumption. This could Resultin an unintended hot spot, often in the head/neck, up to 25%. This method allows dose modulation without adding treatment complexity nor introducing radiobiological variations, providing a reasonable solution for this unique TBI situation.« less
NASA Astrophysics Data System (ADS)
Wang, Jinhai; Liu, Dongyuan; Sun, Jinggong; Zhang, Yanjun; Sun, Qiuming; Ma, Jun; Zheng, Yu; Wang, Huiquan
2016-10-01
Near-infrared (NIR) brain imaging is one of the most promising techniques for brain research in recent years. As a significant supplement to the clinical imaging technique, such as CT and MRI, the NIR technique can achieve a fast, non-invasive, and low cost imaging of the brain, which is widely used for the brain functional imaging and hematoma detection. NIR imaging can achieve an imaging depth up to only several centimeters due to the reduced optical attenuation. The structure of the human brain is so particularly complex, from the perspective of optical detection, the measurement light needs go through the skin, skull, cerebrospinal fluid (CSF), grey matter, and white matter, and then reverses the order reflected by the detector. The more photons from the Depth of Interest (DOI) in brain the detector capture, the better detection accuracy and stability can be obtained. In this study, the Equivalent Signal to Noise Ratio (ESNR) was defined as the proportion of the photons from the DOI to the total photons the detector evaluated the best Source and Detector (SD) separation. The Monte-Carlo (MC) simulation was used to establish a multi brain layer model to analyze the distribution of the ESNR along the radial direction for different DOIs and several basic brain optical and structure parameters. A map between the best detection SD separation, in which distance the ESNR was the highest, and the brain parameters was established for choosing the best detection point in the NIR brain imaging application. The results showed that the ESNR was very sensitivity to the SD separation. So choosing the best SD separation based on the ESNR is very significant for NIR brain imaging application. It provides an important reference and new thinking for the brain imaging in the near infrared.
Single-channel mixed signal blind source separation algorithm based on multiple ICA processing
NASA Astrophysics Data System (ADS)
Cheng, Xiefeng; Li, Ji
2017-01-01
Take separating the fetal heart sound signal from the mixed signal that get from the electronic stethoscope as the research background, the paper puts forward a single-channel mixed signal blind source separation algorithm based on multiple ICA processing. Firstly, according to the empirical mode decomposition (EMD), the single-channel mixed signal get multiple orthogonal signal components which are processed by ICA. The multiple independent signal components are called independent sub component of the mixed signal. Then by combining with the multiple independent sub component into single-channel mixed signal, the single-channel signal is expanded to multipath signals, which turns the under-determined blind source separation problem into a well-posed blind source separation problem. Further, the estimate signal of source signal is get by doing the ICA processing. Finally, if the separation effect is not very ideal, combined with the last time's separation effect to the single-channel mixed signal, and keep doing the ICA processing for more times until the desired estimated signal of source signal is get. The simulation results show that the algorithm has good separation effect for the single-channel mixed physiological signals.
Separation of cells from the rat anterior pituitary gland
NASA Technical Reports Server (NTRS)
Hymer, Wesley C.; Hatfield, J. Michael
1983-01-01
Various techniques for separating the hormone-producing cell types from the rat anterior pituitary gland are examined. The purity, viability, and responsiveness of the separated cells depend on the physiological state of the donor, the tissue dissociation procedures, the staining technique used for identification of cell type, and the cell separation technique. The chamber-gradient setup and operation, the characteristics of the gradient materials, and the separated cell analysis of velocity sedimentation techniques (in particular Staput and Celsep) are described. Consideration is given to the various types of materials used in density gradient centrifugation and the operation of a gradient generating device. The use of electrophoresis to separate rat pituitary cells is discussed.
Detection of a very bright source close to the LMC supernova SN 1987A
NASA Technical Reports Server (NTRS)
Nisenson, P.; Papaliolios, C.; Karovska, M.; Noyes, R.
1987-01-01
High angular resolution observations of the supernova in the Large Magellanic Cloud, SN 1987A, have revealed a bright source separated from the SN by approximately 60 mas with a magnitude difference of 2.7 at 656 nm (H-alpha). Speckle imaging techniques were applied to data recorded with the CfA two-dimensional photon counting detector on the CTIO 4 m telescope on March 25 and April 2 to allow measurements in H-alpha on both nights and at 533 nm and 450 nm on the second night. The nature of this object is as yet unknown, though it is almost certainly a phenomenon related to the SN.
NASA Technical Reports Server (NTRS)
Baker, John; Thorpe, Ira
2012-01-01
Thoroughly studied classic space-based gravitational-wave missions concepts such as the Laser Interferometer Space Antenna (LISA) are based on laser-interferometry techniques. Ongoing developments in atom-interferometry techniques have spurred recently proposed alternative mission concepts. These different approaches can be understood on a common footing. We present an comparative analysis of how each type of instrument responds to some of the noise sources which may limiting gravitational-wave mission concepts. Sensitivity to laser frequency instability is essentially the same for either approach. Spacecraft acceleration reference stability sensitivities are different, allowing smaller spacecraft separations in the atom interferometry approach, but acceleration noise requirements are nonetheless similar. Each approach has distinct additional measurement noise issues.
Khutorianskiĭ, V A; Smirnov, A I; Matveev, D A
2014-01-01
The method of microcolumn reversed phase high performance liquid chromatography (rp-HPLC) was employed to determine the content of elemental sulphur in mineral waters. The study envisaged the analysis of the samples of sulphide-containing mineral waters Novonukutskaya and Matsesta obtained by the solid phase extraction technique. Based on these data, the authors discuss the origin and the circulation of sulphur in the hydrogen sulphide sources. The elution conditions selected in this study ensured the high-resolution separation of the octasulphur peak from the peaks of allotropic components of the extract whereas the two-wave detection technique allowed to identify the peaks of molecular sulphur.
Steinheimer, T.R.; Pereira, W.E.; Johnson, S.M.
1981-01-01
A bed sediment sample taken from an area impacted by heavy industrial activity was analyzed for organic compounds of environmental significance. Extraction was effected on a Soxhlet apparatus using a freeze-dried sample. The Soxhlet extract was fractionated by silica gel micro-column adsorption chromatography. Separation and identification of the organic compounds was accomplished by capillary gas chromatography/mass spectrometry techniques. More than 50 compounds were identified; these include saturated hydrocarbons, olefins, aromatic hydrocarbons, alkylated polycyclic aromatic hydrocarbons, and oxygenated compounds such as aldehydes and ketones. The role of bed sediments as a source or sink for organic pollutants is discussed. ?? 1981.
Turbulent flow separation control through passive techniques
NASA Technical Reports Server (NTRS)
Lin, J. C.; Howard, F. G.; Selby, G. V.
1989-01-01
Several passive separation control techniques for controlling moderate two-dimensional turbulent flow separation over a backward-facing ramp are studied. Small transverse and swept grooves, passive porous surfaces, large longitudinal grooves, and vortex generators were among the techniques used. It was found that, unlike the transverse and longitudinal grooves of an equivalent size, the 45-deg swept-groove configurations tested tended to enhance separation.
Separation of Migration and Tomography Modes of Full-Waveform Inversion in the Plane Wave Domain
NASA Astrophysics Data System (ADS)
Yao, Gang; da Silva, Nuno V.; Warner, Michael; Kalinicheva, Tatiana
2018-02-01
Full-waveform inversion (FWI) includes both migration and tomography modes. The migration mode acts like a nonlinear least squares migration to map model interfaces with reflections, while the tomography mode behaves as tomography to build a background velocity model. The migration mode is the main response of inverting reflections, while the tomography mode exists in response to inverting both the reflections and refractions. To emphasize one of the two modes in FWI, especially for inverting reflections, the separation of the two modes in the gradient of FWI is required. Here we present a new method to achieve this separation with an angle-dependent filtering technique in the plane wave domain. We first transform the source and residual wavefields into the plane wave domain with the Fourier transform and then decompose them into the migration and tomography components using the opening angles between the transformed source and residual plane waves. The opening angles close to 180° contribute to the tomography component, while the others correspond to the migration component. We find that this approach is very effective and robust even when the medium is relatively complicated with strong lateral heterogeneities, highly dipping reflectors, and strong anisotropy. This is well demonstrated by theoretical analysis and numerical tests with a synthetic data set and a field data set.
Advances in audio source seperation and multisource audio content retrieval
NASA Astrophysics Data System (ADS)
Vincent, Emmanuel
2012-06-01
Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.
Household food waste separation behavior and the importance of convenience.
Bernstad, Anna
2014-07-01
Two different strategies aiming at increasing household source-separation of food waste were assessed through a case-study in a Swedish residential area (a) use of written information, distributed as leaflets amongst households and (b) installation of equipment for source-segregation of waste with the aim of increasing convenience food waste sorting in kitchens. Weightings of separately collected food waste before and after distribution of written information suggest that this resulted in neither a significant increased amount of separately collected food waste, nor an increased source-separation ratio. After installation of sorting equipment in households, both the amount of separately collected food waste as well as the source-separation ratio increased vastly. Long-term monitoring shows that results where longstanding. Results emphasize the importance of convenience and existence of infrastructure necessary for source-segregation of waste as important factors for household waste recycling, but also highlight the need of addressing these aspects where waste is generated, i.e. already inside the household. Copyright © 2014 Elsevier Ltd. All rights reserved.
Probing quantum correlation functions through energy-absorption interferometry
NASA Astrophysics Data System (ADS)
Withington, S.; Thomas, C. N.; Goldie, D. J.
2017-08-01
An interferometric technique is described for determining the spatial forms of the individual degrees of freedom through which a many-body system can absorb energy from its environment. The method separates out the spatial forms of the coherent excitations present at any single frequency; it is not necessary to sweep the frequency and then infer the spatial forms of possible excitations from resonant absorption features. The system under test is excited with two external sources, which create generalized forces, and the fringe in the total power dissipated is measured as the relative phase between the sources is varied. If the complex fringe visibility is measured for different pairs of source locations, the anti-Hermitian part of the complex-valued nonlocal correlation tensor can be determined, which can then be decomposed to give the natural dynamical modes of the system and their relative responsivities. If each source in the interferometer creates a different kind of force, the spatial forms of the individual excitations that are responsible for cross-correlated response can be found. The technique is related to holography, but measures the state of coherence to which the system is maximally sensitive. It can be applied across a wide range of wavelengths, in a variety of ways, to homogeneous media, thin films, patterned structures, and components such as sensors, detectors, and energy-harvesting absorbers.
A Photometric Technique for Determining Fluid Concentration using Consumer-Grade Hardware
NASA Technical Reports Server (NTRS)
Leslie, F.; Ramachandran, N.
1999-01-01
In support of a separate study to produce an exponential concentration gradient in a magnetic fluid, a noninvasive technique for determining, species concentration from off-the-shelf hardware has been developed. The approach uses a backlighted fluid test cell photographed with a commercial digital camcorder. Because the light extinction coefficient is wavelength dependent, tests were conducted to determine the best filter color to use, although some guidance was also provided using an absorption spectrophotometer. With the appropriate filter in place, the provide attenuation of the light passing, through the test cell was captured by the camcorder. The digital image was analyzed for intensity using, software from Scion Image Corp. downloaded from the Internet. The analysis provides a two-dimensional array of concentration with an average error of 0.0095 ml/ml. This technique is superior to invasive techniques, which require extraction of a sample that disturbs the concentration distribution in the test cell. Refinements of this technique using a true monochromatic laser light Source are also discussed.
Very-low-energy-spread ion sources
NASA Astrophysics Data System (ADS)
Lee, Y.
1997-05-01
Ion beams with low axial energy spread are required in many applications such as ion projection lithography, isobaric separation in radioactive ion beam experiments, and ion beam deposition processes. In an ion source, the spread of the axial ion energy is caused by the nonuniformity of the plasma potential distribution along the source axis. Multicusp ion sources are capable of production positive and negative ions with good beam quality and relatively low energy spread. By intorducing a magnetic filter inside the multicusp source chamber, the axial plasma potential distribution is modified and the energy spread of positive hydrogen ions can be reduced to as low as 1 eV. The energy spread measurements of multicusp sources have been conducted by employing three different techniques: an electrostatic energy analyzer at the source exit; a magnetic deflection spectrometer; and a retarding-field energy analyzer for the accelerated beam. These different measurements confirmed tha! t ! the axial energy spread of positive and negative ions generated in the filter-equipped multicusp sources are small. New ion source configurations are now being investigated at LBNL with the purpose of achieving enen lower energy spread (<1eV) and of maximizing source performance such as reliability and lifetime.
Full-Scale Turbofan-Engine Turbine-Transfer Function Determination Using Three Internal Sensors
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.
2012-01-01
Noise-source separation techniques, using three engine-internal sensors, are applied to existing static-engine test data to determine the turbine transfer function for the currently subdominant combustion noise. The results are used to assess the combustion-noise prediction capability of the Aircraft Noise Prediction Program (ANOPP) and an improvement to the combustion-noise module GECOR is suggested. The work was carried out in response to the NASA Fundamental Aeronautics Subsonic Fixed Wing Program s Reduced-Perceived-Noise Technical Challenge.
Special opportunities in helicopter aerodynamics
NASA Technical Reports Server (NTRS)
Mccroskey, W. J.
1983-01-01
Aerodynamic research relating to modern helicopters includes the study of three dimensional, unsteady, nonlinear flow fields. A selective review is made of some of the phenomenon that hamper the development of satisfactory engineering prediction techniques, but which provides a rich source of research opportunities: flow separations, compressibility effects, complex vortical wakes, and aerodynamic interference between components. Several examples of work in progress are given, including dynamic stall alleviation, the development of computational methods for transonic flow, rotor-wake predictions, and blade-vortex interactions.
Techniques for the conversion to carbon dioxide of oxygen from dissolved sulfate in thermal waters
Nehring, N.L.; Bowen, P.A.; Truesdell, A.H.
1977-01-01
The fractionation of oxygen isotopes between dissolved sulfate ions and water provides a useful geothermometer for geothermal waters. The oxygen isotope composition of dissolved sulfate may also be used to indicate the source of the sulfate and processes of formation. The methods described here for separation, purification and reduction of sulfate to prepare carbon dioxide for mass spectrometric analysis are modifications of methods by Rafter (1967), Mizutani (1971), Sakai and Krouse (1971), and Mizutani and Rafter (1969). ?? 1976.
Implementation of trinary logic in a polarization encoded optical shadow-casting scheme.
Rizvi, R A; Zaheer, K; Zubairy, M S
1991-03-10
The design of various multioutput trinary combinational logic units by a polarization encoded optical shadow-casting (POSC) technique is presented. The POSC modified algorithm is employed to design and implement these logic elements in a trinary number system with separate and simultaneous generation of outputs. A detailed solution of the POSC logic equations for a fixed source plane and a fixed decoding mask is given to obtain input pixel coding for a trinary half-adder, full adder, and subtractor.
Xu, Wanying; Zhou, Chuanbin; Lan, Yajun; Jin, Jiasheng; Cao, Aixin
2015-05-01
Municipal solid waste (MSW) management (MSWM) is most important and challenging in large urban communities. Sound community-based waste management systems normally include waste reduction and material recycling elements, often entailing the separation of recyclable materials by the residents. To increase the efficiency of source separation and recycling, an incentive-based source separation model was designed and this model was tested in 76 households in Guiyang, a city of almost three million people in southwest China. This model embraced the concepts of rewarding households for sorting organic waste, government funds for waste reduction, and introducing small recycling enterprises for promoting source separation. Results show that after one year of operation, the waste reduction rate was 87.3%, and the comprehensive net benefit under the incentive-based source separation model increased by 18.3 CNY tonne(-1) (2.4 Euros tonne(-1)), compared to that under the normal model. The stakeholder analysis (SA) shows that the centralized MSW disposal enterprises had minimum interest and may oppose the start-up of a new recycling system, while small recycling enterprises had a primary interest in promoting the incentive-based source separation model, but they had the least ability to make any change to the current recycling system. The strategies for promoting this incentive-based source separation model are also discussed in this study. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Liu, J.; Angelopoulos, V.; Chu, X.; McPherron, R. L.
2016-12-01
Although Earth's Region 1 and 2 currents are related to activities such as substorm initiation, their magnetospheric origin remains unclear. Utilizing the triangular configuration of THEMIS probes at 8-12 RE downtail, we seek the origin of nightside Region 1 and 2 currents. The triangular configuration allows a curlometer-like technique which do not rely on active-time boundary crossings, so we can examine the current distribution in quiet times as well as active times. Our statistical study reveals that both Region 1 and 2 currents exist in the plasma sheet during quiet and active times. Especially, this is the first unequivocal, in-situ evidence of the existence of Region 2 currents in the plasma sheet. Farther away from the neutral sheet than the Region 2 currents lie the Region 1 currents which extend at least to the plasma sheet boundary layer. At geomagnetic quiet times, the separation between the two currents is located 2.5 RE from the neutral sheet. These findings suggest that the plasma sheet is a source of Region 1 and 2 currents regardless of geomagnetic activity level. During substorms, the separation between Region 1 and 2 currents migrates toward (away from) the neutral sheet as the plasma sheet thins (thickens). This migration indicates that the deformation of Region 1 and 2 currents is associated with redistribution of FAC sources in the magnetotail. In some substorms when the THEMIS probes encounter a dipolarization, a substorm current wedge (SCW) can be inferred from our technique, and it shows a distinctively larger current density than the pre-existing Region 1 currents. This difference suggests that the SCW is not just an enhancement of the pre-existing Region 1 current; the SCW and the Region 1 currents have different sources.
NASA Astrophysics Data System (ADS)
McKee, K. F.; Fee, D.; Haney, M. M.; Lyons, J. J.; Matoza, R. S.
2016-12-01
A ground-coupled airwave (GCA) occurs when an incident atmospheric pressure wave encounters the Earth's surface and part of the energy of the wave is transferred to the ground (i.e. coupled to the ground) as a seismic wave. This seismic wave propagates as a surface Rayleigh wave evidenced by the retrograde particle motion detected on a three-component seismometer. Acoustic waves recorded on a collocated microphone and seismometer can be coherent and have a 90-degree phase difference, predicted by theory and in agreement with observations. If the sensors are separated relative to the frequencies of interest, usually 10s to 100s of meters, then recorded wind noise becomes incoherent and an additional phase shift is present due to the separation distance. These characteristics of GCAs have been used to distinguish wind noise from other sources as well as to determine the acoustic contribution to seismic recordings. Here we aim to develop a minimalist infrasound signal detection and characterization technique requiring just one microphone and one three-component seismometer. Based on GCA theory, determining a source azimuth should be possible using a single seismo-acoustic sensor pair by utilizing the phase difference and exploiting the characteristic particle motion. We will use synthetic seismo-acoustic data generated by a coupled Earth-atmosphere 3D finite difference code to test and tune the detection and characterization method. The method will then be further tested using various well-constrained sources (e.g. Chelyabinsk meteor, Pagan Volcano, Cleveland Volcano). Such a technique would be advantageous in situations where resources are limited and large sensor networks are not feasible.
Stormflow-hydrograph separation based on isotopes: the thrill is gone--what's next?
Burns, Douglas A.
2002-01-01
Beginning in the 1970s, the promise of a new method for separatingstormflow hydrographs using18O,2H, and3Hprovedanirresistibletemptation, and was a vast improvement over graphical separationand solute tracer methods that were prevalent at the time. Eventu-ally, hydrologists realized that this new method entailed a plethoraof assumptions about temporal and spatial homogeneity of isotopiccomposition (many of which were commonly violated). Nevertheless,hydrologists forged ahead with dozens of isotope-based hydrograph-separation studies that were published in the 1970s and 1980s.Hortonian overland flow was presumed dead. By the late 1980s,the new isotope-based hydrograph separation technique had movedinto adolescence, accompanied by typical adolescent problems suchas confusion and a search for identity. As experienced hydrologistscontinued to use the isotope technique to study stormflow hydrol-ogy in forested catchments in humid climates, their younger peersfollowed obligingly—again and again. Was Hortonian overland flowreally dead and forgotten, though? What about catchments in whichpeople live and work? And what about catchments in dry climatesand the tropics? How useful were study results when several of theassumptions about the homogeneity of source waters were commonlyviolated? What if two components could not explain the variation ofisotopic composition measured in the stream during stormflow? Andwhat about uncertainty? As with many new tools, once the initialshine wore off, the limitations of the method became a concern—oneof which was that isotope-based hydrograph separations alone couldnot reveal much about the flow paths by which water arrives at astream channel during storms.
A selection of giant radio sources from NVSS
Proctor, D. D.
2016-06-01
Results of the application of pattern-recognition techniques to the problem of identifying giant radio sources (GRSs) from the data in the NVSS catalog are presented, and issues affecting the process are explored. Decision-tree pattern-recognition software was applied to training-set source pairs developed from known NVSS large-angular-size radio galaxies. The full training set consisted of 51,195 source pairs, 48 of which were known GRSs for which each lobe was primarily represented by a single catalog component. The source pairs had a maximum separation ofmore » $$20^{\\prime} $$ and a minimum component area of 1.87 square arcmin at the 1.4 mJy level. The importance of comparing the resulting probability distributions of the training and application sets for cases of unknown class ratio is demonstrated. The probability of correctly ranking a randomly selected (GRS, non-GRS) pair from the best of the tested classifiers was determined to be 97.8 ± 1.5%. The best classifiers were applied to the over 870,000 candidate pairs from the entire catalog. Images of higher-ranked sources were visually screened, and a table of over 1600 candidates, including morphological annotation, is presented. These systems include doubles and triples, wide-angle tail and narrow-angle tail, S- or Z-shaped systems, and core-jets and resolved cores. In conclusion, while some resolved-lobe systems are recovered with this technique, generally it is expected that such systems would require a different approach.« less
Tasci, Tonguc O; Johnson, William P; Fernandez, Diego P; Manangon, Eliana; Gale, Bruce K
2014-10-24
Compared to other sub-techniques of field flow fractionation (FFF), cyclical electrical field flow fractionation (CyElFFF) is a relatively new method with many opportunities remaining for improvement. One of the most important limitations of this method is the separation of particles smaller than 100nm. For such small particles, the diffusion rate becomes very high, resulting in severe reductions in the CyElFFF separation efficiency. To address this limitation, we modified the electrical circuitry of the ElFFF system. In all earlier ElFFF reports, electrical power sources have been directly connected to the ElFFF channel electrodes, and no alteration has been made in the electrical circuitry of the system. In this work, by using discrete electrical components, such as resistors and diodes, we improved the effective electric field in the system to allow high resolution separations. By modifying the electrical circuitry of the ElFFF system, high resolution separations of 15 and 40nm gold nanoparticles were achieved. The effects of applying different frequencies, amplitudes and voltage shapes have been investigated and analyzed through experiments. Copyright © 2014 Elsevier B.V. All rights reserved.
Wang, Huaqing; Li, Ruitong; Tang, Gang; Yuan, Hongfang; Zhao, Qingliang; Cao, Xi
2014-01-01
A Compound fault signal usually contains multiple characteristic signals and strong confusion noise, which makes it difficult to separate week fault signals from them through conventional ways, such as FFT-based envelope detection, wavelet transform or empirical mode decomposition individually. In order to improve the compound faults diagnose of rolling bearings via signals’ separation, the present paper proposes a new method to identify compound faults from measured mixed-signals, which is based on ensemble empirical mode decomposition (EEMD) method and independent component analysis (ICA) technique. With the approach, a vibration signal is firstly decomposed into intrinsic mode functions (IMF) by EEMD method to obtain multichannel signals. Then, according to a cross correlation criterion, the corresponding IMF is selected as the input matrix of ICA. Finally, the compound faults can be separated effectively by executing ICA method, which makes the fault features more easily extracted and more clearly identified. Experimental results validate the effectiveness of the proposed method in compound fault separating, which works not only for the outer race defect, but also for the rollers defect and the unbalance fault of the experimental system. PMID:25289644
Shear velocity of the Rotokawa geothermal field using ambient noise
NASA Astrophysics Data System (ADS)
Civilini, F.; Savage, M. K.; Townend, J.
2014-12-01
Ambient noise correlation is an increasingly popular seismological technique that uses the ambient seismic noise recorded at two stations to construct an empirical Green's function. Applications of this technique include determining shear velocity structure and attenuation. An advantage of ambient noise is that it does not rely on external sources of seismic energy such as local or teleseismic earthquakes. This method has been used in the geothermal industry to determine the depths at which magmatic processes occur, to distinguish between production and non-production areas, and to observe seismic velocity perturbations associated with fluid extraction. We will present a velocity model for the Rotokawa geothermal field near Taupo, New Zealand, produced from ambient noise cross correlations. Production at Rotokawa is based on the "Rotokawa A" combined cycle power station established in 1997 and the "Nga Awa Purua" triple flash power plant established in 2010. Rotokawa Joint Venture, a partnership between Mighty River Power and Tauhara North No. 2 Trust currently operates 174 MW of generation at Rotokawa. An array of short period seismometers was installed in 2008 and occupies an area of roughly 5 square kilometers around the site. Although both cultural and natural noise sources are recorded at the stations, the instrument separation distance provides a unique challenge for analyzing cross correlations produced by both signal types. The inter-station spacing is on the order of a few kilometers, so waves from cultural sources generally are not coherent from one station to the other, while the wavelength produced by natural noise is greater than the station separation. Velocity models produced from these two source types will be compared to known geological models of the site. Depending on the amount of data needed to adequately construct cross-correlations, a time-dependent model of velocity will be established and compared with geothermal production processes.
NASA Astrophysics Data System (ADS)
Chen, Qiang; Xu, Qian; Zhang, Yijun; Yang, Yinghui; Yong, Qi; Liu, Guoxiang; Liu, Xianwen
2018-03-01
Single satellite geodetic technique has weakness for mapping sequence of ground deformation associated with serial seismic events, like InSAR with long revisiting period readily leading to mixed complex deformation signals from multiple events. It challenges the observation capability of single satellite geodetic technique for accurate recognition of individual surface deformation and earthquake model. The rapidly increasing availability of various satellite observations provides good solution for overcoming the issue. In this study, we explore a sequential combination of multiple overlapping datasets from ALOS/PALSAR, ENVISAT/ASAR and GPS observations to separate surface deformation associated with the 2011 Mw 9.0 Tohoku-Oki major quake and two strong aftershocks including the Mw 6.6 Iwaki and Mw 5.8 Ibaraki events. We first estimate the fault slip model of major shock with ASAR interferometry and GPS displacements as constraints. Due to the used PALSAR interferogram spanning the period of all the events, we then remove the surface deformation of major shock through forward calculated prediction thus obtaining PALSAR InSAR deformation associated with the two strong aftershocks. The inversion for source parameters of Iwaki aftershock is conducted using the refined PALSAR deformation considering that the higher magnitude Iwaki quake has dominant deformation contribution than the Ibaraki event. After removal of deformation component of Iwaki event, we determine the fault slip distribution of Ibaraki shock using the remained PALSAR InSAR deformation. Finally, the complete source models for the serial seismic events are clearly identified from the sequential combination of multi-source satellite observations, which suggest that the major quake is a predominant mega-thrust rupture, whereas the two aftershocks are normal faulting motion. The estimated seismic moment magnitude for the Tohoku-Oki, Iwaki and Ibaraki evens are Mw 9.0, Mw 6.85 and Mw 6.11, respectively.
Blind separation of incoherent and spatially disjoint sound sources
NASA Astrophysics Data System (ADS)
Dong, Bin; Antoni, Jérôme; Pereira, Antonio; Kellermann, Walter
2016-11-01
Blind separation of sound sources aims at reconstructing the individual sources which contribute to the overall radiation of an acoustical field. The challenge is to reach this goal using distant measurements when all sources are operating concurrently. The working assumption is usually that the sources of interest are incoherent - i.e. statistically orthogonal - so that their separation can be approached by decorrelating a set of simultaneous measurements, which amounts to diagonalizing the cross-spectral matrix. Principal Component Analysis (PCA) is traditionally used to this end. This paper reports two new findings in this context. First, a sufficient condition is established under which "virtual" sources returned by PCA coincide with true sources; it stipulates that the sources of interest should be not only incoherent but also spatially orthogonal. A particular case of this instance is met by spatially disjoint sources - i.e. with non-overlapping support sets. Second, based on this finding, a criterion that enforces both statistical and spatial orthogonality is proposed to blindly separate incoherent sound sources which radiate from disjoint domains. This criterion can be easily incorporated into acoustic imaging algorithms such as beamforming or acoustical holography to identify sound sources of different origins. The proposed methodology is validated on laboratory experiments. In particular, the separation of aeroacoustic sources is demonstrated in a wind tunnel.
A comparison between DART-MS and DSA-MS in the forensic analysis of writing inks.
Drury, Nicholas; Ramotowski, Robert; Moini, Mehdi
2018-05-23
Ambient ionization mass spectrometry is gaining momentum in forensic science laboratories because of its high speed of analysis, minimal sample preparation, and information-rich results. One such application of ambient ionization methodology includes the analysis of writing inks from questioned documents where colorants of interest may not be soluble in common solvents, rendering thin layer chromatography (TLC) and separation-mass spectrometry methods such as LC/MS (-MS) impractical. Ambient ionization mass spectrometry uses a variety of ionization techniques such as penning ionization in Direct Analysis in Real Time (DART), and atmospheric pressure chemical ionization in Direct Sample Analysis (DSA), and electrospray ionization in Desorption Electrospray Ionization (DESI). In this manuscript, two of the commonly used ambient ionization techniques are compared: Perkin Elmer DSA-MS and IonSense DART in conjunction with a JEOL AccuTOF MS. Both technologies were equally successful in analyzing writing inks and produced similar spectra. DSA-MS produced less background signal likely because of its closed source configuration; however, the open source configuration of DART-MS provided more flexibility for sample positioning for optimum sensitivity and thereby allowing smaller piece of paper containing writing ink to be analyzed. Under these conditions, the minimum sample required for DART-MS was 1mm strokes of ink on paper, whereas DSA-MS required a minimum of 3mm. Moreover, both techniques showed comparable repeatability. Evaluation of the analytical figures of merit, including sensitivity, linear dynamic range, and repeatability, for DSA-MS and DART-MS analysis is provided. To the forensic context of the technique, DART-MS was applied to the analysis of United States Secret Service ink samples directly on a sampling mesh, and the results were compared with DSA-MS of the same inks on paper. Unlike analysis using separation mass spectrometry, which requires sample preparation, both DART-MS and DSA-MS successfully analyzed writing inks with minimal sample preparation. Copyright © 2018 Elsevier B.V. All rights reserved.
Ishikawa, Masayori; Nagase, Naomi; Matsuura, Taeko; Hiratsuka, Junichi; Suzuki, Ryusuke; Miyamoto, Naoki; Sutherland, Kenneth Lee; Fujita, Katsuhisa; Shirato, Hiroki
2015-03-01
The scintillator with optical fiber (SOF) dosimeter consists of a miniature scintillator mounted on the tip of an optical fiber. The scintillator of the current SOF dosimeter is a 1-mm diameter hemisphere. For a scintillation dosimeter coupled with an optical fiber, measurement accuracy is influenced by signals due to Cerenkov radiation in the optical fiber. We have implemented a spectral filtering technique for compensating for the Cerenkov radiation effect specifically for our plastic scintillator-based dosimeter, using a wavelength-separated counting method. A dichroic mirror was used for separating input light signals. Individual signal counting was performed for high- and low-wavelength light signals. To confirm the accuracy, measurements with various amounts of Cerenkov radiation were performed by changing the incident direction while keeping the Ir-192 source-to-dosimeter distance constant, resulting in a fluctuation of <5%. Optical fiber bending was also addressed; no bending effect was observed for our wavelength-separated SOF dosimeter. © The Author 2015. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.
Comparison of Frequency-Domain Array Methods for Studying Earthquake Rupture Process
NASA Astrophysics Data System (ADS)
Sheng, Y.; Yin, J.; Yao, H.
2014-12-01
Seismic array methods, in both time- and frequency- domains, have been widely used to study the rupture process and energy radiation of earthquakes. With better spatial resolution, the high-resolution frequency-domain methods, such as Multiple Signal Classification (MUSIC) (Schimdt, 1986; Meng et al., 2011) and the recently developed Compressive Sensing (CS) technique (Yao et al., 2011, 2013), are revealing new features of earthquake rupture processes. We have performed various tests on the methods of MUSIC, CS, minimum-variance distortionless response (MVDR) Beamforming and conventional Beamforming in order to better understand the advantages and features of these methods for studying earthquake rupture processes. We use the ricker wavelet to synthesize seismograms and use these frequency-domain techniques to relocate the synthetic sources we set, for instance, two sources separated in space but, their waveforms completely overlapping in the time domain. We also test the effects of the sliding window scheme on the recovery of a series of input sources, in particular, some artifacts that are caused by the sliding window scheme. Based on our tests, we find that CS, which is developed from the theory of sparsity inversion, has relatively high spatial resolution than the other frequency-domain methods and has better performance at lower frequencies. In high-frequency bands, MUSIC, as well as MVDR Beamforming, is more stable, especially in the multi-source situation. Meanwhile, CS tends to produce more artifacts when data have poor signal-to-noise ratio. Although these techniques can distinctly improve the spatial resolution, they still produce some artifacts along with the sliding of the time window. Furthermore, we propose a new method, which combines both the time-domain and frequency-domain techniques, to suppress these artifacts and obtain more reliable earthquake rupture images. Finally, we apply this new technique to study the 2013 Okhotsk deep mega earthquake in order to better capture the rupture characteristics (e.g., rupture area and velocity) of this earthquake.
NASA Astrophysics Data System (ADS)
Prasad, S.; Bruce, L. M.
2007-04-01
There is a growing interest in using multiple sources for automatic target recognition (ATR) applications. One approach is to take multiple, independent observations of a phenomenon and perform a feature level or a decision level fusion for ATR. This paper proposes a method to utilize these types of multi-source fusion techniques to exploit hyperspectral data when only a small number of training pixels are available. Conventional hyperspectral image based ATR techniques project the high dimensional reflectance signature onto a lower dimensional subspace using techniques such as Principal Components Analysis (PCA), Fisher's linear discriminant analysis (LDA), subspace LDA and stepwise LDA. While some of these techniques attempt to solve the curse of dimensionality, or small sample size problem, these are not necessarily optimal projections. In this paper, we present a divide and conquer approach to address the small sample size problem. The hyperspectral space is partitioned into contiguous subspaces such that the discriminative information within each subspace is maximized, and the statistical dependence between subspaces is minimized. We then treat each subspace as a separate source in a multi-source multi-classifier setup and test various decision fusion schemes to determine their efficacy. Unlike previous approaches which use correlation between variables for band grouping, we study the efficacy of higher order statistical information (using average mutual information) for a bottom up band grouping. We also propose a confidence measure based decision fusion technique, where the weights associated with various classifiers are based on their confidence in recognizing the training data. To this end, training accuracies of all classifiers are used for weight assignment in the fusion process of test pixels. The proposed methods are tested using hyperspectral data with known ground truth, such that the efficacy can be quantitatively measured in terms of target recognition accuracies.
Lee, Hae-Lee; Kim, Sue-Hee; Ji, Dong-Beom; Kim, Yong-Jun
2009-09-01
The aim of this study was to compare the effects of spermatozoa separation techniques on sperm quality and in-vitro fertilization (IVF) results for cryopreserved bovine semen. Sephadex, glass wool and Percoll gradient separation techniques were used for sperm separation and sperm motility, morphology and membrane integrity were evaluated before and after separation. Also, cleavage and blastocyst developmental rate were investigated after IVF with sperm recovered by each separation technique. The motility of samples obtained by the three separation techniques were greater compared to the control samples (p < 0.05). The percentage of spermatozoa with intact plasma-membrane integrity, identified by 6-carboxyfluoresceindiacetate/ propidium iodide fluorescent staining and the hypo-osmotic swelling test, was highest in the glass wool filtration samples (p < 0.05). The cleavage and blastocyst rate of total oocytes produced from glass wool filtration samples were also higher than the control and Sephadex filtration samples (p < 0.05), but were not significantly different from Percoll separation samples. However, a significantly greater number of cleaved embryos produced by glass wool filtration developed to blastocyst stage than those produced by Percoll separation (p < 0.05). These results indicate that spermatozoa with good quality can be achieved by these three separation techniques and can be used for bovine IVF. In particular, it suggests that glass wool filtration would be the most effective method of the three for improving sperm quality and embryo production for cryopreserved bovine spermatozoa.
Portable microcontroller-based instrument for near-infrared spectroscopy
NASA Astrophysics Data System (ADS)
Giardini, Mario E.; Corti, Mario; Lago, Paolo; Gelmetti, Andrea
2000-05-01
Near IR Spectroscopy (NIRS) can be employed to noninvasively and continuously measure in-vivo local changes in haemodynamics and oxygenation of human tissues. In particular, the technique can be particularly useful for muscular functional monitoring. We present a portable NIRS research-grade acquisition system prototype, strictly dedicate to low-noise measurements during muscular exercise. The prototype is able to control four LED sources and a detector. Such a number of sources allows for multipoint measurements or for multi-wavelength spectroscopy of tissue constituents other than oxygen, such as cytochrome aa3 oxidation. The LEDs and the detector are mounted on separate probes, which carry also the relevant drivers and preamplifiers. By employing surface-mount technologies, probe size and weight are kept to a minimum. A single-chip mixed-signal RISC microcontroller performs source-to- detector multiplexing with a digital correlation technique. The acquired data are stored on an on-board 64 K EEPROM bank, and can be subsequently uploaded to a personal computer via serial port for further analysis. The resulting instrument is compact and lightweight. Preliminary test of the prototype on oxygen consumption during tourniquet- induced forearm ischaemia show adequate detectivity and time response.
Core-shifts and proper-motion constraints in the S5 polar cap sample at the 15 and 43 GHz bands
NASA Astrophysics Data System (ADS)
Abellán, F. J.; Martí-Vidal, I.; Marcaide, J. M.; Guirado, J. C.
2018-06-01
We have studied a complete radio sample of active galactic nuclei with the very-long-baseline-interferometry (VLBI) technique and for the first time successfully obtained high-precision phase-delay astrometry at Q band (43 GHz) from observations acquired in 2010. We have compared our astrometric results with those obtained with the same technique at U band (15 GHz) from data collected in 2000. The differences in source separations among all the source pairs observed in common at the two epochs are compatible at the 1σ level between U and Q bands. With the benefit of quasi-simultaneous U and Q band observations in 2010, we have studied chromatic effects (core-shift) at the radio source cores with three different methods. The magnitudes of the core-shifts are of the same order (about 0.1 mas) for all methods. However, some discrepancies arise in the orientation of the core-shifts determined through the different methods. In some cases these discrepancies are due to insufficient signal for the method used. In others, the discrepancies reflect assumptions of the methods and could be explained by curvatures in the jets and departures from conical jets.
Time-resolved multicolor two-photon excitation fluorescence microscopy of cells and tissues
NASA Astrophysics Data System (ADS)
Zheng, Wei
2014-11-01
Multilabeling which maps the distribution of different targets is an indispensable technique in many biochemical and biophysical studies. Two-photon excitation fluorescence (TPEF) microscopy of endogenous fluorophores combining with conventional fluorescence labeling techniques such as genetically encoded fluorescent protein (FP) and fluorescent dyes staining could be a powerful tool for imaging living cells. However, the challenge is that the excitation and emission wavelength of these endogenous fluorophores and fluorescent labels are very different. A multi-color ultrafast source is required for the excitation of multiple fluorescence molecules. In this study, we developed a two-photon imaging system with excitations from the pump femtosecond laser and the selected supercontinuum generated from a photonic crystal fiber (PCF). Multiple endogenous fluorophores, fluorescent proteins and fluorescent dyes were excited in their optimal wavelengths simultaneously. A time- and spectral-resolved detection system was used to record the TPEF signals. This detection technique separated the TPEF signals from multiple sources in time and wavelength domains. Cellular organelles such as nucleus, mitochondria, microtubule and endoplasmic reticulum, were clearly revealed in the TPEF images. The simultaneous imaging of multiple fluorophores of cells will greatly aid the study of sub-cellular compartments and protein localization.
Médioni, R; Asselineau, B; Verrey, B; Trompier, F; Itié, C; Texier, C; Muller, H; Pelcot, G; Clairand, I; Jacquet, X; Pochat, J L
2004-01-01
In criticality accident dosimetry and more generally for high dose measurements, special techniques are used to measure separately the gamma ray and neutron components of the dose. To improve these techniques and to check their dosimetry systems (physical and/or biological), a total of 60 laboratories from 29 countries (America, Europe, Asia) participated in an international intercomparaison, which took place in France from 9 to 21 June 2002, at the SILENE reactor in Valduc and at a pure gamma source in Fontenay-aux-Roses. This intercomparison was jointly organised by the IRSN and the CEA with the help of the NEA/OCDE and was partly supported by the European Communities. This paper describes the aim of this intercomparison, the techniques used by the participants and the two radiation sources and their characteristics. The experimental arrangements of the dosemeters for the irradiations in free air or on phantoms are given. Then the dosimetric quantities measured and reported by the participants are summarised, analysed and compared with the reference values. The present paper concerns only the physical dosimetry and essentially experiments performed on the SILENE facility. The results obtained with the biological dosimetry are published in two other papers of this issue.
Trapp, Oliver
2010-02-12
Highly efficient and sophisticated separation techniques are available to analyze complex compound mixtures with superior sensitivities and selectivities often enhanced by a 2nd dimension, e.g. a separation technique or spectroscopic and spectrometric techniques. For enantioselective separations numerous chiral stationary phases (CSPs) exist to cover a broad range of chiral compounds. Despite these advances enantioselective separations can become very challenging for mixtures of stereolabile constitutional isomers, because the on-column interconversion can lead to completely overlapping peak profiles. Typically, multidimensional separation techniques, e.g. multidimensional GC (MDGC), using an achiral 1st separation dimension and transferring selected analytes to a chiral 2nd separation are the method of choice to approach such problems. However, this procedure is very time consuming and only predefined sections of peaks can be transferred by column switching to the second dimension. Here we demonstrate for stereolabile 1,2-dialkylated diaziridines a technique to experimentally deconvolute overlapping gas chromatographic elution profiles of constitutional isomers based on multiple-reaction-monitoring MS (MRM-MS). The here presented technique takes advantage of different fragmentation probabilities and pathways to isolate the elution profile of configurational isomers. Copyright 2009 Elsevier B.V. All rights reserved.
Nanotechnology, resources, and pollution control
NASA Astrophysics Data System (ADS)
Gillett, Stephen L.
1996-09-01
The separation of different kinds of atoms or molecules from each other is a fundamental technological problem. Current techniques of resource extraction, which use the ancient paradigm of the differential partitioning of elements into coexisting phases, are simple but extremely wasteful and require feedstocks (`ores') that are already anomalously enriched. This is impractical for pollution control and desalination, which require extraction of low concentrations; instead, atomistic separation, typically by differential motion through semipermeable membranes, is used. The present application of such membranes is seriously limited, however, mostly because of limitations in their fabrication by conventional bulk techniques. The capabilities of biological systems, such as vertebrate kidneys, are vastly better, largely because they are intrinsically structured at a molecular scale. Nanofabrication of semipermeable membranes promises capabilities on the order of those of biological systems, and this in turn could provide much financial incentive for the development of molecular assemblers, as well established markets exist already. Continued incentives would exist, moreover, as markets expanded with decreasing costs, leading to such further applications as remediation of polluted sites, cheap desalination, and resource extraction from very low-grade sources.
Al Harrach, M; Afsharipour, B; Boudaoud, S; Carriou, V; Marin, F; Merletti, R
2016-08-01
The Brachialis (BR) is placed under the Biceps Brachii (BB) deep in the upper arm. Therefore, the detection of the corresponding surface Electromyogram (sEMG) is a complex task. The BR is an important elbow flexor, but it is usually not considered in the sEMG based force estimation process. The aim of this study was to attempt to separate the two sEMG activities of the BR and the BB by using a High Density sEMG (HD-sEMG) grid placed at the upper arm and Canonical Component Analysis (CCA) technique. For this purpose, we recorded sEMG signals from seven subjects with two 8 × 4 electrode grids placed over BB and BR. Four isometric voluntary contraction levels were recorded (5, 10, 30 and 50 %MVC) for 90° elbow angle. Then using CCA and image processing tools the sources of each muscle activity were separated. Finally, the corresponding sEMG signals were reconstructed using the remaining canonical components in order to retrieve the activity of the BB and the BR muscles.
Liquid rocket engine self-cooled combustion chambers
NASA Technical Reports Server (NTRS)
1977-01-01
Self-cooled combustion chambers are chambers in which the chamber wall temperature is controlled by methods other than fluid flow within the chamber wall supplied from an external source. In such chambers, adiabatic wall temperature may be controlled by use of upstream fluid components such as the injector or a film-coolant ring, or by internal flow of self-contained materials; e.g. pyrolysis gas flow in charring ablators, and the flow of infiltrated liquid metals in porous matrices. Five types of self-cooled chambers are considered in this monograph. The name identifying the chamber is indicative of the method (mechanism) by which the chamber is cooled, as follows: ablative; radiation cooled; internally regenerative (Interegen); heat sink; adiabatic wall. Except for the Interegen and heat sink concepts, each chamber type is discussed separately. A separate and final section of the monograph deals with heat transfer to the chamber wall and treats Stanton number evaluation, film cooling, and film-coolant injection techniques, since these subjects are common to all chamber types. Techniques for analysis of gas film cooling and liquid film cooling are presented.
A field-deployable GC-EI-HRTOF-MS for in situ characterization of volatile organic compounds
NASA Astrophysics Data System (ADS)
Lerner, B. M.; Herndon, S. C.; Yacovitch, T. I.; Roscioli, J. R.; Fortner, E.; Knighton, W. B.; Sueper, D.; Isaacman-VanWertz, G. A.; Jayne, J. T.; Worsnop, D. R.
2017-12-01
Previous authors have demonstrated the value of coupling conventional gas chromatograph (GC) separation techniques with the new generation of electron-impact high-resolution time-of-flight mass spectrometry (EI-HR-ToF-MS) detectors for the measurement of halocarbons and semi-volatile organic species. Here, we present new instrumentation, analytical techniques and field data from the deployment of a GC-EI-HR-ToF-MS system in the mini Aerodyne mobile laboratory to sites upwind and downwind of San Antonio, Texas in May 2017. The instrument employed a multi-component adsorbent trap pre-concertation system followed by single-column separation. We will show results from the field work, including inter-comparison with other VOC measurements and characterization of C5-C10 hydrocarbon mixing ratios to distinguish urban and oil/gas emission sources in characterized air. We will discuss practical aspects of deployment of the GC-EI-HRTOF-MS in a mobile laboratory and system performance in the field. Will we also present further development of Aerodyne's TERN software package for chromatographic data analysis to processing of HRTOF-MS datasets.
NASA Technical Reports Server (NTRS)
Dutta, Soumyo; Braun, Robert D.; Russell, Ryan P.; Clark, Ian G.; Striepe, Scott A.
2012-01-01
Flight data from an entry, descent, and landing (EDL) sequence can be used to reconstruct the vehicle's trajectory, aerodynamic coefficients and the atmospheric profile experienced by the vehicle. Past Mars missions have contained instruments that do not provide direct measurement of the freestream atmospheric conditions. Thus, the uncertainties in the atmospheric reconstruction and the aerodynamic database knowledge could not be separated. The upcoming Mars Science Laboratory (MSL) will take measurements of the pressure distribution on the aeroshell forebody during entry and will allow freestream atmospheric conditions to be partially observable. This data provides a mean to separate atmospheric and aerodynamic uncertainties and is part of the MSL EDL Instrumentation (MEDLI) project. Methods to estimate the flight performance statistically using on-board measurements are demonstrated here through the use of simulated Mars data. Different statistical estimators are used to demonstrate which estimator best quantifies the uncertainties in the flight parameters. The techniques demonstrated herein are planned for application to the MSL flight dataset after the spacecraft lands on Mars in August 2012.
NASA Astrophysics Data System (ADS)
Pieper, Michael; Manolakis, Dimitris; Truslow, Eric; Cooley, Thomas; Brueggeman, Michael; Jacobson, John; Weisner, Andrew
2017-08-01
Accurate estimation or retrieval of surface emissivity from long-wave infrared or thermal infrared (TIR) hyperspectral imaging data acquired by airborne or spaceborne sensors is necessary for many scientific and defense applications. This process consists of two interwoven steps: atmospheric compensation and temperature-emissivity separation (TES). The most widely used TES algorithms for hyperspectral imaging data assume that the emissivity spectra for solids are smooth compared to the atmospheric transmission function. We develop a model to explain and evaluate the performance of TES algorithms using a smoothing approach. Based on this model, we identify three sources of error: the smoothing error of the emissivity spectrum, the emissivity error from using the incorrect temperature, and the errors caused by sensor noise. For each TES smoothing technique, we analyze the bias and variability of the temperature errors, which translate to emissivity errors. The performance model explains how the errors interact to generate temperature errors. Since we assume exact knowledge of the atmosphere, the presented results provide an upper bound on the performance of TES algorithms based on the smoothness assumption.
NASA Astrophysics Data System (ADS)
Pfeifer, Thorben; Janzen, Rasmus; Steingrobe, Tobias; Sperling, Michael; Franze, Bastian; Engelhard, Carsten; Buscher, Wolfgang
2012-10-01
A novel ion source/sampling cone device for inductively coupled plasma mass spectrometry (ICP-MS) especially operated in the hyphenated mode as a detection system coupled with different separation modules is presented. Its technical setup is described in detail. Its main feature is the very low total argon consumption of less than 1.5 L min- 1, leading to significant reduction of operational costs especially when time-consuming speciation analysis is performed. The figures of merit of the new system with respect to sensitivity, detection power, long-term stability and working range were explored. Despite the profound differences of argon consumption of the new system in comparison to the conventional ICP-MS system, many of the characteristic features of the conventional ICP-MS could be maintained to a great extent. To demonstrate the ion source's capabilities, it was used as an element-selective detector for gas (GC) and high performance liquid chromatography (HPLC) where organic compounds of mercury and cobalt, respectively, were separated and detected with the new low-flow ICP-MS detection system. The corresponding chromatograms are shown. The applicability for trace element analysis has been validated with the certified reference material NIST 1643e.
Exemplar-Based Image Inpainting Using a Modified Priority Definition.
Deng, Liang-Jian; Huang, Ting-Zhu; Zhao, Xi-Le
2015-01-01
Exemplar-based algorithms are a popular technique for image inpainting. They mainly have two important phases: deciding the filling-in order and selecting good exemplars. Traditional exemplar-based algorithms are to search suitable patches from source regions to fill in the missing parts, but they have to face a problem: improper selection of exemplars. To improve the problem, we introduce an independent strategy through investigating the process of patches propagation in this paper. We first define a new separated priority definition to propagate geometry and then synthesize image textures, aiming to well recover image geometry and textures. In addition, an automatic algorithm is designed to estimate steps for the new separated priority definition. Comparing with some competitive approaches, the new priority definition can recover image geometry and textures well.
Beam position monitor gate functionality implementation and applications
Cheng, Weixing; Ha, Kiman; Li, Yongjun; ...
2018-06-14
We introduce a novel technique to implement gate functionality for the beam position monitors (BPM) at the National Synchrotron Light Source II (NSLS-II). The functionality, now implemented in FPGA, allows us to acquire two separated bunch-trains’ synchronized turn-by-turn (TBT) data simultaneously with the NSLS-II in-house developed BPM system. The gated position resolution is improved about 3 times by narrowing the sampling width. Experimentally we demonstrated that the machine lattice could be transparently characterized with the gated TBT data of a short diagnostic bunch-train Cheng et al., 2017; Li et al., 2017. Other applications, for example, precisely characterizing storage ring impedance/wake-fieldmore » through recording the beam positions of two separated bunch trains has been experimentally demonstrated.« less
Beam position monitor gate functionality implementation and applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Weixing; Ha, Kiman; Li, Yongjun
We introduce a novel technique to implement gate functionality for the beam position monitors (BPM) at the National Synchrotron Light Source II (NSLS-II). The functionality, now implemented in FPGA, allows us to acquire two separated bunch-trains’ synchronized turn-by-turn (TBT) data simultaneously with the NSLS-II in-house developed BPM system. The gated position resolution is improved about 3 times by narrowing the sampling width. Experimentally we demonstrated that the machine lattice could be transparently characterized with the gated TBT data of a short diagnostic bunch-train Cheng et al., 2017; Li et al., 2017. Other applications, for example, precisely characterizing storage ring impedance/wake-fieldmore » through recording the beam positions of two separated bunch trains has been experimentally demonstrated.« less
Primary Energy Efficiency Analysis of Different Separate Sensible and Latent Cooling Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdelaziz, Omar
2015-01-01
Separate Sensible and Latent cooling (SSLC) has been discussed in open literature as means to improve air conditioning system efficiency. The main benefit of SSLC is that it enables heat source optimization for the different forms of loads, sensible vs. latent, and as such maximizes the cycle efficiency. In this paper I use a thermodynamic analysis tool in order to analyse the performance of various SSLC technologies including: multi-evaporators two stage compression system, vapour compression system with heat activated desiccant dehumidification, and integrated vapour compression with desiccant dehumidification. A primary coefficient of performance is defined and used to judge themore » performance of the different SSLC technologies at the design conditions. Results showed the trade-off in performance for different sensible heat factor and regeneration temperatures.« less
Exemplar-Based Image Inpainting Using a Modified Priority Definition
Deng, Liang-Jian; Huang, Ting-Zhu; Zhao, Xi-Le
2015-01-01
Exemplar-based algorithms are a popular technique for image inpainting. They mainly have two important phases: deciding the filling-in order and selecting good exemplars. Traditional exemplar-based algorithms are to search suitable patches from source regions to fill in the missing parts, but they have to face a problem: improper selection of exemplars. To improve the problem, we introduce an independent strategy through investigating the process of patches propagation in this paper. We first define a new separated priority definition to propagate geometry and then synthesize image textures, aiming to well recover image geometry and textures. In addition, an automatic algorithm is designed to estimate steps for the new separated priority definition. Comparing with some competitive approaches, the new priority definition can recover image geometry and textures well. PMID:26492491
Abd Aziz, Mohd Aizudin; Md Isa, Khairuddin; Ab Rashid, Radzuwan
2017-06-01
This article aims to provide insights into the factors that contribute to the separation efficiency of solid particles. In this study, a pneumatic jigging technique was used to assess the separation of solid waste materials that consisted of copper, glass and rubber insulator. Several initial experiments were carried out to evaluate the strengths and limitations of the technique. It is found that despite some limitations of the technique, all the samples prepared for the experiments were successfully separated. The follow-up experiments were then carried out to further assess the separation of copper wire and rubber insulator. The effects of air flow and pulse rates on the separation process were examined. The data for these follow-up experiments were analysed using a sink float analysis technique. The analysis shows that the air flow rate was very important in determining the separation efficiency. However, the separation efficiency may be influenced by the type of materials used.
NASA Astrophysics Data System (ADS)
Nihill, Kevin John
This thesis details a range of experiments and techniques that use the scattering of atomic beams from surfaces to both characterize a variety of interfaces and harness mass-specific scattering conditions to separate and enrich isotopic components in a mixture of gases. Helium atom scattering has been used to characterize the surface structure and vibrational dynamics of methyl-terminated Ge(111), thereby elucidating the effects of organic termination on a rigid semiconductor interface. Helium atom scattering was employed as a surface-sensitive, non-destructive probe of the surface. By means of elastic gas-surface diffraction, this technique is capable of providing measurements of atomic spacing, step height, average atomic displacement as a function of surface temperature, gas-surface potential well depth, and surface Debye temperature. Inelastic time-of-flight studies provide highly resolved energy exchange measurements between helium atoms and collective lattice vibrations, or phonons; a collection of these measurements across a range of incident kinematic parameters allowed for a thorough mapping of low-energy phonons (e.g., the Rayleigh wave) across the surface Brillouin zone and subsequent comparison with complementary theoretical calculations. The scattering of molecular beams - here, hydrogen and deuterium from methyl-terminated Si(111) - enables the measurement of the anisotropy of the gas-surface interaction potential through rotationally inelastic diffraction (RID), whereby incident atoms can exchange internal energy between translational and rotational modes and diffract into unique angular channels as a result. The probability of rotational excitations as a function of incident energy and angle were measured and compared with electronic structure and scattering calculations to provide insight into the gas-surface interaction potential and hence the surface charge density distribution, revealing important details regarding the interaction of H2 with an organic-functionalized semiconductor interface. Aside from their use as probes for surface structure and dynamics, atomic beam sources are also demonstrated to enable the efficient separation of gaseous mixtures of isotopes by means of diffraction and differential condensation. In the former method, the kinematic conditions for elastic diffraction result in an incident beam of natural abundance neon diffracting into isotopically distinct angles, resulting in the enrichment of a desired isotope; this purification can be improved by exploiting the difference in arrival times of the two isotopes at a given final angle. In the latter method, the identical incident velocities of coexpanded isotopes lead to minor but important differences in their incident kinetic energies, and thus their probability of adsorbing on a sufficiently cold surface, resulting in preferential condensation of a given isotope that depends on the energy of the incident beam. Both of these isotope separation techniques are made possible by the narrow velocity distribution and velocity seeding effect offered only by high-Mach number supersonic beam sources. These experiments underscore the utility of supersonically expanded atomic and molecular beam sources as both extraordinarily precise probes of surface structure and dynamics and as a means for high-throughput, non-dissociative isotopic enrichment methods.
Effective separation technique for small diameter whiskers.
NASA Technical Reports Server (NTRS)
Westfall, L. J.
1972-01-01
Description of a technique for separating small-diameter whiskers from the as-grown matt by gently agitating the whisker matts in a solution of deionized or distilled water for six to eight hours. High-strength Al2O3 whiskers were effectively separated by this technique, comprising an average 48% of the original weight of the whisker matt. According to estimation, more than 90% of separated whiskers had diameters between 0.7 and 2.0 microns.
New diagnostic technique for the study of turbulent boundary-layer separation
NASA Technical Reports Server (NTRS)
Horstman, C. C.; Owen, F. K.
1974-01-01
Description of a diagnostic technique for determining the unsteady character of turbulent boundary-layer separation. The technique uses thin platinum films mounted flush with the model surface. Voltages from these films provide measurements related to the flow character above the film. For illustration, results obtained by this technique are presented for the interaction of a hypersonic shock wave and a turbulent boundary layer, with and without separation.
NASA Astrophysics Data System (ADS)
Bi, Chuan-Xing; Geng, Lin; Zhang, Xiao-Zheng
2016-05-01
In the sound field with multiple non-stationary sources, the measured pressure is the sum of the pressures generated by all sources, and thus cannot be used directly for studying the vibration and sound radiation characteristics of every source alone. This paper proposes a separation model based on the interpolated time-domain equivalent source method (ITDESM) to separate the pressure field belonging to every source from the non-stationary multi-source sound field. In the proposed method, ITDESM is first extended to establish the relationship between the mixed time-dependent pressure and all the equivalent sources distributed on every source with known location and geometry information, and all the equivalent source strengths at each time step are solved by an iterative solving process; then, the corresponding equivalent source strengths of one interested source are used to calculate the pressure field generated by that source alone. Numerical simulation of two baffled circular pistons demonstrates that the proposed method can be effective in separating the non-stationary pressure generated by every source alone in both time and space domains. An experiment with two speakers in a semi-anechoic chamber further evidences the effectiveness of the proposed method.
Photophoretic velocimetry for the characterization of aerosols.
Haisch, Christoph; Kykal, Carsten; Niessner, Reinhard
2008-03-01
Aerosols are particles in a size range from some nanometers to some micrometers suspended in air or other gases. Their relevance varies as wide as their origin and composition. In the earth's atmosphere they influence the global radiation balance and human health. Artificially produced aerosols are applied, e.g., for drug administration, as paint and print pigments, or in rubber tire production. In all these fields, an exact characterization of single particles as well as of the particle ensemble is essential. Beyond characterization, continuous separation is often required. State-of-the-art separation techniques are based on electrical, thermal, or flow fields. In this work we present an approach to apply light in the form of photophoretic (PP) forces for characterization and separation of aerosol particles according to their optical properties. Such separation technique would allow, e.g., the separation of organic from inorganic particles of the same aerodynamic size. We present a system which automatically records velocities induced by PP forces and does a statistical evaluation in order to characterize the particle ensemble properties. The experimental system essentially consists of a flow cell with rectangular cross section (1 cm(2), length 7 cm), where the aerosol stream is pumped through in the vertical direction at ambient pressure. In the cell, a laser beam is directed orthogonally to the particle flow direction, which results in a lateral displacement of the particles. In an alternative configuration, the beam is directed in the opposite direction to the aerosol flow; hence, the particles are slowed down by the PP force. In any case, the photophoretically induced variations of speed and position are visualized by a second laser illumination and a camera system, feeding a mathematical particle tracking algorithm. The light source inducing the PP force is a diode laser (lambda = 806 nm, P = 0.5 W).
Kim, Jin Yeong; Balderas-Xicohténcatl, Rafael; Zhang, Linda; Kang, Sung Gu; Hirscher, Michael; Oh, Hyunchul; Moon, Hoi Ri
2017-10-25
Deuterium plays a pivotal role in industrial and scientific research, and is irreplaceable for various applications such as isotope tracing, neutron moderation, and neutron scattering. In addition, deuterium is a key energy source for fusion reactions. Thus, the isolation of deuterium from a physico-chemically almost identical isotopic mixture is a seminal challenge in modern separation technology. However, current commercial approaches suffer from extremely low separation efficiency (i.e., cryogenic distillation, selectivity of 1.5 at 24 K), requiring a cost-effective and large-scale separation technique. Herein, we report a highly effective hydrogen isotope separation system based on metal-organic frameworks (MOFs) having the highest reported separation factor as high as ∼26 at 77 K by maximizing synergistic effects of the chemical affinity quantum sieving (CAQS) and kinetic quantum sieving (KQS). For this purpose, the MOF-74 system having high hydrogen adsorption enthalpies due to strong open metal sites is chosen for CAQS functionality, and imidazole molecules (IM) are employed to the system for enhancing the KQS effect. To the best of our knowledge, this work is not only the first attempt to implement two quantum sieving effects, KQS and CAQS, in one system, but also provides experimental validation of the utility of this system for practical industrial usage by isolating high-purity D 2 through direct selective separation studies using 1:1 D 2 /H 2 mixtures.
Wireless Power Transfer for Space Applications
NASA Technical Reports Server (NTRS)
Ramos, Gabriel Vazquez; Yuan, Jiann-Shiun
2011-01-01
This paper introduces an implementation for magnetic resonance wireless power transfer for space applications. The analysis includes an equivalent impedance study, loop material characterization, source/load resonance coupling technique, and system response behavior due to loads variability. System characterization is accomplished by executing circuit design from analytical equations and simulations using Matlab and SPICE. The theory was validated by a combination of different experiments that includes loop material consideration, resonance coupling circuits considerations, electric loads considerations and a small scale proof-of-concept prototype. Experiment results shows successful wireless power transfer for all the cases studied. The prototype provided about 4.5 W of power to the load at a separation of -5 cm from the source using a power amplifier rated for 7 W.
Low-density InP-based quantum dots emitting around the 1.5 μm telecom wavelength range
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yacob, M.; Reithmaier, J. P.; Benyoucef, M., E-mail: m.benyoucef@physik.uni-kassel.de
The authors report on low-density InAs quantum dots (QDs) grown on AlGaInAs surfaces lattice matched to InP using post-growth annealing by solid-source molecular beam epitaxy. Clearly spatially separated QDs with a dot density of about 5 × 10{sup 8} cm{sup −2} are obtained by using a special capping technique after the dot formation process. High-resolution micro-photoluminescence performed on optimized QD structures grown on distributed Bragg reflector exhibits single QD emissions around 1.5 μm with narrow excitonic linewidth below 50 μeV, which can be used as single photon source in the telecom wavelength range.
A First Look at the DGEN380 Engine Acoustic Data from a Core-Noise Perspective
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.
2015-01-01
This work is a first look at acoustic data acquired in the NASA Glenn Research Center Aero-Acoustic Propulsion Laboratory using the Price Induction DGEN380 small turbofan engine, with particular emphasis on broadband combustor (core) noise. Combustor noise is detected by using a two-signal source separation technique employing one engine-internal sensor and one semi-far-field microphone. Combustor noise is an important core-noise component and is likely to become a more prominent contributor to overall airport community noise due to turbofan design trends, expected aircraft configuration changes, and advances in fan-noise-mitigation techniques. This work was carried out under the NASA Fundamental Aeronautics Program, Fixed Wing Project, Quiet Performance Subproject
NASA Technical Reports Server (NTRS)
Laird, Philip
1992-01-01
We distinguish static and dynamic optimization of programs: whereas static optimization modifies a program before runtime and is based only on its syntactical structure, dynamic optimization is based on the statistical properties of the input source and examples of program execution. Explanation-based generalization is a commonly used dynamic optimization method, but its effectiveness as a speedup-learning method is limited, in part because it fails to separate the learning process from the program transformation process. This paper describes a dynamic optimization technique called a learn-optimize cycle that first uses a learning element to uncover predictable patterns in the program execution and then uses an optimization algorithm to map these patterns into beneficial transformations. The technique has been used successfully for dynamic optimization of pure Prolog.
Joint Blind Source Separation by Multi-set Canonical Correlation Analysis
Li, Yi-Ou; Adalı, Tülay; Wang, Wei; Calhoun, Vince D
2009-01-01
In this work, we introduce a simple and effective scheme to achieve joint blind source separation (BSS) of multiple datasets using multi-set canonical correlation analysis (M-CCA) [1]. We first propose a generative model of joint BSS based on the correlation of latent sources within and between datasets. We specify source separability conditions, and show that, when the conditions are satisfied, the group of corresponding sources from each dataset can be jointly extracted by M-CCA through maximization of correlation among the extracted sources. We compare source separation performance of the M-CCA scheme with other joint BSS methods and demonstrate the superior performance of the M-CCA scheme in achieving joint BSS for a large number of datasets, group of corresponding sources with heterogeneous correlation values, and complex-valued sources with circular and non-circular distributions. We apply M-CCA to analysis of functional magnetic resonance imaging (fMRI) data from multiple subjects and show its utility in estimating meaningful brain activations from a visuomotor task. PMID:20221319
[Detection of Heart Rate of Fetal ECG Based on STFT and BSS].
Wang, Xu; Cai, Kun
2016-01-01
Changes in heart rate of fetal is function regulating performance of the circulatory system and the central nervous system, it is significant to detect heart rate of fetus in perinatal fetal. This paper puts forward the fetal heart rate detection method based on short time Fourier transform and blind source separation. First of all, the mixed ECG signal was preprocessed, and then the wavelet transform technique was used to separate the fetal ECG signal with noise from mixed ECG signal, after that, the short-time Fourier transform and the blind separation were carried on it, and then calculated the correlation coefficient of it, Finally, An independent component that it has strongest correlation with the original signal was selected to make FECG peak detection and calculated the fetal instantaneous heart rate. The experimental results show that the method can improve the detection rate of the FECG peak (R), and it has high accuracy in fixing peak(R) location in the case of low signal-noise ratio.
Rapid fusion method for the determination of Pu, Np, and Am in large soil samples
Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...
2015-02-14
A new rapid sodium hydroxide fusion method for the preparation of 10-20 g soil samples has been developed by the Savannah River National Laboratory (SRNL). The method enables lower detection limits for plutonium, neptunium, and americium in environmental soil samples. The method also significantly reduces sample processing time and acid fume generation compared to traditional soil digestion techniques using hydrofluoric acid. Ten gram soil aliquots can be ashed and fused using the new method in 1-2 hours, completely dissolving samples, including refractory particles. Pu, Np and Am are separated using stacked 2mL cartridges of TEVA and DGA Resin and measuredmore » using alpha spectrometry. The method can be adapted for measurement by inductively-coupled plasma mass spectrometry (ICP-MS). Two 10 g soil aliquots of fused soil may be combined prior to chromatographic separations to further improve detection limits. Total sample preparation time, including chromatographic separations and alpha spectrometry source preparation, is less than 8 hours.« less
Wearable Sensor Localization Considering Mixed Distributed Sources in Health Monitoring Systems
Wan, Liangtian; Han, Guangjie; Wang, Hao; Shu, Lei; Feng, Nanxing; Peng, Bao
2016-01-01
In health monitoring systems, the base station (BS) and the wearable sensors communicate with each other to construct a virtual multiple input and multiple output (VMIMO) system. In real applications, the signal that the BS received is a distributed source because of the scattering, reflection, diffraction and refraction in the propagation path. In this paper, a 2D direction-of-arrival (DOA) estimation algorithm for incoherently-distributed (ID) and coherently-distributed (CD) sources is proposed based on multiple VMIMO systems. ID and CD sources are separated through the second-order blind identification (SOBI) algorithm. The traditional estimating signal parameters via the rotational invariance technique (ESPRIT)-based algorithm is valid only for one-dimensional (1D) DOA estimation for the ID source. By constructing the signal subspace, two rotational invariant relationships are constructed. Then, we extend the ESPRIT to estimate 2D DOAs for ID sources. For DOA estimation of CD sources, two rational invariance relationships are constructed based on the application of generalized steering vectors (GSVs). Then, the ESPRIT-based algorithm is used for estimating the eigenvalues of two rational invariance matrices, which contain the angular parameters. The expressions of azimuth and elevation for ID and CD sources have closed forms, which means that the spectrum peak searching is avoided. Therefore, compared to the traditional 2D DOA estimation algorithms, the proposed algorithm imposes significantly low computational complexity. The intersecting point of two rays, which come from two different directions measured by two uniform rectangle arrays (URA), can be regarded as the location of the biosensor (wearable sensor). Three BSs adopting the smart antenna (SA) technique cooperate with each other to locate the wearable sensors using the angulation positioning method. Simulation results demonstrate the effectiveness of the proposed algorithm. PMID:26985896
Wearable Sensor Localization Considering Mixed Distributed Sources in Health Monitoring Systems.
Wan, Liangtian; Han, Guangjie; Wang, Hao; Shu, Lei; Feng, Nanxing; Peng, Bao
2016-03-12
In health monitoring systems, the base station (BS) and the wearable sensors communicate with each other to construct a virtual multiple input and multiple output (VMIMO) system. In real applications, the signal that the BS received is a distributed source because of the scattering, reflection, diffraction and refraction in the propagation path. In this paper, a 2D direction-of-arrival (DOA) estimation algorithm for incoherently-distributed (ID) and coherently-distributed (CD) sources is proposed based on multiple VMIMO systems. ID and CD sources are separated through the second-order blind identification (SOBI) algorithm. The traditional estimating signal parameters via the rotational invariance technique (ESPRIT)-based algorithm is valid only for one-dimensional (1D) DOA estimation for the ID source. By constructing the signal subspace, two rotational invariant relationships are constructed. Then, we extend the ESPRIT to estimate 2D DOAs for ID sources. For DOA estimation of CD sources, two rational invariance relationships are constructed based on the application of generalized steering vectors (GSVs). Then, the ESPRIT-based algorithm is used for estimating the eigenvalues of two rational invariance matrices, which contain the angular parameters. The expressions of azimuth and elevation for ID and CD sources have closed forms, which means that the spectrum peak searching is avoided. Therefore, compared to the traditional 2D DOA estimation algorithms, the proposed algorithm imposes significantly low computational complexity. The intersecting point of two rays, which come from two different directions measured by two uniform rectangle arrays (URA), can be regarded as the location of the biosensor (wearable sensor). Three BSs adopting the smart antenna (SA) technique cooperate with each other to locate the wearable sensors using the angulation positioning method. Simulation results demonstrate the effectiveness of the proposed algorithm.
Viner, Tabitha C; Kagan, Rebecca A; Johnson, Jennifer L
2014-01-01
Mortality due to electrical injury in wildlife may occur in the form of lightning strike or power line contact. Evidence of electrical contact may be grossly obvious, with extensive singeing, curling, and blackening of feathers, fur, or skin. Occasionally, changes may be subtle, owing to lower current or reduced conductivity, making a definitive diagnosis of electrocution more difficult. We describe the use of an alternate light source in the examination of cases of lightning strike and power line contact in wildlife, and the enhanced detection of changes due to electrical currents in the hair and feathers of affected animals. Subtle changes in the wing feathers of 12 snow geese and 1 wolf that were struck by separate lightning events were made obvious by the use of an alternate light source. Similarly, this technique can be used to strengthen the evidence for power line exposure in birds. Published by Elsevier Ireland Ltd.
Verification of high efficient broad beam cold cathode ion source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel Reheem, A. M., E-mail: amreheem2009@yahoo.com; Radiation Physics Department, National Center for Radiation Research and Technology; Ahmed, M. M.
2016-08-15
An improved form of cold cathode ion source has been designed and constructed. It consists of stainless steel hollow cylinder anode and stainless steel cathode disc, which are separated by a Teflon flange. The electrical discharge and output characteristics have been measured at different pressures using argon, nitrogen, and oxygen gases. The ion exit aperture shape and optimum distance between ion collector plate and cathode disc are studied. The stable discharge current and maximum output ion beam current have been obtained using grid exit aperture. It was found that the optimum distance between ion collector plate and ion exit aperturemore » is equal to 6.25 cm. The cold cathode ion source is used to deposit aluminum coating layer on AZ31 magnesium alloy using argon ion beam current which equals 600 μA. Scanning electron microscope and X-ray diffraction techniques used for characterizing samples before and after aluminum deposition.« less
Adaptive Sparse Representation for Source Localization with Gain/Phase Errors
Sun, Ke; Liu, Yimin; Meng, Huadong; Wang, Xiqin
2011-01-01
Sparse representation (SR) algorithms can be implemented for high-resolution direction of arrival (DOA) estimation. Additionally, SR can effectively separate the coherent signal sources because the spectrum estimation is based on the optimization technique, such as the L1 norm minimization, but not on subspace orthogonality. However, in the actual source localization scenario, an unknown gain/phase error between the array sensors is inevitable. Due to this nonideal factor, the predefined overcomplete basis mismatches the actual array manifold so that the estimation performance is degraded in SR. In this paper, an adaptive SR algorithm is proposed to improve the robustness with respect to the gain/phase error, where the overcomplete basis is dynamically adjusted using multiple snapshots and the sparse solution is adaptively acquired to match with the actual scenario. The simulation results demonstrate the estimation robustness to the gain/phase error using the proposed method. PMID:22163875
Chromatographic Techniques for Rare Earth Elements Analysis
NASA Astrophysics Data System (ADS)
Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin
2017-04-01
The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.
Ghost interactions in MEG/EEG source space: A note of caution on inter-areal coupling measures.
Palva, J Matias; Wang, Sheng H; Palva, Satu; Zhigalov, Alexander; Monto, Simo; Brookes, Matthew J; Schoffelen, Jan-Mathijs; Jerbi, Karim
2018-06-01
When combined with source modeling, magneto- (MEG) and electroencephalography (EEG) can be used to study long-range interactions among cortical processes non-invasively. Estimation of such inter-areal connectivity is nevertheless hindered by instantaneous field spread and volume conduction, which artificially introduce linear correlations and impair source separability in cortical current estimates. To overcome the inflating effects of linear source mixing inherent to standard interaction measures, alternative phase- and amplitude-correlation based connectivity measures, such as imaginary coherence and orthogonalized amplitude correlation have been proposed. Being by definition insensitive to zero-lag correlations, these techniques have become increasingly popular in the identification of correlations that cannot be attributed to field spread or volume conduction. We show here, however, that while these measures are immune to the direct effects of linear mixing, they may still reveal large numbers of spurious false positive connections through field spread in the vicinity of true interactions. This fundamental problem affects both region-of-interest-based analyses and all-to-all connectome mappings. Most importantly, beyond defining and illustrating the problem of spurious, or "ghost" interactions, we provide a rigorous quantification of this effect through extensive simulations. Additionally, we further show that signal mixing also significantly limits the separability of neuronal phase and amplitude correlations. We conclude that spurious correlations must be carefully considered in connectivity analyses in MEG/EEG source space even when using measures that are immune to zero-lag correlations. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Multiple-component Decomposition from Millimeter Single-channel Data
NASA Astrophysics Data System (ADS)
Rodríguez-Montoya, Iván; Sánchez-Argüelles, David; Aretxaga, Itziar; Bertone, Emanuele; Chávez-Dagostino, Miguel; Hughes, David H.; Montaña, Alfredo; Wilson, Grant W.; Zeballos, Milagros
2018-03-01
We present an implementation of a blind source separation algorithm to remove foregrounds off millimeter surveys made by single-channel instruments. In order to make possible such a decomposition over single-wavelength data, we generate levels of artificial redundancy, then perform a blind decomposition, calibrate the resulting maps, and lastly measure physical information. We simulate the reduction pipeline using mock data: atmospheric fluctuations, extended astrophysical foregrounds, and point-like sources, but we apply the same methodology to the Aztronomical Thermal Emission Camera/ASTE survey of the Great Observatories Origins Deep Survey–South (GOODS-S). In both applications, our technique robustly decomposes redundant maps into their underlying components, reducing flux bias, improving signal-to-noise ratio, and minimizing information loss. In particular, GOODS-S is decomposed into four independent physical components: one of them is the already-known map of point sources, two are atmospheric and systematic foregrounds, and the fourth component is an extended emission that can be interpreted as the confusion background of faint sources.
NASA Astrophysics Data System (ADS)
Haga, Ken-ichi; Kamiya, Yuusuke; Tokumitsu, Eisuke
2018-02-01
We report on a new fabrication process for thin-film transistors (TFTs) with a new structure and a new operation principle. In this process, both the channel and electrode (source/drain) are formed simultaneously, using the same oxide material, using a single nano-rheology printing (n-RP) process, without any conventional lithography process. N-RP is a direct thermal imprint technique and deforms oxide precursor gel. To reduce the source/drain resistance, the material common to the channel and electrode is conductive indium-tin-oxide (ITO). The gate insulator is made of a ferroelectric material, whose high charge density can deplete the channel of the thin ITO film, which realizes the proposed operation principle. First, we have examined the n-RP conditions required for the channel and source/drain patterning, and found that the patterning properties are strongly affected by the cooling rate before separating the mold. Second, we have fabricated the TFTs as proposed and confirmed their TFT operation.
Zakaria, Ammar; Shakaff, Ali Yeon Md.; Adom, Abdul Hamid; Ahmad, Mohd Noor; Masnan, Maz Jamilah; Aziz, Abdul Hallis Abdul; Fikri, Nazifah Ahmad; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah
2010-01-01
An improved classification of Orthosiphon stamineus using a data fusion technique is presented. Five different commercial sources along with freshly prepared samples were discriminated using an electronic nose (e-nose) and an electronic tongue (e-tongue). Samples from the different commercial brands were evaluated by the e-tongue and then followed by the e-nose. Applying Principal Component Analysis (PCA) separately on the respective e-tongue and e-nose data, only five distinct groups were projected. However, by employing a low level data fusion technique, six distinct groupings were achieved. Hence, this technique can enhance the ability of PCA to analyze the complex samples of Orthosiphon stamineus. Linear Discriminant Analysis (LDA) was then used to further validate and classify the samples. It was found that the LDA performance was also improved when the responses from the e-nose and e-tongue were fused together. PMID:22163381
Zakaria, Ammar; Shakaff, Ali Yeon Md; Adom, Abdul Hamid; Ahmad, Mohd Noor; Masnan, Maz Jamilah; Aziz, Abdul Hallis Abdul; Fikri, Nazifah Ahmad; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah
2010-01-01
An improved classification of Orthosiphon stamineus using a data fusion technique is presented. Five different commercial sources along with freshly prepared samples were discriminated using an electronic nose (e-nose) and an electronic tongue (e-tongue). Samples from the different commercial brands were evaluated by the e-tongue and then followed by the e-nose. Applying Principal Component Analysis (PCA) separately on the respective e-tongue and e-nose data, only five distinct groups were projected. However, by employing a low level data fusion technique, six distinct groupings were achieved. Hence, this technique can enhance the ability of PCA to analyze the complex samples of Orthosiphon stamineus. Linear Discriminant Analysis (LDA) was then used to further validate and classify the samples. It was found that the LDA performance was also improved when the responses from the e-nose and e-tongue were fused together.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Counselman, C.C. III
1973-09-01
Very-long-baseline interferometry (VLBI) techniques have already been used to determine the vector separations between antennas thousands of kilometers apart to within 2 m and the directions of extragalactic radio sources to 0.1'', and to track an artificial satellite of the earth and the Apollo Lunar Rover on the surface of the Moon. The relative loostions of the Apollo Lunar Surface Experiment Package (ALSEP) transmitters on the lunar surface are being measured within 1 m, and the Moon's libration is being messured to 1'' of selenocentric src. Attempts are under way to measure the solar gravitational deflection of radio waves moremore » accurately than previously possible, by means of VLBI. A wide variety of scientific problems is being attacked by VLBI techniques, which may soon be two orders of magnitude more accurate than at present. (auth)« less
Source separation of household waste: a case study in China.
Zhuang, Ying; Wu, Song-Wei; Wang, Yun-Long; Wu, Wei-Xiang; Chen, Ying-Xu
2008-01-01
A pilot program concerning source separation of household waste was launched in Hangzhou, capital city of Zhejiang province, China. Detailed investigations on the composition and properties of household waste in the experimental communities revealed that high water content and high percentage of food waste are the main limiting factors in the recovery of recyclables, especially paper from household waste, and the main contributors to the high cost and low efficiency of waste disposal. On the basis of the investigation, a novel source separation method, according to which household waste was classified as food waste, dry waste and harmful waste, was proposed and performed in four selected communities. In addition, a corresponding household waste management system that involves all stakeholders, a recovery system and a mechanical dehydration system for food waste were constituted to promote source separation activity. Performances and the questionnaire survey results showed that the active support and investment of a real estate company and a community residential committee play important roles in enhancing public participation and awareness of the importance of waste source separation. In comparison with the conventional mixed collection and transportation system of household waste, the established source separation and management system is cost-effective. It could be extended to the entire city and used by other cities in China as a source of reference.
Recent advances in microparticle continuous separation.
Kersaudy-Kerhoas, M; Dhariwal, R; Desmulliez, M P Y
2008-03-01
Recent advances in microparticle separation in continuous flow are presented. It is intended for scientists in the field of separation science in biology, chemistry and microsystems engineering. Recent techniques of micron-sized particle separation within microsystems are described with emphasis on five different categories: optical, magnetic, fluidic-only, electrical and minor separation methods. Examples from the growing literature are explained with insights on separation efficiency and microengineering challenges. Current applications of the techniques are discussed.
Zuckerman, Scott L; Laufer, Ilya; Sahgal, Arjun; Yamada, Yoshiya J; Schmidt, Meic H; Chou, Dean; Shin, John H; Kumar, Naresh; Sciubba, Daniel M
2016-10-15
Systematic review. The aim of this study was to review the techniques, indications, and outcomes of minimally invasive surgery (MIS) and separation surgery with subsequent radiosurgery in the treatment of patients with metastatic spine disease. The utilization of MIS techniques in patients with spine metastases is a growing area within spinal oncology. Separation surgery represents a novel paradigm where radiosurgery provides long-term control after tumor is surgically separated from the neural elements. PubMed, Embase, and CINAHL databases were systematically queried for literature reporting MIS techniques or separation surgery in patients with metastatic spine disease. PRISMA guidelines were followed. Of the initial 983 articles found, 29 met inclusion criteria. Twenty-five articles discussed MIS techniques and were grouped according to the primary objective: percutaneous stabilization (8), tubular retractors (4), mini-open approach (8), and thoracoscopy/endoscopy (5). The remaining 4 studies reported separation surgery. Indications were similar across all studies and included patients with instability, refractory pain, or neurologic compromise. Intraoperative variables, outcomes, and complications were similar in MIS studies compared to traditional approaches, and some MIS studies showed a statistically significant improvement in outcomes. Studies of mini-open techniques had the strongest evidence for superiority. Low-quality evidence currently exists for MIS techniques and separation surgery in the treatment of metastatic spine disease. Given the early promising results, the next iteration of research should include higher-quality studies with sufficient power, and will be able to provide higher-level evidence on the outcomes of MIS approaches and separation surgery. N/A.
Capillary electrophoresis of inorganic anions.
Kaniansky, D; Masár, M; Marák, J; Bodor, R
1999-02-26
This review deals with the separation mechanisms applied to the separation of inorganic anions by capillary electrophoresis (CE) techniques. It covers various CE techniques that are suitable for the separation and/or determination of inorganic anions in various matrices, including capillary zone electrophoresis, micellar electrokinetic chromatography, electrochromatography and capillary isotachophoresis. Detection and sample preparation techniques used in CE separations are also reviewed. An extensive part of this review deals with applications of CE techniques in various fields (environmental, food and plant materials, biological and biomedical, technical materials and industrial processes). Attention is paid to speciations of anions of arsenic, selenium, chromium, phosphorus, sulfur and halogen elements by CE.
Song, Hyung Keun; Yoo, Je Hyun; Byun, Young Soo; Yang, Kyu Hyun
2014-05-01
Among patients over 50 years of age, separate vertical wiring alone may be insufficient for fixation of fractures of the inferior pole of the patella. Therefore, mechanical and clinical studies were performed in patients over the age of 50 to test the strength of augmentation of separate vertical wiring with cerclage wire (i.e., combined technique). Multiple osteotomies were performed to create four-part fractures in the inferior poles of eight pairs of cadaveric patellae. One patella from each pair was fixed with the separate wiring technique, while the other patella was fixed with a combined technique. The ultimate load to failure and stiffness of the fixation were subsequently measured. In a clinical study of 21 patients (average age of 64 years), comminuted fractures of the inferior pole of the patellae were treated using the combined technique. Operative parameters were recorded from which post-operative outcomes were evaluated. For cadaveric patellae, whose mean age was 69 years, the mean ultimate loads to failure for the separate vertical wiring technique and the combined technique were 216.4±72.4 N and 324.9±50.6 N, respectively (p=0.012). The mean stiffness for the separate vertical wiring technique and the combined technique was 241.1±68.5 N/mm and 340.8±45.3 N/mm, respectively (p=0.012). In the clinical study, the mean clinical score at final follow-up was 28.1 points. Augmentation of separate vertical wiring with cerclage wire provides enough strength for protected early exercise of the knee joint and uneventful healing.
NASA Technical Reports Server (NTRS)
Fox, T. A.
1973-01-01
An experimental reflector reactivity study was made with a compact cylindrical reactor using a uranyl fluoride - water fuel solution. The reactor was axially unreflected and radially reflected with segments of molybdenum. The reflector segments were displaced incrementally in both the axial and radial dimensions, and the shutdown of each configuration was measured by using the pulsed-neutron source technique. The reactivity effects for axial and radial displacement of reflector segments are tabulated separately and compared. The experiments provide data for control-system studies of compact-space-power-reactor concepts.
Back-trajectory modeling of high time-resolution air measurement data to separate nearby sources
Strategies to isolate air pollution contributions from sources is of interest as voluntary or regulatory measures are undertaken to reduce air pollution. When different sources are located in close proximity to one another and have similar emissions, separating source emissions ...
Integrality and separability of multitouch interaction techniques in 3D manipulation tasks.
Martinet, Anthony; Casiez, Géry; Grisoni, Laurent
2012-03-01
Multitouch displays represent a promising technology for the display and manipulation of data. While the manipulation of 2D data has been widely explored, 3D manipulation with multitouch displays remains largely unexplored. Based on an analysis of the integration and separation of degrees of freedom, we propose a taxonomy for 3D manipulation techniques with multitouch displays. Using that taxonomy, we introduce Depth-Separated Screen-Space (DS3), a new 3D manipulation technique based on the separation of translation and rotation. In a controlled experiment, we compared DS3 with Sticky Tools and Screen-Space. Results show that separating the control of translation and rotation significantly affects performance for 3D manipulation, with DS3 performing faster than the two other techniques.
Instantaneous and Frequency-Warped Signal Processing Techniques for Auditory Source Separation.
NASA Astrophysics Data System (ADS)
Wang, Avery Li-Chun
This thesis summarizes several contributions to the areas of signal processing and auditory source separation. The philosophy of Frequency-Warped Signal Processing is introduced as a means for separating the AM and FM contributions to the bandwidth of a complex-valued, frequency-varying sinusoid p (n), transforming it into a signal with slowly-varying parameters. This transformation facilitates the removal of p (n) from an additive mixture while minimizing the amount of damage done to other signal components. The average winding rate of a complex-valued phasor is explored as an estimate of the instantaneous frequency. Theorems are provided showing the robustness of this measure. To implement frequency tracking, a Frequency-Locked Loop algorithm is introduced which uses the complex winding error to update its frequency estimate. The input signal is dynamically demodulated and filtered to extract the envelope. This envelope may then be remodulated to reconstruct the target partial, which may be subtracted from the original signal mixture to yield a new, quickly-adapting form of notch filtering. Enhancements to the basic tracker are made which, under certain conditions, attain the Cramer -Rao bound for the instantaneous frequency estimate. To improve tracking, the novel idea of Harmonic -Locked Loop tracking, using N harmonically constrained trackers, is introduced for tracking signals, such as voices and certain musical instruments. The estimated fundamental frequency is computed from a maximum-likelihood weighting of the N tracking estimates, making it highly robust. The result is that harmonic signals, such as voices, can be isolated from complex mixtures in the presence of other spectrally overlapping signals. Additionally, since phase information is preserved, the resynthesized harmonic signals may be removed from the original mixtures with relatively little damage to the residual signal. Finally, a new methodology is given for designing linear-phase FIR filters which require a small fraction of the computational power of conventional FIR implementations. This design strategy is based on truncated and stabilized IIR filters. These signal-processing methods have been applied to the problem of auditory source separation, resulting in voice separation from complex music that is significantly better than previous results at far lower computational cost.
NASA Astrophysics Data System (ADS)
Pires, Carlos A. L.; Ribeiro, Andreia F. S.
2017-02-01
We develop an expansion of space-distributed time series into statistically independent uncorrelated subspaces (statistical sources) of low-dimension and exhibiting enhanced non-Gaussian probability distributions with geometrically simple chosen shapes (projection pursuit rationale). The method relies upon a generalization of the principal component analysis that is optimal for Gaussian mixed signals and of the independent component analysis (ICA), optimized to split non-Gaussian scalar sources. The proposed method, supported by information theory concepts and methods, is the independent subspace analysis (ISA) that looks for multi-dimensional, intrinsically synergetic subspaces such as dyads (2D) and triads (3D), not separable by ICA. Basically, we optimize rotated variables maximizing certain nonlinear correlations (contrast functions) coming from the non-Gaussianity of the joint distribution. As a by-product, it provides nonlinear variable changes `unfolding' the subspaces into nearly Gaussian scalars of easier post-processing. Moreover, the new variables still work as nonlinear data exploratory indices of the non-Gaussian variability of the analysed climatic and geophysical fields. The method (ISA, followed by nonlinear unfolding) is tested into three datasets. The first one comes from the Lorenz'63 three-dimensional chaotic model, showing a clear separation into a non-Gaussian dyad plus an independent scalar. The second one is a mixture of propagating waves of random correlated phases in which the emergence of triadic wave resonances imprints a statistical signature in terms of a non-Gaussian non-separable triad. Finally the method is applied to the monthly variability of a high-dimensional quasi-geostrophic (QG) atmospheric model, applied to the Northern Hemispheric winter. We find that quite enhanced non-Gaussian dyads of parabolic shape, perform much better than the unrotated variables in which concerns the separation of the four model's centroid regimes (positive and negative phases of the Arctic Oscillation and of the North Atlantic Oscillation). Triads are also likely in the QG model but of weaker expression than dyads due to the imposed shape and dimension. The study emphasizes the existence of nonlinear dyadic and triadic nonlinear teleconnections.
NASA Astrophysics Data System (ADS)
Gao, Lingli; Pan, Yudi
2018-05-01
The correct estimation of the seismic source signature is crucial to exploration geophysics. Based on seismic interferometry, the virtual real source (VRS) method provides a model-independent way for source signature estimation. However, when encountering multimode surface waves, which are commonly seen in the shallow seismic survey, strong spurious events appear in seismic interferometric results. These spurious events introduce errors in the virtual-source recordings and reduce the accuracy of the source signature estimated by the VRS method. In order to estimate a correct source signature from multimode surface waves, we propose a mode-separated VRS method. In this method, multimode surface waves are mode separated before seismic interferometry. Virtual-source recordings are then obtained by applying seismic interferometry to each mode individually. Therefore, artefacts caused by cross-mode correlation are excluded in the virtual-source recordings and the estimated source signatures. A synthetic example showed that a correct source signature can be estimated with the proposed method, while strong spurious oscillation occurs in the estimated source signature if we do not apply mode separation first. We also applied the proposed method to a field example, which verified its validity and effectiveness in estimating seismic source signature from shallow seismic shot gathers containing multimode surface waves.
The analysis of complex mixed-radiation fields using near real-time imaging.
Beaumont, Jonathan; Mellor, Matthew P; Joyce, Malcolm J
2014-10-01
A new mixed-field imaging system has been constructed at Lancaster University using the principles of collimation and back projection to passively locate and assess sources of neutron and gamma-ray radiation. The system was set up at the University of Manchester where three radiation sources: (252)Cf, a lead-shielded (241)Am/Be and a (22)Na source were imaged. Real-time discrimination was used to find the respective components of the neutron and gamma-ray fields detected by a single EJ-301 liquid scintillator, allowing separate images of neutron and gamma-ray emitters to be formed. (252)Cf and (22)Na were successfully observed and located in the gamma-ray image; however, the (241)Am/Be was not seen owing to surrounding lead shielding. The (252)Cf and (241)Am/Be neutron sources were seen clearly in the neutron image, demonstrating the advantage of this mixed-field technique over a gamma-ray-only image where the (241)Am/Be source would have gone undetected. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Reed, Joshua L.
Permanent implants of low-energy photon-emitting brachytherapy sources are used to treat a variety of cancers. Individual source models must be separately characterized due to their unique geometry, materials, and radionuclides, which all influence their dose distributions. Thermoluminescent dosimeters (TLDs) are often used for dose measurements around low-energy photon-emitting brachytherapy sources. TLDs are typically calibrated with higher energy sources such as 60Co, which requires a correction for the change in the response of the TLDs as a function of photon energy. These corrections have historically been based on TLD response to x ray bremsstrahlung spectra instead of to brachytherapy sources themselves. This work determined the TLD intrinsic energy dependence for 125I and 103Pd sources relative to 60Co, which allows for correction of TLD measurements of brachytherapy sources with factors specific to their energy spectra. Traditional brachytherapy sources contain mobile internal components and large amounts of high-Z material such as radio-opaque markers and titanium encapsulations. These all contribute to perturbations and uncertainties in the dose distribution around the source. The CivaString is a new elongated 103Pd brachytherapy source with a fixed internal geometry, polymer encapsulation, and lengths ranging from 1 to 6 cm, which offers advantages over traditional source designs. This work characterized the CivaString source and the results facilitated the formal approval of this source for use in clinical treatments. Additionally, the accuracy of a superposition technique for dose calculation around the sources with lengths >1 cm was verified. Advances in diagnostic techniques are paving the way for focal brachytherapy in which the dose is intentionally modulated throughout the target volume to focus on subvolumes that contain cancer cells. Brachytherapy sources with variable longitudinal strength (VLS) are a promising candidate for use in focal brachytherapy treatments given their customizable activity distributions, although they are not yet commercially available. This work characterized five prototype VLS sources, developed methods for clinical calibration and verification of these sources, and developed an analytical dose calculation algorithm that scales with both source length and VLS.
Zhou, Guoxu; Yang, Zuyuan; Xie, Shengli; Yang, Jun-Mei
2011-04-01
Online blind source separation (BSS) is proposed to overcome the high computational cost problem, which limits the practical applications of traditional batch BSS algorithms. However, the existing online BSS methods are mainly used to separate independent or uncorrelated sources. Recently, nonnegative matrix factorization (NMF) shows great potential to separate the correlative sources, where some constraints are often imposed to overcome the non-uniqueness of the factorization. In this paper, an incremental NMF with volume constraint is derived and utilized for solving online BSS. The volume constraint to the mixing matrix enhances the identifiability of the sources, while the incremental learning mode reduces the computational cost. The proposed method takes advantage of the natural gradient based multiplication updating rule, and it performs especially well in the recovery of dependent sources. Simulations in BSS for dual-energy X-ray images, online encrypted speech signals, and high correlative face images show the validity of the proposed method.
Estimating the Earthquake Source Time Function by Markov Chain Monte Carlo Sampling
NASA Astrophysics Data System (ADS)
Dȩbski, Wojciech
2008-07-01
Many aspects of earthquake source dynamics like dynamic stress drop, rupture velocity and directivity, etc. are currently inferred from the source time functions obtained by a deconvolution of the propagation and recording effects from seismograms. The question of the accuracy of obtained results remains open. In this paper we address this issue by considering two aspects of the source time function deconvolution. First, we propose a new pseudo-spectral parameterization of the sought function which explicitly takes into account the physical constraints imposed on the sought functions. Such parameterization automatically excludes non-physical solutions and so improves the stability and uniqueness of the deconvolution. Secondly, we demonstrate that the Bayesian approach to the inverse problem at hand, combined with an efficient Markov Chain Monte Carlo sampling technique, is a method which allows efficient estimation of the source time function uncertainties. The key point of the approach is the description of the solution of the inverse problem by the a posteriori probability density function constructed according to the Bayesian (probabilistic) theory. Next, the Markov Chain Monte Carlo sampling technique is used to sample this function so the statistical estimator of a posteriori errors can be easily obtained with minimal additional computational effort with respect to modern inversion (optimization) algorithms. The methodological considerations are illustrated by a case study of the mining-induced seismic event of the magnitude M L ≈3.1 that occurred at Rudna (Poland) copper mine. The seismic P-wave records were inverted for the source time functions, using the proposed algorithm and the empirical Green function technique to approximate Green functions. The obtained solutions seem to suggest some complexity of the rupture process with double pulses of energy release. However, the error analysis shows that the hypothesis of source complexity is not justified at the 95% confidence level. On the basis of the analyzed event we also show that the separation of the source inversion into two steps introduces limitations on the completeness of the a posteriori error analysis.
Particle image velocimetry based on wavelength division multiplexing
NASA Astrophysics Data System (ADS)
Tang, Chunxiao; Li, Enbang; Li, Hongqiang
2018-01-01
This paper introduces a technical approach of wavelength division multiplexing (WDM) based particle image velocimetry (PIV). It is designed to measure transient flows with different scales of velocity by capturing multiple particle images in one exposure. These images are separated by different wavelengths, and thus the pulse separation time is not influenced by the frame rate of the camera. A triple-pulsed PIV system has been created in order to prove the feasibility of WDM-PIV. This is demonstrated in a sieve plate extraction column model by simultaneously measuring the fast flow in the downcomer and the slow vortices inside the plates. A simple displacement/velocity field combination method has also been developed. The constraints imposed by WDM-PIV are limited wavelength choices of available light sources and cameras. The usage of WDM technique represents a feasible way to realize multiple-pulsed PIV.
Ultra-thin layer chromatography with integrated silver colloid-based SERS detection.
Wallace, Ryan A; Lavrik, Nickolay V; Sepaniak, Michael J
2017-01-01
Simplified lab-on-a-chip techniques are desirable for quick and efficient detection of analytes of interest in the field. The following work involves the use of deterministic pillar arrays on the micro-scale as a platform to separate compounds, and the use of Ag colloid within the arrays as a source of increased signal via surface enhanced Raman spectroscopy (SERS). One problem traditionally seen with SERS surfaces containing Ag colloid is oxidation; however, our platforms are superhydrophobic, reducing the amount of oxidation taking place on the surface of the Ag colloid. This work includes the successful separation and SERS detection of a fluorescent dye compounds (resorufin and sulforhodamine 640), fluorescent anti-tumor drugs (Adriamycin and Daunomycin), and purine and pyrimidine bases (adenine, cytosine, guanine, hypoxanthine, and thymine). © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Analysis of transitional separation bubbles on infinite swept wings
NASA Technical Reports Server (NTRS)
Davis, R. L.; Carter, J. E.
1986-01-01
A previously developed two-dimensional local inviscid-viscous interaction technique for the analysis of airfoil transitional separation bubbles, ALESEP (Airfoil Leading Edge Separation), has been extended for the calculation of transitional separation bubbles over infinite swept wings. As part of this effort, Roberts' empirical correlation, which is interpreted as a separated flow empirical extension of Mack's stability theory for attached flows, has been incorporated into the ALESEP procedure for the prediction of the transition location within the separation bubble. In addition, the viscous procedure used in the ALESEP techniques has been modified to allow for wall suction. A series of two-dimensional calculations is presented as a verification of the prediction capability of the interaction techniques with the Roberts' transition model. Numerical tests have shown that this two-dimensional natural transition correlation may also be applied to transitional separation bubbles over infinite swept wings. Results of the interaction procedure are compared with Horton's detailed experimental data for separated flow over a swept plate which demonstrates the accuracy of the present technique. Wall suction has been applied to a similar interaction calculation to demonstrate its effect on the separation bubble. The principal conclusion of this paper is that the prediction of transitional separation bubbles over two-dimensional or infinite swept geometries is now possible using the present interacting boundary layer approach.
Audio visual speech source separation via improved context dependent association model
NASA Astrophysics Data System (ADS)
Kazemi, Alireza; Boostani, Reza; Sobhanmanesh, Fariborz
2014-12-01
In this paper, we exploit the non-linear relation between a speech source and its associated lip video as a source of extra information to propose an improved audio-visual speech source separation (AVSS) algorithm. The audio-visual association is modeled using a neural associator which estimates the visual lip parameters from a temporal context of acoustic observation frames. We define an objective function based on mean square error (MSE) measure between estimated and target visual parameters. This function is minimized for estimation of the de-mixing vector/filters to separate the relevant source from linear instantaneous or time-domain convolutive mixtures. We have also proposed a hybrid criterion which uses AV coherency together with kurtosis as a non-Gaussianity measure. Experimental results are presented and compared in terms of visually relevant speech detection accuracy and output signal-to-interference ratio (SIR) of source separation. The suggested audio-visual model significantly improves relevant speech classification accuracy compared to existing GMM-based model and the proposed AVSS algorithm improves the speech separation quality compared to reference ICA- and AVSS-based methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krechmer, Jordan E.; Groessl, Michael; Zhang, Xuan
Measurement techniques that provide molecular-level information are needed to elucidate the multiphase processes that produce secondary organic aerosol (SOA) species in the atmosphere. Here we demonstrate the application of ion mobility spectrometry-mass spectrometry (IMS–MS) to the simultaneous characterization of the elemental composition and molecular structures of organic species in the gas and particulate phases. Molecular ions of gas-phase organic species are measured online with IMS–MS after ionization with a custom-built nitrate chemical ionization (CI) source. This CI–IMS–MS technique is used to obtain time-resolved measurements (5 min) of highly oxidized organic molecules during the 2013 Southern Oxidant and Aerosol Study (SOAS)more » ambient field campaign in the forested SE US. The ambient IMS–MS signals are consistent with laboratory IMS–MS spectra obtained from single-component carboxylic acids and multicomponent mixtures of isoprene and monoterpene oxidation products. Mass-mobility correlations in the 2-D IMS–MS space provide a means of identifying ions with similar molecular structures within complex mass spectra and are used to separate and identify monoterpene oxidation products in the ambient data that are produced from different chemical pathways. Water-soluble organic carbon (WSOC) constituents of fine aerosol particles that are not resolvable with standard analytical separation methods, such as liquid chromatography (LC), are shown to be separable with IMS–MS coupled to an electrospray ionization (ESI) source. The capability to use ion mobility to differentiate between isomers is demonstrated for organosulfates derived from the reactive uptake of isomers of isoprene epoxydiols (IEPOX) onto wet acidic sulfate aerosol. As a result, controlled fragmentation of precursor ions by collisionally induced dissociation (CID) in the transfer region between the IMS and the MS is used to validate MS peak assignments, elucidate structures of oligomers, and confirm the presence of the organosulfate functional group.« less
Krechmer, Jordan E.; Groessl, Michael; Zhang, Xuan; ...
2016-07-25
Measurement techniques that provide molecular-level information are needed to elucidate the multiphase processes that produce secondary organic aerosol (SOA) species in the atmosphere. Here we demonstrate the application of ion mobility spectrometry-mass spectrometry (IMS–MS) to the simultaneous characterization of the elemental composition and molecular structures of organic species in the gas and particulate phases. Molecular ions of gas-phase organic species are measured online with IMS–MS after ionization with a custom-built nitrate chemical ionization (CI) source. This CI–IMS–MS technique is used to obtain time-resolved measurements (5 min) of highly oxidized organic molecules during the 2013 Southern Oxidant and Aerosol Study (SOAS)more » ambient field campaign in the forested SE US. The ambient IMS–MS signals are consistent with laboratory IMS–MS spectra obtained from single-component carboxylic acids and multicomponent mixtures of isoprene and monoterpene oxidation products. Mass-mobility correlations in the 2-D IMS–MS space provide a means of identifying ions with similar molecular structures within complex mass spectra and are used to separate and identify monoterpene oxidation products in the ambient data that are produced from different chemical pathways. Water-soluble organic carbon (WSOC) constituents of fine aerosol particles that are not resolvable with standard analytical separation methods, such as liquid chromatography (LC), are shown to be separable with IMS–MS coupled to an electrospray ionization (ESI) source. The capability to use ion mobility to differentiate between isomers is demonstrated for organosulfates derived from the reactive uptake of isomers of isoprene epoxydiols (IEPOX) onto wet acidic sulfate aerosol. As a result, controlled fragmentation of precursor ions by collisionally induced dissociation (CID) in the transfer region between the IMS and the MS is used to validate MS peak assignments, elucidate structures of oligomers, and confirm the presence of the organosulfate functional group.« less
NASA Astrophysics Data System (ADS)
Krechmer, Jordan E.; Groessl, Michael; Zhang, Xuan; Junninen, Heikki; Massoli, Paola; Lambe, Andrew T.; Kimmel, Joel R.; Cubison, Michael J.; Graf, Stephan; Lin, Ying-Hsuan; Budisulistiorini, Sri H.; Zhang, Haofei; Surratt, Jason D.; Knochenmuss, Richard; Jayne, John T.; Worsnop, Douglas R.; Jimenez, Jose-Luis; Canagaratna, Manjula R.
2016-07-01
Measurement techniques that provide molecular-level information are needed to elucidate the multiphase processes that produce secondary organic aerosol (SOA) species in the atmosphere. Here we demonstrate the application of ion mobility spectrometry-mass spectrometry (IMS-MS) to the simultaneous characterization of the elemental composition and molecular structures of organic species in the gas and particulate phases. Molecular ions of gas-phase organic species are measured online with IMS-MS after ionization with a custom-built nitrate chemical ionization (CI) source. This CI-IMS-MS technique is used to obtain time-resolved measurements (5 min) of highly oxidized organic molecules during the 2013 Southern Oxidant and Aerosol Study (SOAS) ambient field campaign in the forested SE US. The ambient IMS-MS signals are consistent with laboratory IMS-MS spectra obtained from single-component carboxylic acids and multicomponent mixtures of isoprene and monoterpene oxidation products. Mass-mobility correlations in the 2-D IMS-MS space provide a means of identifying ions with similar molecular structures within complex mass spectra and are used to separate and identify monoterpene oxidation products in the ambient data that are produced from different chemical pathways. Water-soluble organic carbon (WSOC) constituents of fine aerosol particles that are not resolvable with standard analytical separation methods, such as liquid chromatography (LC), are shown to be separable with IMS-MS coupled to an electrospray ionization (ESI) source. The capability to use ion mobility to differentiate between isomers is demonstrated for organosulfates derived from the reactive uptake of isomers of isoprene epoxydiols (IEPOX) onto wet acidic sulfate aerosol. Controlled fragmentation of precursor ions by collisionally induced dissociation (CID) in the transfer region between the IMS and the MS is used to validate MS peak assignments, elucidate structures of oligomers, and confirm the presence of the organosulfate functional group.
scarlet: Source separation in multi-band images by Constrained Matrix Factorization
NASA Astrophysics Data System (ADS)
Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert
2018-03-01
SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.
NASA Astrophysics Data System (ADS)
Jiang, Wei; Gao, Zhiyong; Khoso, Sultan Ahmed; Gao, Jiande; Sun, Wei; Pu, Wei; Hu, Yuehua
2018-03-01
Fluorite, a chief source of fluorine in the nature, usually coexists with calcite mineral in ore deposits. Worldwide, flotation techniques with a selective collector and/or a selective depressant are commonly preferred for the separation of fluorite from calcite. In the present study, an attempt was made to use benzhydroxamic acid (BHA) as a collector for the selective separation of fluorite from calcite without using any depressant. Results obtained from the flotation experiments for single mineral and mixed binary minerals revealed that the BHA has a good selective collecting ability for the fluorite when 50 mg/L of BHA was used at pH of 9. The results from the zeta potential and X-ray photoelectron spectroscopy (XPS) indicated that the BHA easily chemisorbs onto the fluorite as compared to calcite. Crystal chemistry calculations showed the larger Ca density and the higher Ca activity on fluorite surface mainly account for the selective adsorption of BHA on fluorite, leading to the selective separation of fluorite from calcite. Moreover, a stronger hydrogen bonding with BHA and the weaker electrostatic repulsion with BHA- also contribute to the stronger interaction of BHA species with fluorite surface.
HIGH POWER BEAM DUMP AND TARGET / ACCELERATOR INTERFACE PROCEDURES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blokland, Willem; Plum, Michael A; Peters, Charles C
Satisfying operational procedures and limits for the beam target interface is a critical concern for high power operation at spallation neutron sources. At the Oak Ridge Spallation Neutron Source (SNS) a number of protective measures are instituted to ensure that the beam position, beam size and peak intensity are within acceptable limits at the target and high power Ring Injection Dump (RID). The high power beam dump typically handles up to 50 100 kW of beam power and its setup is complicated by the fact that there are two separate beam components simultaneously directed to the dump. The beam onmore » target is typically in the 800-1000 kW average power level, delivered in sub- s 60 Hz pulses. Setup techniques using beam measurements to quantify the beam parameters at the target and dump will be described. However, not all the instrumentation used for the setup and initial qualification is available during high power operation. Additional techniques are used to monitor the beam during high power operation to ensure the setup conditions are maintained, and these are also described.« less
Green, W. Reed; Haggard, Brian E.
2001-01-01
Water-quality sampling consisting of every other month (bimonthly) routine sampling and storm event sampling (six storms annually) is used to estimate annual phosphorus and nitrogen loads at Illinois River south of Siloam Springs, Arkansas. Hydrograph separation allowed assessment of base-flow and surfacerunoff nutrient relations and yield. Discharge and nutrient relations indicate that water quality at Illinois River south of Siloam Springs, Arkansas, is affected by both point and nonpoint sources of contamination. Base-flow phosphorus concentrations decreased with increasing base-flow discharge indicating the dilution of phosphorus in water from point sources. Nitrogen concentrations increased with increasing base-flow discharge, indicating a predominant ground-water source. Nitrogen concentrations at higher base-flow discharges often were greater than median concentrations reported for ground water (from wells and springs) in the Springfield Plateau aquifer. Total estimated phosphorus and nitrogen annual loads for calendar year 1997-1999 using the regression techniques presented in this paper (35 samples) were similar to estimated loads derived from integration techniques (1,033 samples). Flow-weighted nutrient concentrations and nutrient yields at the Illinois River site were about 10 to 100 times greater than national averages for undeveloped basins and at North Sylamore Creek and Cossatot River (considered to be undeveloped basins in Arkansas). Total phosphorus and soluble reactive phosphorus were greater than 10 times and total nitrogen and dissolved nitrite plus nitrate were greater than 10 to 100 times the national and regional averages for undeveloped basins. These results demonstrate the utility of a strategy whereby samples are collected every other month and during selected storm events annually, with use of regression models to estimate nutrient loads. Annual loads of phosphorus and nitrogen estimated using regression techniques could provide similar results to estimates using integration techniques, with much less investment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, J; Szczykutowicz, T; Bayouth, J
Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between themore » acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials necessitate consideration for radiation therapy treatment planning.« less
Life cycle assessment of a household solid waste source separation programme: a Swedish case study.
Bernstad, Anna; la Cour Jansen, Jes; Aspegren, Henrik
2011-10-01
The environmental impact of an extended property close source-separation system for solid household waste (i.e., a systems for collection of recyclables from domestic properties) is investigated in a residential area in southern Sweden. Since 2001, households have been able to source-separate waste into six fractions of dry recyclables and food waste sorting. The current system was evaluated using the EASEWASTE life cycle assessment tool. Current status is compared with an ideal scenario in which households display perfect source-separation behaviour and a scenario without any material recycling. Results show that current recycling provides substantial environmental benefits compared to a non-recycling alternative. The environmental benefit varies greatly between recyclable fractions, and the recyclables currently most frequently source-separated by households are often not the most beneficial from an environmental perspective. With optimal source-separation of all recyclables, the current net contribution to global warming could be changed to a net-avoidance while current avoidance of nutrient enrichment, acidification and photochemical ozone formation could be doubled. Sensitivity analyses show that the type of energy substituted by incineration of non-recycled waste, as well as energy used in recycling processes and in the production of materials substituted by waste recycling, is of high relevance for the attained results.
Valdes, Claudia P.; Varma, Hari M.; Kristoffersen, Anna K.; Dragojevic, Tanja; Culver, Joseph P.; Durduran, Turgut
2014-01-01
We introduce a new, non-invasive, diffuse optical technique, speckle contrast optical spectroscopy (SCOS), for probing deep tissue blood flow using the statistical properties of laser speckle contrast and the photon diffusion model for a point source. The feasibility of the method is tested using liquid phantoms which demonstrate that SCOS is capable of measuring the dynamic properties of turbid media non-invasively. We further present an in vivo measurement in a human forearm muscle using SCOS in two modalities: one with the dependence of the speckle contrast on the source-detector separation and another on the exposure time. In doing so, we also introduce crucial corrections to the speckle contrast that account for the variance of the shot and sensor dark noises. PMID:25136500
Using artificial neural networks (ANN) for open-loop tomography
NASA Astrophysics Data System (ADS)
Osborn, James; De Cos Juez, Francisco Javier; Guzman, Dani; Butterley, Timothy; Myers, Richard; Guesalaga, Andres; Laine, Jesus
2011-09-01
The next generation of adaptive optics (AO) systems require tomographic techniques in order to correct for atmospheric turbulence along lines of sight separated from the guide stars. Multi-object adaptive optics (MOAO) is one such technique. Here, we present a method which uses an artificial neural network (ANN) to reconstruct the target phase given off-axis references sources. This method does not require any input of the turbulence profile and is therefore less susceptible to changing conditions than some existing methods. We compare our ANN method with a standard least squares type matrix multiplication method (MVM) in simulation and find that the tomographic error is similar to the MVM method. In changing conditions the tomographic error increases for MVM but remains constant with the ANN model and no large matrix inversions are required.
NASA Astrophysics Data System (ADS)
Ezbiri, A.; Tatam, R. P.
1995-09-01
A passive signal-processing technique for addressing a miniature low-finesse fiber Fabry-Perot interferometric sensor with a multimode laser diode is reported. Two modes of a multimode laser diode separated by 3 nm are used to obtain quadrature outputs from an \\similar 20 - mu m cavity. Wavelength-division demultiplexing combined with digital signal processing is used to recover the measurand-induced phase change. The technique is demonstrated for the measurement of vibration. The signal-to-noise ratio is \\similar 70 dB at 500 Hz for \\similar pi /2 rad displacement of the mirror, which results in a minimum detectable signal of \\similar 200 mu rad H z-1/2 . A quantitative discussion of miscalibration and systematic errors is presented.
Quantification of Soluble Sugars and Sugar Alcohols by LC-MS/MS.
Feil, Regina; Lunn, John Edward
2018-01-01
Sugars are simple carbohydrates composed primarily of carbon, hydrogen, and oxygen. They play a central role in metabolism as sources of energy and as building blocks for synthesis of structural and nonstructural polymers. Many different techniques have been used to measure sugars, including refractometry, colorimetric and enzymatic assays, gas chromatography, high-performance liquid chromatography, and nuclear magnetic resonance spectroscopy. In this chapter we describe a method that combines an initial separation of sugars by high-performance anion-exchange chromatography (HPAEC) with detection and quantification by tandem mass spectrometry (MS/MS). This combination of techniques provides exquisite specificity, allowing measurement of a diverse range of high- and low-abundance sugars in biological samples. This method can also be used for isotopomer analysis in stable-isotope labeling experiments to measure metabolic fluxes.
Diameter Tuning of β-Ga2O3 Nanowires Using Chemical Vapor Deposition Technique.
Kumar, Mukesh; Kumar, Vikram; Singh, R
2017-12-01
Diameter tuning of [Formula: see text]-Ga 2 O 3 nanowires using chemical vapor deposition technique have been investigated under various experimental conditions. Diameter of root grown [Formula: see text]-Ga 2 O 3 nanowires having monoclinic crystal structure is tuned by varying separation distance between metal source and substrate. Effect of gas flow rate and mixer ratio on the morphology and diameter of nanowires has been studied. Nanowire diameter depends on growth temperature, and it is independent of catalyst nanoparticle size at higher growth temperature (850-900 °C) as compared to lower growth temperature (800 °C). These nanowires show changes in structural strain value with change in diameter. Band-gap of nanowires increases with decrease in the diameter.
Zhu, Yumin; Zhang, Hua; Shao, Liming; He, Pinjing
2015-01-01
Excessive inter-contamination with heavy metals hampers the application of biological treatment products derived from mixed or mechanically-sorted municipal solid waste (MSW). In this study, we investigated fine particles of <2mm, which are small fractions in MSW but constitute a significant component of the total heavy metal content, using bulk detection techniques. A total of 17 individual fine particles were evaluated using synchrotron radiation-based micro-X-ray fluorescence and micro-X-ray diffraction. We also discussed the association, speciation and source apportionment of heavy metals. Metals were found to exist in a diffuse distribution with heterogeneous intensities and intense hot-spots of <10 μm within the fine particles. Zn-Cu, Pb-Fe and Fe-Mn-Cr had significant correlations in terms of spatial distribution. The overlapped enrichment, spatial association, and the mineral phases of metals revealed the potential sources of fine particles from size-reduced waste fractions (such as scraps of organic wastes or ceramics) or from the importation of other particles. The diverse sources of heavy metal pollutants within the fine particles suggested that separate collection and treatment of the biodegradable waste fraction (such as food waste) is a preferable means of facilitating the beneficial utilization of the stabilized products. Copyright © 2014. Published by Elsevier B.V.
Song, Hyung Keun; Yoo, Je Hyun; Byun, Young Soo
2014-01-01
Purpose Among patients over 50 years of age, separate vertical wiring alone may be insufficient for fixation of fractures of the inferior pole of the patella. Therefore, mechanical and clinical studies were performed in patients over the age of 50 to test the strength of augmentation of separate vertical wiring with cerclage wire (i.e., combined technique). Materials and Methods Multiple osteotomies were performed to create four-part fractures in the inferior poles of eight pairs of cadaveric patellae. One patella from each pair was fixed with the separate wiring technique, while the other patella was fixed with a combined technique. The ultimate load to failure and stiffness of the fixation were subsequently measured. In a clinical study of 21 patients (average age of 64 years), comminuted fractures of the inferior pole of the patellae were treated using the combined technique. Operative parameters were recorded from which post-operative outcomes were evaluated. Results For cadaveric patellae, whose mean age was 69 years, the mean ultimate loads to failure for the separate vertical wiring technique and the combined technique were 216.4±72.4 N and 324.9±50.6 N, respectively (p=0.012). The mean stiffness for the separate vertical wiring technique and the combined technique was 241.1±68.5 N/mm and 340.8±45.3 N/mm, respectively (p=0.012). In the clinical study, the mean clinical score at final follow-up was 28.1 points. Conclusion Augmentation of separate vertical wiring with cerclage wire provides enough strength for protected early exercise of the knee joint and uneventful healing. PMID:24719149
Structured illumination diffuse optical tomography for noninvasive functional neuroimaging in mice.
Reisman, Matthew D; Markow, Zachary E; Bauer, Adam Q; Culver, Joseph P
2017-04-01
Optical intrinsic signal (OIS) imaging has been a powerful tool for capturing functional brain hemodynamics in rodents. Recent wide field-of-view implementations of OIS have provided efficient maps of functional connectivity from spontaneous brain activity in mice. However, OIS requires scalp retraction and is limited to superficial cortical tissues. Diffuse optical tomography (DOT) techniques provide noninvasive imaging, but previous DOT systems for rodent neuroimaging have been limited either by sparse spatial sampling or by slow speed. Here, we develop a DOT system with asymmetric source-detector sampling that combines the high-density spatial sampling (0.4 mm) detection of a scientific complementary metal-oxide-semiconductor camera with the rapid (2 Hz) imaging of a few ([Formula: see text]) structured illumination (SI) patterns. Analysis techniques are developed to take advantage of the system's flexibility and optimize trade-offs among spatial sampling, imaging speed, and signal-to-noise ratio. An effective source-detector separation for the SI patterns was developed and compared with light intensity for a quantitative assessment of data quality. The light fall-off versus effective distance was also used for in situ empirical optimization of our light model. We demonstrated the feasibility of this technique by noninvasively mapping the functional response in the somatosensory cortex of the mouse following electrical stimulation of the forepaw.
Mamolen, M; Breiman, R F; Barbaree, J M; Gunn, R A; Stone, K M; Spika, J S; Dennis, D T; Mao, S H; Vogt, R L
1993-01-01
A multistate outbreak of Legionnaires' disease occurred among nine tour groups of senior citizens returning from stays at one of two lodges in a Vermont resort in October 1987. Interviews and serologic studies of 383 (85%) of the tour members revealed 17 individuals (attack rate, 4.4%) with radiologically documented pneumonia and laboratory evidence of legionellosis. A survey of tour groups staying at four nearby lodges and of Vermont-area medical facilities revealed no additional cases. Environmental investigation of common tour stops revealed no likely aerosol source of Legionella infection outside the lodges. Legionella pneumophila serogroup 1 was isolated from water sources at both implicated lodges, and the monoclonal antibody subtype matched those of the isolates from six patients from whom clinical isolates were obtained. The cultures reacted with monoclonal antibodies MAB1, MAB2, 33G2, and 144C2 to yield a 1,2,5,7 or a Benidorm 030E pattern. The strains were also identical by alloenzyme electrophoresis and DNA ribotyping techniques. The epidemiologic and laboratory data suggest that concurrent outbreaks occurred following exposures to the same L. pneumophila serogroup 1 strain at two separate lodges. Multiple molecular subtyping techniques can provide essential information for epidemiologic investigations of Legionnaires' disease. PMID:8253953
Mamolen, M; Breiman, R F; Barbaree, J M; Gunn, R A; Stone, K M; Spika, J S; Dennis, D T; Mao, S H; Vogt, R L
1993-10-01
A multistate outbreak of Legionnaires' disease occurred among nine tour groups of senior citizens returning from stays at one of two lodges in a Vermont resort in October 1987. Interviews and serologic studies of 383 (85%) of the tour members revealed 17 individuals (attack rate, 4.4%) with radiologically documented pneumonia and laboratory evidence of legionellosis. A survey of tour groups staying at four nearby lodges and of Vermont-area medical facilities revealed no additional cases. Environmental investigation of common tour stops revealed no likely aerosol source of Legionella infection outside the lodges. Legionella pneumophila serogroup 1 was isolated from water sources at both implicated lodges, and the monoclonal antibody subtype matched those of the isolates from six patients from whom clinical isolates were obtained. The cultures reacted with monoclonal antibodies MAB1, MAB2, 33G2, and 144C2 to yield a 1,2,5,7 or a Benidorm 030E pattern. The strains were also identical by alloenzyme electrophoresis and DNA ribotyping techniques. The epidemiologic and laboratory data suggest that concurrent outbreaks occurred following exposures to the same L. pneumophila serogroup 1 strain at two separate lodges. Multiple molecular subtyping techniques can provide essential information for epidemiologic investigations of Legionnaires' disease.
Bright field segmentation tomography (BFST) for use as surface identification in stereomicroscopy
NASA Astrophysics Data System (ADS)
Thiesse, Jacqueline R.; Namati, Eman; de Ryk, Jessica; Hoffman, Eric A.; McLennan, Geoffrey
2004-07-01
Stereomicroscopy is an important method for use in image acquisition because it provides a 3D image of an object when other microscopic techniques can only provide the image in 2D. One challenge that is being faced with this type of imaging is determining the top surface of a sample that has otherwise indistinguishable surface and planar characteristics. We have developed a system that creates oblique illumination and in conjunction with image processing, the top surface can be viewed. The BFST consists of the Leica MZ12 stereomicroscope with a unique attached lighting source. The lighting source consists of eight light emitting diodes (LED's) that are separated by 45-degree angles. Each LED in this system illuminates with a 20-degree viewing angle once per cycle with a shadow over the rest of the sample. Subsequently, eight segmented images are taken per cycle. After the images are captured they are stacked through image addition to achieve the full field of view, and the surface is then easily identified. Image processing techniques, such as skeletonization can be used for further enhancement and measurement. With the use of BFST, advances can be made in detecting surface features from metals to tissue samples, such as in the analytical assessment of pulmonary emphysema using the technique of mean linear intercept.
Mesh-based phase contrast Fourier transform imaging
NASA Astrophysics Data System (ADS)
Tahir, Sajjad; Bashir, Sajid; MacDonald, C. A.; Petruccelli, Jonathan C.
2017-04-01
Traditional x-ray radiography is limited by low attenuation contrast in materials of low electron density. Phase contrast imaging offers the potential to improve the contrast between such materials, but due to the requirements on the spatial coherence of the x-ray beam, practical implementation of such systems with tabletop (i.e. non-synchrotron) sources has been limited. One phase imaging technique employs multiple fine-pitched gratings. However, the strict manufacturing tolerances and precise alignment requirements have limited the widespread adoption of grating-based techniques. In this work, we have investigated a recently developed technique that utilizes a single grid of much coarser pitch. Our system consisted of a low power 100 μm spot Mo source, a CCD with 22 μm pixel pitch, and either a focused mammography linear grid or a stainless steel woven mesh. Phase is extracted from a single image by windowing and comparing data localized about harmonics of the mesh in the Fourier domain. The effects on the diffraction phase contrast and scattering amplitude images of varying grid types and periods, and of varying the width of the window function used to separate the harmonics were investigated. Using the wire mesh, derivatives of the phase along two orthogonal directions were obtained and combined to form improved phase contrast images.
A Review of Heating and Temperature Control in Microfluidic Systems: Techniques and Applications
Miralles, Vincent; Huerre, Axel; Malloggi, Florent; Jullien, Marie-Caroline
2013-01-01
This review presents an overview of the different techniques developed over the last decade to regulate the temperature within microfluidic systems. A variety of different approaches has been adopted, from external heating sources to Joule heating, microwaves or the use of lasers to cite just a few examples. The scope of the technical solutions developed to date is impressive and encompasses for instance temperature ramp rates ranging from 0.1 to 2,000 °C/s leading to homogeneous temperatures from −3 °C to 120 °C, and constant gradients from 6 to 40 °C/mm with a fair degree of accuracy. We also examine some recent strategies developed for applications such as digital microfluidics, where integration of a heating source to generate a temperature gradient offers control of a key parameter, without necessarily requiring great accuracy. Conversely, Temperature Gradient Focusing requires high accuracy in order to control both the concentration and separation of charged species. In addition, the Polymerase Chain Reaction requires both accuracy (homogeneous temperature) and integration to carry out demanding heating cycles. The spectrum of applications requiring temperature regulation is growing rapidly with increasingly important implications for the physical, chemical and biotechnological sectors, depending on the relevant heating technique. PMID:26835667
An EEG blind source separation algorithm based on a weak exclusion principle.
Lan Ma; Blu, Thierry; Wang, William S-Y
2016-08-01
The question of how to separate individual brain and non-brain signals, mixed by volume conduction in electroencephalographic (EEG) and other electrophysiological recordings, is a significant problem in contemporary neuroscience. This study proposes and evaluates a novel EEG Blind Source Separation (BSS) algorithm based on a weak exclusion principle (WEP). The chief point in which it differs from most previous EEG BSS algorithms is that the proposed algorithm is not based upon the hypothesis that the sources are statistically independent. Our first step was to investigate algorithm performance on simulated signals which have ground truth. The purpose of this simulation is to illustrate the proposed algorithm's efficacy. The results show that the proposed algorithm has good separation performance. Then, we used the proposed algorithm to separate real EEG signals from a memory study using a revised version of Sternberg Task. The results show that the proposed algorithm can effectively separate the non-brain and brain sources.
Magnetic separation techniques in sample preparation for biological analysis: a review.
He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke
2014-12-01
Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.
Bibliography of articles and reports on mineral-separation techniques, processes, and applications
NASA Technical Reports Server (NTRS)
Harmon, R. S.
1971-01-01
A bibliography of published articles and reports on mineral-separation techniques, processes, and applications is presented along with an author and subject index. This information is intended for use in the mineral-separation facility of the Lunar Receiving Laboratory at the NASA Manned Spacecraft Center and as an aid and reference to persons involved or interested in mineral separation.
Kawai, Kosuke; Huong, Luong Thi Mai
2017-03-01
Proper management of food waste, a major component of municipal solid waste (MSW), is needed, especially in developing Asian countries where most MSW is disposed of in landfill sites without any pretreatment. Source separation can contribute to solving problems derived from the disposal of food waste. An organic waste source separation and collection programme has been operated in model areas in Hanoi, Vietnam, since 2007. This study proposed three key parameters (participation rate, proper separation rate and proper discharge rate) for behaviour related to source separation of household organic waste, and monitored the progress of the programme based on the physical composition of household waste sampled from 558 households in model programme areas of Hanoi. The results showed that 13.8% of 558 households separated organic waste, and 33.0% discharged mixed (unseparated) waste improperly. About 41.5% (by weight) of the waste collected as organic waste was contaminated by inorganic waste, and one-third of the waste disposed of as organic waste by separators was inorganic waste. We proposed six hypothetical future household behaviour scenarios to help local officials identify a final or midterm goal for the programme. We also suggested that the city government take further actions to increase the number of people participating in separating organic waste, improve the accuracy of separation and prevent non-separators from discharging mixed waste improperly.
Powerline noise elimination in biomedical signals via blind source separation and wavelet analysis.
Akwei-Sekyere, Samuel
2015-01-01
The distortion of biomedical signals by powerline noise from recording biomedical devices has the potential to reduce the quality and convolute the interpretations of the data. Usually, powerline noise in biomedical recordings are extinguished via band-stop filters. However, due to the instability of biomedical signals, the distribution of signals filtered out may not be centered at 50/60 Hz. As a result, self-correction methods are needed to optimize the performance of these filters. Since powerline noise is additive in nature, it is intuitive to model powerline noise in a raw recording and subtract it from the raw data in order to obtain a relatively clean signal. This paper proposes a method that utilizes this approach by decomposing the recorded signal and extracting powerline noise via blind source separation and wavelet analysis. The performance of this algorithm was compared with that of a 4th order band-stop Butterworth filter, empirical mode decomposition, independent component analysis and, a combination of empirical mode decomposition with independent component analysis. The proposed method was able to expel sinusoidal signals within powerline noise frequency range with higher fidelity in comparison with the mentioned techniques, especially at low signal-to-noise ratio.
Induced natural convection thermal cycling device
Heung, Leung Kit [Aiken, SC
2002-08-13
A device for separating gases, especially isotopes, by thermal cycling of a separation column using a pressure vessel mounted vertically and having baffled sources for cold and heat. Coils at the top are cooled with a fluid such as liquid nitrogen. Coils at the bottom are either electrical resistance coils or a tubular heat exchange. The sources are shrouded with an insulated "top hat" and simultaneously opened and closed at the outlets to cool or heat the separation column. Alternatively, the sources for cold and heat are mounted separately outside the vessel and an external loop is provided for each circuit.
46 CFR 129.395 - Radio installations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... INSTALLATIONS Power Sources and Distribution Systems § 129.395 Radio installations. A separate circuit, with... radios, if installed, may be powered from a local lighting power source, such as the pilothouse lighting panel, provided each radio power source has a separate overcurrent protection device. ...
46 CFR 129.395 - Radio installations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... INSTALLATIONS Power Sources and Distribution Systems § 129.395 Radio installations. A separate circuit, with... radios, if installed, may be powered from a local lighting power source, such as the pilothouse lighting panel, provided each radio power source has a separate overcurrent protection device. ...
46 CFR 129.395 - Radio installations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... INSTALLATIONS Power Sources and Distribution Systems § 129.395 Radio installations. A separate circuit, with... radios, if installed, may be powered from a local lighting power source, such as the pilothouse lighting panel, provided each radio power source has a separate overcurrent protection device. ...
46 CFR 129.395 - Radio installations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... INSTALLATIONS Power Sources and Distribution Systems § 129.395 Radio installations. A separate circuit, with... radios, if installed, may be powered from a local lighting power source, such as the pilothouse lighting panel, provided each radio power source has a separate overcurrent protection device. ...
Looking inside the microseismic cloud using seismic interferometry
NASA Astrophysics Data System (ADS)
Matzel, E.; Rhode, A.; Morency, C.; Templeton, D. C.; Pyle, M. L.
2015-12-01
Microseismicity provides a direct means of measuring the physical characteristics of active tectonic features such as fault zones. Thousands of microquakes are often associated with an active site. This cloud of microseismicity helps define the tectonically active region. When processed using novel geophysical techniques, we can isolate the energy sensitive to the faulting region, itself. The virtual seismometer method (VSM) is a technique of seismic interferometry that provides precise estimates of the GF between earthquakes. In many ways the converse of ambient noise correlation, it is very sensitive to the source parameters (location, mechanism and magnitude) and to the Earth structure in the source region. In a region with 1000 microseisms, we can calculate roughly 500,000 waveforms sampling the active zone. At the same time, VSM collapses the computation domain down to the size of the cloud of microseismicity, often by 2-3 orders of magnitude. In simple terms VSM involves correlating the waveforms from a pair of events recorded at an individual station and then stacking the results over all stations to obtain the final result. In the far-field, when most of the stations in a network fall along a line between the two events, the result is an estimate of the GF between the two, modified by the source terms. In this geometry each earthquake is effectively a virtual seismometer recording all the others. When applied to microquakes, this alignment is often not met, and we also need to address the effects of the geometry between the two microquakes relative to each seismometer. Nonetheless, the technique is quite robust, and highly sensitive to the microseismic cloud. Using data from the Salton Sea geothermal region, we demonstrate the power of the technique, illustrating our ability to scale the technique from the far-field, where sources are well separated, to the near field where their locations fall within each other's uncertainty ellipse. VSM provides better illumination of the complex subsurface by generating precise, high frequency estimates of the GF and resolution of seismic properties between every pair of events. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344
Review of anhydrous zirconium-hafnium separation techniques. Information circular/1984
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skaggs, R.L.; Rogers, D.T.; Hunter, D.B.
1983-12-01
Sixteen nonaqueous techniques conceived to replace the current aqueous scheme for separating hafnium and zirconium tetrachlorides were reviewed and evaluated by the Bureau of Mines. The methods are divided into two classes: separation by fractional volatilization of the tetrachlorides, which takes advantage of the higher volatility of hafnium tetrachloride; and separation by chemical techniques, based on differences in chemical behavior of the two tetrachlorides. The criteria used to evaluate separation methods were temperature, pressure, separation factor per equilibrium stage, complexity, compatibility with existing technology, and potential for continuous operation. Three processes were selected as being most promising: (1) high-pressure distillation,more » (2) extractive distillation from a molten salt, and (3) preferential reduction of gaseous ZrCl4. Any of the proposed nonaqueous Hf-Zr separation schemes must be supplemented with additional purification to remove trace impurities.« less
Bullard, K M; Hietpas, P B; Ewing, A G
1998-01-01
Polymerase chain reaction (PCR) amplified short tandem repeat (STR) samples from the HUMVWF locus have been analyzed using a unique sample introduction and separation technique. A single capillary is used to transfer samples onto an ultrathin slab gel (57 microm thin). This ultrathin nondenaturing polyacrylamide gel is used to separate the amplified fragments, and laser-induced fluorescence with ethidium bromide is used for detection. The feasibility of performing STR analysis using this system has been investigated by examining the reproducibility for repeated samples. Reproducibility is examined by comparing the migration of the 14 and 17 HUMVWF alleles on three consecutive separations on the ultrathin slab gel. Using one locus, separations match in migration time with the two alleles 42 s apart for each of the three consecutive separations. This technique shows potential to increase sample throughput in STR analysis techniques although separation resolution still needs to be improved.
Improved FastICA algorithm in fMRI data analysis using the sparsity property of the sources.
Ge, Ruiyang; Wang, Yubao; Zhang, Jipeng; Yao, Li; Zhang, Hang; Long, Zhiying
2016-04-01
As a blind source separation technique, independent component analysis (ICA) has many applications in functional magnetic resonance imaging (fMRI). Although either temporal or spatial prior information has been introduced into the constrained ICA and semi-blind ICA methods to improve the performance of ICA in fMRI data analysis, certain types of additional prior information, such as the sparsity, has seldom been added to the ICA algorithms as constraints. In this study, we proposed a SparseFastICA method by adding the source sparsity as a constraint to the FastICA algorithm to improve the performance of the widely used FastICA. The source sparsity is estimated through a smoothed ℓ0 norm method. We performed experimental tests on both simulated data and real fMRI data to investigate the feasibility and robustness of SparseFastICA and made a performance comparison between SparseFastICA, FastICA and Infomax ICA. Results of the simulated and real fMRI data demonstrated the feasibility and robustness of SparseFastICA for the source separation in fMRI data. Both the simulated and real fMRI experimental results showed that SparseFastICA has better robustness to noise and better spatial detection power than FastICA. Although the spatial detection power of SparseFastICA and Infomax did not show significant difference, SparseFastICA had faster computation speed than Infomax. SparseFastICA was comparable to the Infomax algorithm with a faster computation speed. More importantly, SparseFastICA outperformed FastICA in robustness and spatial detection power and can be used to identify more accurate brain networks than FastICA algorithm. Copyright © 2016 Elsevier B.V. All rights reserved.
Improving the Nulling Beamformer Using Subspace Suppression.
Rana, Kunjan D; Hämäläinen, Matti S; Vaina, Lucia M
2018-01-01
Magnetoencephalography (MEG) captures the magnetic fields generated by neuronal current sources with sensors outside the head. In MEG analysis these current sources are estimated from the measured data to identify the locations and time courses of neural activity. Since there is no unique solution to this so-called inverse problem, multiple source estimation techniques have been developed. The nulling beamformer (NB), a modified form of the linearly constrained minimum variance (LCMV) beamformer, is specifically used in the process of inferring interregional interactions and is designed to eliminate shared signal contributions, or cross-talk, between regions of interest (ROIs) that would otherwise interfere with the connectivity analyses. The nulling beamformer applies the truncated singular value decomposition (TSVD) to remove small signal contributions from a ROI to the sensor signals. However, ROIs with strong crosstalk will have high separating power in the weaker components, which may be removed by the TSVD operation. To address this issue we propose a new method, the nulling beamformer with subspace suppression (NBSS). This method, controlled by a tuning parameter, reweights the singular values of the gain matrix mapping from source to sensor space such that components with high overlap are reduced. By doing so, we are able to measure signals between nearby source locations with limited cross-talk interference, allowing for reliable cortical connectivity analysis between them. In two simulations, we demonstrated that NBSS reduces cross-talk while retaining ROIs' signal power, and has higher separating power than both the minimum norm estimate (MNE) and the nulling beamformer without subspace suppression. We also showed that NBSS successfully localized the auditory M100 event-related field in primary auditory cortex, measured from a subject undergoing an auditory localizer task, and suppressed cross-talk in a nearby region in the superior temporal sulcus.
Variational Bayesian Learning for Wavelet Independent Component Analysis
NASA Astrophysics Data System (ADS)
Roussos, E.; Roberts, S.; Daubechies, I.
2005-11-01
In an exploratory approach to data analysis, it is often useful to consider the observations as generated from a set of latent generators or "sources" via a generally unknown mapping. For the noisy overcomplete case, where we have more sources than observations, the problem becomes extremely ill-posed. Solutions to such inverse problems can, in many cases, be achieved by incorporating prior knowledge about the problem, captured in the form of constraints. This setting is a natural candidate for the application of the Bayesian methodology, allowing us to incorporate "soft" constraints in a natural manner. The work described in this paper is mainly driven by problems in functional magnetic resonance imaging of the brain, for the neuro-scientific goal of extracting relevant "maps" from the data. This can be stated as a `blind' source separation problem. Recent experiments in the field of neuroscience show that these maps are sparse, in some appropriate sense. The separation problem can be solved by independent component analysis (ICA), viewed as a technique for seeking sparse components, assuming appropriate distributions for the sources. We derive a hybrid wavelet-ICA model, transforming the signals into a domain where the modeling assumption of sparsity of the coefficients with respect to a dictionary is natural. We follow a graphical modeling formalism, viewing ICA as a probabilistic generative model. We use hierarchical source and mixing models and apply Bayesian inference to the problem. This allows us to perform model selection in order to infer the complexity of the representation, as well as automatic denoising. Since exact inference and learning in such a model is intractable, we follow a variational Bayesian mean-field approach in the conjugate-exponential family of distributions, for efficient unsupervised learning in multi-dimensional settings. The performance of the proposed algorithm is demonstrated on some representative experiments.
Cohen, Michael X
2017-09-27
The number of simultaneously recorded electrodes in neuroscience is steadily increasing, providing new opportunities for understanding brain function, but also new challenges for appropriately dealing with the increase in dimensionality. Multivariate source separation analysis methods have been particularly effective at improving signal-to-noise ratio while reducing the dimensionality of the data and are widely used for cleaning, classifying and source-localizing multichannel neural time series data. Most source separation methods produce a spatial component (that is, a weighted combination of channels to produce one time series); here, this is extended to apply source separation to a time series, with the idea of obtaining a weighted combination of successive time points, such that the weights are optimized to satisfy some criteria. This is achieved via a two-stage source separation procedure, in which an optimal spatial filter is first constructed and then its optimal temporal basis function is computed. This second stage is achieved with a time-delay-embedding matrix, in which additional rows of a matrix are created from time-delayed versions of existing rows. The optimal spatial and temporal weights can be obtained by solving a generalized eigendecomposition of covariance matrices. The method is demonstrated in simulated data and in an empirical electroencephalogram study on theta-band activity during response conflict. Spatiotemporal source separation has several advantages, including defining empirical filters without the need to apply sinusoidal narrowband filters. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Improved Multiple-Species Cyclotron Ion Source
NASA Technical Reports Server (NTRS)
Soli, George A.; Nichols, Donald K.
1990-01-01
Use of pure isotope 86Kr instead of natural krypton in multiple-species ion source enables source to produce krypton ions separated from argon ions by tuning cylcotron with which source used. Addition of capability to produce and separate krypton ions at kinetic energies of 150 to 400 MeV necessary for simulation of worst-case ions occurring in outer space.
Binary Oscillatory Crossflow Electrophoresis
NASA Technical Reports Server (NTRS)
Molloy, Richard F.; Gallagher, Christopher T.; Leighton, David T., Jr.
1996-01-01
We present preliminary results of our implementation of a novel electrophoresis separation technique: Binary Oscillatory Cross flow Electrophoresis (BOCE). The technique utilizes the interaction of two driving forces, an oscillatory electric field and an oscillatory shear flow, to create an active binary filter for the separation of charged species. Analytical and numerical studies have indicated that this technique is capable of separating proteins with electrophoretic mobilities differing by less than 10%. With an experimental device containing a separation chamber 20 cm long, 5 cm wide, and 1 mm thick, an order of magnitude increase in throughput over commercially available electrophoresis devices is theoretically possible.
Sukholthaman, Pitchayanin; Sharp, Alice
2016-06-01
Municipal solid waste has been considered as one of the most immediate and serious problems confronting urban government in most developing and transitional economies. Providing solid waste performance highly depends on the effectiveness of waste collection and transportation process. Generally, this process involves a large amount of expenditures and has very complex and dynamic operational problems. Source separation has a major impact on effectiveness of waste management system as it causes significant changes in quantity and quality of waste reaching final disposal. To evaluate the impact of effective source separation on waste collection and transportation, this study adopts a decision support tool to comprehend cause-and-effect interactions of different variables in waste management system. A system dynamics model that envisages the relationships of source separation and effectiveness of waste management in Bangkok, Thailand is presented. Influential factors that affect waste separation attitudes are addressed; and the result of change in perception on waste separation is explained. The impacts of different separation rates on effectiveness of provided collection service are compared in six scenarios. 'Scenario 5' gives the most promising opportunities as 40% of residents are willing to conduct organic and recyclable waste separation. The results show that better service of waste collection and transportation, less monthly expense, extended landfill life, and satisfactory efficiency of the provided service at 60.48% will be achieved at the end of the simulation period. Implications of how to get public involved and conducted source separation are proposed. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pires, Carlos; Ribeiro, Andreia
2016-04-01
An efficient nonlinear method of statistical source separation of space-distributed non-Gaussian distributed data is proposed. The method relies in the so called Independent Subspace Analysis (ISA), being tested on a long time-series of the stream-function field of an atmospheric quasi-geostrophic 3-level model (QG3) simulating the winter's monthly variability of the Northern Hemisphere. ISA generalizes the Independent Component Analysis (ICA) by looking for multidimensional and minimally dependent, uncorrelated and non-Gaussian distributed statistical sources among the rotated projections or subspaces of the multivariate probability distribution of the leading principal components of the working field whereas ICA restrict to scalar sources. The rationale of that technique relies upon the projection pursuit technique, looking for data projections of enhanced interest. In order to accomplish the decomposition, we maximize measures of the sources' non-Gaussianity by contrast functions which are given by squares of nonlinear, cross-cumulant-based correlations involving the variables spanning the sources. Therefore sources are sought matching certain nonlinear data structures. The maximized contrast function is built in such a way that it provides the minimization of the mean square of the residuals of certain nonlinear regressions. The issuing residuals, followed by spherization, provide a new set of nonlinear variable changes that are at once uncorrelated, quasi-independent and quasi-Gaussian, representing an advantage with respect to the Independent Components (scalar sources) obtained by ICA where the non-Gaussianity is concentrated into the non-Gaussian scalar sources. The new scalar sources obtained by the above process encompass the attractor's curvature thus providing improved nonlinear model indices of the low-frequency atmospheric variability which is useful since large circulation indices are nonlinearly correlated. The non-Gaussian tested sources (dyads and triads, respectively of two and three dimensions) lead to a dense data concentration along certain curves or surfaces, nearby which the clusters' centroids of the joint probability density function tend to be located. That favors a better splitting of the QG3 atmospheric model's weather regimes: the positive and negative phases of the Arctic Oscillation and positive and negative phases of the North Atlantic Oscillation. The leading model's non-Gaussian dyad is associated to a positive correlation between: 1) the squared anomaly of the extratropical jet-stream and 2) the meridional jet-stream meandering. Triadic sources coming from maximized third-order cross cumulants between pairwise uncorrelated components reveal situations of triadic wave resonance and nonlinear triadic teleconnections, only possible thanks to joint non-Gaussianity. That kind of triadic synergies are accounted for an Information-Theoretic measure: the Interaction Information. The dominant model's triad occurs between anomalies of: 1) the North Pole anomaly pressure 2) the jet-stream intensity at the Eastern North-American boundary and 3) the jet-stream intensity at the Eastern Asian boundary. Publication supported by project FCT UID/GEO/50019/2013 - Instituto Dom Luiz.
Applications of seismic spatial wavefield gradient and rotation data in exploration seismology
NASA Astrophysics Data System (ADS)
Schmelzbach, C.; Van Renterghem, C.; Sollberger, D.; Häusler, M.; Robertsson, J. O. A.
2017-12-01
Seismic spatial wavefield gradient and rotation data have the potential to open up new ways to address long-standing problems in land-seismic exploration such as identifying and separating P-, S-, and surface waves. Gradient-based acquisition and processing techniques could enable replacing large arrays of densely spaced receivers by sparse spatially-compact receiver layouts or even one single multicomponent station with dedicated instruments (e.g., rotational seismometers). Such approaches to maximize the information content of single-station recordings are also of significant interest for seismic measurements at sites with limited access such as boreholes, the sea bottom, and extraterrestrial seismology. Arrays of conventional three-component (3C) geophones enable measuring not only the particle velocity in three dimensions but also estimating their spatial gradients. Because the free-surface condition allows to express vertical derivatives in terms of horizontal derivatives, the full gradient tensor and, hence, curl and divergence of the wavefield can be computed. In total, three particle velocity components, three rotational components, and divergence, result seven-component (7C) seismic data. Combined particle velocity and gradient data can be used to isolate the incident P- or S-waves at the land surface or the sea bottom using filtering techniques based on the elastodynamic representation theorem. Alternatively, as only S-waves exhibit rotational motion, rotational measurements can directly be used to identify S-waves. We discuss the derivations of the gradient-based filters as well as their application to synthetic and field data, demonstrating that rotational data can be of particular interest to S-wave reflection and P-to-S-wave conversion imaging. The concept of array-derived gradient estimation can be extended to source arrays as well. Therefore, source arrays allow us to emulate rotational (curl) and dilatational (divergence) sources. Combined with 7C recordings, a total of 49 components of the seismic wavefield can be excited and recorded. Such data potentially allow to further improve wavefield separation and may find application in directional imaging and coherent noise suppression.
Kurtosis Approach for Nonlinear Blind Source Separation
NASA Technical Reports Server (NTRS)
Duong, Vu A.; Stubbemd, Allen R.
2005-01-01
In this paper, we introduce a new algorithm for blind source signal separation for post-nonlinear mixtures. The mixtures are assumed to be linearly mixed from unknown sources first and then distorted by memoryless nonlinear functions. The nonlinear functions are assumed to be smooth and can be approximated by polynomials. Both the coefficients of the unknown mixing matrix and the coefficients of the approximated polynomials are estimated by the gradient descent method conditional on the higher order statistical requirements. The results of simulation experiments presented in this paper demonstrate the validity and usefulness of our approach for nonlinear blind source signal separation.
Synchronization Tomography: Modeling and Exploring Complex Brain Dynamics
NASA Astrophysics Data System (ADS)
Fieseler, Thomas
2002-03-01
Phase synchronization (PS) plays an important role both under physiological and pathological conditions. With standard averaging techniques of MEG data, it is difficult to reliably detect cortico-cortical and cortico-muscular PS processes that are not time-locked to an external stimulus. For this reason, novel synchronization analysis techniques were developed and directly applied to MEG signals. Of course, due to the lack of an inverse modeling (i.e. source localization), the spatial resolution of this approach was limited. To detect and localize cerebral PS, we here present the synchronization tomography (ST): For this, we first estimate the cerebral current source density by means of the magnetic field tomography (MFT). We then apply the single-run PS analysis to the current source density in each voxel of the reconstruction space. In this way we study simulated PS, voxel by voxel in order to determine the spatio-temporal resolution of the ST. To this end different generators of ongoing rhythmic cerebral activity are simulated by current dipoles at different locations and directions, which are modeled by slightly detuned chaotic oscillators. MEG signals for these generators are simulated for a spherical head model and a whole-head MEG system. MFT current density solutions are calculated from these simulated signals within a hemispherical source space. We compare the spatial resolution of the ST with that of the MFT. Our results show that adjacent sources which are indistinguishable for the MFT, can nevertheless be separated with the ST, provided they are not strongly phase synchronized. This clearly demonstrates the potential of combining spatial information (i.e. source localization) with temporal information for the anatomical localization of phase synchronization in the human brain.
King, Brendon; Fanok, Stella; Phillips, Renae; Swaffer, Brooke
2015-01-01
Cryptosporidium continues to be problematic for the water industry, with risk assessments often indicating that treatment barriers may fail under extreme conditions. However, risk analyses have historically used oocyst densities and not considered either oocyst infectivity or species/genotype, which can result in an overestimation of risk if the oocysts are not human infective. We describe an integrated assay for determining oocyst density, infectivity, and genotype from a single-sample concentrate, an important advance that overcomes the need for processing multiple-grab samples or splitting sample concentrates for separate analyses. The assay incorporates an oocyst recovery control and is compatible with standard primary concentration techniques. Oocysts were purified from primary concentrates using immunomagnetic separation prior to processing by an infectivity assay. Plate-based cell culture was used to detect infectious foci, with a monolayer washing protocol developed to allow recovery and enumeration of oocysts. A simple DNA extraction protocol was developed to allow typing of any wells containing infectious Cryptosporidium. Water samples from a variety of source water and wastewater matrices, including a semirural catchment, wastewater, an aquifer recharge site, and storm water, were analyzed using the assay. Results demonstrate that the assay can reliably determine oocyst densities, infectivity, and genotype from single-grab samples for a variety of water matrices and emphasize the varying nature of Cryptosporidium risk extant throughout source waters and wastewaters. This assay should therefore enable a more comprehensive understanding of Cryptosporidium risk for different water sources, assisting in the selection of appropriate risk mitigation measures. PMID:25769833
King, Brendon; Fanok, Stella; Phillips, Renae; Swaffer, Brooke; Monis, Paul
2015-05-15
Cryptosporidium continues to be problematic for the water industry, with risk assessments often indicating that treatment barriers may fail under extreme conditions. However, risk analyses have historically used oocyst densities and not considered either oocyst infectivity or species/genotype, which can result in an overestimation of risk if the oocysts are not human infective. We describe an integrated assay for determining oocyst density, infectivity, and genotype from a single-sample concentrate, an important advance that overcomes the need for processing multiple-grab samples or splitting sample concentrates for separate analyses. The assay incorporates an oocyst recovery control and is compatible with standard primary concentration techniques. Oocysts were purified from primary concentrates using immunomagnetic separation prior to processing by an infectivity assay. Plate-based cell culture was used to detect infectious foci, with a monolayer washing protocol developed to allow recovery and enumeration of oocysts. A simple DNA extraction protocol was developed to allow typing of any wells containing infectious Cryptosporidium. Water samples from a variety of source water and wastewater matrices, including a semirural catchment, wastewater, an aquifer recharge site, and storm water, were analyzed using the assay. Results demonstrate that the assay can reliably determine oocyst densities, infectivity, and genotype from single-grab samples for a variety of water matrices and emphasize the varying nature of Cryptosporidium risk extant throughout source waters and wastewaters. This assay should therefore enable a more comprehensive understanding of Cryptosporidium risk for different water sources, assisting in the selection of appropriate risk mitigation measures. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
NASA Astrophysics Data System (ADS)
Ostrovski, Fernanda; McMahon, Richard G.; Connolly, Andrew J.; Lemon, Cameron A.; Auger, Matthew W.; Banerji, Manda; Hung, Johnathan M.; Koposov, Sergey E.; Lidman, Christopher E.; Reed, Sophie L.; Allam, Sahar; Benoit-Lévy, Aurélien; Bertin, Emmanuel; Brooks, David; Buckley-Geer, Elizabeth; Carnero Rosell, Aurelio; Carrasco Kind, Matias; Carretero, Jorge; Cunha, Carlos E.; da Costa, Luiz N.; Desai, Shantanu; Diehl, H. Thomas; Dietrich, Jörg P.; Evrard, August E.; Finley, David A.; Flaugher, Brenna; Fosalba, Pablo; Frieman, Josh; Gerdes, David W.; Goldstein, Daniel A.; Gruen, Daniel; Gruendl, Robert A.; Gutierrez, Gaston; Honscheid, Klaus; James, David J.; Kuehn, Kyler; Kuropatkin, Nikolay; Lima, Marcos; Lin, Huan; Maia, Marcio A. G.; Marshall, Jennifer L.; Martini, Paul; Melchior, Peter; Miquel, Ramon; Ogando, Ricardo; Plazas Malagón, Andrés; Reil, Kevin; Romer, Kathy; Sanchez, Eusebio; Santiago, Basilio; Scarpine, Vic; Sevilla-Noarbe, Ignacio; Soares-Santos, Marcelle; Sobreira, Flavia; Suchyta, Eric; Tarle, Gregory; Thomas, Daniel; Tucker, Douglas L.; Walker, Alistair R.
2017-03-01
We present the discovery and preliminary characterization of a gravitationally lensed quasar with a source redshift zs = 2.74 and image separation of 2.9 arcsec lensed by a foreground zl = 0.40 elliptical galaxy. Since optical observations of gravitationally lensed quasars show the lens system as a superposition of multiple point sources and a foreground lensing galaxy, we have developed a morphology-independent multi-wavelength approach to the photometric selection of lensed quasar candidates based on Gaussian Mixture Models (GMM) supervised machine learning. Using this technique and gi multicolour photometric observations from the Dark Energy Survey (DES), near-IR JK photometry from the VISTA Hemisphere Survey (VHS) and WISE mid-IR photometry, we have identified a candidate system with two catalogue components with IAB = 18.61 and IAB = 20.44 comprising an elliptical galaxy and two blue point sources. Spectroscopic follow-up with NTT and the use of an archival AAT spectrum show that the point sources can be identified as a lensed quasar with an emission line redshift of z = 2.739 ± 0.003 and a foreground early-type galaxy with z = 0.400 ± 0.002. We model the system as a single isothermal ellipsoid and find the Einstein radius θE ˜ 1.47 arcsec, enclosed mass Menc ˜ 4 × 1011 M⊙ and a time delay of ˜52 d. The relatively wide separation, month scale time delay duration and high redshift make this an ideal system for constraining the expansion rate beyond a redshift of 1.
Separation of GRACE geoid time-variations using Independent Component Analysis
NASA Astrophysics Data System (ADS)
Frappart, F.; Ramillien, G.; Maisongrande, P.; Bonnet, M.
2009-12-01
Independent Component Analysis (ICA) is a blind separation method based on the simple assumptions of the independence of the sources and the non-Gaussianity of the observations. An approach based on this numerical method is used here to extract hydrological signals over land and oceans from the polluting striping noise due to orbit repetitiveness and present in the GRACE global mass anomalies. We took advantage of the availability of monthly Level-2 solutions from three official providers (i.e., CSR, JPL and GFZ) that can be considered as different observations of the same phenomenon. The efficiency of the methodology is first demonstrated on a synthetic case. Applied to one month of GRACE solutions, it allows to clearly separate the total water storage change from the meridional-oriented spurious gravity signals on the continents but not on the oceans. This technique gives results equivalent as the destriping method for continental water storage for the hydrological patterns with less smoothing. This methodology is then used to filter the complete series of the 2002-2009 GRACE solutions.
Experimental investigation of differential confinement effects in a rotating helicon plasma
NASA Astrophysics Data System (ADS)
Gueroult, Renaud; Evans, Eugene; Zweben, Stewart J.; Fisch, Nathaniel J.; Levinton, Fred
2014-10-01
Although plasmas have long been considered for isotope separation, challenges presented by nuclear waste remediation and nuclear spent fuel reprocessing have recently sparked a renewed interest for high-throughput plasma based mass separation techniques. Different filter concepts relying on rotating plasmas have been proposed to address these needs. However, one of the challenges common to these concepts is the need to control the plasma rotation profile, which is generally assumed to be provided by means of dedicated electrodes. An experimental effort aiming to evaluate the practicality of these plasma filter concepts has recently been started at PPPL. For this purpose, a linear helicon plasma source is used in combination with concentric ring electrodes. Preliminary biasing experiments results indicate floating potential profiles locally suitable for mass discrimination for different gas mixtures (Ar/Ne, Ar/N2, Ar/Kr). Radially resolved spectroscopic measurements and neutral gas composition analysis at two different axial positions are being planned to assess the mass separation effect. Work supported by US DOE under Contract No. DE-AC02-09CH11466.
Phansalkar, Rasika S; Nam, Joo-Won; Chen, Shao-Nong; McAlpine, James B; Leme, Ariene A; Aydin, Berdan; Bedran-Russo, Ana-Karina; Pauli, Guido F
2018-02-02
Proanthocyanidins (PACs) find wide applications for human use including food, cosmetics, dietary supplements, and pharmaceuticals. The chemical complexity associated with PACs has triggered the development of various chromatographic techniques, with countercurrent separation (CCS) gaining in popularity. This study applied the recently developed DESIGNER (Depletion and Enrichment of Select Ingredients Generating Normalized Extract Resources) approach for the selective enrichment of trimeric and tetrameric PACs using centrifugal partition chromatography (CPC). This CPC method aims at developing PAC based biomaterials, particularly for their application in restoring and repairing dental hard tissue. A general separation scheme beginning with the depletion of polymeric PACs, followed by the removal of monomeric flavan-3-ols and a final enrichment step produced PAC trimer and tetramer enriched fractions. A successful application of this separation scheme is demonstrated for four polyphenol rich plant sources: grape seeds, pine bark, cinnamon bark, and cocoa seeds. Minor modifications to the generic DESIGNER CCS method were sufficient to accommodate the varying chemical complexities of the individual source materials. The step-wise enrichment of PAC trimers and tetramers was monitored using normal phase TLC and Diol-HPLC-UV analyses. CPC proved to be a reliable tool for the selective enrichment of medium size oligomeric PACs (OPACs). This method plays a key role in the development of dental biomaterials considering its reliability and reproducibility, as well as its scale-up capabilities for possible larger-scale manufacturing. Copyright © 2017 Elsevier B.V. All rights reserved.
Recovery of biotechnological products using aqueous two phase systems.
Phong, Win Nee; Show, Pau Loke; Chow, Yin Hui; Ling, Tau Chuan
2018-04-16
Aqueous two-phase system (ATPS) has been suggested as a promising separation tool in the biotechnological industry. This liquid-liquid extraction technique represents an interesting advance in downstream processing due to several advantages such as simplicity, rapid separation, efficiency, economy, flexibility and biocompatibility. Up to date, a range of biotechnological products have been successfully recovered from different sources with high yield using ATPS-based strategy. In view of the important potential contribution of the ATPS in downstream processing, this review article aims to provide latest information about the application of ATPS in the recovery of various biotechnological products in the past 7 years (2010-2017). Apart from that, the challenges as well as the possible future work and outlook of the ATPS-based recovery method have also been presented in this review article. Copyright © 2018 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Supramolecular complexation for environmental control.
Albelda, M Teresa; Frías, Juan C; García-España, Enrique; Schneider, Hans-Jörg
2012-05-21
Supramolecular complexes offer a new and efficient way for the monitoring and removal of many substances emanating from technical processes, fertilization, plant and animal protection, or e.g. chemotherapy. Such pollutants range from toxic or radioactive metal ions and anions to chemical side products, herbicides, pesticides to drugs including steroids, and include degradation products from natural sources. The applications involve usually fast and reversible complex formation, due to prevailing non-covalent interactions. This is of importance for sensing as well as for separation techniques, where the often expensive host compounds can then be reused almost indefinitely. Immobilization of host compounds, e.g. on exchange resins or on membranes, and their implementation in smart new materials hold particular promise. The review illustrates how the design of suitable host compounds in combination with modern sensing and separation methods can contribute to solve some of the biggest problems facing chemistry, which arise from the everyday increasing pollution of the environment.
Local determination of thin liquid film profiles using colour interferometry.
Butler, Calum S; Seeger, Zoe L E; Bell, Toby D M; Bishop, Alexis I; Tabor, Rico F
2016-02-01
We explore theoretically the interference of white light between two interfaces as a function of the optical conditions, using separately: a) idealised conditions where the light is composed of three discrete wavelengths; b) a more typically experimentally realisable case where light comprises a sum of three Gaussian wavelength distributions; and c) unfiltered white light from a broadband source comprising a broad distribution of wavelengths. It is demonstrated that the latter case is not only optically simple to arrange, but also provides unambiguous absolute separation information over the range 0-1μm --a useful range in studies of cell adhesion, thin liquid films and lubrication-- when coupled to detection using a typical colour camera. The utility of this technique is verified experimentally by exploring the air film between a cylinder and surface, as well as arbitrary liquid films beneath air bubbles that are interacting with solid surfaces.
NASA Astrophysics Data System (ADS)
Bocher, Thomas; Beuthan, Juergen; Scheller, M.; Hopf, Juergen U. G.; Linnarz, Marietta; Naber, Rolf-Dieter; Minet, Olaf; Becker, Wolfgang; Mueller, Gerhard J.
1995-12-01
Conventional laser-induced fluorescence spectroscopy (LIFS) of endogenous chromophores like NADH (Nicotineamide Adenine Dinucleotide, reduced form) and PP IX (Protoporphyrin IX) provides information about the relative amounts of these metabolites in the observed cells. But for diagnostic applications the concentrations of these chromophores have to be determined quantitatively to establish tissue-independent differentiation criterions. It is well- known that the individually and locally varying optical tissue parameters are major obstacles for the determination of the true chromophore concentrations by simple fluorescence spectroscopy. To overcome these problems a fiber-based, 2-channel technique including a rescaled NADH-channel (delivering quantitative values) and a relative PP IX-channel was developed. Using the accumulated information of both channels can provide good tissue state separation. Ex-vivo studies with resected and frozen samples (with LN2) of squamous cells in the histologically confirmed states: normal, tumor border, inflammation and hyperplasia were performed. Each state was represented in this series with at least 7 samples. At the identical tissue spot both, the rescaled NADH-fluorescence and the relative PP IX- fluorescence, were determined. In the first case a nitrogen laser (337 nm, 500 ps, 200 microjoule, 10 Hz) in the latter case a diode laser (633 nm, 15 mW, cw) were used as excitation sources. In this ex-vivo study a good separation between the different tissue states was achieved. With a device constructed for clinical usage one quantitative, in-vivo NADH- measurement was done recently showing similar separation capabilities.
Systematic study of target localization for bioluminescence tomography guided radiation therapy
Yu, Jingjing; Zhang, Bin; Iordachita, Iulian I.; Reyes, Juvenal; Lu, Zhihao; Brock, Malcolm V.; Patterson, Michael S.; Wong, John W.
2016-01-01
Purpose: To overcome the limitation of CT/cone-beam CT (CBCT) in guiding radiation for soft tissue targets, the authors developed a spectrally resolved bioluminescence tomography (BLT) system for the small animal radiation research platform. The authors systematically assessed the performance of the BLT system in terms of target localization and the ability to resolve two neighboring sources in simulations, tissue-mimicking phantom, and in vivo environments. Methods: Multispectral measurements acquired in a single projection were used for the BLT reconstruction. The incomplete variables truncated conjugate gradient algorithm with an iterative permissible region shrinking strategy was employed as the optimization scheme to reconstruct source distributions. Simulation studies were conducted for single spherical sources with sizes from 0.5 to 3 mm radius at depth of 3–12 mm. The same configuration was also applied for the double source simulation with source separations varying from 3 to 9 mm. Experiments were performed in a standalone BLT/CBCT system. Two self-illuminated sources with 3 and 4.7 mm separations placed inside a tissue-mimicking phantom were chosen as the test cases. Live mice implanted with single-source at 6 and 9 mm depth, two sources at 3 and 5 mm separation at depth of 5 mm, or three sources in the abdomen were also used to illustrate the localization capability of the BLT system for multiple targets in vivo. Results: For simulation study, approximate 1 mm accuracy can be achieved at localizing center of mass (CoM) for single-source and grouped CoM for double source cases. For the case of 1.5 mm radius source, a common tumor size used in preclinical study, their simulation shows that for all the source separations considered, except for the 3 mm separation at 9 and 12 mm depth, the two neighboring sources can be resolved at depths from 3 to 12 mm. Phantom experiments illustrated that 2D bioluminescence imaging failed to distinguish two sources, but BLT can provide 3D source localization with approximately 1 mm accuracy. The in vivo results are encouraging that 1 and 1.7 mm accuracy can be attained for the single-source case at 6 and 9 mm depth, respectively. For the 2 sources in vivo study, both sources can be distinguished at 3 and 5 mm separations, and approximately 1 mm localization accuracy can also be achieved. Conclusions: This study demonstrated that their multispectral BLT/CBCT system could be potentially applied to localize and resolve multiple sources at wide range of source sizes, depths, and separations. The average accuracy of localizing CoM for single-source and grouped CoM for double sources is approximately 1 mm except deep-seated target. The information provided in this study can be instructive to devise treatment margins for BLT-guided irradiation. These results also suggest that the 3D BLT system could guide radiation for the situation with multiple targets, such as metastatic tumor models. PMID:27147371
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patil, S.B.; Srivastava, P.; Mishra, S.K.
2013-07-01
Radioactive waste management is a vital aspect of any nuclear program. The commercial feasibility of the nuclear program largely depends on the efficiency of the waste management techniques. One of such techniques is the separation of high yield radio-nuclides from the waste and making it suitable for medical and industrial applications. This will give societal benefit in addition to revenue generation. Co-60, the isotope presently being used for medical applications, needs frequent replacement because of its short half life. Cs-137, the major constituent of the nuclear waste, is a suitable substitute for Co-60 as a radioactive source because of itsmore » longer half life (28 years). Indian nuclear waste management program has given special emphasis on utilization of Cs-137 for such applications. In view of this a demonstration facility has been designed for vitrification of Cs-137 in borosilicate glass, cast in stainless steel pencils, to be used as source pencils of 300 Ci strength for blood irradiation. An induction heated metallic melter of suitable capacity has been custom designed for the application and employed for the Cs-137 pencil fabrication facility. This article describes various systems, design features, experiments and resulting modifications, observations and remote handling features necessary for the actual operation of such facility. The layout of the facility has been planned in such a way that the same can be adopted in a hot cell for commercial production of source pencils. (authors)« less
Separation techniques: Chromatography
Coskun, Ozlem
2016-01-01
Chromatography is an important biophysical technique that enables the separation, identification, and purification of the components of a mixture for qualitative and quantitative analysis. Proteins can be purified based on characteristics such as size and shape, total charge, hydrophobic groups present on the surface, and binding capacity with the stationary phase. Four separation techniques based on molecular characteristics and interaction type use mechanisms of ion exchange, surface adsorption, partition, and size exclusion. Other chromatography techniques are based on the stationary bed, including column, thin layer, and paper chromatography. Column chromatography is one of the most common methods of protein purification. PMID:28058406
Using dynamic mode decomposition for real-time background/foreground separation in video
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kutz, Jose Nathan; Grosek, Jacob; Brunton, Steven
The technique of dynamic mode decomposition (DMD) is disclosed herein for the purpose of robustly separating video frames into background (low-rank) and foreground (sparse) components in real-time. Foreground/background separation is achieved at the computational cost of just one singular value decomposition (SVD) and one linear equation solve, thus producing results orders of magnitude faster than robust principal component analysis (RPCA). Additional techniques, including techniques for analyzing the video for multi-resolution time-scale components, and techniques for reusing computations to allow processing of streaming video in real time, are also described herein.
Application of separable parameter space techniques to multi-tracer PET compartment modeling.
Zhang, Jeff L; Michael Morey, A; Kadrmas, Dan J
2016-02-07
Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg-Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models.
Application of separable parameter space techniques to multi-tracer PET compartment modeling
NASA Astrophysics Data System (ADS)
Zhang, Jeff L.; Morey, A. Michael; Kadrmas, Dan J.
2016-02-01
Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg-Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models.
Yuan, Yalin; Yabe, Mitsuyasu
2014-01-01
A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly. PMID:25546279
Dose rate calculations around 192Ir brachytherapy sources using a Sievert integration model
NASA Astrophysics Data System (ADS)
Karaiskos, P.; Angelopoulos, A.; Baras, P.; Rozaki-Mavrouli, H.; Sandilos, P.; Vlachos, L.; Sakelliou, L.
2000-02-01
The classical Sievert integral method is a valuable tool for dose rate calculations around brachytherapy sources, combining simplicity with reasonable computational times. However, its accuracy in predicting dose rate anisotropy around 192 Ir brachytherapy sources has been repeatedly put into question. In this work, we used a primary and scatter separation technique to improve an existing modification of the Sievert integral (Williamson's isotropic scatter model) that determines dose rate anisotropy around commercially available 192 Ir brachytherapy sources. The proposed Sievert formalism provides increased accuracy while maintaining the simplicity and computational time efficiency of the Sievert integral method. To describe transmission within the materials encountered, the formalism makes use of narrow beam attenuation coefficients which can be directly and easily calculated from the initially emitted 192 Ir spectrum. The other numerical parameters required for its implementation, once calculated with the aid of our home-made Monte Carlo simulation code, can be used for any 192 Ir source design. Calculations of dose rate and anisotropy functions with the proposed Sievert expression, around commonly used 192 Ir high dose rate sources and other 192 Ir elongated source designs, are in good agreement with corresponding accurate Monte Carlo results which have been reported by our group and other authors.
Fire extinguishment in oxygen enriched atmospheres
NASA Technical Reports Server (NTRS)
Robertson, A. F.; Rappaport, M. W.
1973-01-01
Current state-of-the-art of fire suppression and extinguishment techniques in oxygen enriched atmosphere is reviewed. Four classes of extinguishment action are considered: cooling, separation of reactants, dilution or removal of fuel, and use of chemically reactive agents. Current practice seems to show preference for very fast acting water spray applications to all interior surfaces of earth-based chambers. In space, reliance has been placed on fire prevention methods through the removal of ignition sources and use of nonflammable materials. Recommendations are made for further work related to fire suppression and extinguishment in oxygen enriched atmospheres, and an extensive bibliography is appended.
Aligned and Unaligned Coherence: A New Diagnostic Tool
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2006-01-01
The study of combustion noise from turbofan engines has become important again as the noise from other sources like the fan and jet are reduced. A method has been developed to help identify combustion noise spectra using an aligned and unaligned coherence technique. When used with the well known three signal coherent power method and coherent power method it provides new information by separating tonal information from random process information. Examples are presented showing the underlying tonal structure which is buried under broadband noise and jet noise. The method is applied to data from a Pratt and Whitney PW4098 turbofan engine.
Method and apparatus for detecting and measuring trace impurities in flowing gases
Taylor, Gene W.; Dowdy, Edward J.
1979-01-01
Trace impurities in flowing gases may be detected and measured by a dynamic atomic molecular emission spectrograph utilizing as its energy source the energy transfer reactions of metastable species, atomic or molecular, with the impurities in the flowing gas. An electronically metastable species which maintains a stable afterglow is formed and mixed with the flowing gas in a region downstream from and separate from the region in which the metastable species is formed. Impurity levels are determined quantitatively by the measurement of line and/or band intensity as a function of concentration employing emission spectroscopic techniques.
Inverse boundary-layer theory and comparison with experiment
NASA Technical Reports Server (NTRS)
Carter, J. E.
1978-01-01
Inverse boundary layer computational procedures, which permit nonsingular solutions at separation and reattachment, are presented. In the first technique, which is for incompressible flow, the displacement thickness is prescribed; in the second technique, for compressible flow, a perturbation mass flow is the prescribed condition. The pressure is deduced implicitly along with the solution in each of these techniques. Laminar and turbulent computations, which are typical of separated flow, are presented and comparisons are made with experimental data. In both inverse procedures, finite difference techniques are used along with Newton iteration. The resulting procedure is no more complicated than conventional boundary layer computations. These separated boundary layer techniques appear to be well suited for complete viscous-inviscid interaction computations.
Multiple Cosmic Sources for Meteorite Macromolecules?
Watson, Jonathan S.; Meredith, William; Love, Gordon D.; Gilmour, Iain; Snape, Colin E.
2015-01-01
Abstract The major organic component in carbonaceous meteorites is an organic macromolecular material. The Murchison macromolecular material comprises aromatic units connected by aliphatic and heteroatom-containing linkages or occluded within the wider structure. The macromolecular material source environment remains elusive. Traditionally, attempts to determine source have strived to identify a single environment. Here, we apply a highly efficient hydrogenolysis method to liberate units from the macromolecular material and use mass spectrometric techniques to determine their chemical structures and individual stable carbon isotope ratios. We confirm that the macromolecular material comprises a labile fraction with small aromatic units enriched in 13C and a refractory fraction made up of large aromatic units depleted in 13C. Our findings suggest that the macromolecular material may be derived from at least two separate environments. Compound-specific carbon isotope trends for aromatic compounds with carbon number may reflect mixing of the two sources. The story of the quantitatively dominant macromolecular material in meteorites appears to be made up of more than one chapter. Key Words: Abiotic organic synthesis—Carbonaceous chondrite—Cosmochemistry—Meteorites. Astrobiology 15, 779–786. PMID:26418568
Locating sources within a dense sensor array using graph clustering
NASA Astrophysics Data System (ADS)
Gerstoft, P.; Riahi, N.
2017-12-01
We develop a model-free technique to identify weak sources within dense sensor arrays using graph clustering. No knowledge about the propagation medium is needed except that signal strengths decay to insignificant levels within a scale that is shorter than the aperture. We then reinterpret the spatial coherence matrix of a wave field as a matrix whose support is a connectivity matrix of a graph with sensors as vertices. In a dense network, well-separated sources induce clusters in this graph. The geographic spread of these clusters can serve to localize the sources. The support of the covariance matrix is estimated from limited-time data using a hypothesis test with a robust phase-only coherence test statistic combined with a physical distance criterion. The latter criterion ensures graph sparsity and thus prevents clusters from forming by chance. We verify the approach and quantify its reliability on a simulated dataset. The method is then applied to data from a dense 5200 element geophone array that blanketed of the city of Long Beach (CA). The analysis exposes a helicopter traversing the array and oil production facilities.
Ceccuzzi, Silvio; Jandieri, Vakhtang; Baccarelli, Paolo; Ponti, Cristina; Schettini, Giuseppe
2016-04-01
Comparison of the beam-shaping effect of a field radiated by a line source, when an ideal infinite structure constituted by two photonic crystals and an actual finite one are considered, has been carried out by means of two different methods. The lattice sums technique combined with the generalized reflection matrix method is used to rigorously investigate the radiation from the infinite photonic crystals, whereas radiation from crystals composed of a finite number of rods along the layers is analyzed using the cylindrical-wave approach. A directive radiation is observed with the line source embedded in the structure. With an increased separation distance between the crystals, a significant edge diffraction appears that provides the main radiation mechanism in the finite layout. Suitable absorbers are implemented to reduce the above-mentioned diffraction and the reflections at the boundaries, thus obtaining good agreement between radiation patterns of a localized line source coupled to finite and infinite photonic crystals, when the number of periods of the finite structure is properly chosen.
NASA Astrophysics Data System (ADS)
Zwack, Leonard M.; Paciorek, Christopher J.; Spengler, John D.; Levy, Jonathan I.
2011-05-01
Traffic within urban street canyons can contribute significantly to ambient concentrations of particulate air pollution. In these settings, it is challenging to separate within-canyon source contributions from urban and regional background concentrations given the highly variable and complex emissions and dispersion characteristics. In this study, we used continuous mobile monitoring of traffic-related particulate air pollutants to assess the contribution to concentrations, above background, of traffic in the street canyons of midtown Manhattan. Concentrations of both ultrafine particles (UFP) and fine particles (PM 2.5) were measured at street level using portable instruments. Statistical modeling techniques accounting for autocorrelation were used to investigate the presence of spatial heterogeneity of pollutant concentrations as well as to quantify the contribution of within-canyon traffic sources. Measurements were also made within Central Park, to examine the impact of offsets from major roadways in this urban environment. On average, an approximate 11% increase in concentrations of UFP and 8% increase in concentrations of PM 2.5 over urban background was estimated during high-traffic periods in street canyons as opposed to low traffic periods. Estimates were 8% and 5%, respectively, after accounting for temporal autocorrelation. Within Central Park, concentrations were 40% higher than background (5% after accounting for temporal autocorrelation) within the first 100 m from the nearest roadway for UFP, with a smaller but statistically significant increase for PM 2.5. Our findings demonstrate the viability of a mobile monitoring protocol coupled with spatiotemporal modeling techniques in characterizing local source contributions in a setting with street canyons.
Ion current detector for high pressure ion sources for monitoring separations
Smith, R.D.; Wahl, J.H.; Hofstadler, S.A.
1996-08-13
The present invention relates generally to any application involving the monitoring of signal arising from ions produced by electrospray or other high pressure (>100 torr) ion sources. The present invention relates specifically to an apparatus and method for the detection of ions emitted from a capillary electrophoresis (CE) system, liquid chromatography, or other small-scale separation methods. And further, the invention provides a very simple diagnostic as to the quality of the separation and the operation of an electrospray source. 7 figs.
Ion current detector for high pressure ion sources for monitoring separations
Smith, Richard D.; Wahl, Jon H.; Hofstadler, Steven A.
1996-01-01
The present invention relates generally to any application involving the monitoring of signal arising from ions produced by electrospray or other high pressure (>100 torr) ion sources. The present invention relates specifically to an apparatus and method for the detection of ions emitted from a capillary electrophoresis (CE) system, liquid chromatography, or other small-scale separation methods. And further, the invention provides a very simple diagnostic as to the quality of the separation and the operation of an electrospray source.
High-accuracy peak picking of proteomics data using wavelet techniques.
Lange, Eva; Gröpl, Clemens; Reinert, Knut; Kohlbacher, Oliver; Hildebrandt, Andreas
2006-01-01
A new peak picking algorithm for the analysis of mass spectrometric (MS) data is presented. It is independent of the underlying machine or ionization method, and is able to resolve highly convoluted and asymmetric signals. The method uses the multiscale nature of spectrometric data by first detecting the mass peaks in the wavelet-transformed signal before a given asymmetric peak function is fitted to the raw data. In an optional third stage, the resulting fit can be further improved using techniques from nonlinear optimization. In contrast to currently established techniques (e.g. SNAP, Apex) our algorithm is able to separate overlapping peaks of multiply charged peptides in ESI-MS data of low resolution. Its improved accuracy with respect to peak positions makes it a valuable preprocessing method for MS-based identification and quantification experiments. The method has been validated on a number of different annotated test cases, where it compares favorably in both runtime and accuracy with currently established techniques. An implementation of the algorithm is freely available in our open source framework OpenMS.
Fehr, M
2014-09-01
Business opportunities in the household waste sector in emerging economies still evolve around the activities of bulk collection and tipping with an open material balance. This research, conducted in Brazil, pursued the objective of shifting opportunities from tipping to reverse logistics in order to close the balance. To do this, it illustrated how specific knowledge of sorted waste composition and reverse logistics operations can be used to determine realistic temporal and quantitative landfill diversion targets in an emerging economy context. Experimentation constructed and confirmed the recycling trilogy that consists of source separation, collection infrastructure and reverse logistics. The study on source separation demonstrated the vital difference between raw and sorted waste compositions. Raw waste contained 70% biodegradable and 30% inert matter. Source separation produced 47% biodegradable, 20% inert and 33% mixed material. The study on collection infrastructure developed the necessary receiving facilities. The study on reverse logistics identified private operators capable of collecting and processing all separated inert items. Recycling activities for biodegradable material were scarce and erratic. Only farmers would take the material as animal feed. No composting initiatives existed. The management challenge was identified as stimulating these activities in order to complete the trilogy and divert the 47% source-separated biodegradable discards from the landfills. © The Author(s) 2014.
Dual-band frequency selective surface with large band separation and stable performance
NASA Astrophysics Data System (ADS)
Zhou, Hang; Qu, Shao-Bo; Peng, Wei-Dong; Lin, Bao-Qin; Wang, Jia-Fu; Ma, Hua; Zhang, Jie-Qiu; Bai, Peng; Wang, Xu-Hua; Xu, Zhuo
2012-05-01
A new technique of designing a dual-band frequency selective surface with large band separation is presented. This technique is based on a delicately designed topology of L- and Ku-band microwave filters. The two band-pass responses are generated by a capacitively-loaded square-loop frequency selective surface and an aperture-coupled frequency selective surface, respectively. A Faraday cage is located between the two frequency selective surface structures to eliminate undesired couplings. Based on this technique, a dual-band frequency selective surface with large band separation is designed, which possesses large band separation, high selectivity, and stable performance under various incident angles and different polarizations.
NASA Technical Reports Server (NTRS)
Misra, Prabhakar; She, Yong-Bo; Zhu, Xin-Ming; King, Michael
1997-01-01
Combustion studies under both normal gravity and microgravity conditions depend a great deal on the availability and quality of the diagnostic systems used for such investigations. Microgravity phenomena are specially susceptible to even small perturbations and therefore non-intrusive diagnostic techniques are of paramount importance for successful understanding of reduced-gravity combustion phenomena. Several non-intrusive diagnostic techniques are available for probing and delineating normal as well as reduced gravity combustion processes, such as Rayleigh scattering, Raman scattering, Mie scattering, velocimetry, interferometric and Schlieren techniques, emission and laser-induced fluorescence (LIF) spectroscopy. Our approach is to use the LIF technique as a non-intrusive diagnostic tool for the study of combustion-associated free radicals and use the concomitant optogalvanic transitions to accomplish precise calibration of the laser wavelengths used for recording the excitation spectra of transient molecular species. In attempting to perform spectroscopic measurements on chemical intermediates, we have used conventional laser sources as well as new and novel platforms employing rare-earth doped solid-state lasers. Conventional (commercially available) sources of tunable UV laser radiation are extremely cumbersome and energy-consuming devices that are not very suitable for either in-space or in-flight (or microgravity drop tower) experiments. Traditional LIF sources of tunable UV laser radiation involve in addition to a pump laser (usually a Nd:YAG laser with an attached frequency-doubling stage), a tunable dye laser. In turn, the dye laser has to be provided with a dye circulation system and a subsequent stage for frequency-doubling of the dye laser radiation, together with a servo-tuning system (termed the 'Autotracker') to follow the wavelength changes and also an optical system (called the 'Frequency Separator') for separation of the emanating visible and UV beams. In contrast to this approach, we have devised an alternate arrangement for recording LIF excitation spectra of free radicals (following appropriate precursor fragmentation) that utilizes a tunable rare-earth doped solid state laser system with direct UV pumping. We have designed a compact and portable tunable UV laser system incorporating features necessary for both in-space and in-flight spectroscopy experiments. For the purpose of LIF excitation, we have developed an all-solid-state tunable UV laser that employs direct pumping of the solid-state UV-active medium employing UV harmonics from a Nd:YAG laser. An optical scheme with counterpropagating photolysis and excitation beams focused by suitable lenses into a reaction vacuum chamber was employed.
Gu, Dongyu; Lazo-Portugal, Rodrigo; Fang, Chen; Wang, Zhantong; Ma, Ying; Knight, Martha; Ito, Yoichiro
2018-06-15
Centrifugal precipitation chromatography (CpC) is a powerful chromatographic technique invented in the year 2000 but so far very little applied. The method combines dialysis, counter-current and salting out processes. The separation rotor consists of two identical spiral channels separated by a dialysis membrane (6-8 K MW cut-off) in which the upper channel is eluted with an ammonium sulfate gradient and the lower channel with water, and the mixtures are separated according to their solubility in ammonium sulfate as a chromatographic technique. In the present study, the method was successfully applied for separation and purification of R-phycoerythrin (R-PE), a protein widely used as a fluorescent probe, from the red alga Gracilaria lemaneiformis. The separation was performed with the elution of ammonium sulfate from 50% to 0% in 21.5 h at a flow rate of 0.5 ml/min, while the lower channel was eluted with water at a flow rate of 0.05 ml/min after sample charge, and the column was rotated at 200 rpm. After a single run, the absorbance ratio A 565 /A 280 (a criterion for the purity of R-PE) was increased from 0.5 of the crude to 6.5. The purified R-PE exhibited a typical "three peaks" spectrum with absorbance maximum at 497, 538 and 565 nm. The Native-PAGE showed one single protein band and 20 kDa (subunits α and β) and 30 kDa (subunit γ) can be observed in SDS-PAGE analysis which were consistent with the (αβ) 6 γ subunit composition of R-PE. The results indicated that CpC is an efficient method to obtain protein with the high purity from a complex source. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Holden, C.; Kaneko, Y.; D'Anastasio, E.; Benites, R.; Fry, B.; Hamling, I. J.
2017-11-01
The 2016 Kaikōura (New Zealand) earthquake generated large ground motions and resulted in multiple onshore and offshore fault ruptures, a profusion of triggered landslides, and a regional tsunami. Here we examine the rupture evolution using two kinematic modeling techniques based on analysis of local strong-motion and high-rate GPS data. Our kinematic models capture a complex pattern of slowly (Vr < 2 km/s) propagating rupture from south to north, with over half of the moment release occurring in the northern source region, mostly on the Kekerengu fault, 60 s after the origin time. Both models indicate rupture reactivation on the Kekerengu fault with the time separation of 11 s between the start of the original failure and start of the subsequent one. We further conclude that most near-source waveforms can be explained by slip on the crustal faults, with little (<8%) or no contribution from the subduction interface.
Highland, Matthew J.; Fong, Dillon D.; Ju, Guangxu; ...
2015-08-28
In-situ synchrotron x-ray scattering has been used to monitor and control the synthesis of LaGaO 3 epitaxial thin films by 90° off-axis RF-magnetron sputtering. We compared films deposited from a single LaGaO 3 source with those prepared by alternating deposition from separate La 2O 3 and Ga 2O 3 sources. The conditions for growth of stoichiometric films were determined by real-time monitoring of secondary phase formation as well as from features in the diffuse scatter from island formation during synthesis. Our results provide atomic-scale insight into the mechanisms taking place during reactive epitaxial growth and demonstrate how in-situ techniques canmore » be utilized to achieve stoichiometric control in ultrathin films.« less
Cathode Priming vs. RF Priming for Relativistic Magnetrons
NASA Astrophysics Data System (ADS)
White, W. M.; Spencer, T. A.; Price, D.
2005-10-01
Magnetron start-oscillation time, pulsewidth and pi-mode locking are experimentally compared for RF priming versus cathode priming on the Michigan-Titan relativistic magnetron (-300 kV, 2-10 kA, 300-500 ns). Cathode priming [1, 2] is an innovative technique first demonstrated experimentally at UM. In this technique, the cathode is fabricated with N/2 emitting strips or N/2-separate cathodes (for an N-cavity magnetron), which generate the desired number of spokes for pi-mode. Cathode priming yields 13% faster startup with more reproducible pi-mode oscillation. Radio Frequency (RF) priming is investigated as the baseline priming technique for magnetrons. The external priming source is a 100kW, 3μs pulsewidth magnetron on loan from AFRL. RF priming reduced startup delay by 15% and increased pulsewidth by 9%. [1] M.C. Jones, V.B. Neculaes, R.M. Gilgenbach, W.M. White, M.R. Lopez, Y.Y. Lau, T.A. Spencer, and D. Price, Rev. Sci. Inst., 75, 2976 (2004) [2] M.C. Jones, Doctoral Dissertation, University of Michigan, 2005
Harwood, Valerie J.; Whitlock, John; Withington, Victoria
2000-01-01
The antibiotic resistance patterns of fecal streptococci and fecal coliforms isolated from domestic wastewater and animal feces were determined using a battery of antibiotics (amoxicillin, ampicillin, cephalothin, chlortetracycline, oxytetracycline, tetracycline, erythromycin, streptomycin, and vancomycin) at four concentrations each. The sources of animal feces included wild birds, cattle, chickens, dogs, pigs, and raccoons. Antibiotic resistance patterns of fecal streptococci and fecal coliforms from known sources were grouped into two separate databases, and discriminant analysis of these patterns was used to establish the relationship between the antibiotic resistance patterns and the bacterial source. The fecal streptococcus and fecal coliform databases classified isolates from known sources with similar accuracies. The average rate of correct classification for the fecal streptococcus database was 62.3%, and that for the fecal coliform database was 63.9%. The sources of fecal streptococci and fecal coliforms isolated from surface waters were identified by discriminant analysis of their antibiotic resistance patterns. Both databases identified the source of indicator bacteria isolated from surface waters directly impacted by septic tank discharges as human. At sample sites selected for relatively low anthropogenic impact, the dominant sources of indicator bacteria were identified as various animals. The antibiotic resistance analysis technique promises to be a useful tool in assessing sources of fecal contamination in subtropical waters, such as those in Florida. PMID:10966379
Robotic vision. [process control applications
NASA Technical Reports Server (NTRS)
Williams, D. S.; Wilf, J. M.; Cunningham, R. T.; Eskenazi, R.
1979-01-01
Robotic vision, involving the use of a vision system to control a process, is discussed. Design and selection of active sensors employing radiation of radio waves, sound waves, and laser light, respectively, to light up unobservable features in the scene are considered, as are design and selection of passive sensors, which rely on external sources of illumination. The segmentation technique by which an image is separated into different collections of contiguous picture elements having such common characteristics as color, brightness, or texture is examined, with emphasis on the edge detection technique. The IMFEX (image feature extractor) system performing edge detection and thresholding at 30 frames/sec television frame rates is described. The template matching and discrimination approach to recognize objects are noted. Applications of robotic vision in industry for tasks too monotonous or too dangerous for the workers are mentioned.
Tavlarides, Lawrence L.; Bae, Jae-Heum
1991-01-01
A laser capillary spectrophotometric technique measures real time or near real time bivariate drop size and concentration distribution for a reactive liquid-liquid dispersion system. The dispersion is drawn into a precision-bore glass capillary and an appropriate light source is used to distinguish the aqueous phase from slugs of the organic phase at two points along the capillary whose separation is precisely known. The suction velocity is measured, as is the length of each slug from which the drop free diameter is calculated. For each drop, the absorptivity at a given wavelength is related to the molar concentration of a solute of interest, and the concentration of given drops of the organic phase is derived from pulse heights of the detected light. This technique permits on-line monitoring and control of liquid-liquid dispersion processes.
A Markov model for blind image separation by a mean-field EM algorithm.
Tonazzini, Anna; Bedini, Luigi; Salerno, Emanuele
2006-02-01
This paper deals with blind separation of images from noisy linear mixtures with unknown coefficients, formulated as a Bayesian estimation problem. This is a flexible framework, where any kind of prior knowledge about the source images and the mixing matrix can be accounted for. In particular, we describe local correlation within the individual images through the use of Markov random field (MRF) image models. These are naturally suited to express the joint pdf of the sources in a factorized form, so that the statistical independence requirements of most independent component analysis approaches to blind source separation are retained. Our model also includes edge variables to preserve intensity discontinuities. MRF models have been proved to be very efficient in many visual reconstruction problems, such as blind image restoration, and allow separation and edge detection to be performed simultaneously. We propose an expectation-maximization algorithm with the mean field approximation to derive a procedure for estimating the mixing matrix, the sources, and their edge maps. We tested this procedure on both synthetic and real images, in the fully blind case (i.e., no prior information on mixing is exploited) and found that a source model accounting for local autocorrelation is able to increase robustness against noise, even space variant. Furthermore, when the model closely fits the source characteristics, independence is no longer a strict requirement, and cross-correlated sources can be separated, as well.
NASA Astrophysics Data System (ADS)
Miura, Hitoshi
The development of compact separation and recovery methods using selective ion-exchange techniques is very important for the reprocessing and high-level liquid wastes (HLLWs) treatment in the nuclear backend field. The selective nuclide separation techniques are effective for the volume reduction of wastes and the utilization of valuable nuclides, and expected for the construction of advanced nuclear fuel cycle system and the rationalization of waste treatment. In order to accomplish the selective nuclide separation, the design and synthesis of novel adsorbents are essential for the development of compact and precise separation processes. The present paper deals with the preparation of highly functional and selective hybrid microcapsules enclosing nano-adsorbents in the alginate gel polymer matrices by sol-gel methods, their characterization and the clarification of selective adsorption properties by batch and column methods. The selective separation of Cs, Pd and Re in real HLLW was further accomplished by using novel microcapsules, and an advanced nuclide separation system was proposed by the combination of selective processes using microcapsules.
Contemporary molecular tools in microbial ecology and their application to advancing biotechnology.
Rashid, Mamoon; Stingl, Ulrich
2015-12-01
Novel methods in microbial ecology are revolutionizing our understanding of the structure and function of microbes in the environment, but concomitant advances in applications of these tools to biotechnology are mostly lagging behind. After more than a century of efforts to improve microbial culturing techniques, about 70-80% of microbial diversity - recently called the "microbial dark matter" - remains uncultured. In early attempts to identify and sample these so far uncultured taxonomic lineages, methods that amplify and sequence ribosomal RNA genes were extensively used. Recent developments in cell separation techniques, DNA amplification, and high-throughput DNA sequencing platforms have now made the discovery of genes/genomes of uncultured microorganisms from different environments possible through the use of metagenomic techniques and single-cell genomics. When used synergistically, these metagenomic and single-cell techniques create a powerful tool to study microbial diversity. These genomics techniques have already been successfully exploited to identify sources for i) novel enzymes or natural products for biotechnology applications, ii) novel genes from extremophiles, and iii) whole genomes or operons from uncultured microbes. More can be done to utilize these tools more efficiently in biotechnology. Copyright © 2015 Elsevier Inc. All rights reserved.
Kurtosis Approach Nonlinear Blind Source Separation
NASA Technical Reports Server (NTRS)
Duong, Vu A.; Stubbemd, Allen R.
2005-01-01
In this paper, we introduce a new algorithm for blind source signal separation for post-nonlinear mixtures. The mixtures are assumed to be linearly mixed from unknown sources first and then distorted by memoryless nonlinear functions. The nonlinear functions are assumed to be smooth and can be approximated by polynomials. Both the coefficients of the unknown mixing matrix and the coefficients of the approximated polynomials are estimated by the gradient descent method conditional on the higher order statistical requirements. The results of simulation experiments presented in this paper demonstrate the validity and usefulness of our approach for nonlinear blind source signal separation Keywords: Independent Component Analysis, Kurtosis, Higher order statistics.
Dong, Jun; Ni, Mingjiang; Chi, Yong; Zou, Daoan; Fu, Chao
2013-08-01
In China, the continuously increasing amount of municipal solid waste (MSW) has resulted in an urgent need for changing the current municipal solid waste management (MSWM) system based on mixed collection. A pilot program focusing on source-separated MSW collection was thus launched (2010) in Hangzhou, China, to lessen the related environmental loads. And greenhouse gas (GHG) emissions (Kyoto Protocol) are singled out in particular. This paper uses life cycle assessment modeling to evaluate the potential environmental improvement with regard to GHG emissions. The pre-existing MSWM system is assessed as baseline, while the source separation scenario is compared internally. Results show that 23 % GHG emissions can be decreased by source-separated collection compared with the base scenario. In addition, the use of composting and anaerobic digestion (AD) is suggested for further optimizing the management of food waste. 260.79, 82.21, and -86.21 thousand tonnes of GHG emissions are emitted from food waste landfill, composting, and AD, respectively, proving the emission reduction potential brought by advanced food waste treatment technologies. Realizing the fact, a modified MSWM system is proposed by taking AD as food waste substitution option, with additional 44 % GHG emissions saved than current source separation scenario. Moreover, a preliminary economic assessment is implemented. It is demonstrated that both source separation scenarios have a good cost reduction potential than mixed collection, with the proposed new system the most cost-effective one.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Xueyun; Wojcik, Roza; Zhang, Xing
Ion mobility spectrometry (IMS) is a widely used analytical technique for rapid molecular separations in the gas phase. IMS alone is useful, but its coupling with mass spectrometry (MS) and front-end separations has been extremely beneficial for increasing measurement sensitivity, peak capacity of complex mixtures, and the scope of molecular information in biological and environmental sample analyses. Multiple studies in disease screening and environmental evaluations have even shown these IMS-based multidimensional separations extract information not possible with each technique individually. This review highlights 3-dimensional separations using IMS-MS in conjunction with a range of front-end techniques, such as gas chromatography (GC),more » supercritical fluid chromatography (SFC), liquid chromatography (LC), solid phase extractions (SPE), capillary electrophoresis (CE), field asymmetric ion mobility spectrometry (FAIMS), and microfluidic devices. The origination, current state, various applications, and future capabilities for these multidimensional approaches are described to provide insight into the utility and potential of each technique.« less
Ammonia producing engine utilizing oxygen separation
Easley, Jr., William Lanier; Coleman, Gerald Nelson [Petersborough, GB; Robel, Wade James [Peoria, IL
2008-12-16
A power system is provided having a power source, a first power source section with a first intake passage and a first exhaust passage, a second power source section with a second intake passage and a second exhaust passage, and an oxygen separator. The second intake passage may be fluidly isolated from the first intake passage.
NASA Technical Reports Server (NTRS)
Wolgemuth, D. J.; Gizang-Ginsberg, E.; Engelmyer, E.; Gavin, B. J.; Ponzetto, C.
1985-01-01
The use of a self-contained unit-gravity cell separation apparatus for separation of populations of mouse testicular cells is described. The apparatus, a Celsep (TM), maximizes the unit area over which sedimentation occurs, reduces the amount of separation medium employed, and is quite reproducible. Cells thus isolated have been good sources for isolation of DNA, and notably, high molecular weight RNA.
Instrument intercomparison of glyoxal, methyl glyoxal and NO2 under simulated atmospheric conditions
NASA Astrophysics Data System (ADS)
Thalman, R.; Baeza-Romero, M. T.; Ball, S. M.; Borrás, E.; Daniels, M. J. S.; Goodall, I. C. A.; Henry, S. B.; Karl, T.; Keutsch, F. N.; Kim, S.; Mak, J.; Monks, P. S.; Muñoz, A.; Orlando, J.; Peppe, S.; Rickard, A. R.; Ródenas, M.; Sánchez, P.; Seco, R.; Su, L.; Tyndall, G.; Vázquez, M.; Vera, T.; Waxman, E.; Volkamer, R.
2015-04-01
The α-dicarbonyl compounds glyoxal (CHOCHO) and methyl glyoxal (CH3C(O)CHO) are produced in the atmosphere by the oxidation of hydrocarbons and emitted directly from pyrogenic sources. Measurements of ambient concentrations inform about the rate of hydrocarbon oxidation, oxidative capacity, and secondary organic aerosol (SOA) formation. We present results from a comprehensive instrument comparison effort at two simulation chamber facilities in the US and Europe that included nine instruments, and seven different measurement techniques: broadband cavity enhanced absorption spectroscopy (BBCEAS), cavity-enhanced differential optical absorption spectroscopy (CE-DOAS), white-cell DOAS, Fourier transform infrared spectroscopy (FTIR, two separate instruments), laser-induced phosphorescence (LIP), solid-phase micro extraction (SPME), and proton transfer reaction mass spectrometry (PTR-ToF-MS, two separate instruments; for methyl glyoxal only because no significant response was observed for glyoxal). Experiments at the National Center for Atmospheric Research (NCAR) compare three independent sources of calibration as a function of temperature (293-330 K). Calibrations from absorption cross-section spectra at UV-visible and IR wavelengths are found to agree within 2% for glyoxal, and 4% for methyl glyoxal at all temperatures; further calibrations based on ion-molecule rate constant calculations agreed within 5% for methyl glyoxal at all temperatures. At the European Photoreactor (EUPHORE) all measurements are calibrated from the same UV-visible spectra (either directly or indirectly), thus minimizing potential systematic bias. We find excellent linearity under idealized conditions (pure glyoxal or methyl glyoxal, R2 > 0.96), and in complex gas mixtures characteristic of dry photochemical smog systems (o-xylene/NOx and isoprene/NOx, R2 > 0.95; R2 ∼ 0.65 for offline SPME measurements of methyl glyoxal). The correlations are more variable in humid ambient air mixtures (RH > 45%) for methyl glyoxal (0.58 < R2 < 0.68) than for glyoxal (0.79 < R2 < 0.99). The intercepts of correlations were insignificant for the most part (below the instruments' experimentally determined detection limits); slopes further varied by less than 5% for instruments that could also simultaneously measure NO2. For glyoxal and methyl glyoxal the slopes varied by less than 12 and 17% (both 3-σ) between direct absorption techniques (i.e., calibration from knowledge of the absorption cross section). We find a larger variability among in situ techniques that employ external calibration sources (75-90%, 3-σ), and/or techniques that employ offline analysis. Our intercomparison reveals existing differences in reports about precision and detection limits in the literature, and enables comparison on a common basis by observing a common air mass. Finally, we evaluate the influence of interfering species (e.g., NO2, O3 and H2O) of relevance in field and laboratory applications. Techniques now exist to conduct fast and accurate measurements of glyoxal at ambient concentrations, and methyl glyoxal under simulated conditions. However, techniques to measure methyl glyoxal at ambient concentrations remain a challenge, and would be desirable.
Cultured Cortical Neurons Can Perform Blind Source Separation According to the Free-Energy Principle
Isomura, Takuya; Kotani, Kiyoshi; Jimbo, Yasuhiko
2015-01-01
Blind source separation is the computation underlying the cocktail party effect––a partygoer can distinguish a particular talker’s voice from the ambient noise. Early studies indicated that the brain might use blind source separation as a signal processing strategy for sensory perception and numerous mathematical models have been proposed; however, it remains unclear how the neural networks extract particular sources from a complex mixture of inputs. We discovered that neurons in cultures of dissociated rat cortical cells could learn to represent particular sources while filtering out other signals. Specifically, the distinct classes of neurons in the culture learned to respond to the distinct sources after repeating training stimulation. Moreover, the neural network structures changed to reduce free energy, as predicted by the free-energy principle, a candidate unified theory of learning and memory, and by Jaynes’ principle of maximum entropy. This implicit learning can only be explained by some form of Hebbian plasticity. These results are the first in vitro (as opposed to in silico) demonstration of neural networks performing blind source separation, and the first formal demonstration of neuronal self-organization under the free energy principle. PMID:26690814
Time-frequency approach to underdetermined blind source separation.
Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong
2012-02-01
This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.
Feasibility of Higher-Order Differential Ion Mobility Separations Using New Asymmetric Waveforms
Shvartsburg, Alexandre A.; Mashkevich, Stefan V.; Smith, Richard D.
2011-01-01
Technologies for separating and characterizing ions based on their transport properties in gases have been around for three decades. The early method of ion mobility spectrometry (IMS) distinguished ions by absolute mobility that depends on the collision cross section with buffer gas atoms. The more recent technique of field asymmetric waveform IMS (FAIMS) measures the difference between mobilities at high and low electric fields. Coupling IMS and FAIMS to soft ionization sources and mass spectrometry (MS) has greatly expanded their utility, enabling new applications in biomedical and nanomaterials research. Here, we show that time-dependent electric fields comprising more than two intensity levels could, in principle, effect an infinite number of distinct differential separations based on the higher-order terms of expression for ion mobility. These analyses could employ the hardware and operational procedures similar to those utilized in FAIMS. Methods up to the 4th or 5th order (where conventional IMS is 1st order and FAIMS is 2nd order) should be practical at field intensities accessible in ambient air, with still higher orders potentially achievable in insulating gases. Available experimental data suggest that higher-order separations should be largely orthogonal to each other and to FAIMS, IMS, and MS. PMID:16494377
Systematic study of target localization for bioluminescence tomography guided radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Jingjing; Zhang, Bin; Reyes, Juvenal
Purpose: To overcome the limitation of CT/cone-beam CT (CBCT) in guiding radiation for soft tissue targets, the authors developed a spectrally resolved bioluminescence tomography (BLT) system for the small animal radiation research platform. The authors systematically assessed the performance of the BLT system in terms of target localization and the ability to resolve two neighboring sources in simulations, tissue-mimicking phantom, and in vivo environments. Methods: Multispectral measurements acquired in a single projection were used for the BLT reconstruction. The incomplete variables truncated conjugate gradient algorithm with an iterative permissible region shrinking strategy was employed as the optimization scheme to reconstructmore » source distributions. Simulation studies were conducted for single spherical sources with sizes from 0.5 to 3 mm radius at depth of 3–12 mm. The same configuration was also applied for the double source simulation with source separations varying from 3 to 9 mm. Experiments were performed in a standalone BLT/CBCT system. Two self-illuminated sources with 3 and 4.7 mm separations placed inside a tissue-mimicking phantom were chosen as the test cases. Live mice implanted with single-source at 6 and 9 mm depth, two sources at 3 and 5 mm separation at depth of 5 mm, or three sources in the abdomen were also used to illustrate the localization capability of the BLT system for multiple targets in vivo. Results: For simulation study, approximate 1 mm accuracy can be achieved at localizing center of mass (CoM) for single-source and grouped CoM for double source cases. For the case of 1.5 mm radius source, a common tumor size used in preclinical study, their simulation shows that for all the source separations considered, except for the 3 mm separation at 9 and 12 mm depth, the two neighboring sources can be resolved at depths from 3 to 12 mm. Phantom experiments illustrated that 2D bioluminescence imaging failed to distinguish two sources, but BLT can provide 3D source localization with approximately 1 mm accuracy. The in vivo results are encouraging that 1 and 1.7 mm accuracy can be attained for the single-source case at 6 and 9 mm depth, respectively. For the 2 sources in vivo study, both sources can be distinguished at 3 and 5 mm separations, and approximately 1 mm localization accuracy can also be achieved. Conclusions: This study demonstrated that their multispectral BLT/CBCT system could be potentially applied to localize and resolve multiple sources at wide range of source sizes, depths, and separations. The average accuracy of localizing CoM for single-source and grouped CoM for double sources is approximately 1 mm except deep-seated target. The information provided in this study can be instructive to devise treatment margins for BLT-guided irradiation. These results also suggest that the 3D BLT system could guide radiation for the situation with multiple targets, such as metastatic tumor models.« less
SU-E-T-564: Multi-Helix Rotating Shield Brachytherapy for Cervical Cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dadkhah, H; Wu, X; Flynn, R
Purpose: To present a novel and practical brachytherapy technique, called multi-helix rotating shield brachytherapy (H-RSBT), for the precise positioning of a partial shield in a curved applicator. H-RSBT enables RSBT delivery using only translational motion of the radiation source/shield combination. H-RSBT overcomes the challenges associated with previously proposed RSBT approaches based on a serial (S-RSBT) step-and-shoot delivery technique, which required independent translational and rotational motion. Methods: A Fletcher-type applicator, compatible with the combination of a Xoft Axxent™ electronic brachytherapy source and a 0.5 mm thick tungsten shield, is proposed. The wall of the applicator contains six evenly-spaced helical keyways thatmore » rigidly define the emission direction of the shield as a function of depth. The shield contains three protruding keys and is attached to the source such that it rotates freely. S-RSBT and H-RSBT treatment plans with 180° and 45° azimuthal emission angles were generated for five cervical cancer patients representative of a wide range of high-risk clinical target volume (HR-CTV) shapes and applicator positions. The number of beamlets used in the treatment planning process was nearly constant for S-RSBT and H-RSBT by using dwell positions separated by 5 and 1.7 mm, respectively, and emission directions separated by 22.5° and 60°, respectively. For all the treatment plans the EQD2 of the HR-CTV was escalated until the EQD{sub 2cc} tolerance of either the bladder, rectum, or sigmoid colon was reached. Results: Treatment times for H-RSBT tended to be shorter than for S-RSBT, with changes of −38.47% to 1.12% with an average of −8.34%. The HR-CTV D{sub 90} changed by −8.81% to 2.08% with an average of −2.46%. Conclusion: H-RSBT is a mechanically feasible technique in the curved applicators needed for cervical cancer brachytherapy. S-RSBT and H-RSBT dose distributions were clinically equivalent for all patients considered, with the H-RSBT deliveries tending to be faster. Ryan Flynn has ownership interest in pxAlpha, LLC, which is a startup company developing a rotating shield brachytherapy system.« less
NASA Astrophysics Data System (ADS)
Kwon, Hyuk Taek
Propylene/propane separation is one of the most challenging separations, currently achieved by energy-intensive cryogenic distillation. Despite the great potentials for energy-efficient membrane-based propylene/propane separation processes, no commercial membranes are available due to the limitations (i.e., low selectivity) of current polymeric materials. Zeolitic imidazolate frameworks (ZIFs) are promising membrane materials primarily due to their well-defined ultra-micropores with controllable surface chemistry along with their relatively high thermal/chemical stabilities. In particular, ZIF-8 with the effective aperture size of ~ 4.0 A has been shown very promising for propylene/propane separation. Despite the extensive research on ZIF-8 membranes, only a few of ZIF-8 membranes have displayed good propylene/propane separation performances presumably due to the challenges of controlling the microstructures of polycrystalline membranes. Since the membrane microstructures are greatly influenced by processing techniques, it is critically important to develop new techniques. In this dissertation, three state-of-the-art ZIF membrane synthesis techniques are developed. The first is a one-step in-situ synthesis technique based on the concept of counter diffusion. The technique enabled us to obtain highly propylene selective ZIF-8 membranes in less than a couple of hours with exceptional mechanical strength. Most importantly, due to the nature of the counter-diffusion concept, the new method offered unique opportunities such as healing defective membranes (i.e., poorly-intergrown) as well as significantly reducing the consumption of costly ligands and organic solvents. The second is a microwave-assisted seeding technique. Using this new seeding technique, we were able to prepare seeded supports with a high packing density in a couple of minutes, which subsequently grown into highly propylene-selective ZIF-8 membranes with an average propylene/propane selectivity of ~40. The last is a heteroepitaxial growth technique. The first well-intergrown membranes of ZIF-67 (Co-substituted ZIF-8) by heteroepitaxially growing ZIF-67 on ZIF-8 seed layers were reported. The ZIF-67 membranes exhibited impressively high propylene/propane separation capabilities. The presence of a methanol co-solvent in the growth solution was critically important to reproducibly prepare high quality ZIF-67 membranes. Furthermore, when the tertiary growth of ZIF-8 layers was applied to the ZIF-67 membranes, the membranes exhibited unprecedentedly high propylene/propane separation factors of ~ 200 possibly due to enhanced grain boundary structure.
Application of separable parameter space techniques to multi-tracer PET compartment modeling
Zhang, Jeff L; Morey, A Michael; Kadrmas, Dan J
2016-01-01
Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg–Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models. PMID:26788888
Design methodology for integrated downstream separation systems in an ethanol biorefinery
NASA Astrophysics Data System (ADS)
Mohammadzadeh Rohani, Navid
Energy security and environmental concerns have been the main drivers for a historic shift to biofuel production in transportation fuel industry. Biofuels should not only offer environmental advantages over the petroleum fuels they replace but also should be economically sustainable and viable. The so-called second generation biofuels such as ethanol which is the most produced biofuel are mostly derived from lignocellulosic biomasses. These biofuels are more difficult to produce than the first generation ones mainly due to recalcitrance of the feedstocks in extracting their sugar contents. Costly pre-treatment and fractionation stages are required to break down lignocellulosic feedstocks into their constituent elements. On the other hand the mixture produced in fermentation step in a biorefinery contains very low amount of product which makes the subsequent separation step more difficult and more energy consuming. In an ethanol biorefinery, the dilute fermentation broth requires huge operating cost in downstream separation for recovery of the product in a conventional distillation technique. Moreover, the non-ideal nature of ethanol-water mixture which forms an iseotrope at almost 95 wt%, hinders the attainment of the fuel grade ethanol (99.5 wt%). Therefore, an additional dehydration stage is necessary to purify the ethanol from its azeotropic composition to fuel-grade purity. In order to overcome the constraint pertaining to vapor-liquid equilibrium of ethanol-water separation, several techniques have been investigated and proposed in the industry. These techniques such as membrane-based technologies, extraction and etc. have not only sought to produce a pure fuel-grade ethanol but have also aimed at decreasing the energy consumption of this energy-intensive separation. Decreasing the energy consumption of an ethanol biorefinery is of paramount importance in improving its overall economics and in facilitating the way to displacing petroleum transportation fuel and obtaining energy security. On the other hand, Process Integration (PI) as defined by Natural Resource Canada as the combination of activities which aim at improving process systems, their unit operations and their interactions in order to maximize the efficiency of using water, energy and raw materials can also help biorefineries lower their energy consumptions and improve their economics. Energy integration techniques such as pinch analysis adopted by different industries over the years have ensured using heat sources within a plant to supply the demand internally and decrease the external utility consumption. Therefore, adopting energy integration can be one of the ways biorefinery technology owners can consider in their process development as well as their business model in order to improve their overall economics. The objective of this thesis is to propose a methodology for designing integrated downstream separation in a biorefinery. This methodology is tested in an ethanol biorefinery case study. Several alternative separation techniques are evaluated in their energy consumption and economics in three different scenarios; stand-alone without energy integration, stand-alone with internal energy integration and integrated-with Kraft. The energy consumptions and capital costs of separation techniques are assessed in each scenario and the cost and benefit of integration are determined and finally the best alternative is found through techno-economic metrics. Another advantage of this methodology is the use of a graphical tool which provides insights on decreasing energy consumption by modifying the process condition. The pivot point of this work is the use of a novel energy integration method called Bridge analysis. This systematic method which originally is intended for retrofit situation is used here for integration with Kraft process. Integration potentials are identified through this method and savings are presented for each design. In stand-alone with internal integration scenario, the conventional pinch method is used for energy analysis. The results reveal the importance of energy integration in reducing energy consumption. They also show that in an ethanol biorefinery, by adopting energy integration in the conventional distillation separation, we can achieve greater energy saving compared to other alternative techniques. This in turn suggests that new alternative technologies which imply big risks for the company might not be an option for reducing the energy consumption as long as an internal and external integration is incorporated in the business model of an ethanol biorefinery. It is also noteworthy that the methodology developed in this work can be extended as a future work to include a whole biorefinery system. (Abstract shortened by UMI.).
Noise suppression in surface microseismic data
Forghani-Arani, Farnoush; Batzle, Mike; Behura, Jyoti; Willis, Mark; Haines, Seth S.; Davidson, Michael
2012-01-01
We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform. We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform.
Baktemur, Gökhan; Taşkın, Hatıra; Büyükalaca, Saadet
2013-01-01
Irradiated pollen technique is the most successful haploidization technique within Cucurbitaceae. After harvesting of fruits pollinated with irradiated pollen, classical method called as “inspecting the seeds one by one” is used to find haploid embryos in the seeds. In this study, different methods were used to extract the embryos more easily, quickly, economically, and effectively. “Inspecting the seeds one by one” was used as control treatment. Other four methods tested were “sowing seeds direct nutrient media,” “inspecting seeds in the light source,” “floating seeds on liquid media,” and “floating seeds on liquid media after surface sterilization.” Y2 and Y3 melon genotypes selected from the third backcross population of Yuva were used as plant material. Results of this study show that there is no statistically significant difference among methods “inspecting the seeds one by one,” “sowing seeds direct CP nutrient media,” and “inspecting seeds in the light source,” although the average number of embryos per fruit is slightly different. No embryo production was obtained from liquid culture because of infection. When considered together with labor costs and time required for embryo rescue, the best methods were “sowing seeds directly in the CP nutrient media“ and ”inspecting seeds in the light source.” PMID:23818825
Hemasa, Ayman L.; Maher, William A.; Ghanem, Ashraf
2017-01-01
Carbon nanotubes (CNTs) possess unique mechanical, physical, electrical and absorbability properties coupled with their nanometer dimensional scale that renders them extremely valuable for applications in many fields including nanotechnology and chromatographic separation. The aim of this review is to provide an updated overview about the applications of CNTs in chiral and achiral separations of pharmaceuticals, biologics and chemicals. Chiral single-walled carbon nanotubes (SWCNTs) and multi-walled carbon nanotubes (MWCNTs) have been directly applied for the enantioseparation of pharmaceuticals and biologicals by using them as stationary or pseudostationary phases in chromatographic separation techniques such as high-performance liquid chromatography (HPLC), capillary electrophoresis (CE) and gas chromatography (GC). Achiral MWCNTs have been used for achiral separations as efficient sorbent objects in solid-phase extraction techniques of biochemicals and drugs. Achiral SWCNTs have been applied in achiral separation of biological samples. Achiral SWCNTs and MWCNTs have been also successfully used to separate achiral mixtures of pharmaceuticals and chemicals. Collectively, functionalized CNTs have been indirectly applied in separation science by enhancing the enantioseparation of different chiral selectors whereas non-functionalized CNTs have shown efficient capabilities for chiral separations by using techniques such as encapsulation or immobilization in polymer monolithic columns. PMID:28718832
Blecha, Kevin A.; Alldredge, Mat W.
2015-01-01
Animal space use studies using GPS collar technology are increasingly incorporating behavior based analysis of spatio-temporal data in order to expand inferences of resource use. GPS location cluster analysis is one such technique applied to large carnivores to identify the timing and location of feeding events. For logistical and financial reasons, researchers often implement predictive models for identifying these events. We present two separate improvements for predictive models that future practitioners can implement. Thus far, feeding prediction models have incorporated a small range of covariates, usually limited to spatio-temporal characteristics of the GPS data. Using GPS collared cougar (Puma concolor) we include activity sensor data as an additional covariate to increase prediction performance of feeding presence/absence. Integral to the predictive modeling of feeding events is a ground-truthing component, in which GPS location clusters are visited by human observers to confirm the presence or absence of feeding remains. Failing to account for sources of ground-truthing false-absences can bias the number of predicted feeding events to be low. Thus we account for some ground-truthing error sources directly in the model with covariates and when applying model predictions. Accounting for these errors resulted in a 10% increase in the number of clusters predicted to be feeding events. Using a double-observer design, we show that the ground-truthing false-absence rate is relatively low (4%) using a search delay of 2–60 days. Overall, we provide two separate improvements to the GPS cluster analysis techniques that can be expanded upon and implemented in future studies interested in identifying feeding behaviors of large carnivores. PMID:26398546
Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.
2007-01-01
Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information maximization, maximization of non-gaussianity, joint diagonalization of cross-cumulant matrices, and second-order correlation based methods when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study the variability among different ICA algorithms and propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA, and JADE all yield reliable results; each having their strengths in specific areas. EVD, an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for the iterative ICA algorithms, it is important to investigate the variability of the estimates from different runs. We test the consistency of the iterative algorithms, Infomax and FastICA, by running the algorithm a number of times with different initializations and note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis. PMID:17540281
Performance of blind source separation algorithms for fMRI analysis using a group ICA method.
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D
2007-06-01
Independent component analysis (ICA) is a popular blind source separation technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist; however, the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely, information maximization, maximization of non-Gaussianity, joint diagonalization of cross-cumulant matrices and second-order correlation-based methods, when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study variability among different ICA algorithms, and we propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA and joint approximate diagonalization of eigenmatrices (JADE) all yield reliable results, with each having its strengths in specific areas. Eigenvalue decomposition (EVD), an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for iterative ICA algorithms, it is important to investigate the variability of estimates from different runs. We test the consistency of the iterative algorithms Infomax and FastICA by running the algorithm a number of times with different initializations, and we note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis.
Rigamonti, L; Grosso, M; Giugliano, M
2009-02-01
This life cycle assessment study analyses material and energy recovery within integrated municipal solid waste (MSW) management systems, and, in particular, the recovery of the source-separated materials (packaging and organic waste) and the energy recovery from the residual waste. The recovery of materials and energy are analysed together, with the final aim to evaluate possible optimum levels of source-separated collection that lead to the most favourable energetic and environmental results; this method allows identification of an optimum configuration of the MSW management system. The results show that the optimum level of source-separated collection is about 60%, when all the materials are recovered with high efficiency; it decreases to about 50%, when the 60% level is reached as a result of a very high recovery efficiency for organic fractions at the expense of the packaging materials, or when this implies an appreciable reduction of the quality of collected materials. The optimum MSW management system is thus characterized by source-separated collection levels as included in the above indicated range, with subsequent recycling of the separated materials and energy recovery of the residual waste in a large-scale incinerator operating in combined heat and power mode.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haeberli, W.
1981-04-01
This paper presents a survey of methods, commonly in use or under development, to produce beams of polarized negative ions for injection into accelerators. A short summary recalls how the hyperfine interaction is used to obtain nuclear polarization in beams of atoms. Atomic-beam sources for light ions are discussed. If the best presently known techniques are incorporated in all stages of the source, polarized H/sup -/ and D/sup -/ beams in excess of 10 ..mu..A can probably be achieved. Production of polarized ions from fast (keV) beams of polarized atoms is treated separately for atoms in the H(25) excited statemore » (Lamb-Shift source) and atoms in the H(1S) ground state. The negative ion beam from Lamb-Shift sources has reached a plateau just above 1 ..mu..A, but this beam current is adequate for many applications and the somewhat lower beam current is compensated by other desirable characteristics. Sources using fast polarized ground state atoms are in a stage of intense development. The next sections summarize production of polarized heavy ions by the atomic beam method, which is well established, and by optical pumping, which has recently been demonstrated to yield very large nuclear polarization. A short discussion of proposed ion sources for polarized /sup 3/He/sup -/ ions is followed by some concluding remarks.« less
Separating Turbofan Engine Noise Sources Using Auto and Cross Spectra from Four Microphones
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2008-01-01
The study of core noise from turbofan engines has become more important as noise from other sources such as the fan and jet were reduced. A multiple-microphone and acoustic-source modeling method to separate correlated and uncorrelated sources is discussed. The auto- and cross spectra in the frequency range below 1000 Hz are fitted with a noise propagation model based on a source couplet consisting of a single incoherent monopole source with a single coherent monopole source or a source triplet consisting of a single incoherent monopole source with two coherent monopole point sources. Examples are presented using data from a Pratt& Whitney PW4098 turbofan engine. The method separates the low-frequency jet noise from the core noise at the nozzle exit. It is shown that at low power settings, the core noise is a major contributor to the noise. Even at higher power settings, it can be more important than jet noise. However, at low frequencies, uncorrelated broadband noise and jet noise become the important factors as the engine power setting is increased.
Advanced Background Subtraction Applied to Aeroacoustic Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Bahr, Christopher J.; Horne, William C.
2015-01-01
An advanced form of background subtraction is presented and applied to aeroacoustic wind tunnel data. A variant of this method has seen use in other fields such as climatology and medical imaging. The technique, based on an eigenvalue decomposition of the background noise cross-spectral matrix, is robust against situations where isolated background auto-spectral levels are measured to be higher than levels of combined source and background signals. It also provides an alternate estimate of the cross-spectrum, which previously might have poor definition for low signal-to-noise ratio measurements. Simulated results indicate similar performance to conventional background subtraction when the subtracted spectra are weaker than the true contaminating background levels. Superior performance is observed when the subtracted spectra are stronger than the true contaminating background levels. Experimental results show limited success in recovering signal behavior for data where conventional background subtraction fails. They also demonstrate the new subtraction technique's ability to maintain a proper coherence relationship in the modified cross-spectral matrix. Beam-forming and de-convolution results indicate the method can successfully separate sources. Results also show a reduced need for the use of diagonal removal in phased array processing, at least for the limited data sets considered.
An Analysis of Fundamental Mode Surface Wave Amplitude Measurements
NASA Astrophysics Data System (ADS)
Schardong, L.; Ferreira, A. M.; van Heijst, H. J.; Ritsema, J.
2014-12-01
Seismic tomography is a powerful tool to decipher the Earth's interior structure at various scales. Traveltimes of seismic waves are widely used to build velocity models, whereas amplitudes are still only seldomly accounted for. This mainly results from our limited ability to separate the various physical effects responsible for observed amplitude variations, such as focussing/defocussing, scattering and source effects. We present new measurements from 50 global earthquakes of fundamental-mode Rayleigh and Love wave amplitude anomalies measured in the period range 35-275 seconds using two different schemes: (i) a standard time-domain amplitude power ratio technique; and (ii) a mode-branch stripping scheme. For minor-arc data, we observe amplitude anomalies with respect to PREM in the range of 0-4, for which the two measurement techniques show a very good overall agreement. We present here a statistical analysis and comparison of these datasets, as well as comparisons with theoretical calculations for a variety of 3-D Earth models. We assess the geographical coherency of the measurements, and investigate the impact of source, path and receiver effects on surface wave amplitudes, as well as their variations with frequency in a wider range than previously studied.
NASA Astrophysics Data System (ADS)
Gloster, Jonathan; Diep, Michael; Dredden, David; Mix, Matthew; Olsen, Mark; Price, Brian; Steil, Betty
2014-06-01
Small-to-medium sized businesses lack resources to deploy and manage high-end advanced solutions to deter sophisticated threats from well-funded adversaries, but evidence shows that these types of businesses are becoming key targets. As malicious code and network attacks become more sophisticated, classic signature-based virus and malware detection methods are less effective. To augment the current malware methods of detection, we developed a proactive approach to detect emerging malware threats using open source tools and intelligence to discover patterns and behaviors of malicious attacks and adversaries. Technical and analytical skills are combined to track adversarial behavior, methods and techniques. We established a controlled (separated domain) network to identify, monitor, and track malware behavior to increase understanding of the methods and techniques used by cyber adversaries. We created a suite of tools that observe the network and system performance looking for anomalies that may be caused by malware. The toolset collects information from open-source tools and provides meaningful indicators that the system was under or has been attacked. When malware is discovered, we analyzed and reverse engineered it to determine how it could be detected and prevented. Results have shown that with minimum resources, cost effective capabilities can be developed to detect abnormal behavior that may indicate malicious software.
NASA Astrophysics Data System (ADS)
Laassiri, M.; Hamzaoui, E.-M.; Cherkaoui El Moursli, R.
2018-02-01
Inside nuclear reactors, gamma-rays emitted from nuclei together with the neutrons introduce unwanted backgrounds in neutron spectra. For this reason, powerful extraction methods are needed to extract useful neutron signal from recorded mixture and thus to obtain clearer neutron flux spectrum. Actually, several techniques have been developed to discriminate between neutrons and gamma-rays in a mixed radiation field. Most of these techniques, tackle using analogue discrimination methods. Others propose to use some organic scintillators to achieve the discrimination task. Recently, systems based on digital signal processors are commercially available to replace the analog systems. As alternative to these systems, we aim in this work to verify the feasibility of using a Nonnegative Tensor Factorization (NTF) to blind extract neutron component from mixture signals recorded at the output of fission chamber (WL-7657). This last have been simulated through the Geant4 linked to Garfield++ using a 252Cf neutron source. To achieve our objective of obtaining the best possible neutron-gamma discrimination, we have applied the two different NTF algorithms, which have been found to be the best methods that allow us to analyse this kind of nuclear data.
Ishii, Stephanie K L; Boyer, Treavor H
2015-08-01
Alternative approaches to wastewater management including urine source separation have the potential to simultaneously improve multiple aspects of wastewater treatment, including reduced use of potable water for waste conveyance and improved contaminant removal, especially nutrients. In order to pursue such radical changes, system-level evaluations of urine source separation in community contexts are required. The focus of this life cycle assessment (LCA) is managing nutrients from urine produced in a residential setting with urine source separation and struvite precipitation, as compared with a centralized wastewater treatment approach. The life cycle impacts evaluated in this study pertain to construction of the urine source separation system and operation of drinking water treatment, decentralized urine treatment, and centralized wastewater treatment. System boundaries include fertilizer offsets resulting from the production of urine based struvite fertilizer. As calculated by the Tool for the Reduction and Assessment of Chemical and Other Environmental Impacts (TRACI), urine source separation with MgO addition for subsequent struvite precipitation with high P recovery (Scenario B) has the smallest environmental cost relative to existing centralized wastewater treatment (Scenario A) and urine source separation with MgO and Na3PO4 addition for subsequent struvite precipitation with concurrent high P and N recovery (Scenario C). Preliminary economic evaluations show that the three urine management scenarios are relatively equal on a monetary basis (<13% difference). The impacts of each urine management scenario are most sensitive to the assumed urine composition, the selected urine storage time, and the assumed electricity required to treat influent urine and toilet water used to convey urine at the centralized wastewater treatment plant. The importance of full nutrient recovery from urine in combination with the substantial chemical inputs required for N recovery via struvite precipitation indicate the need for alternative methods of N recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.
Albals, Dima; Heyden, Yvan Vander; Schmid, Martin G; Chankvetadze, Bezhan; Mangelings, Debby
2016-03-20
The screening part of an earlier defined chiral separation strategy in capillary electrochromatography (CEC) was used for the separation of ten cathinone- and amphetamine derivatives. They were analyzed using 4 polysaccharide-based chiral stationary phases (CSPs), containing cellulose tris(3,5-dimethylphenylcarbamate) (ODRH), amylose tris(3,5-dimethylphenylcarbamate) (ADH), amylose tris(5-chloro-2-methylphenylcarbamate) (LA2), and cellulose tris(4-chloro-3-methylphenylcarbamate) (LC4) as chiral selectors. After applying the screening to each compound, ADH and LC4 showed the highest success rate. In a second part of the study, a comparison between CEC and other analytical techniques used for chiral separations i.e., supercritical fluid chromatography (SFC), polar organic solvent chromatography (POSC), reversed-phase (RPLC) and normal-phase liquid chromatography (NPLC), was made. For this purpose, earlier defined screening approaches for each technique were applied to separate the 10 test substances. This allowed an overall comparison of the success rates of the screening steps of the 5 techniques for these compounds. The results showed that CEC had a similar enantioselectivity rate as NPLC and RPLC, producing the highest number of separations (9 out of 10 racemates). SFC resolved 7 compounds, while POSC gave only 2 separations. On the other hand, the baseline separation success rates for NPLC and RPLC was better than for CEC. For a second comparison, the same chiral stationary phases as in the CEC screening were also tested with all techniques at their specific screening conditions, which allowed a direct comparison of the performance of CEC versus the same CSPs in the other techniques. This comparison revealed that RPLC was able to separate all tested compounds, and also produced the highest number of baseline separations on the CSP that were used in the CEC screening step. CEC and NPLC showed the same success rate: nine out of ten substances were separated. When CEC and NPLC are combined, separation of the ten compounds can be achieved. SFC and POSC resolved eight and three compounds, respectively. POSC was the least attractive option as it expressed only limited enantioselectivity toward these compounds. Copyright © 2015 Elsevier B.V. All rights reserved.
Procedure for Separating Noise Sources in Measurements of Turbofan Engine Core Noise
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2006-01-01
The study of core noise from turbofan engines has become more important as noise from other sources like the fan and jet have been reduced. A multiple microphone and acoustic source modeling method to separate correlated and uncorrelated sources has been developed. The auto and cross spectrum in the frequency range below 1000 Hz is fitted with a noise propagation model based on a source couplet consisting of a single incoherent source with a single coherent source or a source triplet consisting of a single incoherent source with two coherent point sources. Examples are presented using data from a Pratt & Whitney PW4098 turbofan engine. The method works well.
Transverse Dimensions of Chorus in the Source Region
NASA Technical Reports Server (NTRS)
Santolik, O.; Gurnett, D. A.
2003-01-01
We report measurement of whistler-mode chorus by the four Cluster spacecraft at close separations. We focus our analysis on the generation region close to the magnetic equatorial plane at a radial distance of 4.4 Earth's radii. We use both linear and rank correlation analysis to define perpendicular dimensions of the sources of chorus elements below one half of the electron cyclotron frequency. Correlation is significant throughout the range of separation distances of 60-260 km parallel to the field line and 7-100 km in the perpendicular plane. At these scales, the correlation coefficient is independent for parallel separations, and decreases with perpendicular separation. The observations are consistent with a statistical model of the source region assuming individual sources as gaussian peaks of radiated power with a common half-width of 35 km perpendicular to the magnetic field. This characteristic scale is comparable to the wavelength of observed waves.
Statistics of natural reverberation enable perceptual separation of sound and space
Traer, James; McDermott, Josh H.
2016-01-01
In everyday listening, sound reaches our ears directly from a source as well as indirectly via reflections known as reverberation. Reverberation profoundly distorts the sound from a source, yet humans can both identify sound sources and distinguish environments from the resulting sound, via mechanisms that remain unclear. The core computational challenge is that the acoustic signatures of the source and environment are combined in a single signal received by the ear. Here we ask whether our recognition of sound sources and spaces reflects an ability to separate their effects and whether any such separation is enabled by statistical regularities of real-world reverberation. To first determine whether such statistical regularities exist, we measured impulse responses (IRs) of 271 spaces sampled from the distribution encountered by humans during daily life. The sampled spaces were diverse, but their IRs were tightly constrained, exhibiting exponential decay at frequency-dependent rates: Mid frequencies reverberated longest whereas higher and lower frequencies decayed more rapidly, presumably due to absorptive properties of materials and air. To test whether humans leverage these regularities, we manipulated IR decay characteristics in simulated reverberant audio. Listeners could discriminate sound sources and environments from these signals, but their abilities degraded when reverberation characteristics deviated from those of real-world environments. Subjectively, atypical IRs were mistaken for sound sources. The results suggest the brain separates sound into contributions from the source and the environment, constrained by a prior on natural reverberation. This separation process may contribute to robust recognition while providing information about spaces around us. PMID:27834730
Statistics of natural reverberation enable perceptual separation of sound and space.
Traer, James; McDermott, Josh H
2016-11-29
In everyday listening, sound reaches our ears directly from a source as well as indirectly via reflections known as reverberation. Reverberation profoundly distorts the sound from a source, yet humans can both identify sound sources and distinguish environments from the resulting sound, via mechanisms that remain unclear. The core computational challenge is that the acoustic signatures of the source and environment are combined in a single signal received by the ear. Here we ask whether our recognition of sound sources and spaces reflects an ability to separate their effects and whether any such separation is enabled by statistical regularities of real-world reverberation. To first determine whether such statistical regularities exist, we measured impulse responses (IRs) of 271 spaces sampled from the distribution encountered by humans during daily life. The sampled spaces were diverse, but their IRs were tightly constrained, exhibiting exponential decay at frequency-dependent rates: Mid frequencies reverberated longest whereas higher and lower frequencies decayed more rapidly, presumably due to absorptive properties of materials and air. To test whether humans leverage these regularities, we manipulated IR decay characteristics in simulated reverberant audio. Listeners could discriminate sound sources and environments from these signals, but their abilities degraded when reverberation characteristics deviated from those of real-world environments. Subjectively, atypical IRs were mistaken for sound sources. The results suggest the brain separates sound into contributions from the source and the environment, constrained by a prior on natural reverberation. This separation process may contribute to robust recognition while providing information about spaces around us.
NASA Astrophysics Data System (ADS)
Vernstrom, T.; Scott, Douglas; Wall, J. V.; Condon, J. J.; Cotton, W. D.; Perley, R. A.
2016-09-01
This is the first of two papers describing the observations and cataloguing of deep 3-GHz observations of the Lockman Hole North using the Karl G. Jansky Very Large Array. The aim of this paper is to investigate, through the use of simulated images, the uncertainties and accuracy of source-finding routines, as well as to quantify systematic effects due to resolution, such as source confusion and source size. While these effects are not new, this work is intended as a particular case study that can be scaled and translated to other surveys. We use the simulations to derive uncertainties in the fitted parameters, as well as bias corrections for the actual catalogue (presented in Paper II). We compare two different source-finding routines, OBIT and AEGEAN, and two different effective resolutions, 8 and 2.75 arcsec. We find that the two routines perform comparably well, with OBIT being slightly better at de-blending sources, but slightly worse at fitting resolved sources. We show that 30-70 per cent of sources are missed or fit inaccurately once the source size becomes larger than the beam, possibly explaining source count errors in high-resolution surveys. We also investigate the effect of blending, finding that any sources with separations smaller than the beam size are fit as single sources. We show that the use of machine-learning techniques can correctly identify blended sources up to 90 per cent of the time, and prior-driven fitting can lead to a 70 per cent improvement in the number of de-blended sources.
Sandra, Koen; Moshir, Mahan; D'hondt, Filip; Tuytten, Robin; Verleysen, Katleen; Kas, Koen; François, Isabelle; Sandra, Pat
2009-04-15
Multidimensional liquid-based separation techniques are described for maximizing the resolution of the enormous number of peptides generated upon tryptic digestion of proteomes, and hence, reduce the spatial and temporal complexity of the sample to a level that allows successful mass spectrometric analysis. This review complements the previous contribution on unidimensional high performance liquid chromatography (HPLC). Both chromatography and electrophoresis will be discussed albeit with reversed-phase HPLC (RPLC) as the final separation dimension prior to MS analysis.
NASA Astrophysics Data System (ADS)
Elangasinghe, M. A.; Dirks, K. N.; Singhal, N.; Costello, S. B.; Longley, I.; Salmond, J. A.
2014-02-01
Air pollution from the transport sector has a marked effect on human health, so isolating the pollutant contribution from a roadway is important in understanding its impact on the local neighbourhood. This paper proposes a novel technique based on a semi-empirical air pollution model to quantify the impact from a roadway on the air quality of a local neighbourhood using ambient records of a single air pollution monitor. We demonstrate the proposed technique using a case study, in which we quantify the contribution from a major highway with respect to the local background concentration in Auckland, New Zealand. Comparing the diurnal variation of the model-separated background contribution with real measurements from a site upwind of the highway shows that the model estimates are reliable. Amongst all of the pollutants considered, the best estimations of the background were achieved for nitrogen oxides. Although the multi-pronged approach worked well for predominantly vehicle-related pollutants, it could not be used effectively to isolate emissions of PM10 due to the complex and less predictable influence of natural sources (such as marine aerosols). The proposed approach is useful in situations where ambient records from an upwind background station are not available (as required by other techniques) and is potentially transferable to situations such as intersections and arterial roads. Applying this technique to longer time series could help to understand the changes in pollutant concentrations from the road and background sources for different emission scenarios, for different years or seasons. Modelling results also show the potential of such a hybrid semi-empirical models to contribute to our understanding of the physical parameters determining air quality and to validate emissions inventory data.
Detection, Source Location, and Analysis of Volcano Infrasound
NASA Astrophysics Data System (ADS)
McKee, Kathleen F.
The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible in volcanic environments. The fumarolic jet noise was found to have a sustained, low amplitude signal with a spectral peak between 7-10 Hz. From thermal imagery we measure the jet temperature ( 260 °C) and estimate the jet diameter ( 2.5 m). From the estimated jet diameter, an assumed Strouhal number of 0.19, and the jet noise peak frequency, we estimated the jet velocity to be 79 - 132 m/s. We used published gas data to then estimate the volatile flux at 160 - 270 kg/s (14,000 - 23,000 t/d). These estimates are typically difficult to obtain in volcanic environments, but provide valuable information on the eruption. At regional and global length scales we use infrasound arrays to detect signals and determine their source back-azimuths. A ground coupled airwave (GCA) occurs when an incident acoustic pressure wave encounters the Earth's surface and part of the energy of the wave is transferred to the ground. GCAs are commonly observed from sources such as volcanic eruptions, bolides, meteors, and explosions. They have been observed to have retrograde particle motion. When recorded on collocated seismo-acoustic sensors, the phase between the infrasound and seismic signals is 90°. If the sensors are separated wind noise is usually incoherent and an additional phase is added due to the sensor separation. We utilized the additional phase and the characteristic particle motion to determine a unique back-azimuth solution to an acoustic source. The additional phase will be different depending on the direction from which a wave arrives. Our technique was tested using synthetic seismo-acoustic data from a coupled Earth-atmosphere 3D finite difference code and then applied to two well-constrained datasets: Mount St. Helens, USA, and Mount Pagan, Commonwealth of the Northern Mariana Islands Volcanoes. The results from our method are within <1° - 5° of the actual and traditional infrasound array processing determined back-azimuths. Ours is a new method to detect and determine the back-azimuth to infrasonic signals, which will be useful when financial and spatial resources are limited.
Ju, Wei-Na; Wang, Cheng-Xue; Wang, Tie-Jun; Qi, Bao-Chang
2017-11-01
Clavicle fractures are common, and mostly occur in the midshaft. Methods for operative treatment of midshaft clavicle fractures are evolving, as they improve clinical outcomes compared with traditional conservative management. However, fixation of comminuted midshaft clavicle fractures with bone fragments separated by soft tissue remains a challenge. Here, we present a case of comminuted midshaft clavicle fracture with a bone fragment separated from the main fracture by soft tissue. Left comminuted midshaft clavicle fracture. We treated this patient with a novel double ligature technique using absorbable suturing. In the past 7 years, we have treated >50 patients with this technique. We have achieved good clinical outcomes with no complications. We recommend widespread use of our novel double ligature technique for treating comminuted midshaft clavicle fractures with bone fragments separated by soft tissue.
Ju, Wei-Na; Wang, Cheng-Xue; Wang, Tie-Jun; Qi, Bao-Chang
2017-01-01
Abstract Rationale: Clavicle fractures are common, and mostly occur in the midshaft. Methods for operative treatment of midshaft clavicle fractures are evolving, as they improve clinical outcomes compared with traditional conservative management. However, fixation of comminuted midshaft clavicle fractures with bone fragments separated by soft tissue remains a challenge. Patient concerns: Here, we present a case of comminuted midshaft clavicle fracture with a bone fragment separated from the main fracture by soft tissue. Diagnosis: Left comminuted midshaft clavicle fracture. Interventions: We treated this patient with a novel double ligature technique using absorbable suturing. Outcomes: In the past 7 years, we have treated >50 patients with this technique. We have achieved good clinical outcomes with no complications. Lessons: We recommend widespread use of our novel double ligature technique for treating comminuted midshaft clavicle fractures with bone fragments separated by soft tissue. PMID:29137088
Method for improving the angular resolution of a neutron scatter camera
Mascarenhas, Nicholas; Marleau, Peter; Gerling, Mark; Cooper, Robert Lee; Mrowka, Stanley; Brennan, James S.
2012-12-25
An instrument that will directly image the fast fission neutrons from a special nuclear material source wherein the neutron detection efficiency is increased has been described. Instead of the previous technique that uses a time-of-flight (TOF) between 2 widely spaced fixed planes of neutron detectors to measure scatter neutron kinetic energy, we now use the recoil proton energy deposited in the second of the 2 scatter planes which can now be repositioned either much closer together or further apart. However, by doubling the separation distance between the 2 planes from 20 cm to a distance of 40 cm we improved the angular resolution of the detector from about 12.degree. to about 10.degree.. A further doubling of the separation distance to 80 cm provided an addition improvement in angular resolution of the detector to about 6.degree. without adding additional detectors or ancillary electronics. The distance between planes also may be dynamically changed using a suitable common technique such as a gear- or motor-drive to toggle between the various positions. The angular resolution of this new configuration, therefore, is increased at the expanse of detection sensitivity. However, the diminished sensitivity may be acceptable for those applications where the detector is able to interrogate a particular site for an extended period.
Ostrovski, Fernanda; McMahon, Richard G.; Connolly, Andrew J.; ...
2016-11-17
In this paper, we present the discovery and preliminary characterization of a gravitationally lensed quasar with a source redshift z s = 2.74 and image separation of 2.9 arcsec lensed by a foreground z l = 0.40 elliptical galaxy. Since optical observations of gravitationally lensed quasars show the lens system as a superposition of multiple point sources and a foreground lensing galaxy, we have developed a morphology-independent multi-wavelength approach to the photometric selection of lensed quasar candidates based on Gaussian Mixture Models (GMM) supervised machine learning. Using this technique and gi multicolour photometric observations from the Dark Energy Survey (DES),more » near-IR JK photometry from the VISTA Hemisphere Survey (VHS) and WISE mid-IR photometry, we have identified a candidate system with two catalogue components with i AB = 18.61 and i AB = 20.44 comprising an elliptical galaxy and two blue point sources. Spectroscopic follow-up with NTT and the use of an archival AAT spectrum show that the point sources can be identified as a lensed quasar with an emission line redshift of z = 2.739 ± 0.003 and a foreground early-type galaxy with z = 0.400 ± 0.002. We model the system as a single isothermal ellipsoid and find the Einstein radius θ E ~ 1.47 arcsec, enclosed mass M enc ~ 4 × 10 11 M ⊙ and a time delay of ~52 d. Finally, the relatively wide separation, month scale time delay duration and high redshift make this an ideal system for constraining the expansion rate beyond a redshift of 1.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostrovski, Fernanda; McMahon, Richard G.; Connolly, Andrew J.
In this paper, we present the discovery and preliminary characterization of a gravitationally lensed quasar with a source redshift z s = 2.74 and image separation of 2.9 arcsec lensed by a foreground z l = 0.40 elliptical galaxy. Since optical observations of gravitationally lensed quasars show the lens system as a superposition of multiple point sources and a foreground lensing galaxy, we have developed a morphology-independent multi-wavelength approach to the photometric selection of lensed quasar candidates based on Gaussian Mixture Models (GMM) supervised machine learning. Using this technique and gi multicolour photometric observations from the Dark Energy Survey (DES),more » near-IR JK photometry from the VISTA Hemisphere Survey (VHS) and WISE mid-IR photometry, we have identified a candidate system with two catalogue components with i AB = 18.61 and i AB = 20.44 comprising an elliptical galaxy and two blue point sources. Spectroscopic follow-up with NTT and the use of an archival AAT spectrum show that the point sources can be identified as a lensed quasar with an emission line redshift of z = 2.739 ± 0.003 and a foreground early-type galaxy with z = 0.400 ± 0.002. We model the system as a single isothermal ellipsoid and find the Einstein radius θ E ~ 1.47 arcsec, enclosed mass M enc ~ 4 × 10 11 M ⊙ and a time delay of ~52 d. Finally, the relatively wide separation, month scale time delay duration and high redshift make this an ideal system for constraining the expansion rate beyond a redshift of 1.« less
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Scope. 246.100 Section 246.100 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES SOURCE SEPARATION FOR... source separation of residential, commercial, and institutional solid wastes. Explicitly excluded are...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Scope. 246.100 Section 246.100 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES SOURCE SEPARATION FOR... source separation of residential, commercial, and institutional solid wastes. Explicitly excluded are...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Scope. 246.100 Section 246.100 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES SOURCE SEPARATION FOR... source separation of residential, commercial, and institutional solid wastes. Explicitly excluded are...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Scope. 246.100 Section 246.100 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES SOURCE SEPARATION FOR... source separation of residential, commercial, and institutional solid wastes. Explicitly excluded are...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Scope. 246.100 Section 246.100 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES SOURCE SEPARATION FOR... source separation of residential, commercial, and institutional solid wastes. Explicitly excluded are...
NASA Astrophysics Data System (ADS)
Munafo, I.; Malagnini, L.; Tinti, E.; Chiaraluce, L.; Di Stefano, R.; Valoroso, L.
2014-12-01
The Alto Tiberina Fault (ATF) is a 60 km long east-dipping low-angle normal fault, located in a sector of the Northern Apennines (Italy) undergoing active extension since the Quaternary. The ATF has been imaged by analyzing the active source seismic reflection profiles, and the instrumentally recorded persistent background seismicity. The present study is an attempt to separate the contributions of source, site, and crustal attenuation, in order to focus on the mechanics of the seismic sources on the ATF, as well on the synthetic and the antithetic structures within the ATF hanging-wall (i.e. Colfiorito fault, Gubbio fault and Umbria Valley fault). In order to compute source spectra, we perform a set of regressions over the seismograms of 2000 small earthquakes (-0.8 < ML< 4) recorded between 2010 and 2014 at 50 permanent seismic stations deployed in the framework of the Alto Tiberina Near Fault Observatory project (TABOO) and equipped with three-components seismometers, three of which located in shallow boreholes. Because we deal with some very small earthquakes, we maximize the signal to noise ratio (SNR) with a technique based on the analysis of peak values of bandpass-filtered time histories, in addition to the same processing performed on Fourier amplitudes. We rely on a tool called Random Vibration Theory (RVT) to completely switch from peak values in the time domain to Fourier spectral amplitudes. Low-frequency spectral plateau of the source terms are used to compute moment magnitudes (Mw) of all the events, whereas a source spectral ratio technique is used to estimate the corner frequencies (Brune spectral model) of a subset of events chosen over the analysis of the noise affecting the spectral ratios. So far, the described approach provides high accuracy over the spectral parameters of earthquakes of localized seismicity, and may be used to gain insights into the underlying mechanics of faulting and the earthquake processes.
Haji Gholizadeh, Mohammad; Melesse, Assefa M; Reddi, Lakshmi
2016-10-01
In this study, principal component analysis (PCA), factor analysis (FA), and the absolute principal component score-multiple linear regression (APCS-MLR) receptor modeling technique were used to assess the water quality and identify and quantify the potential pollution sources affecting the water quality of three major rivers of South Florida. For this purpose, 15years (2000-2014) dataset of 12 water quality variables covering 16 monitoring stations, and approximately 35,000 observations was used. The PCA/FA method identified five and four potential pollution sources in wet and dry seasons, respectively, and the effective mechanisms, rules and causes were explained. The APCS-MLR apportioned their contributions to each water quality variable. Results showed that the point source pollution discharges from anthropogenic factors due to the discharge of agriculture waste and domestic and industrial wastewater were the major sources of river water contamination. Also, the studied variables were categorized into three groups of nutrients (total kjeldahl nitrogen, total phosphorus, total phosphate, and ammonia-N), water murkiness conducive parameters (total suspended solids, turbidity, and chlorophyll-a), and salt ions (magnesium, chloride, and sodium), and average contributions of different potential pollution sources to these categories were considered separately. The data matrix was also subjected to PMF receptor model using the EPA PMF-5.0 program and the two-way model described was performed for the PMF analyses. Comparison of the obtained results of PMF and APCS-MLR models showed that there were some significant differences in estimated contribution for each potential pollution source, especially in the wet season. Eventually, it was concluded that the APCS-MLR receptor modeling approach appears to be more physically plausible for the current study. It is believed that the results of apportionment could be very useful to the local authorities for the control and management of pollution and better protection of important riverine water quality. Copyright © 2016 Elsevier B.V. All rights reserved.
Indirect current control with separate IZ drop compensation for voltage source converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanetkar, V.R.; Dawande, M.S.; Dubey, G.K.
1995-12-31
Indirect Current Control (ICC) of boost type Voltage Source Converters (VSCs) using separate compensation of line IZ voltage drop is presented. A separate bi-directional VSC is used to produce the compensation voltage. This simplifies the ICC regulator scheme as the power flow is controlled through single modulation index. Experimental verification is provided for bi-directional control of the power flow.
Independent component analysis algorithm FPGA design to perform real-time blind source separation
NASA Astrophysics Data System (ADS)
Meyer-Baese, Uwe; Odom, Crispin; Botella, Guillermo; Meyer-Baese, Anke
2015-05-01
The conditions that arise in the Cocktail Party Problem prevail across many fields creating a need for of Blind Source Separation. The need for BSS has become prevalent in several fields of work. These fields include array processing, communications, medical signal processing, and speech processing, wireless communication, audio, acoustics and biomedical engineering. The concept of the cocktail party problem and BSS led to the development of Independent Component Analysis (ICA) algorithms. ICA proves useful for applications needing real time signal processing. The goal of this research was to perform an extensive study on ability and efficiency of Independent Component Analysis algorithms to perform blind source separation on mixed signals in software and implementation in hardware with a Field Programmable Gate Array (FPGA). The Algebraic ICA (A-ICA), Fast ICA, and Equivariant Adaptive Separation via Independence (EASI) ICA were examined and compared. The best algorithm required the least complexity and fewest resources while effectively separating mixed sources. The best algorithm was the EASI algorithm. The EASI ICA was implemented on hardware with Field Programmable Gate Arrays (FPGA) to perform and analyze its performance in real time.
Separation of concurrent broadband sound sources by human listeners
NASA Astrophysics Data System (ADS)
Best, Virginia; van Schaik, André; Carlile, Simon
2004-01-01
The effect of spatial separation on the ability of human listeners to resolve a pair of concurrent broadband sounds was examined. Stimuli were presented in a virtual auditory environment using individualized outer ear filter functions. Subjects were presented with two simultaneous noise bursts that were either spatially coincident or separated (horizontally or vertically), and responded as to whether they perceived one or two source locations. Testing was carried out at five reference locations on the audiovisual horizon (0°, 22.5°, 45°, 67.5°, and 90° azimuth). Results from experiment 1 showed that at more lateral locations, a larger horizontal separation was required for the perception of two sounds. The reverse was true for vertical separation. Furthermore, it was observed that subjects were unable to separate stimulus pairs if they delivered the same interaural differences in time (ITD) and level (ILD). These findings suggested that the auditory system exploited differences in one or both of the binaural cues to resolve the sources, and could not use monaural spectral cues effectively for the task. In experiments 2 and 3, separation of concurrent noise sources was examined upon removal of low-frequency content (and ITDs), onset/offset ITDs, both of these in conjunction, and all ITD information. While onset and offset ITDs did not appear to play a major role, differences in ongoing ITDs were robust cues for separation under these conditions, including those in the envelopes of high-frequency channels.
Magnetic resonance separation imaging using a divided inversion recovery technique (DIRT).
Goldfarb, James W
2010-04-01
The divided inversion recovery technique is an MRI separation method based on tissue T(1) relaxation differences. When tissue T(1) relaxation times are longer than the time between inversion pulses in a segmented inversion recovery pulse sequence, longitudinal magnetization does not pass through the null point. Prior to additional inversion pulses, longitudinal magnetization may have an opposite polarity. Spatial displacement of tissues in inversion recovery balanced steady-state free-precession imaging has been shown to be due to this magnetization phase change resulting from incomplete magnetization recovery. In this paper, it is shown how this phase change can be used to provide image separation. A pulse sequence parameter, the time between inversion pulses (T180), can be adjusted to provide water-fat or fluid separation. Example water-fat and fluid separation images of the head, heart, and abdomen are presented. The water-fat separation performance was investigated by comparing image intensities in short-axis divided inversion recovery technique images of the heart. Fat, blood, and fluid signal was suppressed to the background noise level. Additionally, the separation performance was not affected by main magnetic field inhomogeneities.
Method for the chemical separation of GE-68 from its daughter Ga-68
Fitzsimmons, Jonathan M.; Atcher, Robert W.
2010-06-01
The present invention is directed to a generator apparatus for separating a daughter gallium-68 radioisotope substantially free of impurities from a parent gernanium-68 radioisotope, including a first resin-containing column containing parent gernanium-68 radioisotope and daughter gallium-68 radioisotope, a source of first eluent connected to said first resin-containing column for separating daughter gallium-68 radioisotope from the first resin-containing column, said first eluent including citrate whereby the separated gallium is in the form of gallium citrate, a mixing space connected to said first resin-containing column for admixing a source of hydrochloric acid with said separated gallium citrate whereby gallium citrate is converted to gallium tetrachloride, a second resin-containing column for retention of gallium-68 tetrachloride, and, a source of second eluent connected to said second resin-containing column for eluting the daughter gallium-68 radioisotope from said second resin-containing column.
NASA Astrophysics Data System (ADS)
Brewick, P. T.; Smyth, A. W.
2014-12-01
The accurate and reliable estimation of modal damping from output-only vibration measurements of structural systems is a continuing challenge in the fields of operational modal analysis (OMA) and system identification. In this paper a modified version of the blind source separation (BSS)-based Second-Order Blind Identification (SOBI) method was used to perform modal damping identification on a model bridge structure under varying loading conditions. The bridge model was created with finite elements and consisted of a series of stringer beams supported by a larger girder. The excitation was separated into two categories: ambient noise and traffic loads with noise modeled with random forcing vectors and traffic simulated with moving loads for cars and partially distributed moving masses for trains. The acceleration responses were treated as the mixed output signals for the BSS algorithm. The modified SOBI method used a windowing technique to maximize the amount of information used for blind identification from the responses. The modified SOBI method successfully found the mode shapes for both types of excitation with strong accuracy, but power spectral densities (PSDs) of the recovered modal responses showed signs of distortion for the traffic simulations. The distortion had an adverse affect on the damping ratio estimates for some of the modes but no correlation could be found between the accuracy of the damping estimates and the accuracy of the recovered mode shapes. The responses and their PSDs were compared to real-world collected data and patterns similar to distortion were observed implying that this issue likely affects real-world estimates.
Nanoflow Separation of Amino Acids for the Analysis of Cosmic Dust
NASA Technical Reports Server (NTRS)
Martin, M. P.; Glavin, D. P.; Dworkin, Jason P.
2008-01-01
The delivery of amino acids to the early Earth by interplanetary dust particles, comets, and carbonaceous meteorites could have been a significant source of the early Earth's prebiotic organic inventory. Amino acids are central to modern terrestrial biochemistry as major components of proteins and enzymes and were probably vital in the origin of life. A variety of amino acids have been detected in the CM carbonaceous meteorite Murchison, many of which are exceptionally rare in the terrestrial biosphere including a-aminoisobutyric acid (AIB) and isovaline. AIB has also been detected in a small percentage of Antarctic micrometeorite grains believed to be related to the CM meteorites We report on progress in optimizing a nanoflow liquid chromatography separation system with dual detection via laser-induced-fluorescence time of flight mass spectrometry (nLC-LIF/ToF-MS) for the analysis of o-phthaldialdehydelN-acetyl-L-cysteine (OPA/NAC) labeled amino acids in cosmic dust grains. The very low flow rates (<3 micro-L/min) of nLC over analytical LC (>0.1 ml/min) combined with <2 micron column bead sizes has the potential to produce efficient analyte ionizations andchromatograms with very sharp peaks; both increase sensitivity. The combination of the selectivity (only primary amines are derivatized), sensitivity (>4 orders of magnitude lower than traditional GC-MS techniques), and specificity (compounds identities are determined by both retention time and exact mass) makes this a compelling technique. However, the development of an analytical method to achieve separation of compounds as structurally similar as amino acid monomers and produce the sharp peaks required for maximum sensitivity is challenging.
Fetal source extraction from magnetocardiographic recordings by dependent component analysis
NASA Astrophysics Data System (ADS)
de Araujo, Draulio B.; Kardec Barros, Allan; Estombelo-Montesco, Carlos; Zhao, Hui; Roque da Silva Filho, A. C.; Baffa, Oswaldo; Wakai, Ronald; Ohnishi, Noboru
2005-10-01
Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.
Applications of Skylab data to land use and climatological analysis
NASA Technical Reports Server (NTRS)
Alexander, R. H. (Principal Investigator); Lewis, J. E., Jr.; Lins, H. F., Jr.; Jenner, C. B.; Outcalt, S. I.; Pease, R. W.
1976-01-01
The author has identified the following significant results. Skylab study in Central Atlantic Regional Ecological Test Site encompassed two separate but related tasks: (1) evaluation of photographic sensors S190A and B as sources of land use data for planning and managing land resources in major metropolitan regions, and (2) evaluation of the multispectral scanner S192 used in conjunction with associated data and analytical techniques as a data source on urban climates and the surface energy balance. Photographs from the Skylab S190B earth terrain camera were of greatest interest in the land use analysis task; they were of sufficiently high resolution to identify and map many level 2 and 3 land use categories. After being corrected to allow for atmosphere effects, output from thermal and visible bands of the S192 was employed in constructing computer map plots of albedo and surface temperature.
Early results from Magsat. [studies of near-earth magnetic fields
NASA Technical Reports Server (NTRS)
Langel, R. A.; Estes, R. H.; Mayhew, M. A.
1981-01-01
Papers presented at the May 27, 1981 meeting of the American Geophysical Union concerning early results from the Magsat satellite program, which was designed to study the near-earth magnetic fields originating in the core and lithosphere, are discussed. The satellite was launched on October 30, 1979 into a sun-synchronous (twilight) orbit, and re-entered the atmosphere on June 11, 1980. Instruments carried included a cesium vapor magnetometer to measure field magnitudes, a fluxgate magnetometer to measure field components and an optical system to measure fluxgate magnetometer orientation. Early results concerned spherical harmonic models, fields due to ionospheric and magnetospheric currents, the identification and interpretation of fields from lithospheric sources. The preliminary results confirm the possibility of separating the measured field into core, crustal and external components, and represent significant developments in analytical techniques in main-field modelling and the physics of the field sources.
Nucleon Charges from 2+1+1-flavor HISQ and 2+1-flavor clover lattices
Gupta, Rajan
2016-07-24
Precise estimates of the nucleon charges g A, g S and g T are needed in many phenomenological analyses of SM and BSM physics. In this talk, we present results from two sets of calculations using clover fermions on 9 ensembles of 2+1+1-flavor HISQ and 4 ensembles of 2+1-flavor clover lattices. In addition, we show that high statistics can be obtained cost-effectively using the truncated solver method with bias correction and the coherent source sequential propagator technique. By performing simulations at 4–5 values of the source-sink separation t sep, we demonstrate control over excited-state contamination using 2- and 3-state fits.more » Using the high-precision 2+1+1-flavor data, we perform a simultaneous fit in a, M π and M πL to obtain our final results for the charges.« less
Data registration for automated non-destructive inspection with multiple data sets
NASA Astrophysics Data System (ADS)
Tippetts, T.; Brierley, N.; Cawley, P.
2013-01-01
In many NDE applications, multiple sources of data are available covering the same region of a part under inspection. These overlapping data can come from intersecting scan patterns, sensors in an array configuration, or repeated inspections at different times. In many cases these data sets are analysed independently, with separate assessments for each channel or data file. It should be possible to improve the overall reliability of the inspection by combining multiple sources of information, simultaneously increasing the Probability of Detection (POD) and decreasing the Probability of False Alarm (PFA). Data registration, i.e. mapping the data to matching coordinates in space, is both an essential prerequisite and a challenging obstacle to this type of data fusion. This paper describes optimization techniques for matching and aligning features in NDE data. Examples from automated ultrasound inspection of aircraft engine discs illustrate the approach.
Single-Donor Leukophoretic Technique
NASA Technical Reports Server (NTRS)
Eberhardt, R. N.
1977-01-01
Leukocyte separation-and-retrieval device utilizes granulocyte and monocyte property of leukoadhesion to glass surfaces as basis of their separation from whole blood. Device is used with single donor technique and has application in biological and chemical processing, veterinary research and clinical care.
NASA Technical Reports Server (NTRS)
Ashour-Abdalla, Maha
1998-01-01
A fundamental goal of magnetospheric physics is to understand the transport of plasma through the solar wind-magnetosphere-ionosphere system. To attain such an understanding, we must determine the sources of the plasma, the trajectories of the particles through the magnetospheric electric and magnetic fields to the point of observation, and the acceleration processes they undergo enroute. This study employed plasma distributions observed in the near-Earth plasma sheet by Interball and Geotail spacecraft together with theoretical techniques to investigate the ion sources and the transport of plasma. We used ion trajectory calculations in magnetic and electric fields from a global Magnetohydrodynamics (MHD) simulation to investigate the transport and to identify common ion sources for ions observed in the near-Earth magnetotail by the Interball and Geotail spacecraft. Our first step was to examine a number of distribution functions and identify distinct boundaries in both configuration and phase space that are indicative of different plasma sources and transport mechanisms. We examined events from October 26, 1995, November 29-30, 1996, and December 22, 1996. During the first event Interball and Geotail were separated by approximately 10 R(sub E) in z, and during the second event the spacecraft were separated by approximately 4(sub RE). Both of these events had a strong IMF By component pointing toward the dawnside. On October 26, 1995, the IMF B(sub Z) component was northward, and on November 1-9-30, 1996, the IMF B sub Z) component was near 0. During the first event, Geotail was located near the equator on the dawn flank, while Interball was for the most part in the lobe region. The distribution function from the Coral instrument on Interball showed less structure and resembled a drifting Maxwellian. The observed distribution on Geotail, on the other hand, included a great number of structures at both low and high energies. During the third event (December 22, 1996) both spacecraft were in the plasma sheet and were separated bY approximately 20 R(sub E) in the y direction. During this event the IMF was southward.
Decomposition Technique for Remaining Useful Life Prediction
NASA Technical Reports Server (NTRS)
Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)
2014-01-01
The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.
Bayes filter modification for drivability map estimation with observations from stereo vision
NASA Astrophysics Data System (ADS)
Panchenko, Aleksei; Prun, Viktor; Turchenkov, Dmitri
2017-02-01
Reconstruction of a drivability map for a moving vehicle is a well-known research topic in applied robotics. Here creating such a map for an autonomous truck on a generally planar surface containing separate obstacles is considered. The source of measurements for the truck is a calibrated pair of cameras. The stereo system detects and reconstructs several types of objects, such as road borders, other vehicles, pedestrians and general tall objects or highly saturated objects (e.g. road cone). For creating a robust mapping module we use a modification of Bayes filtering, which introduces some novel techniques for occupancy map update step. Specifically, our modified version becomes applicable to the presence of false positive measurement errors, stereo shading and obstacle occlusion. We implemented the technique and achieved real-time 15 FPS computations on an industrial shake proof PC. Our real world experiments show the positive effect of the filtering step.
Rehle, D; Leleux, D; Erdelyi, M; Tittel, F; Fraser, M; Friedfeld, S
2001-01-01
A laser spectrometer based on difference-frequency generation in periodically poled LiNbO3 (PPLN) has been used to quantify atmospheric formaldehyde with a detection limit of 0.32 parts per billion in a given volume (ppbV) using specifically developed data-processing techniques. With state-of-the-art fiber-coupled diode-laser pump sources at 1083 nm and 1561 nm, difference-frequency radiation has been generated in the 3.53-micrometers (2832-cm-1) spectral region. Formaldehyde in ambient air in the 1- to 10-ppb V range has been detected continuously for nine and five days at two separate field sites in the Greater Houston area operated by the Texas Natural Resource Conservation Commission (TNRCC) and the Houston Regional Monitoring Corporation (HRM). The acquired spectroscopic data are compared with results obtained by a well-established wet-chemical o-(2,3,4,5,6-pentafluorobenzyl) hydroxylamine (PFBHA) technique.
Estimation of global anthropogenic dust aerosol using CALIOP satellite
NASA Astrophysics Data System (ADS)
Chen, B.; Huang, J.; Liu, J.
2014-12-01
Anthropogenic dust aerosols are those produced by human activity, which mainly come from cropland, pasture, and urban in this paper. Because understanding of the emissions of anthropogenic dust is still very limited, a new technique for separating anthropogenic dust from natural dustusing CALIPSO dust and planetary boundary layer height retrievalsalong with a land use dataset is introduced. Using this technique, the global distribution of dust is analyzed and the relative contribution of anthropogenic and natural dust sources to regional and global emissions are estimated. Local anthropogenic dust aerosol due to human activity, such as agriculture, industrial activity, transportation, and overgrazing, accounts for about 22.3% of the global continentaldust load. Of these anthropogenic dust aerosols, more than 52.5% come from semi-arid and semi-wet regions. On the whole, anthropogenic dust emissions from East China and India are higher than other regions.
NASA Technical Reports Server (NTRS)
Brackett, Robert A.; Arvidson, Raymond E.
1993-01-01
A technique is presented that allows extraction of compositional and textural information from visible, near and thermal infrared remotely sensed data. Using a library of both emissivity and reflectance spectra, endmember abundances and endmember thermal inertias are extracted from AVIRIS (Airborne Visible and Infrared Imaging Spectrometer) and TIMS (Thermal Infrared Mapping Spectrometer) data over Lunar Crater Volcanic Field, Nevada, using a dual inversion. The inversion technique is motivated by upcoming Mars Observer data and the need for separation of composition and texture parameters from sub pixel mixtures of bedrock and dust. The model employed offers the opportunity to extract compositional and textural information for a variety of endmembers within a given pixel. Geologic inferences concerning grain size, abundance, and source of endmembers can be made directly from the inverted data. These parameters are of direct relevance to Mars exploration, both for Mars Observer and for follow-on missions.
Marine environment pollution: The contribution of mass spectrometry to the study of seawater.
Magi, Emanuele; Di Carro, Marina
2016-09-09
The study of marine pollution has been traditionally addressed to persistent chemicals, generally known as priority pollutants; a current trend in environmental analysis is a shift toward "emerging pollutants," defined as newly identified or previously unrecognized contaminants. The present review is focused on the peculiar contribution of mass spectrometry (MS) to the study of pollutants in the seawater compartment. The work is organized in five paragraphs where the most relevant groups of pollutants, both "classical" and "emerging," are presented and discussed, highlighting the relative data obtained by the means of different MS techniques. The hyphenation of MS and separative techniques, together with the development of different ion sources, makes MS and tandem MS the analytical tool of choice for the determination of trace organic contaminants in seawater. © 2016 Wiley Periodicals, Inc. Mass Spec Rev. © 2016 Wiley Periodicals, Inc.
Single photon energy dispersive x-ray diffraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Higginbotham, Andrew; Patel, Shamim; Ciricosta, Orlando
2014-03-15
With the pressure range accessible to laser driven compression experiments on solid material rising rapidly, new challenges in the diagnosis of samples in harsh laser environments are emerging. When driving to TPa pressures (conditions highly relevant to planetary interiors), traditional x-ray diffraction techniques are plagued by increased sources of background and noise, as well as a potential reduction in signal. In this paper we present a new diffraction diagnostic designed to record x-ray diffraction in low signal-to-noise environments. By utilising single photon counting techniques we demonstrate the ability to record diffraction patterns on nanosecond timescales, and subsequently separate, photon-by-photon, signalmore » from background. In doing this, we mitigate many of the issues surrounding the use of high intensity lasers to drive samples to extremes of pressure, allowing for structural information to be obtained in a regime which is currently largely unexplored.« less
Tavlarides, L.L.; Bae, J.H.
1991-12-24
A laser capillary spectrophotometric technique measures real time or near real time bivariate drop size and concentration distribution for a reactive liquid-liquid dispersion system. The dispersion is drawn into a precision-bore glass capillary and an appropriate light source is used to distinguish the aqueous phase from slugs of the organic phase at two points along the capillary whose separation is precisely known. The suction velocity is measured, as is the length of each slug from which the drop free diameter is calculated. For each drop, the absorptivity at a given wavelength is related to the molar concentration of a solute of interest, and the concentration of given drops of the organic phase is derived from pulse heights of the detected light. This technique permits on-line monitoring and control of liquid-liquid dispersion processes. 17 figures.
NASA Technical Reports Server (NTRS)
Rehle, D.; Leleux, D.; Erdelyi, M.; Tittel, F.; Fraser, M.; Friedfeld, S.
2001-01-01
A laser spectrometer based on difference-frequency generation in periodically poled LiNbO3 (PPLN) has been used to quantify atmospheric formaldehyde with a detection limit of 0.32 parts per billion in a given volume (ppbV) using specifically developed data-processing techniques. With state-of-the-art fiber-coupled diode-laser pump sources at 1083 nm and 1561 nm, difference-frequency radiation has been generated in the 3.53-micrometers (2832-cm-1) spectral region. Formaldehyde in ambient air in the 1- to 10-ppb V range has been detected continuously for nine and five days at two separate field sites in the Greater Houston area operated by the Texas Natural Resource Conservation Commission (TNRCC) and the Houston Regional Monitoring Corporation (HRM). The acquired spectroscopic data are compared with results obtained by a well-established wet-chemical o-(2,3,4,5,6-pentafluorobenzyl) hydroxylamine (PFBHA) technique.
Gamma radiation effects on the dynamic fatigue measurements of glass discs
NASA Technical Reports Server (NTRS)
Ananaba, T. O. J.; Kinser, D. L.
1985-01-01
Circular specimens of low iron soda lime silicate glass were blasted with grit after having a circular notch etched into their centers. After separation into two groups, one group was exposed to gamma radiation. The fracture strengths of all samples were then tested by the biaxial technique, i.e., specimens were balanced on three balls and loaded in the center by a piston. The irradiated samples had received a 140,000 Gy dose from a Co-60 source. An enhanced interaction between the ambient moisture and the grit-blasted central notch was observed in the irradiated samples, which displayed accelerated corrosion.
Performance of a Diaphragmed Microlens for a Packaged Microspectrometer
Lo, Joe; Chen, Shih-Jui; Fang, Qiyin; Papaioannou, Thanassis; Kim, Eun-Sok; Gundersen, Martin; Marcu, Laura
2009-01-01
This paper describes the design, fabrication, packaging and testing of a microlens integrated in a multi-layered MEMS microspectrometer. The microlens was fabricated using modified PDMS molding to form a suspended lens diaphragm. Gaussian beam propagation model was used to measure the focal length and quantify M2 value of the microlens. A tunable calibration source was set up to measure the response of the packaged device. Dual wavelength separation by the packaged device was demonstrated by CCD imaging and beam profiling of the spectroscopic output. We demonstrated specific techniques to measure critical parameters of microoptics systems for future optimization of spectroscopic devices. PMID:22399943
NASA Astrophysics Data System (ADS)
Aubry, Alexandre; Derode, Arnaud; Padilla, Frédéric
2008-03-01
We present local measurements of the diffusion constant for ultrasonic waves undergoing multiple scattering. The experimental setup uses a coherent array of programmable transducers. By achieving Gaussian beamforming at emission and reception, an array of virtual sources and receivers located in the near field is constructed. A matrix treatment is proposed to separate the incoherent intensity from the coherent backscattering peak. Local measurements of the diffusion constant D are then achieved. This technique is applied to a real case: a sample of human trabecular bone for which the ultrasonic characterization of multiple scattering is an issue.
Assessment of the Impacts of Radio Frequency Interference on SMAP Radar and Radiometer Measurements
NASA Technical Reports Server (NTRS)
Chen, Curtis W.; Piepmeier, Jeffrey R.; Johnson, Joel T.; Hirad Ghaemi
2012-01-01
The NASA Soil Moisture Active and Passive (SMAP) mission will measure soil moisture with a combination of Lband radar and radiometer measurements. We present an assessment of the expected impact of radio frequency interference (RFI) on SMAP performance, incorporating projections based on recent data collected by the Aquarius and SMOS missions. We discuss the impacts of RFI on the radar and radiometer separately given the differences in (1) RFI environment between the shared radar band and the protected radiometer band, (2) mitigation techniques available for the different measurements, and (3) existing data sources available that can inform predictions for SMAP.
Assessing contribution of DOC from sediments to a drinking-water reservoir using optical profiling
Downing, Bryan D.; Bergamaschi, Brian A.; Evans, David G.; Boss, Emmanuel
2008-01-01
Understanding the sources of dissolved organic carbon (DOC) in drinking-water reservoirs is an important management issue because DOC may form disinfection by-products, interfere with disinfection, or increase treatment costs. DOC may be derived from a host of sources-algal production of DOC in the reservoir, marginal production of DOC from mucks and vascular plants at the margins, and sediments in the reservoir. The purpose of this study was to assess if release of DOC from reservoir sediments containing ferric chloride coagulant was a significant source of DOC to the reservoir. We examined the source-specific contributions of DOC using a profiling system to measure the in situ distribution of optical properties of absorption and fluorescence at various locations in the reservoir. Vertical optical profiles were coupled with discrete water samples measured in the laboratory for DOC concentration and optical properties: absorption spectra and excitation emission matrix spectra (EEMs). Modeling the in situ optical data permitted estimation of the bulk DOC profile in the reservoir as well as separation into source-specific contributions. Analysis of the source-specific profiles and their associated optical characteristics indicated that the sedimentary source of DOC to the reservoir is significant and that this DOC is labile in the reservoir. We conclude that optical profiling is a useful technique for understanding complex biogeochemical processes in a reservoir.
Mass transfer apparatus and method for separation of gases
Blount, Gerald C.
2015-10-13
A process and apparatus for separating components of a source gas is provided in which more soluble components of the source gas are dissolved in an aqueous solvent at high pressure. The system can utilize hydrostatic pressure to increase solubility of the components of the source gas. The apparatus includes gas recycle throughout multiple mass transfer stages to improve mass transfer of the targeted components from the liquid to gas phase. Separated components can be recovered for use in a value added application or can be processed for long-term storage, for instance in an underwater reservoir.
Mass transfer apparatus and method for separation of gases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blount, Gerald C.; Gorensek, Maximilian Boris; Hamm, Luther L.
A process and apparatus for separating components of a source gas is provided in which more soluble components of the source gas are dissolved in an aqueous solvent at high pressure. The system can utilize hydrostatic pressure to increase solubility of the components of the source gas. The apparatus includes gas recycle throughout multiple mass transfer stages to improve mass transfer of the targeted components from the liquid to gas phase. Separated components can be recovered for use in a value added application or can be processed for long-term storage, for instance in an underwater reservoir.
Bernstad, Anna; la Cour Jansen, Jes; Aspegren, Henrik
2011-03-01
Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems. Copyright © 2010 Elsevier Ltd. All rights reserved.
Rupture Propagation Imaging of Fluid Induced Events at the Basel EGS Project
NASA Astrophysics Data System (ADS)
Folesky, Jonas; Kummerow, Jörn; Shapiro, Serge A.
2014-05-01
The analysis of rupture properties using rupture propagation imaging techniques is a fast developing field of research in global seismology. Usually rupture fronts of large to megathrust earthquakes are subject of recent studies, like e.g. the 2004 Sumatra-Andaman earthquake or the 2011 Tohoku, Japan earthquake. The back projection technique is the most prominent technique in this field. Here the seismograms recorded at an array or at a seismic network are back shifted to a grid of possible source locations via a special stacking procedure. This can provide information on the energy release and energy distribution of the rupture which then can be used to find estimates of event properties like location, rupture direction, rupture speed or length. The procedure is fast and direct and it only relies on a reasonable velocity model. Thus it is a good way to rapidly estimate the rupture properties and it can be used to confirm independently achieved event information. We adopted the back projection technique and put it in a microseismic context. We demonstrated its usage for multiple synthetic ruptures within a reservoir model of microseismic scale in earlier works. Our motivation hereby is the occurrence of relatively large, induced seismic events at a number of stimulated geothermal reservoirs or waste disposal sites, having magnitudes ML ≥ 3.4 and yielding rupture lengths of several hundred meters. We use the configuration of the seismic network and reservoir properties of the Basel Geothermal Site to build a synthetic model of a rupture by modeling the wave field of multiple spatio-temporal separated single sources using Finite-Difference modeling. The focus of this work is the application of the Back Projection technique and the demonstration of its feasibility to retrieve the rupture properties of real fluid induced events. We take four microseismic events with magnitudes from ML 3.1 to 3.4 and reconstruct source parameters like location, orientation and length. By comparison with our synthetic results as well as independent localization studies and source mechanism studies in this area we can show, that the obtained results are reasonable and that the application of back projection imaging is not only possible for microseismic datasets of respective quality, but that it provides important additional insights in the rupture process.
Supercritical fluids in separation science--the dreams, the reality and the future.
Smith, R M
1999-09-24
The last 20 years have seen an intense interest in the use of supercritical fluids in separation science. This started with the introduction of commercial instruments first for packed and then for capillary chromatography and it looked as if this would be a technique to rival gas-liquid chromatography and HPLC. The activity developed quite rapidly into packed column supercritical fluid separations then into supercritical fluid extraction. However, in recent years there has been a decline in publications. These later techniques continue to be used but are now principally applied to a limited group of applications where they offer significant advantages over alternative techniques. This review looks back over this period and analyses how these methods were developed and the fluids, detectors and applications that were examined. It suggests why many of the initial applications have vanished and why the initial apparent promise was not fulfilled. The rise and fall of supercritical fluids represents a lesson in the way analysts approach new techniques and how we might view other new separation developments at the end of this millennium. The review looks forward to the future of supercritical fluids and their role at the end of the first century of separation science. Probably the most important idea that supercritical fluids have brought to separation science is a recognition that there is unity in the separation methods and that a continuum exists from gases to liquids.
Psychophysical investigation of an auditory spatial illusion in cats: the precedence effect.
Tollin, Daniel J; Yin, Tom C T
2003-10-01
The precedence effect (PE) describes several spatial perceptual phenomena that occur when similar sounds are presented from two different locations and separated by a delay. The mechanisms that produce the effect are thought to be responsible for the ability to localize sounds in reverberant environments. Although the physiological bases for the PE have been studied, little is known about how these sounds are localized by species other than humans. Here we used the search coil technique to measure the eye positions of cats trained to saccade to the apparent locations of sounds. To study the PE, brief broadband stimuli were presented from two locations, with a delay between their onsets; the delayed sound meant to simulate a single reflection. Although the cats accurately localized single sources, the apparent locations of the paired sources depended on the delay. First, the cats exhibited summing localization, the perception of a "phantom" sound located between the sources, for delays < +/-400 micros for sources positioned in azimuth along the horizontal plane, but not for sources positioned in elevation along the sagittal plane. Second, consistent with localization dominance, for delays from 400 micros to about 10 ms, the cats oriented toward the leading source location only, with little influence of the lagging source, both for horizontally and vertically placed sources. Finally, the echo threshold was reached for delays >10 ms, where the cats first began to orient to the lagging source on some trials. These data reveal that cats experience the PE phenomena similarly to humans.
Blind source deconvolution for deep Earth seismology
NASA Astrophysics Data System (ADS)
Stefan, W.; Renaut, R.; Garnero, E. J.; Lay, T.
2007-12-01
We present an approach to automatically estimate an empirical source characterization of deep earthquakes recorded teleseismically and subsequently remove the source from the recordings by applying regularized deconvolution. A principle goal in this work is to effectively deblur the seismograms, resulting in more impulsive and narrower pulses, permitting better constraints in high resolution waveform analyses. Our method consists of two stages: (1) we first estimate the empirical source by automatically registering traces to their 1st principal component with a weighting scheme based on their deviation from this shape, we then use this shape as an estimation of the earthquake source. (2) We compare different deconvolution techniques to remove the source characteristic from the trace. In particular Total Variation (TV) regularized deconvolution is used which utilizes the fact that most natural signals have an underlying spareness in an appropriate basis, in this case, impulsive onsets of seismic arrivals. We show several examples of deep focus Fiji-Tonga region earthquakes for the phases S and ScS, comparing source responses for the separate phases. TV deconvolution is compared to the water level deconvolution, Tikenov deconvolution, and L1 norm deconvolution, for both data and synthetics. This approach significantly improves our ability to study subtle waveform features that are commonly masked by either noise or the earthquake source. Eliminating source complexities improves our ability to resolve deep mantle triplications, waveform complexities associated with possible double crossings of the post-perovskite phase transition, as well as increasing stability in waveform analyses used for deep mantle anisotropy measurements.
NASA Technical Reports Server (NTRS)
Brooks, D. E.
1979-01-01
Technique utilizing electric field to promote biological cell separation from suspending medium in zero gravity increases speed, reduces sedimentation, and improves efficiency of separation in normal gravity.
Safeguard monitoring of direct electrolytic reduction
NASA Astrophysics Data System (ADS)
Jurovitzki, Abraham L.
Nuclear power is regaining global prominence as a sustainable energy source as the world faces the consequences of depending on limited fossil based, CO2 emitting fuels. A key component to achieving this sustainability is to implement a closed nuclear fuel cycle. Without achieving this goal, a relatively small fraction of the energy value in nuclear fuel is actually utilized. This involves recycling of spent nuclear fuel (SNF)---separating fissile actinides from waste products and using them to fabricate fresh fuel. Pyroprocessing is a viable option being developed for this purpose with a host of benefits compared to other recycling options, such as PUREX. Notably, pyroprocessing is ill suited to separate pure plutonium from spent fuel and thus has non-proliferation benefits. Pyroprocessing involves high temperature electrochemical and chemical processing of SNF in a molten salt electrolyte. During this batch process, several intermediate and final streams are produced that contain radioactive material. While pyroprocessing is ineffective at separating pure plutonium, there are various process misuse scenarios that could result in diversion of impure plutonium into one or more of these streams. This is a proliferation risk that should be addressed with innovative safeguards technology. One approach to meeting this challenge is to develop real time monitoring techniques that can be implemented in the hot cells and coupled with the various unit operations involved with pyroprocessing. Current state of the art monitoring techniques involve external chemical assaying which requires sample removal from these unit operations. These methods do not meet International Atomic Energy Agency's (IAEA) timeliness requirements. In this work, a number of monitoring techniques were assessed for their viability as online monitoring tools. A hypothetical diversion scenario for the direct electrolytic reduction process was experimentally verified (using Nd2O3 as a surrogate for PuO2). Electrochemical analysis was demonstrated to be effective at detecting even very dilute concentrations of actinides as evidence for a diversion attempt.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorhout, Jacquelyn Marie
This dissertation covers several distinct projects relating to the fields of nuclear forensics and basic actinide science. Post-detonation nuclear forensics, in particular, the study of fission products resulting from a nuclear device to determine device attributes and information, often depends on the comparison of fission products to a library of known ratios. The expansion of this library is imperative as technology advances. Rapid separation of fission products from a target material, without the need to dissolve the target, is an important technique to develop to improve the library and provide a means to develop samples and standards for testing separations.more » Several materials were studied as a proof-of-concept that fission products can be extracted from a solid target, including microparticulate (< 10 μm diameter) dUO 2, porous metal organic frameworks (MOFs) synthesized from depleted uranium (dU), and other organicbased frameworks containing dU. The targets were irradiated with fast neutrons from one of two different neutron sources, contacted with dilute acids to facilitate the separation of fission products, and analyzed via gamma spectroscopy for separation yields. The results indicate that smaller particle sizes of dUO 2 in contact with the secondary matrix KBr yield higher separation yields than particles without a secondary matrix. It was also discovered that using 0.1 M HNO 3 as a contact acid leads to the dissolution of the target material. Lower concentrations of acid were used for future experiments. In the case of the MOFs, a larger pore size in the framework leads to higher separation yields when contacted with 0.01 M HNO 3. Different types of frameworks also yield different results.« less
Magnetic techniques for the isolation and purification of proteins and peptides
Safarik, Ivo; Safarikova, Mirka
2004-01-01
Isolation and separation of specific molecules is used in almost all areas of biosciences and biotechnology. Diverse procedures can be used to achieve this goal. Recently, increased attention has been paid to the development and application of magnetic separation techniques, which employ small magnetic particles. The purpose of this review paper is to summarize various methodologies, strategies and materials which can be used for the isolation and purification of target proteins and peptides with the help of magnetic field. An extensive list of realised purification procedures documents the efficiency of magnetic separation techniques. PMID:15566570
Maguregui, M I; Alonso, R M; Barandiaran, M; Jimenez, R M; García, N
2007-06-22
The identification of organic colorants used in artistic paintings is an important information source for reconstructing the working techniques found in a particular work and for defining a programme for the restoration and conservation of the painting. In this work, sodium dodecyl sulfate (SDS) was used as a surfactant in micellar electrokinetic chromatography (MEKC) for separating a broad range of red organic pigments, based on their colouring matters: madder (colouring matters: alizarin, quinizarin and purpurin), cochineal (colouring matter: carminic acid), red sandalwood (colouring matter: santalin), brazilwood (colouring matter: brazilin), lac dye (colouring matter: laccaic acid) and dragon's blood (colouring matter: dracorhodin). The running electrolyte used was 20 mM borax (pH 9), containing 20 mM SDS and 10% acetonitrile as organic modifier. Separation was carried out by applying a +20 kV voltage at the injection end, 25 degrees C and 214 nm/254 nm as detection wavelengths. All colorants were separated within less than 13 min with a good baseline resolution. The method was applied to the analysis of paint samples obtained from the Diocesan Museum of Holy Art of Bilbao.
Discovery of a dual active galactic nucleus with ˜8 kpc separation
NASA Astrophysics Data System (ADS)
Ellison, Sara L.; Secrest, Nathan J.; Mendel, J. Trevor; Satyapal, Shobita; Simard, Luc
2017-09-01
Targeted searches for dual active galactic nuclei (AGNs), with separations 1-10 kpc, have yielded relatively few successes. A recent pilot survey by Satyapal et al. has demonstrated that mid-infrared (mid-IR) pre-selection has the potential to significantly improve the success rate for dual AGN confirmation in late stage galaxy mergers. In this Letter, we combine mid-IR selection with spatially resolved optical AGN diagnostics from the Mapping Nearby Galaxies at Apache Point Observatory survey to identify a candidate dual AGN in the late stage major galaxy merger SDSS J140737.17+442856.2 at z = 0.143. The nature of the dual AGN is confirmed with Chandra X-ray observations that identify two hard X-ray point sources with intrinsic (corrected for absorption) 2-10 keV luminosities of 4 × 1041 and 3.5 × 1043 erg s-1 separated by 8.3 kpc. The neutral hydrogen absorption (˜1022 cm-2) towards the two AGNs is lower than in duals selected solely on their mid-IR colours, indicating that strategies that combine optical and mid-IR diagnostics may complement techniques that identify the highly obscured dual phase, such as at high X-ray energies or mid-IR only.
Separation of Doppler radar-based respiratory signatures.
Lee, Yee Siong; Pathirana, Pubudu N; Evans, Robin J; Steinfort, Christopher L
2016-08-01
Respiration detection using microwave Doppler radar has attracted significant interest primarily due to its unobtrusive form of measurement. With less preparation in comparison with attaching physical sensors on the body or wearing special clothing, Doppler radar for respiration detection and monitoring is particularly useful for long-term monitoring applications such as sleep studies (i.e. sleep apnoea, SIDS). However, motion artefacts and interference from multiple sources limit the widespread use and the scope of potential applications of this technique. Utilising the recent advances in independent component analysis (ICA) and multiple antenna configuration schemes, this work investigates the feasibility of decomposing respiratory signatures into each subject from the Doppler-based measurements. Experimental results demonstrated that FastICA is capable of separating two distinct respiratory signatures from two subjects adjacent to each other even in the presence of apnoea. In each test scenario, the separated respiratory patterns correlate closely to the reference respiration strap readings. The effectiveness of FastICA in dealing with the mixed Doppler radar respiration signals confirms its applicability in healthcare applications, especially in long-term home-based monitoring as it usually involves at least two people in the same environment (i.e. two people sleeping next to each other). Further, the use of FastICA to separate involuntary movements such as the arm swing from the respiratory signatures of a single subject was explored in a multiple antenna environment. The separated respiratory signal indeed demonstrated a high correlation with the measurements made by a respiratory strap used currently in clinical settings.
NASA Astrophysics Data System (ADS)
Subudhi, Sudhakar; Sreenivas, K. R.; Arakeri, Jaywant H.
2013-01-01
This work is concerned with the removal of unwanted fluid through the source-sink pair. The source consists of fluid issuing out of a nozzle in the form of a jet and the sink is a pipe that is kept some distance from the source pipe. Of concern is the percentage of source fluid sucked through the sink. The experiments have been carried in a large glass water tank. The source nozzle diameter is 6 mm and the sink pipe diameter is either 10 or 20 mm. The horizontal and vertical separations and angles between these source and sink pipes are adjustable. The flow was visualized using KMnO4 dye, planer laser induced fluorescence and particle streak photographs. To obtain the effectiveness (that is percentage of source fluid entering the sink pipe), titration method is used. The velocity profiles with and without the sink were obtained using particle image velocimetry. The sink flow rate to obtain a certain effectiveness increase dramatically with lateral separation. The sink diameter and the angle between source and the sink axes don't influence effectiveness as much as the lateral separation.
Rosenberg, Erwin
2003-06-06
The use of mass spectrometry based on atmospheric pressure ionisation techniques (atmospheric pressure chemical ionisation, APCI, and electrospray ionisation, ESI) for speciation analysis is reviewed with emphasis on the literature published in and after 1999. This report accounts for the increasing interest that atmospheric pressure ionisation techniques, and in particular ESI, have found in the past years for qualitative and quantitative speciation analysis. In contrast to element-selective detectors, organic mass spectrometric techniques provide information on the intact metal species which can be used for the identification of unknown species (particularly with MS-MS detection) or the confirmation of the actual presence of species in a given sample. Due to the complexity of real samples, it is inevitable in all but the simplest cases to couple atmospheric pressure MS detection to a separation technique. Separation in the liquid phase (capillary electrophoresis or liquid chromatography in reversed phase, ion chromatographic or size-exclusion mode) is particularly suitable since the available techniques cover a very wide range of analyte polarities and molecular mass. Moreover, derivatisation can normally be avoided in liquid-phase separation. Particularly in complex environmental or biological samples, separation in one dimension is not sufficient for obtaining adequate resolution for all relevant species. In this case, multi-dimensional separation, based on orthogonal separation techniques, has proven successful. ESI-MS is also often used in parallel with inductively coupled plasma MS detection. This review is structured in two parts. In the first, the fundamentals of atmospheric pressure ionisation techniques are briefly reviewed. The second part of the review discusses recent applications including redox species, use of ESI-MS for structural elucidation of metal complexes, characterisation and quantification of small organometallic species with relevance to environment, health and food. Particular attention is given to the characterisation of biomolecules and metalloproteins (metallothioneins and phytochelatins) and to the investigation of the interaction of metals and biomolecules. Particularly in the latter field, ESI-MS is the ideal technique due to the softness of the ionisation process which allows to assume that the detected gas-phase ions are a true representation of the ions or ion-biomolecule complexes prevalent in solution. It is particularly this field, important to biochemistry, physiology and medical chemistry, where we can expect significant developments also in the future.
Investigation of foam flotation and phase partitioning techniques
NASA Technical Reports Server (NTRS)
Currin, B. L.
1985-01-01
The present status of foam flotation as a separation process is evaluated and limitations for cells and proteins are determined. Possible applications of foam flotation to separations in microgravity are discussed. Application of the fluid mechanical aspects of foam separation techniques is made to phase partitioning in order to investigate the viscous drag forces that may effect the partitioning of cells in a two phase poly(ethylene glycol) and dextran system.
On-road and wind-tunnel measurement of motorcycle helmet noise.
Kennedy, J; Carley, M; Walker, I; Holt, N
2013-09-01
The noise source mechanisms involved in motorcycling include various aerodynamic sources and engine noise. The problem of noise source identification requires extensive data acquisition of a type and level that have not previously been applied. Data acquisition on track and on road are problematic due to rider safety constraints and the portability of appropriate instrumentation. One way to address this problem is the use of data from wind tunnel tests. The validity of these measurements for noise source identification must first be demonstrated. In order to achieve this extensive wind tunnel tests have been conducted and compared with the results from on-track measurements. Sound pressure levels as a function of speed were compared between on track and wind tunnel tests and were found to be comparable. Spectral conditioning techniques were applied to separate engine and wind tunnel noise from aerodynamic noise and showed that the aerodynamic components were equivalent in both cases. The spectral conditioning of on-track data showed that the contribution of engine noise to the overall noise is a function of speed and is more significant than had previously been thought. These procedures form a basis for accurate experimental measurements of motorcycle noise.
Improved charge breeding efficiency of light ions with an electron cyclotron resonance ion source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondrasek, R.; Kutsaev, Sergey; Delahaye, P.
2012-11-15
The Californium Rare Isotope Breeder Upgrade is a new radioactive beam facility for the Argonne Tandem Linac Accelerator System (ATLAS). The facility utilizes a {sup 252}Cf fission source coupled with an electron cyclotron resonance ion source to provide radioactive beam species for the ATLAS experimental program. The californium fission fragment distribution provides nuclei in the mid-mass range which are difficult to extract from production targets using the isotope separation on line technique and are not well populated by low-energy fission of uranium. To date the charge breeding program has focused on optimizing these mid-mass beams, achieving high charge breeding efficienciesmore » of both gaseous and solid species including 14.7% for the radioactive species {sup 143}Ba{sup 27+}. In an effort to better understand the charge breeding mechanism, we have recently focused on the low-mass species sodium and potassium which up to present have been difficult to charge breed efficiently. Unprecedented charge breeding efficiencies of 10.1% for {sup 23}Na{sup 7+} and 17.9% for {sup 39}K{sup 10+} were obtained injecting stable Na{sup +} and K{sup +} beams from a surface ionization source.« less
Improved charge breeding efficiency of light ions with an electron cyclotron resonance ion source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondrasek, R.; Delahaye, P.; Kutsaev, Sergey
2012-11-01
The Californium Rare Isotope Breeder Upgrade is a new radioactive beam facility for the Argonne Tandem Linac Accelerator System (ATLAS). The facility utilizes a 252Cf fission source coupled with an electron cyclotron resonance ion source to provide radioactive beam species for the ATLAS experimental program. The californium fission fragment distribution provides nuclei in the mid-mass range which are difficult to extract from production targets using the isotope separation on line technique and are not well populated by low-energy fission of uranium. To date the charge breeding program has focused on optimizing these mid-mass beams, achieving high charge breeding efficiencies ofmore » both gaseous and solid species including 14.7% for the radioactive species 143Ba27+. In an effort to better understand the charge breeding mechanism, we have recently focused on the low-mass species sodium and potassium which up to present have been difficult to charge breed efficiently. Unprecedented charge breeding efficiencies of 10.1% for 23Na7+ and 17.9% for 39K10+ were obtained injecting stable Na+ and K+ beams from a surface ionization source.« less
Coelho, Marcelo Santos; Card, Steven John; Tawil, Peter Zahi
2017-03-01
The aim of this study was to retrospectively assess the safety potential of a hybrid technique combining nickel-titanium (NiTi) reciprocating and rotary instruments by third- and fourth-year dental students in the predoctoral endodontics clinic at one U.S. dental school. For the study, 3,194 root canal treatments performed by 317 dental students from 2012 through 2015 were evaluated for incidence of ledge creation and instrument separation. The hybrid reciprocating and rotary technique (RRT) consisted of a glide path creation with stainless steel hand files up to size 15/02, a crown down preparation with a NiTi reciprocating instrument, and an apical preparation with NiTi rotary instruments. The control was a traditional rotary and hand technique (RHT) that consisted of the same glide path procedure followed by a crown down preparation with NiTi rotary instruments and an apical preparation with NiTi hand instruments. The results showed that the RHT technique presented a rate of ledge creation of 1.4% per root and the RRT technique was 0.5% per root (p<0.05). Three stainless steel hand files separated: two in the RHT group and one in the RRT group. There was no separation of any NiTi file in any of the techniques. The use of the reciprocating and rotary technique for root canal instrumentation by these dental students provided good safety. This hybrid technique offered a low rate of ledge creation along with no NiTi instrument separation.
Radiochemistry in the twenty-first century: Strenghts, weaknesses, opportunities and threats
NASA Astrophysics Data System (ADS)
de Goeij, J. J. M.
2003-01-01
Strengths, weaknesses, opportunities and threats of radiochemistry and associated nuclear chemistry are discussed. For that purpose radiochemistry is subdivided into three categories. The first category covers fundamental aspects, e.g. nuclear reaction cross-sections, production routes with associated yields and radionuclidic impurities, decay schemes, radiochemical separations, recoil and hot-atom chemistry, isotope effects and fractionation, and interaction of radiation with matter and detection. The second category covers topics where radioactivity is inextricably involved, e.g. the nuclear fuel cycle, very heavy elements and other actinides, primordial and cosmogenic radioactivity, and radionuclide techniques for dating. The third category involves radioactivity as essential part of a technique. On one hand radioactivity is used here as source of ionising radiation for food conservation, polymerisation of plastics, sterilisation, radiotherapy and pain palliation. On the other hand it is used to get information on systems and materials, via radiotracer methods and nuclear activation techniques. In particular the latter field is experiencing strong competition with other, non-nuclear methods. In this frame it is indicated what is required to achieve a situation where nuclear analytical techniques may successfully be exploited to the full extent of their potentials, particularly in providing valuable and sometimes unique information.
Reduction of polyatomic interferences in ICP-MS by collision/reaction cell (CRC-ICP-MS) techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eiden, Greg C; Barinaga, Charles J; Koppenaal, David W
2012-05-01
Polyatomic and other spectral interferences in plasma source mass spectrometry (PSMS) can be dramatically reduced using collision and reaction cells (CRC). These devices have been used for decades in fundamental studies of ion-molecule chemistry, but have only recently been applied to PSMS. Benefits of this approach as applied in inductively coupled plasma MS (ICP-MS) include interference reduction, isobar separation, and thermalization/focusing of ions. Novel ion-molecule chemistry schemes are now routinely designed and empirically evaluated with relative ease. These “chemical resolution” techniques can avert interferences requiring mass spectral resolutions of >600,000 (m/m). Purely physical ion beam processes, including collisional dampening andmore » collisional dissociation, are also employed to provide improved sensitivity, resolution, and spectral simplicity. CRC techniques are now firmly entrenched in current-day ICP-MS technology, enabling unprecedented flexibility and freedom from many spectral interferences. A significant body of applications has now been reported in the literature. CRC techniques are found to be most useful for specialized or difficult analytical needs and situations, and are employed in both single- and multi-element determination modes.« less
No-search algorithm for direction of arrival estimation
NASA Astrophysics Data System (ADS)
Tuncer, T. Engin; Ã-Zgen, M. Tankut
2009-10-01
Direction of arrival estimation (DOA) is an important problem in ionospheric research and electromagnetics as well as many other fields. When superresolution techniques are used, a computationally expensive search should be performed in general. In this paper, a no-search algorithm is presented. The idea is to separate the source signals in the time-frequency plane by using the Short-Time Fourier Transform. The direction vector for each source is found by coherent summation over the instantaneous frequency (IF) tracks of the individual sources which are found automatically by employing morphological image processing. Both overlapping and nonoverlapping source IF tracks can be processed and identified by the proposed approach. The CLEAN algorithm is adopted in order to isolate the IF tracks of the overlapping sources with different powers. The proposed method is very effective in finding the IF tracks and can be applied for signals with arbitrary IF characteristics. While the proposed method can be applied to any sensor geometry, planar uniform circular arrays (UCA) bring additional advantages. Different properties of the UCA are presented, and it is shown that the DOA angles can be found as the mean-square error optimum solution of a linear matrix equation. Several simulations are done, and it is shown that the proposed approach performs significantly better than the conventional methods.
Microwave Assisted Helicon Plasmas
NASA Astrophysics Data System (ADS)
McKee, John; Caron, David; Jemiolo, Andrew; Scime, Earl
2017-10-01
The use of two (or more) rf sources at different frequencies is a common technique in the plasma processing industry to control ion energy characteristics separately from plasma generation. A similar approach is presented here with the focus on modifying the electron population in argon and helium plasmas. The plasma is generated by a helicon source at a frequency f0 = 13.56 MHz. Microwaves of frequency f1 = 2.45 GHz are then injected into the helicon source chamber perpendicular to the background magnetic field. The microwaves damp on the electrons via X-mode Electron Cyclotron Heating (ECH) at the upper hybrid resonance, providing additional energy input into the electrons. The effects of this secondary-source heating on electron density, temperature, and energy distribution function are examined and compared to helicon-only single source plasmas as well as numeric models suggesting that the heating is not evenly distributed. Optical Emission Spectroscopy (OES) is used to examine the impact of the energetic tail of the electron distribution on ion and neutral species via collisional excitation. Large enhancements of neutral spectral lines are observed in both Ar and He. While small enhancement of ion lines is seen in Ar, ion lines not normally present in He are observed during microwave injection. U.S. National Science Foundation Grant No. PHY-1360278.
Two Wrongs Make a Right: Addressing Underreporting in Binary Data from Multiple Sources.
Cook, Scott J; Blas, Betsabe; Carroll, Raymond J; Sinha, Samiran
2017-04-01
Media-based event data-i.e., data comprised from reporting by media outlets-are widely used in political science research. However, events of interest (e.g., strikes, protests, conflict) are often underreported by these primary and secondary sources, producing incomplete data that risks inconsistency and bias in subsequent analysis. While general strategies exist to help ameliorate this bias, these methods do not make full use of the information often available to researchers. Specifically, much of the event data used in the social sciences is drawn from multiple, overlapping news sources (e.g., Agence France-Presse, Reuters). Therefore, we propose a novel maximum likelihood estimator that corrects for misclassification in data arising from multiple sources. In the most general formulation of our estimator, researchers can specify separate sets of predictors for the true-event model and each of the misclassification models characterizing whether a source fails to report on an event. As such, researchers are able to accurately test theories on both the causes of and reporting on an event of interest. Simulations evidence that our technique regularly out performs current strategies that either neglect misclassification, the unique features of the data-generating process, or both. We also illustrate the utility of this method with a model of repression using the Social Conflict in Africa Database.
Wang, Rui; Jin, Xin; Wang, Ziyuan; Gu, Wantao; Wei, Zhechao; Huang, Yuanjie; Qiu, Zhuang; Jin, Pengkang
2018-01-01
This paper proposes a new system of multilevel reuse with source separation in printing and dyeing wastewater (PDWW) treatment in order to dramatically improve the water reuse rate to 35%. By analysing the characteristics of the sources and concentrations of pollutants produced in different printing and dyeing processes, special, highly, and less contaminated wastewaters (SCW, HCW, and LCW, respectively) were collected and treated separately. Specially, a large quantity of LCW was sequentially reused at multiple levels to meet the water quality requirements for different production processes. Based on this concept, a multilevel reuse system with a source separation process was established in a typical printing and dyeing enterprise. The water reuse rate increased dramatically to 62%, and the reclaimed water was reused in different printing and dyeing processes based on the water quality. This study provides promising leads in water management for wastewater reclamation. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, John Howard; Alvare, Javier
A reactor has two chambers, namely an oil feedstock chamber and a source chamber. An ion separator separates the oil feedstock chamber from the source chamber, wherein the ion separator allows alkali metal ions to pass from the source chamber, through the ion separator, and into the oil feedstock chamber. A cathode is at least partially housed within the oil feedstock chamber and an anode is at least partially housed within the source chamber. A quantity of an oil feedstock is within the oil feedstock chamber, the oil feedstock comprising at least one carbon atom and a heteroatom and/or onemore » or more heavy metals, the oil feedstock further comprising naphthenic acid. When the alkali metal ion enters the oil feedstock chamber, the alkali metal reacts with the heteroatom, the heavy metals and/or the naphthenic acid, wherein the reaction with the alkali metal forms inorganic products.« less
Teaching Separations: Why, What, When, and How?
ERIC Educational Resources Information Center
Wankat, Phillip C.
2001-01-01
Describes how and when to teach separation science to chemical engineering students. Separation science is important for industrial businesses involving the manufacture of adsorption systems, distillation columns, extractors, and other separation equipment and techniques. (Contains 13 references.) (YDS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, B; Reyes, J; Wong, J
Purpose: To overcome the limitation of CT/CBCT in guiding radiation for soft tissue targets, we developed a bioluminescence tomography(BLT) system for preclinical radiation research. We systematically assessed the system performance in target localization and the ability of resolving two sources in simulations, phantom and in vivo environments. Methods: Multispectral images acquired in single projection were used for the BLT reconstruction. Simulation studies were conducted for single spherical source radius from 0.5 to 3 mm at depth of 3 to 12 mm. The same configuration was also applied for the double sources simulation with source separations varying from 3 to 9more » mm. Experiments were performed in a standalone BLT/CBCT system. Two sources with 3 and 4.7 mm separations placed inside a tissue-mimicking phantom were chosen as the test cases. Live mice implanted with single source at 6 and 9 mm depth, 2 sources with 3 and 5 mm separation at depth of 5 mm or 3 sources in the abdomen were also used to illustrate the in vivo localization capability of the BLT system. Results: Simulation and phantom results illustrate that our BLT can provide 3D source localization with approximately 1 mm accuracy. The in vivo results are encouraging that 1 and 1.7 mm accuracy can be attained for the single source case at 6 and 9 mm depth, respectively. For the 2 sources study, both sources can be distinguished at 3 and 5 mm separations at approximately 1 mm accuracy using 3D BLT but not 2D bioluminescence image. Conclusion: Our BLT/CBCT system can be potentially applied to localize and resolve targets at a wide range of target sizes, depths and separations. The information provided in this study can be instructive to devise margins for BLT-guided irradiation and suggests that the BLT could guide radiation for multiple targets, such as metastasis. Drs. John W. Wong and Iulian I. Iordachita receive royalty payment from a licensing agreement between Xstrahl Ltd and Johns Hopkins University.« less
Apparatus for the liquefaction of natural gas and methods relating to same
Wilding, Bruce M [Idaho Falls, ID; Bingham, Dennis N [Idaho Falls, ID; McKellar, Michael G [Idaho Falls, ID; Turner, Terry D [Ammon, ID; Raterman, Kevin T [Idaho Falls, ID; Palmer, Gary L [Shelley, ID; Klingler, Kerry M [Idaho Falls, ID; Vranicar, John J [Concord, CA
2007-05-22
An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through a turbo expander creating work output. A compressor is driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream. Additional features and techniques may be integrated with the liquefaction process including a water clean-up cycle and a carbon dioxide (CO.sub.2) clean-up cycle.
Apparatus For The Liquefaaction Of Natural Gas And Methods Relating To Same
Wilding, Bruce M.; Bingham, Dennis N.; McKellar, Michael G.; Turner, Terry D.; Rateman, Kevin T.; Palmer, Gary L.; Klinger, Kerry M.; Vranicar, John J.
2005-11-08
An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through a turbo expander creating work output. A compressor is driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream. Additional features and techniques may be integrated with the liquefaction process including a water clean-up cycle and a carbon dioxide (CO2) clean-up cycle.
Apparatus For The Liquefaaction Of Natural Gas And Methods Relating To Same
Wilding, Bruce M.; Bingham, Dennis N.; McKellar, Michael G.; Turner, Terry D.; Raterman, Kevin T.; Palmer, Gary L.; Klingler, Kerry M.; Vranicar, John J.
2005-05-03
An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through a turbo expander creating work output. A compressor is driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream. Additional features and techniques may be integrated with the liquefaction process including a water clean-up cycle and a carbon dioxide (CO2) clean-up cycle.
Apparatus For The Liquefaaction Of Natural Gas And Methods Relating To Same
Wilding, Bruce M.; Bingham, Dennis N.; McKellar, Michael G.; Turner, Terry D.; Raterman, Kevin T.; Palmer, Gary L.; Klingler, Kerry M.; Vranicar, John J.
2003-06-24
An apparatus and method for producing liquefied natural gas. A liquefaction plant may be coupled to a source of unpurified natural gas, such as a natural gas pipeline at a pressure letdown station. A portion of the gas is drawn off and split into a process stream and a cooling stream. The cooling stream passes through a turbo expander creating work output. A compressor is driven by the work output and compresses the process stream. The compressed process stream is cooled, such as by the expanded cooling stream. The cooled, compressed process stream is divided into first and second portions with the first portion being expanded to liquefy the natural gas. A gas-liquid separator separates the vapor from the liquid natural gas. The second portion of the cooled, compressed process stream is also expanded and used to cool the compressed process stream. Additional features and techniques may be integrated with the liquefaction process including a water clean-up cycle and a carbon dioxide (CO.sub.2) clean-up cycle.
Measurements of the Free-Stream Fluctuations above a Turbulent Boundary Layer
NASA Technical Reports Server (NTRS)
Wood, D. H.; Westphal, R. V.
1988-01-01
In this paper an investigation of the velocity fluctuations in the free stream above an incompressible turbulent boundary layer developing at constant pressure is described. It is assumed that the fluctuations receive contributions from three statistically independent sources: (1) one-dimensional unsteadiness, (2) free-stream turbulence, and (3) the irrotational motion induced by the turbulent boundary layer. Measurements were made in a wind tunnel with a root-mean-square level of the axial velocity fluctuations of about 0.2%. All three velocity components were measured using an X-wire probe. The unsteadiness was determined from the spanwise covariance of the axial velocity fluctuations, measured using two single-wire probes. The results show that it is possible to separate the contributions to the rms level of the velocity fluctuations without resorting to the dubious technique of high-pass filtering. This separation could be extended to the spectral densities of the contributions if measurements of sufficient accuracy were available.