Feasibility of Rapid Multitracer PET Tumor Imaging
NASA Astrophysics Data System (ADS)
Kadrmas, D. J.; Rust, T. C.
2005-10-01
Positron emission tomography (PET) can characterize different aspects of tumor physiology using various tracers. PET scans are usually performed using only one tracer since there is no explicit signal for distinguishing multiple tracers. We tested the feasibility of rapidly imaging multiple PET tracers using dynamic imaging techniques, where the signals from each tracer are separated based upon differences in tracer half-life, kinetics, and distribution. Time-activity curve populations for FDG, acetate, ATSM, and PTSM were simulated using appropriate compartment models, and noisy dual-tracer curves were computed by shifting and adding the single-tracer curves. Single-tracer components were then estimated from dual-tracer data using two methods: principal component analysis (PCA)-based fits of single-tracer components to multitracer data, and parallel multitracer compartment models estimating single-tracer rate parameters from multitracer time-activity curves. The PCA analysis found that there is information content present for separating multitracer data, and that tracer separability depends upon tracer kinetics, injection order and timing. Multitracer compartment modeling recovered rate parameters for individual tracers with good accuracy but somewhat higher statistical uncertainty than single-tracer results when the injection delay was >10 min. These approaches to processing rapid multitracer PET data may potentially provide a new tool for characterizing multiple aspects of tumor physiology in vivo.
Application of separable parameter space techniques to multi-tracer PET compartment modeling.
Zhang, Jeff L; Michael Morey, A; Kadrmas, Dan J
2016-02-07
Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg-Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models.
Application of separable parameter space techniques to multi-tracer PET compartment modeling
NASA Astrophysics Data System (ADS)
Zhang, Jeff L.; Morey, A. Michael; Kadrmas, Dan J.
2016-02-01
Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg-Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models.
Application of separable parameter space techniques to multi-tracer PET compartment modeling
Zhang, Jeff L; Morey, A Michael; Kadrmas, Dan J
2016-01-01
Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg–Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models. PMID:26788888
Rapid Multi-Tracer PET Tumor Imaging With F-FDG and Secondary Shorter-Lived Tracers.
Black, Noel F; McJames, Scott; Kadrmas, Dan J
2009-10-01
Rapid multi-tracer PET, where two to three PET tracers are rapidly scanned with staggered injections, can recover certain imaging measures for each tracer based on differences in tracer kinetics and decay. We previously showed that single-tracer imaging measures can be recovered to a certain extent from rapid dual-tracer (62)Cu - PTSM (blood flow) + (62)Cu - ATSM (hypoxia) tumor imaging. In this work, the feasibility of rapidly imaging (18)F-FDG plus one or two of these shorter-lived secondary tracers was evaluated in the same tumor model. Dynamic PET imaging was performed in four dogs with pre-existing tumors, and the raw scan data was combined to emulate 60 minute long dual- and triple-tracer scans, using the single-tracer scans as gold standards. The multi-tracer data were processed for static (SUV) and kinetic (K(1), K(net)) endpoints for each tracer, followed by linear regression analysis of multi-tracer versus single-tracer results. Static and quantitative dynamic imaging measures of FDG were both accurately recovered from the multi-tracer scans, closely matching the single-tracer FDG standards (R > 0.99). Quantitative blood flow information, as measured by PTSM K(1) and SUV, was also accurately recovered from the multi-tracer scans (R = 0.97). Recovery of ATSM kinetic parameters proved more difficult, though the ATSM SUV was reasonably well recovered (R = 0.92). We conclude that certain additional information from one to two shorter-lived PET tracers may be measured in a rapid multi-tracer scan alongside FDG without compromising the assessment of glucose metabolism. Such additional and complementary information has the potential to improve tumor characterization in vivo, warranting further investigation of rapid multi-tracer techniques.
Rapid Multi-Tracer PET Tumor Imaging With 18F-FDG and Secondary Shorter-Lived Tracers
Black, Noel F.; McJames, Scott; Kadrmas, Dan J.
2009-01-01
Rapid multi-tracer PET, where two to three PET tracers are rapidly scanned with staggered injections, can recover certain imaging measures for each tracer based on differences in tracer kinetics and decay. We previously showed that single-tracer imaging measures can be recovered to a certain extent from rapid dual-tracer 62Cu – PTSM (blood flow) + 62Cu — ATSM (hypoxia) tumor imaging. In this work, the feasibility of rapidly imaging 18F-FDG plus one or two of these shorter-lived secondary tracers was evaluated in the same tumor model. Dynamic PET imaging was performed in four dogs with pre-existing tumors, and the raw scan data was combined to emulate 60 minute long dual- and triple-tracer scans, using the single-tracer scans as gold standards. The multi-tracer data were processed for static (SUV) and kinetic (K1, Knet) endpoints for each tracer, followed by linear regression analysis of multi-tracer versus single-tracer results. Static and quantitative dynamic imaging measures of FDG were both accurately recovered from the multi-tracer scans, closely matching the single-tracer FDG standards (R > 0.99). Quantitative blood flow information, as measured by PTSM K1 and SUV, was also accurately recovered from the multi-tracer scans (R = 0.97). Recovery of ATSM kinetic parameters proved more difficult, though the ATSM SUV was reasonably well recovered (R = 0.92). We conclude that certain additional information from one to two shorter-lived PET tracers may be measured in a rapid multi-tracer scan alongside FDG without compromising the assessment of glucose metabolism. Such additional and complementary information has the potential to improve tumor characterization in vivo, warranting further investigation of rapid multi-tracer techniques. PMID:20046800
Fourier analysis of multitracer cosmological surveys
NASA Astrophysics Data System (ADS)
Abramo, L. Raul; Secco, Lucas F.; Loureiro, Arthur
2016-02-01
We present optimal quadratic estimators for the Fourier analysis of cosmological surveys that detect several different types of tracers of large-scale structure. Our estimators can be used to simultaneously fit the matter power spectrum and the biases of the tracers - as well as redshift-space distortions (RSDs), non-Gaussianities (NGs), or any other effects that are manifested through differences between the clusterings of distinct species of tracers. Our estimators reduce to the one by Feldman, Kaiser & Peacock (FKP) in the case of a survey consisting of a single species of tracer. We show that the multitracer estimators are unbiased, and that their covariance is given by the inverse of the multitracer Fisher matrix. When the biases, RSDs and NGs are fixed to their fiducial values, and one is only interested in measuring the underlying power spectrum, our estimators are projected into the estimator found by Percival, Verde & Peacock. We have tested our estimators on simple (lognormal) simulated galaxy maps, and we show that it performs as expected, being either equivalent or superior to the FKP method in all cases we analysed. Finally, we have shown how to extend the multitracer technique to include the one-halo term of the power spectrum.
Methodology for quantitative rapid multi-tracer PET tumor characterizations.
Kadrmas, Dan J; Hoffman, John M
2013-10-04
Positron emission tomography (PET) can image a wide variety of functional and physiological parameters in vivo using different radiotracers. As more is learned about the molecular basis for disease and treatment, the potential value of molecular imaging for characterizing and monitoring disease status has increased. Characterizing multiple aspects of tumor physiology by imaging multiple PET tracers in a single patient provides additional complementary information, and there is a significant body of literature supporting the potential value of multi-tracer PET imaging in oncology. However, imaging multiple PET tracers in a single patient presents a number of challenges. A number of techniques are under development for rapidly imaging multiple PET tracers in a single scan, where signal-recovery processing algorithms are employed to recover various imaging endpoints for each tracer. Dynamic imaging is generally used with tracer injections staggered in time, and kinetic constraints are utilized to estimate each tracers' contribution to the multi-tracer imaging signal. This article summarizes past and ongoing work in multi-tracer PET tumor imaging, and then organizes and describes the main algorithmic approaches for achieving multi-tracer PET signal-recovery. While significant advances have been made, the complexity of the approach necessitates protocol design, optimization, and testing for each particular tracer combination and application. Rapid multi-tracer PET techniques have great potential for both research and clinical cancer imaging applications, and continued research in this area is warranted.
Methodology for Quantitative Rapid Multi-Tracer PET Tumor Characterizations
Kadrmas, Dan J.; Hoffman, John M.
2013-01-01
Positron emission tomography (PET) can image a wide variety of functional and physiological parameters in vivo using different radiotracers. As more is learned about the molecular basis for disease and treatment, the potential value of molecular imaging for characterizing and monitoring disease status has increased. Characterizing multiple aspects of tumor physiology by imaging multiple PET tracers in a single patient provides additional complementary information, and there is a significant body of literature supporting the potential value of multi-tracer PET imaging in oncology. However, imaging multiple PET tracers in a single patient presents a number of challenges. A number of techniques are under development for rapidly imaging multiple PET tracers in a single scan, where signal-recovery processing algorithms are employed to recover various imaging endpoints for each tracer. Dynamic imaging is generally used with tracer injections staggered in time, and kinetic constraints are utilized to estimate each tracers' contribution to the multi-tracer imaging signal. This article summarizes past and ongoing work in multi-tracer PET tumor imaging, and then organizes and describes the main algorithmic approaches for achieving multi-tracer PET signal-recovery. While significant advances have been made, the complexity of the approach necessitates protocol design, optimization, and testing for each particular tracer combination and application. Rapid multi-tracer PET techniques have great potential for both research and clinical cancer imaging applications, and continued research in this area is warranted. PMID:24312149
Cheng, Xiaoyin; Li, Zhoulei; Liu, Zhen; Navab, Nassir; Huang, Sung-Cheng; Keller, Ulrich; Ziegler, Sibylle; Shi, Kuangyu
2015-02-12
The separation of multiple PET tracers within an overlapping scan based on intrinsic differences of tracer pharmacokinetics is challenging, due to limited signal-to-noise ratio (SNR) of PET measurements and high complexity of fitting models. In this study, we developed a direct parametric image reconstruction (DPIR) method for estimating kinetic parameters and recovering single tracer information from rapid multi-tracer PET measurements. This is achieved by integrating a multi-tracer model in a reduced parameter space (RPS) into dynamic image reconstruction. This new RPS model is reformulated from an existing multi-tracer model and contains fewer parameters for kinetic fitting. Ordered-subsets expectation-maximization (OSEM) was employed to approximate log-likelihood function with respect to kinetic parameters. To incorporate the multi-tracer model, an iterative weighted nonlinear least square (WNLS) method was employed. The proposed multi-tracer DPIR (MTDPIR) algorithm was evaluated on dual-tracer PET simulations ([18F]FDG and [11C]MET) as well as on preclinical PET measurements ([18F]FLT and [18F]FDG). The performance of the proposed algorithm was compared to the indirect parameter estimation method with the original dual-tracer model. The respective contributions of the RPS technique and the DPIR method to the performance of the new algorithm were analyzed in detail. For the preclinical evaluation, the tracer separation results were compared with single [18F]FDG scans of the same subjects measured 2 days before the dual-tracer scan. The results of the simulation and preclinical studies demonstrate that the proposed MT-DPIR method can improve the separation of multiple tracers for PET image quantification and kinetic parameter estimations.
Expansion of all multitrace tree level EYM amplitudes
NASA Astrophysics Data System (ADS)
Du, Yi-Jian; Feng, Bo; Teng, Fei
2017-12-01
In this paper, we investigate the expansion of tree level multitrace Einstein-Yang-Mills (EYM) amplitudes. First, we propose two types of recursive expansions of tree level EYM amplitudes with an arbitrary number of gluons, gravitons and traces by those amplitudes with fewer traces or/and gravitons. Then we give many support evidence, including proofs using the Cachazo-He-Yuan (CHY) formula and Britto-Cachazo-Feng-Witten (BCFW) recursive relation. As a byproduct, two types of generalized BCJ relations for multitrace EYM are further proposed, which will be useful in the BCFW proof. After one applies the recursive expansions repeatedly, any multitrace EYM amplitudes can be given in the Kleiss-Kuijf (KK) basis of tree level color ordered Yang-Mills (YM) amplitudes. Thus the Bern-Carrasco-Johansson (BCJ) numerators, as the expansion coefficients, for all multitrace EYM amplitudes are naturally constructed.
The excretion of biotrace elements using the multitracer technique in tumour-bearing mice.
Wang, X; Tian, J; Yin, X M; Zhang, X; Wang, Q Z
2000-12-01
A radioactive multitracer solution obtained from the nuclear reaction of selenium with 25 MeV/nucleon 40Ar ions was used for investigation of trace element excretion into the faeces and urine of cancerous mice. The excretion rates of 22 elements (Na, K, Rb, Mg, Ca, Sr, Ga, As, Sc, V, Cr, Mn, Co, Fe, Y, Zr, Mo, Nb, Tc, Ru, Ag and In) were simultaneously measured under strictly identical experimental conditions, in order to clarify the excretion behavior of these elements in cancerous mice. The faecal and urinary excretion rates of Mg, Sr, Ga, As, Sc, V, Cr, Mn, Co, Fe, Y, Zr, Nb, Ru and Mo in cancerous mice, showed the in highest value at 0-8 hours. The accumulative excretion of Ca, Mo, Y and Zr was decreased and Na, Fe, Mn and Co increased in tumour-bearing mice, when compared to normal mice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lapuyade-Lahorgue, J; Ruan, S; Li, H
Purpose: Multi-tracer PET imaging is getting more attention in radiotherapy by providing additional tumor volume information such as glucose and oxygenation. However, automatic PET-based tumor segmentation is still a very challenging problem. We propose a statistical fusion approach to joint segment the sub-area of tumors from the two tracers FDG and FMISO PET images. Methods: Non-standardized Gamma distributions are convenient to model intensity distributions in PET. As a serious correlation exists in multi-tracer PET images, we proposed a new fusion method based on copula which is capable to represent dependency between different tracers. The Hidden Markov Field (HMF) model ismore » used to represent spatial relationship between PET image voxels and statistical dynamics of intensities for each modality. Real PET images of five patients with FDG and FMISO are used to evaluate quantitatively and qualitatively our method. A comparison between individual and multi-tracer segmentations was conducted to show advantages of the proposed fusion method. Results: The segmentation results show that fusion with Gaussian copula can receive high Dice coefficient of 0.84 compared to that of 0.54 and 0.3 of monomodal segmentation results based on individual segmentation of FDG and FMISO PET images. In addition, high correlation coefficients (0.75 to 0.91) for the Gaussian copula for all five testing patients indicates the dependency between tumor regions in the multi-tracer PET images. Conclusion: This study shows that using multi-tracer PET imaging can efficiently improve the segmentation of tumor region where hypoxia and glucidic consumption are present at the same time. Introduction of copulas for modeling the dependency between two tracers can simultaneously take into account information from both tracers and deal with two pathological phenomena. Future work will be to consider other families of copula such as spherical and archimedian copulas, and to eliminate partial volume effect by considering dependency between neighboring voxels.« less
NASA Astrophysics Data System (ADS)
Ogée, Jerome; Wehr, Richard; Commane, Roisin; Launois, Thomas; Meredith, Laura; Munger, Bill; Nelson, David; Saleska, Scott; Zahniser, Mark; Wofsy, Steve; Wingate, Lisa
2016-04-01
The net flux of carbon dioxide between the land surface and the atmosphere is dominated by photosynthesis and soil respiration, two of the largest gross CO2 fluxes in the carbon cycle. More robust estimates of these gross fluxes could be obtained from the atmospheric budgets of other valuable tracers, such as carbonyl sulfide (COS) or the carbon and oxygen isotope compositions (δ13C and δ18O) of atmospheric CO2. Over the past decades, the global atmospheric flask network has measured the inter-annual and intra-annual variations in the concentrations of these tracers. However, knowledge gaps and a lack of high-resolution multi-tracer ecosystem-scale measurements have hindered the development of process-based models that can simulate the behaviour of each tracer in response to environmental drivers. We present novel datasets of net ecosystem COS, 13CO2 and CO18O exchange and vertical profile data collected over 3 consecutive growing seasons (2011-2013) at the Harvard forest flux site. We then used the process-based model MuSICA (multi-layer Simulator of the Interactions between vegetation Canopy and the Atmosphere) to include the transport, reaction, diffusion and production of each tracer within the forest and exchanged with the atmosphere. Model simulations over the three years captured well the impact of diurnally and seasonally varying environmental conditions on the net ecosystem exchange of each tracer. The model also captured well the dynamic vertical features of tracer behaviour within the canopy. This unique dataset and model sensitivity analysis highlights the benefit in the collection of multi-tracer high-resolution field datasets and the developement of multi-tracer land surface models to provide valuable constraints on photosynthesis and respiration across scales in the near future.
NASA Astrophysics Data System (ADS)
Jerez-Hanckes, Carlos; Pérez-Arancibia, Carlos; Turc, Catalin
2017-12-01
We present Nyström discretizations of multitrace/singletrace formulations and non-overlapping Domain Decomposition Methods (DDM) for the solution of Helmholtz transmission problems for bounded composite scatterers with piecewise constant material properties. We investigate the performance of DDM with both classical Robin and optimized transmission boundary conditions. The optimized transmission boundary conditions incorporate square root Fourier multiplier approximations of Dirichlet to Neumann operators. While the multitrace/singletrace formulations as well as the DDM that use classical Robin transmission conditions are not particularly well suited for Krylov subspace iterative solutions of high-contrast high-frequency Helmholtz transmission problems, we provide ample numerical evidence that DDM with optimized transmission conditions constitute efficient computational alternatives for these type of applications. In the case of large numbers of subdomains with different material properties, we show that the associated DDM linear system can be efficiently solved via hierarchical Schur complements elimination.
Fusion of multi-tracer PET images for dose painting.
Lelandais, Benoît; Ruan, Su; Denœux, Thierry; Vera, Pierre; Gardin, Isabelle
2014-10-01
PET imaging with FluoroDesoxyGlucose (FDG) tracer is clinically used for the definition of Biological Target Volumes (BTVs) for radiotherapy. Recently, new tracers, such as FLuoroThymidine (FLT) or FluoroMisonidazol (FMiso), have been proposed. They provide complementary information for the definition of BTVs. Our work is to fuse multi-tracer PET images to obtain a good BTV definition and to help the radiation oncologist in dose painting. Due to the noise and the partial volume effect leading, respectively, to the presence of uncertainty and imprecision in PET images, the segmentation and the fusion of PET images is difficult. In this paper, a framework based on Belief Function Theory (BFT) is proposed for the segmentation of BTV from multi-tracer PET images. The first step is based on an extension of the Evidential C-Means (ECM) algorithm, taking advantage of neighboring voxels for dealing with uncertainty and imprecision in each mono-tracer PET image. Then, imprecision and uncertainty are, respectively, reduced using prior knowledge related to defects in the acquisition system and neighborhood information. Finally, a multi-tracer PET image fusion is performed. The results are represented by a set of parametric maps that provide important information for dose painting. The performances are evaluated on PET phantoms and patient data with lung cancer. Quantitative results show good performance of our method compared with other methods. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sirocko, Frank; Garbe-Schönberg, Dieter; Devey, Colin
2000-11-01
Thirty seven deep-sea sediment cores from the Arabian Sea were studied geochemically (49 major and trace elements) for four time slices during the Holocene and the last glacial, and in one high sedimentation rate core (century scale resolution) to detect tracers of past variations in the intensity of the atmospheric monsoon circulation and its hydrographic expression in the ocean surface. This geochemical multi-tracer approach, coupled with additional information on the grain size composition of the clastic fraction, the bulk carbonate and biogenic opal contents makes it possible to characterize the sedimentological regime in detail. Sediments characterized by a specific elemental composition (enrichment) originated from the following sources: river suspensions from the Tapti and Narbada, draining the Indian Deccan traps (Ti, Sr); Indus sediments and dust from Rajasthan and Pakistan (Rb, Cs); dust from Iran and the Persian Gulf (Al, Cr); dust from central Arabia (Mg); dust from East Africa and the Red Sea (Zr/Hf, Ti/Al). C org, Cd, Zn, Ba, Pb, U, and the HREE are associated with the intensity of upwelling in the western Arabian Sea, but only those patterns that are consistently reproduced by all of these elements can be directly linked with the intensity of the southwest monsoon. Relying on information from a single element can be misleading, as each element is affected by various other processes than upwelling intensity and nutrient content of surface water alone. The application of the geochemical multi-tracer approach indicates that the intensity of the southwest monsoon was low during the LGM, declined to a minimum from 15,000-13,000 14C year BP, intensified slightly at the end of this interval, was almost stable during the Bölling, Alleröd and the Younger Dryas, but then intensified in two abrupt successions at the end of the Younger Dryas (9900 14C year BP) and especially in a second event during the early Holocene (8800 14C year BP). Dust discharge by northwesterly winds from Arabia exhibited a similar evolution, but followed an opposite course: high during the LGM with two primary sources—the central Arabian desert and the dry Persian Gulf region. Dust discharge from both regions reached a pronounced maximum at 15,000-13,000 14C year. At the end of this interval, however, the dust plumes from the Persian Gulf area ceased dramatically, whereas dust discharge from central Arabia decreased only slightly. Dust discharge from East Africa and the Red Sea increased synchronously with the two major events of southwest monsoon intensification as recorded in the nutrient content of surface waters. In addition to the tracers of past dust flux and surface water nutrient content, the geochemical multi-tracer approach provides information on the history of deep sea ventilation (Mo, S), which was much lower during the last glacial maximum than during the Holocene. The multi-tracer approach—i.e. a few sedimentological parameters plus a set of geochemical tracers widely available from various multi-element analysis techniques—is a highly applicable technique for studying the complex sedimentation patterns of an ocean basin, and, specifically in the case of the Arabian Sea, can even reveal the seasonal structure of climate change.
Re, V; Sacchi, E; Mas-Pla, J; Menció, A; El Amrani, N
2014-12-01
Groundwater pollution from anthropogenic sources is a serious concern affecting several coastal aquifers worldwide. Increasing groundwater exploitation, coupled with point and non-point pollution sources, are the main anthropogenic impacts on coastal environments and are responsible for severe health and food security issues. Adequate management strategies to protect groundwater from contamination and overexploitation are of paramount importance, especially in arid prone regions, where coastal aquifers often represent the main freshwater resource to sustain human needs. The Bou-Areg Aquifer (Morocco) is a perfect example of a coastal aquifer constantly exposed to all the negative externalities associated with groundwater use for agricultural purposes, which lead to a general increase in aquifer salinization. In this study data on 61 water samples, collected in June and November 2010, were used to: (i) track groundwater composition changes related to the use of irrigation water from different sources, (ii) highlight seasonal variations to assess aquifer vulnerability, and (iii) present a reproducible example of multi-tracer approach for groundwater management in rural coastal areas. Hydrogeochemical results show that Bou-Areg groundwater is characterized by - high salinity, associated with a remarkable increase in bicarbonate content in the crop growing season, due to more intense biological activity in irrigated soils. The coupled multi-tracer and statistical analysis confirms the strong dependency on irrigation activities as well as a clear identification of the processes governing the aquifer's hydrochemistry in the different seasons. Water Rock Interaction (WRI) dominates the composition of most of groundwater samples in the Low Irrigation season (L-IR) and Agricultural Return Flow (ARF) mainly affects groundwater salinization in the High Irrigation season (H-IR) in the same areas naturally affected by WRI. In the central part of the plain River Recharge (RR) from the Selouane River is responsible for the high groundwater salinity whilst Mixing Processes (MIX) occur in absence of irrigation activities. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Pratama Wahyu Hidayat, Putra; Hary Murti, Antonius; Sudarmaji; Shirly, Agung; Tiofan, Bani; Damayanti, Shinta
2018-03-01
Geometry is an important parameter for the field of hydrocarbon exploration and exploitation, it has significant effect to the amount of resources or reserves, rock spreading, and risk analysis. The existence of geological structure or fault becomes one factor affecting geometry. This study is conducted as an effort to enhance seismic image quality in faults dominated area namely offshore Madura Strait. For the past 10 years, Oligo-Miocene carbonate rock has been slightly explored on Madura Strait area, the main reason because migration and trap geometry still became risks to be concern. This study tries to determine the boundary of each fault zone as subsurface image generated by converting seismic data into variance attribute. Variance attribute is a multitrace seismic attribute as the derivative result from amplitude seismic data. The result of this study shows variance section of Madura Strait area having zero (0) value for seismic continuity and one (1) value for discontinuity of seismic data. Variance section shows the boundary of RMKS fault zone with Kendeng zone distinctly. Geological structure and subsurface geometry for Oligo-Miocene carbonate rock could be identified perfectly using this method. Generally structure interpretation to identify the boundary of fault zones could be good determined by variance attribute.
Asante, Kwadwo Ansong; Agusa, Tetsuro; Biney, Charles Augustus; Agyekum, William Atuobi; Bello, Mohammed; Otsuka, Masanari; Itai, Takaaki; Takahashi, Shin; Tanabe, Shinsuke
2012-05-01
To understand human contamination by multi-trace elements (TEs) in electrical and electronic waste (e-waste) recycling site at Agbogbloshie, Accra in Ghana, this study analyzed TEs and As speciation in urine of e-waste recycling workers. Concentrations of Fe, Sb, and Pb in urine of e-waste recycling workers were significantly higher than those of reference sites after consideration of interaction by age, indicating that the recycling workers are exposed to these TEs through the recycling activity. Urinary As concentration was relatively high, although the level in drinking water was quite low. Speciation analysis of As in human urine revealed that arsenobetaine and dimethylarsinic acid were the predominant As species and concentrations of both species were positively correlated with total As concentration as well as between each other. These results suggest that such compounds may be derived from the same source, probably fish and shellfish and greatly influence As exposure levels. To our knowledge, this is the first study on human contamination resulting from the primitive recycling of e-waste in Ghana. This study will contribute to the knowledge about human exposure to trace elements from an e-waste site in a less industrialized region so far scantly covered in the literature. Copyright © 2012 Elsevier B.V. All rights reserved.
Obara, S; Nagai, T
1983-01-01
The instantaneous frequency display of single unit discharges provides a useful measure of neuronal activities. Such a device must produce voltage outputs proportional to the reciprocal of each inter-spike interval by on-line computation of the hyperbola of V = a/t. Segment approximation of the required hyperbola can be made by a series of exponential functions which increase in time constants by a factor of m. Numerical analysis of a normalized function indicates possible error maxima of 3.4, 2.4 and 1.1% for m of 2, 1.8 and 1.5, respectively. This prediction is fully confirmed by the actual performance where m of 1.5 is adopted. The test circuit combines only readily available ICs and other components, to give a linear F-V conversion over a dynamic range of 4-600 Hz with error maxima of approximately 1%. The outputs are square pulses of approximately 1.5 ms in duration through the use of a flexible sample-hold circuit. Compared with that of earlier models, this display mode gives better photographic records with the base-line in simultaneous multi-trace display. Simple and systematic methods are described for designing a circuit to one's own specifications, and also for compensating for component variations.
A multitracer system for multizone ventilation measurement
NASA Astrophysics Data System (ADS)
Sherman, Max
1990-09-01
Mass transfer due to pressure-driven air flow is one of the most important processes for determining both environmental quality and energy requirements in buildings. Heat, moisture, and contaminants are all transported by air movement between indoors and outdoors as well as between different zones within a building. Measurement of these air flows is critical to understanding the performance of buildings. Virtually all measurements of ventilation are made using the dilution of a tracer gas. The vast majority of such measurements have been made in a single zone, using a single tracer gas. For the past several years LBL has been developing the MultiTracer Measurement System (MTMS) to provide full multizone air flow information in an accurate, real-time manner. MTMS is based on a quadrupole mass spectrometer to provide high-speed concentration analysis of multiple tracer gases in the (low) ppm level that are injected into multiple zones using mass-flow controllers. The measurement and injection system is controlled by a PC and can measure all concentrations in all zones (and adjust the injected tracer flows) within 2 min and can operate unattended for weeks. The resulting injection rate and concentration data can be analyzed to infer the bulk air movement between zones. The system also measures related quantities such as weather and zonal temperature to assist in the data interpretation. Using MTMS, field measurements have been made for the past two years.
Combining multitracing and 2D-modelling to identify the dynamic of heavy metals during flooding.
NASA Astrophysics Data System (ADS)
Hissler, C.; Hostache, R.; Matgen, P.; Tosheva, Z.; David, E.; Bates, P.; Stille, P.
2012-04-01
Recent years have seen a growing awareness of the wider environmental significance of the sediment loads transported by rivers and streams. This includes the importance of suspended sediment in transporting heavy metals and the potential for these trace elements to be desorbed from the particles to the solution. That threaten the water quality and can cause severe impacts in downstream areas like wetlands and floodplains. Contemporary data on the sediment loads of rivers provide clear evidence of significant recent changes in the sediment fluxes and of several rivers in response to human activities. For instance, Trace elements (including heavy metals) that are currently considered to be undisturbed by human activities and used as tracers of continental crust derived material have become more and more involved in industrial processes. Mathematical models validated by in situ experimentations are the only available tool to predict the consequences of natural as well as man-induced environmental changes and impacts on sediment dynamics. They are approximate representations of complex natural systems and the evaluation of a model with respect to its ability to reproduce multiple criteria and behaviour of a real system is still problematic. Interactions between modellers and experimentalists improve significantly the interpretation of the modelling output and led to formulate more realistic assumptions on the behaviour of natural systems. The geochemical information, which appeared to be non-correlated with the hydrological standard parameters, provides new information and contributes to give an "orthogonal view" on the hydrologic system behaviour. Regarding the recent development in geochemical tracer applications in models, the multi-tracer approach (natural vs anthropogenic; elemental concentration-isotopic signature-radionuclide activity) may be a necessity to decrease significantly the uncertainties in sediment transport modelling. The objective of this study is to assess the risk of floodplain contamination in heavy metal due to river sediment deposition and to heavy metal partitioning between particulate and dissolved phases. We focus on a multidisciplinary approach combining environmental geochemistry (multitracing) and hydraulic modelling (using TELEMAC-2D). One important single flood event was selected to illustrate this innovative approach. During the entire flood, the river water was sampled every hour in order to collect the particulate and the dissolved fractions. All the tracers were analyzed in both fractions. An important set of hydrological and sedimentological data are used to reach a more efficient calibration of the TELEMAC modelling system. In addition to standard techniques of hydrochemistry, new approaches of in situ suspended sediment transport monitoring will help getting new insights on the hydraulic system behaviour.
Preparing CAM-SE for Multi-Tracer Applications: CAM-SE-Cslam
NASA Astrophysics Data System (ADS)
Lauritzen, P. H.; Taylor, M.; Goldhaber, S.
2014-12-01
The NCAR-DOE spectral element (SE) dynamical core comes from the HOMME (High-Order Modeling Environment; Dennis et al., 2012) and it is available in CAM. The CAM-SE dynamical core is designed with intrinsic mimetic properties guaranteeing total energy conservation (to time-truncation errors) and mass-conservation, and has demonstrated excellent scalability on massively parallel compute platforms (Taylor, 2011). For applications involving many tracers such as chemistry and biochemistry modeling, CAM-SE has been found to be significantly more computationally costly than the current "workhorse" model CAM-FV (Finite-Volume; Lin 2004). Hence a multi-tracer efficient scheme, called the CSLAM (Conservative Semi-Lagrangian Multi-tracer; Lauritzen et al., 2011) scheme, has been implemented in the HOMME (Erath et al., 2012). The CSLAM scheme has recently been cast in flux-form in HOMME so that it can be coupled to the SE dynamical core through conventional flux-coupling methods where the SE dynamical core provides background air mass fluxes to CSLAM. Since the CSLAM scheme makes use of a finite-volume gnomonic cubed-sphere grid and hence does not operate on the SE quadrature grid, the capability of running tracer advection, the physical parameterization suite and dynamics on separate grids has been implemented in CAM-SE. The default CAM-SE-CSLAM setup is to run physics on the quasi-equal area CSLAM grid. The capability of running physics on a different grid than the SE dynamical core may provide a more consistent coupling since the physics grid option operates with quasi-equal-area cell average values rather than non-equi-distant grid-point (SE quadrature point) values. Preliminary results on the performance of CAM-SE-CSLAM will be presented.
NASA Astrophysics Data System (ADS)
Schilling, Oliver S.; Gerber, Christoph; Partington, Daniel J.; Purtschert, Roland; Brennwald, Matthias S.; Kipfer, Rolf; Hunkeler, Daniel; Brunner, Philip
2017-12-01
To provide a sound understanding of the sources, pathways, and residence times of groundwater water in alluvial river-aquifer systems, a combined multitracer and modeling experiment was carried out in an important alluvial drinking water wellfield in Switzerland. 222Rn, 3H/3He, atmospheric noble gases, and the novel 37Ar-method were used to quantify residence times and mixing ratios of water from different sources. With a half-life of 35.1 days, 37Ar allowed to successfully close a critical observational time gap between 222Rn and 3H/3He for residence times of weeks to months. Covering the entire range of residence times of groundwater in alluvial systems revealed that, to quantify the fractions of water from different sources in such systems, atmospheric noble gases and helium isotopes are tracers suited for end-member mixing analysis. A comparison between the tracer-based mixing ratios and mixing ratios simulated with a fully-integrated, physically-based flow model showed that models, which are only calibrated against hydraulic heads, cannot reliably reproduce mixing ratios or residence times of alluvial river-aquifer systems. However, the tracer-based mixing ratios allowed the identification of an appropriate flow model parametrization. Consequently, for alluvial systems, we recommend the combination of multitracer studies that cover all relevant residence times with fully-coupled, physically-based flow modeling to better characterize the complex interactions of river-aquifer systems.
Trophic niche partitioning of littoral fish species from the rocky intertidal of Helgoland, Germany
NASA Astrophysics Data System (ADS)
Hielscher, N. N.; Malzahn, A. M.; Diekmann, R.; Aberle, N.
2015-12-01
During a 3-year field study, interspecific and interannual differences in the trophic ecology of littoral fish species were investigated in the rocky intertidal of Helgoland island (North Sea). We investigated trophic niche partitioning of common coexisting littoral fish species based on a multi-tracer approach using stable isotope and fatty acids in order to show differences and similarities in resource use and feeding modes. The results of the dual-tracer approach showed clear trophic niche partitioning of the five target fish species, the goldsinny wrasse Ctenolabrus rupestris, the sand goby Pomatoschistus minutus, the painted goby Pomatoschistus pictus, the short-spined sea scorpion Myoxocephalus scorpius and the long-spined sea scorpion Taurulus bubalis. Both stable isotopes and fatty acids showed distinct differences in the trophic ecology of the studied fish species. However, the combined use of the two techniques added an additional resolution on the interannual scale. The sand goby P. minutus showed the largest trophic plasticity with a pronounced variability between years. The present data analysis provides valuable information on trophic niche partitioning of fish species in the littoral zones of Helgoland and on complex benthic food webs in general.
Tracing sediment movement on semi-arid watershed using Rare Earth Elements 1988
USDA-ARS?s Scientific Manuscript database
A multi-tracer method employing rare earth elements (REE) was used to determine sediment yield and to track sediment movement in a small semiarid watershed. A 0.33 ha watershed near Tombstone, AZ was divided into five morphological units, each tagged with one of five REE oxides. Relative contributi...
NASA Astrophysics Data System (ADS)
Cullin, J. A.; Ward, A. S.; Cwiertny, D. M.; Barber, L. B.; Kolpin, D. W.; Bradley, P. M.; Keefe, S. H.; Hubbard, L. E.
2013-12-01
Contaminants of emerging concern (CECs) are an unregulated suite of constituents possessing the potential to cause a host of reproductive and developmental problems in humans and wildlife. CECs are frequently detected in environmental waters. Degradation pathways of several CECs are well-characterized in idealized laboratory settings, but CEC fate and transport in complex field settings is poorly understood. In the present study we used a multi-tracer solute injection study to quantify physical transport, photodegradation, and sorption in a wastewater effluent-impacted stream. Conservative tracers were used to quantify physical transport processes in the stream. Use of reactive fluorescent tracers allows for isolation of the relative contribution of photodegradation and sorption within the system. Field data was used to calibrate a one-dimensional transport model allowing us to use forward modeling to predict the transport of sulfamethoxazole, an antibiotic documented to be present in the wastewater effluent and in Fourmile Creek which is susceptible to both sorption and photolysis. Forward modeling will predict both temporal persistence and spatial extent of sulfamethoxazole in Fourmile Creek
Optimization of spectroscopic surveys for testing non-Gaussianity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raccanelli, Alvise; Doré, Olivier; Dalal, Neal, E-mail: alvise@caltech.edu, E-mail: Olivier.P.Dore@jpl.nasa.gov, E-mail: dalaln@illinois.edu
We investigate optimization strategies to measure primordial non-Gaussianity with future spectroscopic surveys. We forecast measurements coming from the 3D galaxy power spectrum and compute constraints on primordial non-Gaussianity parameters f{sub NL} and n{sub NG}. After studying the dependence on those parameters upon survey specifications such as redshift range, area, number density, we assume a reference mock survey and investigate the trade-off between number density and area surveyed. We then define the observational requirements to reach the detection of f{sub NL} of order 1. Our results show that power spectrum constraints on non-Gaussianity from future spectroscopic surveys can improve on currentmore » CMB limits, but the multi-tracer technique and higher order correlations will be needed in order to reach an even better precision in the measurements of the non-Gaussianity parameter f{sub NL}.« less
Single-scan dual-tracer FLT+FDG PET tumor characterization.
Kadrmas, Dan J; Rust, Thomas C; Hoffman, John M
2013-02-07
Rapid multi-tracer PET aims to image two or more tracers in a single scan, simultaneously characterizing multiple aspects of physiology and function without the need for repeat imaging visits. Using dynamic imaging with staggered injections, constraints on the kinetic behavior of each tracer are applied to recover individual-tracer measures from the multi-tracer PET signal. The ability to rapidly and reliably image both (18)F-fluorodeoxyglucose (FDG) and (18)F-fluorothymidine (FLT) would provide complementary measures of tumor metabolism and proliferative activity, with important applications in guiding oncologic treatment decisions and assessing response. However, this tracer combination presents one of the most challenging dual-tracer signal-separation problems--both tracers have the same radioactive half-life, and the injection delay is short relative to the half-life and tracer kinetics. This work investigates techniques for single-scan dual-tracer FLT+FDG PET tumor imaging, characterizing the performance of recovering static and dynamic imaging measures for each tracer from dual-tracer datasets. Simulation studies were performed to characterize dual-tracer signal-separation performance for imaging protocols with both injection orders and injection delays of 10-60 min. Better performance was observed when FLT was administered first, and longer delays before administration of FDG provided more robust signal-separation and recovery of the single-tracer imaging measures. An injection delay of 30 min led to good recovery (R > 0.96) of static image values (e.g. SUV), K(net), and K(1) as compared to values from separate, single-tracer time-activity curves. Recovery of higher order rate parameters (k(2), k(3)) was less robust, indicating that information regarding these parameters was harder to recover in the presence of statistical noise and dual-tracer effects. Performance of the dual-tracer FLT(0 min)+FDG(32 min) technique was further evaluated using PET/CT imaging studies in five patients with primary brain tumors where the data from separate scans of each tracer were combined to synthesize dual-tracer scans with known single-tracer components; results demonstrated similar dual-tracer signal recovery performance. We conclude that rapid dual-tracer FLT+FDG tumor imaging is feasible and can provide quantitative tumor imaging measures comparable to those from conventional separate-scan imaging.
Single-scan dual-tracer FLT+FDG PET tumor characterization
NASA Astrophysics Data System (ADS)
Kadrmas, Dan J.; Rust, Thomas C.; Hoffman, John M.
2013-02-01
Rapid multi-tracer PET aims to image two or more tracers in a single scan, simultaneously characterizing multiple aspects of physiology and function without the need for repeat imaging visits. Using dynamic imaging with staggered injections, constraints on the kinetic behavior of each tracer are applied to recover individual-tracer measures from the multi-tracer PET signal. The ability to rapidly and reliably image both 18F-fluorodeoxyglucose (FDG) and 18F-fluorothymidine (FLT) would provide complementary measures of tumor metabolism and proliferative activity, with important applications in guiding oncologic treatment decisions and assessing response. However, this tracer combination presents one of the most challenging dual-tracer signal-separation problems—both tracers have the same radioactive half-life, and the injection delay is short relative to the half-life and tracer kinetics. This work investigates techniques for single-scan dual-tracer FLT+FDG PET tumor imaging, characterizing the performance of recovering static and dynamic imaging measures for each tracer from dual-tracer datasets. Simulation studies were performed to characterize dual-tracer signal-separation performance for imaging protocols with both injection orders and injection delays of 10-60 min. Better performance was observed when FLT was administered first, and longer delays before administration of FDG provided more robust signal-separation and recovery of the single-tracer imaging measures. An injection delay of 30 min led to good recovery (R > 0.96) of static image values (e.g. SUV), Knet, and K1 as compared to values from separate, single-tracer time-activity curves. Recovery of higher order rate parameters (k2, k3) was less robust, indicating that information regarding these parameters was harder to recover in the presence of statistical noise and dual-tracer effects. Performance of the dual-tracer FLT(0 min)+FDG(32 min) technique was further evaluated using PET/CT imaging studies in five patients with primary brain tumors where the data from separate scans of each tracer were combined to synthesize dual-tracer scans with known single-tracer components; results demonstrated similar dual-tracer signal recovery performance. We conclude that rapid dual-tracer FLT+FDG tumor imaging is feasible and can provide quantitative tumor imaging measures comparable to those from conventional separate-scan imaging.
Single-scan dual-tracer FLT+FDG PET tumor characterization
Kadrmas, Dan J; Rust, Thomas C; Hoffman, John M
2013-01-01
Rapid multi-tracer PET aims to image two or more tracers in a single scan, simultaneously characterizing multiple aspects of physiology and function without the need for repeat imaging visits. Using dynamic imaging with staggered injections, constraints on the kinetic behavior of each tracer are applied to recover individual-tracer measures from the multi-tracer PET signal. The ability to rapidly and reliably image both 18F-fluorodeoxyglucose (FDG) and 18F-fluorothymidine (FLT) would provide complementary measures of tumor metabolism and proliferative activity, with important applications in guiding oncologic treatment decisions and assessing response. However, this tracer combination presents one of the most challenging dual-tracer signal-separation problems—both tracers have the same radioactive half-life, and the injection delay is short relative to the half-life and tracer kinetics. This work investigates techniques for single-scan dual-tracer FLT+FDG PET tumor imaging, characterizing the performance of recovering static and dynamic imaging measures for each tracer from dual-tracer datasets. Simulation studies were performed to characterize dual-tracer signal-separation performance for imaging protocols with both injection orders and injection delays of 10–60 min. Better performance was observed when FLT was administered first, and longer delays before administration of FDG provided more robust signal-separation and recovery of the single-tracer imaging measures. An injection delay of 30 min led to good recovery (R > 0.96) of static image values (e.g. SUV), Knet, and K1 as compared to values from separate, single-tracer time-activity curves. Recovery of higher order rate parameters (k2, k3) was less robust, indicating that information regarding these parameters was harder to recover in the presence of statistical noise and dual-tracer effects. Performance of the dual-tracer FLT(0 min)+FDG(32 min) technique was further evaluated using PET/CT imaging studies in five patients with primary brain tumors where the data from separate scans of each tracer were combined to synthesize dual-tracer scans with known single-tracer components; results demonstrated similar dual-tracer signal recovery performance. We conclude that rapid dual-tracer FLT+FDG tumor imaging is feasible and can provide quantitative tumor imaging measures comparable to those from conventional separate-scan imaging. PMID:23296314
Arnon, Shai; Ronen, Zeev; Adar, Eilon; Yakirevich, Alexander; Nativ, Ronit
2005-10-01
The two-dimensional distribution of flow patterns and their dynamic change due to microbial activity were investigated in naturally fractured chalk cores. Long-term biodegradation experiments were conducted in two cores ( approximately 20 cm diameter, 31 and 44 cm long), intersected by a natural fracture. 2,4,6-tribromophenol (TBP) was used as a model contaminant and as the sole carbon source for aerobic microbial activity. The transmissivity of the fractures was continuously reduced due to biomass accumulation in the fracture concurrent with TBP biodegradation. From multi-tracer experiments conducted prior to and following the microbial activity, it was found that biomass accumulation causes redistribution of the preferential flow channels. Zones of slow flow near the fracture inlet were clogged, thus further diverting the flow through zones of fast flow, which were also partially clogged. Quantitative evaluation of biodegradation and bacterial counts supported the results of the multi-tracer tests, indicating that most of the bacterial activity occurs close to the inlet. The changing flow patterns, which control the nutrient supply, resulted in variations in the concentrations of the chemical constituents (TBP, bromide and oxygen), used as indicators of biodegradation.
Thomas D. Bullen; Scott W. Bailey
2005-01-01
Depletion of calcium from forest soils has important implications for forest productivity and health. Ca is available to fine feeder roots from a number of soil organic and mineral sources. but identifying the primary source or changes of sources in response to environmental change is problematic. We used strontium isotope and alkaline earth element concentration...
NASA Astrophysics Data System (ADS)
Xu, Wei; Su, Xiaosi; Dai, Zhenxue; Yang, Fengtian; Zhu, Pucheng; Huang, Yong
2017-11-01
Environmental tracers (such as major ions, stable and radiogenic isotopes, and heat) monitored in natural waters provide valuable information for understanding the processes of river-groundwater interactions in arid areas. An integrated framework is presented for interpreting multi-tracer data (major ions, stable isotopes (2H, 18O), the radioactive isotope 222Rn, and heat) for delineating the river-groundwater interactions in Nalenggele River basin, northwest China. Qualitative and quantitative analyses were undertaken to estimate the bidirectional water exchange associated with small-scale interactions between groundwater and surface water. Along the river stretch, groundwater and river water exchange readily. From the high mountain zone to the alluvial fan, groundwater discharge to the river is detected by tracer methods and end-member mixing models, but the river has also been identified as a losing river using discharge measurements, i.e. discharge is bidirectional. On the delta-front of the alluvial fan and in the alluvial plain, in the downstream area, the characteristics of total dissolved solids values, 222Rn concentrations and δ18O values in the surface water, and patterns derived from a heat-tracing method, indicate that groundwater discharges into the river. With the environmental tracers, the processes of river-groundwater interaction have been identified in detail for better understanding of overall hydrogeological processes and of the impacts on water allocation policies.
Biasing and the search for primordial non-Gaussianity beyond the local type
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gleyzes, Jérôme; De Putter, Roland; Doré, Olivier
Primordial non-Gaussianity encodes valuable information about the physics of inflation, including the spectrum of particles and interactions. Significant improvements in our understanding of non-Gaussanity beyond Planck require information from large-scale structure. The most promising approach to utilize this information comes from the scale-dependent bias of halos. For local non-Gaussanity, the improvements available are well studied but the potential for non-Gaussianity beyond the local type, including equilateral and quasi-single field inflation, is much less well understood. In this paper, we forecast the capabilities of large-scale structure surveys to detect general non-Gaussianity through galaxy/halo power spectra. We study how non-Gaussanity can bemore » distinguished from a general biasing model and where the information is encoded. For quasi-single field inflation, significant improvements over Planck are possible in some regions of parameter space. We also show that the multi-tracer technique can significantly improve the sensitivity for all non-Gaussianity types, providing up to an order of magnitude improvement for equilateral non-Gaussianity over the single-tracer measurement.« less
Palm, Eric; Dotson, Bryan
2015-11-01
Drug shortages in the United States, including parenteral nutrition (PN) components, have been common in recent years and can adversely affect patient care. Here we report a case of copper and zinc deficiency in a patient receiving PN during a shortage of parenteral trace element products. The management of the patient's deficiencies, including the use of an imported parenteral multi-trace element product, is described. © 2014 American Society for Parenteral and Enteral Nutrition.
Birkigt, Jan; Stumpp, Christine; Małoszewski, Piotr; Nijenhuis, Ivonne
2018-04-15
In recent years, constructed wetland systems have become into focus as means of cost-efficient organic contaminant management. Wetland systems provide a highly reactive environment in which several removal pathways of organic chemicals may be present at the same time; however, specific elimination processes and hydraulic conditions are usually separately investigated and thus not fully understood. The flow system in a three dimensional pilot-scale horizontal subsurface constructed wetland was investigated applying a multi-tracer test combined with a mathematical model to evaluate the flow and transport processes. The results indicate the existence of a multiple flow system with two distinct flow paths through the gravel bed and a preferential flow at the bottom transporting 68% of tracer mass resulting from the inflow design of the model wetland system. There the removal of main contaminant chlorobenzene was up to 52% based on different calculation approaches. Determined retention times in the range of 22d to 32.5d the wetland has a heterogeneous flow pattern. Differences between simulated and measured tracer concentrations in the upper sediment indicate diffusion dominated processes due to stagnant water zones. The tracer study combining experimental evaluation with mathematical modeling demonstrated the complexity of flow and transport processes in the constructed wetlands which need to be taken into account during interpretation of the determining attenuation processes. Copyright © 2017 Elsevier B.V. All rights reserved.
Osland, Emma J; Ali, Azmat; Isenring, Elizabeth; Ball, Patrick; Davis, Melvyn; Gillanders, Lyn
2014-01-01
This work represents the first part of a progressive review of AuSPEN's 1999 Guidelines for Provision of Micronutrient Supplementation in Adult Patients receiving Parenteral Nutrition, in recognition of the developments in the literature on this topic since that time. A systematic literature review was undertaken and recommendations were made based on the available evidence and with consideration to specific elements of the Australian and New Zealand practice environment. The strength of evidence underpinning each recommendation was assessed. External reviewers provided feedback on the guidelines using the AGREE II tool. Reduced doses of manganese, copper, chromium and molybdenum, and an increased dose of selenium are recommended when compared with the 1999 guidelines. Currently the composition of available multi-trace element formulations is recognised as an obstacle to aligning these guidelines with practice. A paucity of available literature and limitations with currently available methods of monitoring trace element status are acknowledged. The currently unknown clinical impact of changes to trace element contamination of parenteral solutions with contemporary practices highlights need for research and clinical vigilance in this area of nutrition support practice. Trace elements are essential and should be provided daily to patients receiving parenteral nutrition. Monitoring is generally only required in longer term parenteral nutrition, however should be determined on an individual basis. Industry is encouraged to modify existing multi-trace element solutions available in Australia and New Zealand to reflect changes in the literature outlined in these guidelines. Areas requiring research are highlighted.
NASA Astrophysics Data System (ADS)
Hillebrand, O.; Nödler, K.; Licha, T.; Geyer, T.
2012-04-01
The application of organic micro-contaminants as indicators for contamination sources in aquifers and surface-water bodies has been increasingly discussed in the literature over the last years. One of the proposed substances was caffeine. It served as indicator for wastewater-leakage to various systems. As well, wastewater volumes could be estimated from caffeine concentrations. Although caffeine is known to be degradable, the degradation rates are normally only determined from mass balances or laboratory experiments. Degradation rates obtained from mass balances are relatively uncertain, as the input-function is difficult to be assessed. Laboratory experiments are hardly capable to consider the full complexity of natural systems and can rarely be transferred to those. To solve this problem, in-situ degradation rates of reactive indicators have to be determined. Especially multitracer tests can be used to access compound-specific transport parameters and degradation rates, relative to conservative tracers. A multitracer test with caffeine and uranine has been performed in a karst system (catchment of the Gallusquelle spring, SW Germany). From the breakthrough curves of the tracers, the transport behavior and the in-situ degradation rate of caffeine could be deduced. The tracers were injected into a sinkhole with a linear distance of 3000 m to the spring. The mean residence time of the tracers was found to be 84 h at a flow velocity of 35 m/h. Throughout the whole experiment, the spring discharge was constant at 187 L/s. Uranine served as conservative reference-tracer for the calibration of a one-dimensional transport model with respect to solute-unspecific parameters. Relative to that, the tracer breakthrough curve of caffeine was interpreted. As solute-specific parameters the retardation coefficient as well as degradation rate of caffeine in the investigated karst aquifer could be determined. The results indicate, that caffeine is slightly retarded in the investigated aquifer (R= 1.031-1.046) and is readily degradable (half-life t1/2= 90-105 h; temperature of the spring water T= 8-9 °C). The degradation rate is surprisingly high. In general, no significant degradation is believed to occur, during the rapid transport in karst systems. The high degradation rates of caffeine illustrate the potential to use this substance as reactive tracer to indicate biological activity within the aquifer. Due to the good degradability of caffeine it does not pose a threat as long-time contamination and can therefore safely be used as reactive tracer in aquifer systems.
Improved microseismic event locations through large-N arrays and wave-equation imaging and inversion
NASA Astrophysics Data System (ADS)
Witten, B.; Shragge, J. C.
2016-12-01
The recent increased focus on small-scale seismicity, Mw < 4 has come about primarily for two reasons. First, there is an increase in induced seismicity related to injection operations primarily for wastewater disposal and hydraulic fracturing for oil and gas recovery and for geothermal energy production. While the seismicity associated with injection is sometimes felt, it is more often weak. Some weak events are detected on current sparse arrays; however, accurate location of the events often requires a larger number of (multi-component) sensors. This leads to the second reason for an increased focus on small magnitude seismicity: a greater number of seismometers are being deployed in large N-arrays. The greater number of sensors decreases the detection threshold and therefore significantly increases the number of weak events found. Overall, these two factors bring new challenges and opportunities. Many standard seismological location and inversion techniques are geared toward large, easily identifiable events recorded on a sparse number of stations. However, with large-N arrays we can detect small events by utilizing multi-trace processing techniques, and increased processing power equips us with tools that employ more complete physics for simultaneously locating events and inverting for P- and S-wave velocity structure. We present a method that uses large-N arrays and wave-equation-based imaging and inversion to jointly locate earthquakes and estimate the elastic velocities of the earth. The technique requires no picking and is thus suitable for weak events. We validate the methodology through synthetic and field data examples.
NASA Astrophysics Data System (ADS)
Suckow, Axel; Taylor, Andrew; Davies, Phil; Leaney, Fred
2017-04-01
Depressurisation of coal seams in the Walloon Coal Measures in Queensland, Australia, may influence aquifers both over- and underlying the formation. The Gubberamunda Sandstone aquifer, which overlies the Walloon Coal Measures, is the starting point of the Great Artesian Basin (GAB) flow system and has been the focus of numerous recharge studies. In comparison, the Hutton Sandstone aquifer, which underlies the Walloon Coal Measures, has received much less attention. This aquifer however, is the main supply of stock water for the beef industry in the area. A multi-environmental tracer study of the Hutton Sandstone aquifer was undertaken at the Mimosa Syncline and was complemented by a few samples taken from the underlying Precipice Sandstone aquifer. This multi-tracer study (comprising 18O, 2H, 3H, CFCs, SF6, 14C, 36Cl, and 4He) demonstrated that the Hutton Sandstone aquifer behaves as a double porosity system. At the regional scale, the system features a relatively small fraction of conductive rock within a fairly large fraction of low permeability rock. Tracer migration therefore occurs mainly by advection in the conductive fraction and mainly by diffusion in the low-permeability fraction of the aquifer. Groundwater flow velocities, derived from exponential decrease of 14C and 36Cl concentrations with distance, differ by a factor of ten and therefore do not indicate the real groundwater flow velocity. However, accounting for a double porosity interpretation of the tracer data leads to a single groundwater flow velocity that is consistent with all observed data. Advective velocity in this double porosity model differs from face value flow velocities derived from 14C and 36Cl by a factor of 4 and 40 respectively. As a consequence of this interpretation, the deeper groundwater flow system of the Hutton Sandstone aquifer is estimated to receive only 3% of the recharge previously estimated using the Chloride Mass Balance approach at the intake beds. The other 97% is assumed to be rejected recharge which discharges through spring complexes in the Surat Basin and contributes to base flow of the Dawson River. This interpretation also suggests: 1) that the Hutton Sandstone aquifer is potentially more vulnerable to impacts from groundwater abstraction, including from stock and domestic water supply and coal seam gas production, than previously anticipated; 2) that other "groundwater age records" around the world likely observe similar double porosity effects and their apparent ages may be similarly distorted; and 3) that the multi-tracer approach used here is a suitable method for identifying other previously unknown double porosity aquifer systems and can potentially quantify deep effective recharge where important water resources are subject of economic development.
NASA Astrophysics Data System (ADS)
Kendall, C.; Silva, S. R.; Young, M. B.
2013-12-01
While nutrient isotopes are a well-established tool for quantifying nutrients inputs from agricultural vs wastewater treatment plant (WWTP) sources, we have found that combining nutrient isotopes with the C, N, and S isotopic compositions of dissolved and particulate organic matter, as part of a comprehensive multi-isotope and multi-tracer approach, is a much more diagnostic approach. The main reasons why organic matter C-N-S isotopes are a useful adjunct to studies of nutrient sources and biogeochemical processes are that the dissolved and particulate organic matter associated with (1) different kinds of animals (e.g., humans vs cows) often have distinctive isotopic compositions reflecting the different diets of the animals, and (2) the different processes associated with the different land uses (e.g., in the WWTP or associated with different crop types) often result in significant differences in the isotopic compositions of the organics. The analysis of the δ34S of particulate organic matter (POM) and dissolved organic matter (DOM) has been found to be especially useful for distinguishing and quantifying water, nutrient, and organic contributions from different land uses in aquatic systems where much of the organic matter is aquatic in origin. In such environments, the bacteria and algae incorporate S from sulfate and sulfide that is isotopically labeled by the different processes associated with different land uses. We have found that there is ~35 permil range in δ34S of POM along the river-estuary continuum in the San Joaquin/Sacramento River basin, with low values associated with sulfate reduction in the upstream wetlands and high values associated with tidal inputs of marine water into the estuary. Furthermore, rice agriculture results in relatively low δ34S values whereas WWTP effluent in the Sacramento River produces distinctly higher values than upstream of the WWTP, presumably because SO2 is used to treat chlorinated effluent. The fish living downstream of these different land uses become isotopically labeled by the environments, making δ34S a useful tracer of fish derived from these different environments. This presentation will use examples from several large-scale river and wetlands studies to demonstrate useful applications of POM and DOM isotopes for environmental monitoring studies, and will discuss the relative merits of different methods for the collection and analysis of POM and DOM samples for C, N, and S isotopes.
Timescales for nitrate contamination of spring waters, northern Florida, USA
Katz, B.G.; Böhlke, J.K.; Hornsby, H.D.
2001-01-01
Residence times of groundwater, discharging from springs in the middle Suwannee River Basin, were estimated using chlorofluorocarbons (CFCs), tritium (3H), and tritium/helium-3 (3H/3He) age-dating methods to assess the chronology of nitrate contamination of spring waters in northern Florida. During base-flow conditions for the Suwannee River in 1997–1999, 17 water samples were collected from 12 first, second, and third magnitude springs discharging groundwater from the Upper Floridan aquifer. Extending age-dating techniques, using transient tracers to spring waters in complex karst systems, required an assessment of several models [piston-flow (PFM), exponential mixing (EMM), and binary-mixing (BMM)] to account for different distributions of groundwater age. Multi-tracer analyses of four springs yielded generally concordant PFM ages of around 20±2 years from CFC-12, CFC-113, 3H, and 3He, with evidence of partial CFC-11 degradation. The EMM gave a reasonable fit to CFC-113, CFC-12, and 3H data, but did not reproduce the observed 3He concentrations or 3H/3He ratios, nor did a combination PFM–EMM. The BMM could reproduce most of the multi-tracer data set only if both endmembers had 3H concentrations not much different from modern values. CFC analyses of 14 additional springs yielded apparent PFM ages from about 10 to 20 years from CFC-113, with evidence of partial CFC-11 degradation and variable CFC-12 contamination. While it is not conclusive, with respect to the age distribution within each spring, the data indicate that the average residence times were in the order of 10–20 years and were roughly proportional to spring magnitude. Applying similar models to recharge and discharge of nitrate based on historical nitrogen loading data yielded contrasting trends for Suwanee County and Lafayette County. In Suwanee County, spring nitrate trends and nitrogen isotope data were consistent with a peak in fertilizer input in the 1970s and a relatively high overall ratio of artificial fertilizer/manure; whereas in Lafayette County, spring nitrate trends and nitrogen isotope data were consistent with a more monotonic increase in fertilizer input and relatively low overall ratio of artificial fertilizer/manure. The combined results of this study indicate that the nitrate concentrations of springs in the Suwannee River basin have responded to increased nitrogen loads from various sources in the watersheds over the last few decades; however, the responses have been subdued and delayed because the average residence time of groundwater discharging from springs are in the order of decades.
Recent and ancient recharge deciphered by multi-dating tracer technique
NASA Astrophysics Data System (ADS)
Dogramaci, Shawan; Cook, Peter; Mccallum, Jimes; Purtchert, Roland
2017-04-01
Determining groundwater residence time from environmental tracer concentrations obtained from open bores or long screened intervals is fraught with difficulty because the sampled water represents variety of ages. Information on the distribution of groundwater age is commonly obtained by measuring more than one tracer. We examined the use of the multi-tracer technique representing different time frames (39Ar, 85Kr, 14C, 3H, CFC 11- CFC-12 CFC-113, SF6 and Cl,) to decipher the groundwater ages sampled from long screened bores in a regional aquifer in the Pilbara region of northwest Australia. We then applied a technique that assumes limited details of the form of the age distribution. Tracer concentrations suggest that groundwater samples are a mixture of young and old water - the former is inferred to represent localised recharge from an adjacent creek, and the latter to be diffuse recharge. Using our method, we were able to identify distinct age components in the groundwater. The results suggest the presence of four distinct age groups; zero and 20 years, 50 to 100 years, 100 to 600 years and approximately 1000 years old. These relatively high recharge events were consistent with local recharge sources (50-100 years) and confirmed by palaeo-climate record obtained from lake sediments. We found that although the ages of these components were well constrained, the relative proportions of each component was highly sensitive to errors of environmental tracer data. Our results show that the method we implemented can identify distinct age groups in groundwater samples without prior knowledge of the age distribution. The presence of distinct recharge times gives insight into groundwater flow conditions over long periods of time.
Multitracer CMB delensing maps from Planck and WISE data
NASA Astrophysics Data System (ADS)
Yu, Byeonghee; Hill, J. Colin; Sherwin, Blake D.
2017-12-01
Delensing, the removal of the limiting lensing B -mode background, is crucial for the success of future cosmic microwave background (CMB) surveys in constraining inflationary gravitational waves (IGWs). In recent work, delensing with large-scale structure tracers has emerged as a promising method both for improving constraints on IGWs and for testing delensing methods for future use. However, the delensing fractions (i.e., the fraction of the lensing-B mode power removed) achieved by recent efforts have been only 20%-30%. In this work, we provide a detailed characterization of a full-sky, dust-cleaned cosmic infrared background (CIB) map for delensing and construct a further-improved delensing template by adding additional tracers to increase delensing performance. In particular, we build a multitracer delensing template by combining the dust-cleaned Planck CIB map with a reconstructed CMB lensing map from Planck and a galaxy number density map from the Wide-field Infrared Survey Explorer (WISE) satellite. For this combination, we calculate the relevant weightings by fitting smooth templates to measurements of all the cross-spectra and autospectra of these maps. On a large fraction of the sky (fsky=0.43 ), we demonstrate that our maps are capable of providing a delensing factor of 43 ±1 % ; using a more restrictive mask (fsky=0.11 ), the delensing factor reaches 48 ±1 % . For low-noise surveys, our delensing maps, which cover much of the sky, can thus improve constraints on the tensor-to-scalar ratio (r ) by nearly a factor of 2. The delensing tracer maps are made publicly available, and we encourage their use in ongoing and upcoming B -mode surveys.
Hillebrand, Olav; Nödler, Karsten; Sauter, Martin; Licha, Tobias
2015-02-15
The increasing pressure on drinking water resources necessitates an efficient management of potential and actual drinking water resources. Karst aquifers play a key role in the supply of the world's population with drinking water. Around one quarter of all drinking water is produced from these types of aquifers. Unfortunately due to the aquifer characteristics with extremely high hydraulic conductivities and short residence times, these systems are vulnerable to contamination. For successful management, a fundamental understanding of mass transport and attenuation processes with respect to potential contaminants is vital. In this study, a multitracer experiment was performed in a karst aquifer in SW-Germany for determining the attenuation capacity of a karst environment by assessing the environmental fate of selected relevant micropollutants. Uranine, acesulfame and carbamazepine were injected into a sinkhole as reference tracers together with the reactive compounds atenolol, caffeine, cyclamate, ibuprofen and paracetamol (also known as acetaminophen). The breakthrough of the tracers was monitored at a karst spring at a distance of ca. 3 km. The breakthrough curves of the reactive compounds were interpreted relative to the reference substances. No significant retardation was found for any of the investigated micropollutants. The determined half-lives of the reactive compounds range from 38 to 1,400 h (i.e. persistent within the investigation period) in the following order (from high to no observed attenuation): paracetamol>atenolol≈ibuprofen>caffeine≫cyclamate. The attenuation rates are generally in agreement with studies from other environmental compartments. The occurrence of the biotransformation product atenolol acid served as evidence for in-situ biodegradation within the aquifer system. Copyright © 2014 Elsevier B.V. All rights reserved.
Neural mechanism underlying autobiographical memory modulated by remoteness and emotion
NASA Astrophysics Data System (ADS)
Ge, Ruiyang; Fu, Yan; Wang, DaHua; Yao, Li; Long, Zhiying
2012-03-01
Autobiographical memory is the ability to recollect past events from one's own life. Both emotional tone and memory remoteness can influence autobiographical memory retrieval along the time axis of one's life. Although numerous studies have been performed to investigate brain regions involved in retrieving processes of autobiographical memory, the effect of emotional tone and memory age on autobiographical memory retrieval remains to be clarified. Moreover, whether the involvement of hippocampus in consolidation of autobiographical events is time dependent or independent has been controversial. In this study, we investigated the effect of memory remoteness (factor1: recent and remote) and emotional valence (factor2: positive and negative) on neural correlates underlying autobiographical memory by using functional magnetic resonance imaging (fMRI) technique. Although all four conditions activated some common regions known as "core" regions in autobiographical memory retrieval, there are some other regions showing significantly different activation for recent versus remote and positive versus negative memories. In particular, we found that bilateral hippocampal regions were activated in the four conditions regardless of memory remoteness and emotional valence. Thus, our study confirmed some findings of previous studies and provided further evidence to support the multi-trace theory which believes that the role of hippocampus involved in autobiographical memory retrieval is time-independent and permanent in memory consolidation.
NASA Astrophysics Data System (ADS)
Creech, L. T.; Donahoe, R. J.
2009-12-01
This paper documents water quality conditions of the Lake Tuscaloosa, Alabama water-supply reservoir and its watershed under two end-members of hydrologic and climatic variability. These data afford the opportunity to view water quality in the context of both land use and drought, facilitating the development of coupled hydrologic and water-quality forecast models to guide watershed management decisions. This study demonstrates that even the region’s normal 10-year drought cycle holds the capacity to significantly impact water quality and should be incorporated into watershed models and decision-making. To accomplish the goals of this project, a multi-tracer approach has been adopted to assess solute sources and water-quality impairments induced by land use. The biogeochemical tracers include: Major- and minor-ions, trace metals, nutrient speciation and stable-isotope tracers at natural abundance levels. These tracers are also vital to understand the role of climate variability in the context of a heterogeneous landscape. Eight seasonal sampling events across 23 sample locations and two water years yield 184 discrete water-quality samples representative of a range of landscape variability and climatological conditions. Each sample was analyzed for 27 solute species and relevant indicators of water quality. Climatological data was obtained from public repositories (NCDC, USDA); hydrologic data from stream and precipitation gages within the watershed (USGS). Multivariate statistics are used to facilitate the numerical analysis and interpretation of the resulting data. Measurements of nitrogen speciation were collected to document patterns of nutrient loading and nitrogen cycling. These data are augmented by the analysis of nitrogen and oxygen isotopes of nitrate. These data clarify the extent to which nitrogen is being loaded in the non-growing season as well as the capacity of the lake to assimilate nutrients. Under drought conditions the lake becomes nitrogen-limited at most locations. Yet, despite these low concentrations of dissolved nitrogen, Diel measurements reveal that the lake achieves a eutrophic state (due to algal productivity and decomposition). This ecological state is also associated with elevated coliform bacteria in the lake, at times exceeding regulatory limits. Although the lake assimilates excess dissolved nitrogen via enhanced productivity, the process constitutes a water-resource impairment. In this context, the stable-isotope tracer component of the project both: 1) accounts for nitrogen sources and mixing, and 2) clarifies the relative importance of nitrogen assimilation vs. biogeochemical cycling. Multivariate analyses of nutrient data, plus that of metals and rock-weathering solutes further clarify the fate of nitrogen at times and locations that nitrogen flux is less than in most river basins, and less than existing models might predict. By extension, these data may also afford deeper understanding of the larger Mobile River Basin’s 'missing' nitrogen loads under variable flow conditions. This phenomenon offers a protective effect against even faster eutrophication rates (than already exist) in our coastal waters, yet is incompletely understood.
Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment
NASA Astrophysics Data System (ADS)
David, S.; Visvikis, D.; Roux, C.; Hatt, M.
2011-09-01
In positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications.
Trace element and isotope deposition across the air–sea interface: progress and research needs
Landing, W. M.; Bucciarelli, E.; Cheize, M.; Fietz, S.; Hayes, C. T.; Kadko, D.; Morton, P. L.; Rogan, N.; Sarthou, G.; Shelley, R. U.; Shi, Z.; Shiller, A.; van Hulten, M. M. P.
2016-01-01
The importance of the atmospheric deposition of biologically essential trace elements, especially iron, is widely recognized, as are the difficulties of accurately quantifying the rates of trace element wet and dry deposition and their fractional solubility. This paper summarizes some of the recent progress in this field, particularly that driven by the GEOTRACES, and other, international research programmes. The utility and limitations of models used to estimate atmospheric deposition flux, for example, from the surface ocean distribution of tracers such as dissolved aluminium, are discussed and a relatively new technique for quantifying atmospheric deposition using the short-lived radionuclide beryllium-7 is highlighted. It is proposed that this field will advance more rapidly by using a multi-tracer approach, and that aerosol deposition models should be ground-truthed against observed aerosol concentration data. It is also important to improve our understanding of the mechanisms and rates that control the fractional solubility of these tracers. Aerosol provenance and chemistry (humidity, acidity and organic ligand characteristics) play important roles in governing tracer solubility. Many of these factors are likely to be influenced by changes in atmospheric composition in the future. Intercalibration exercises for aerosol chemistry and fractional solubility are an essential component of the GEOTRACES programme. This article is part of the themed issue ‘Biological and climatic impacts of ocean trace element chemistry’. PMID:29035268
Trace element and isotope deposition across the air-sea interface: progress and research needs
NASA Astrophysics Data System (ADS)
Baker, A. R.; Landing, W. M.; Bucciarelli, E.; Cheize, M.; Fietz, S.; Hayes, C. T.; Kadko, D.; Morton, P. L.; Rogan, N.; Sarthou, G.; Shelley, R. U.; Shi, Z.; Shiller, A.; van Hulten, M. M. P.
2016-11-01
The importance of the atmospheric deposition of biologically essential trace elements, especially iron, is widely recognized, as are the difficulties of accurately quantifying the rates of trace element wet and dry deposition and their fractional solubility. This paper summarizes some of the recent progress in this field, particularly that driven by the GEOTRACES, and other, international research programmes. The utility and limitations of models used to estimate atmospheric deposition flux, for example, from the surface ocean distribution of tracers such as dissolved aluminium, are discussed and a relatively new technique for quantifying atmospheric deposition using the short-lived radionuclide beryllium-7 is highlighted. It is proposed that this field will advance more rapidly by using a multi-tracer approach, and that aerosol deposition models should be ground-truthed against observed aerosol concentration data. It is also important to improve our understanding of the mechanisms and rates that control the fractional solubility of these tracers. Aerosol provenance and chemistry (humidity, acidity and organic ligand characteristics) play important roles in governing tracer solubility. Many of these factors are likely to be influenced by changes in atmospheric composition in the future. Intercalibration exercises for aerosol chemistry and fractional solubility are an essential component of the GEOTRACES programme. This article is part of the themed issue 'Biological and climatic impacts of ocean trace element chemistry'.
Trace element and isotope deposition across the air-sea interface: progress and research needs.
Baker, A R; Landing, W M; Bucciarelli, E; Cheize, M; Fietz, S; Hayes, C T; Kadko, D; Morton, P L; Rogan, N; Sarthou, G; Shelley, R U; Shi, Z; Shiller, A; van Hulten, M M P
2016-11-28
The importance of the atmospheric deposition of biologically essential trace elements, especially iron, is widely recognized, as are the difficulties of accurately quantifying the rates of trace element wet and dry deposition and their fractional solubility. This paper summarizes some of the recent progress in this field, particularly that driven by the GEOTRACES, and other, international research programmes. The utility and limitations of models used to estimate atmospheric deposition flux, for example, from the surface ocean distribution of tracers such as dissolved aluminium, are discussed and a relatively new technique for quantifying atmospheric deposition using the short-lived radionuclide beryllium-7 is highlighted. It is proposed that this field will advance more rapidly by using a multi-tracer approach, and that aerosol deposition models should be ground-truthed against observed aerosol concentration data. It is also important to improve our understanding of the mechanisms and rates that control the fractional solubility of these tracers. Aerosol provenance and chemistry (humidity, acidity and organic ligand characteristics) play important roles in governing tracer solubility. Many of these factors are likely to be influenced by changes in atmospheric composition in the future. Intercalibration exercises for aerosol chemistry and fractional solubility are an essential component of the GEOTRACES programme.This article is part of the themed issue 'Biological and climatic impacts of ocean trace element chemistry'. © 2015 The Authors.
Mixing of shallow and deep groundwater as indicated by the chemistry and age of karstic springs
NASA Astrophysics Data System (ADS)
Toth, David J.; Katz, Brian G.
2006-06-01
Large karstic springs in east-central Florida, USA were studied using multi-tracer and geochemical modeling techniques to better understand groundwater flow paths and mixing of shallow and deep groundwater. Spring water types included Ca-HCO3 (six), Na-Cl (four), and mixed (one). The evolution of water chemistry for Ca-HCO3 spring waters was modeled by reactions of rainwater with soil organic matter, calcite, and dolomite under oxic conditions. The Na-Cl and mixed-type springs were modeled by reactions of either rainwater or Upper Floridan aquifer water with soil organic matter, calcite, and dolomite under oxic conditions and mixed with varying proportions of saline Lower Floridan aquifer water, which represented 4-53% of the total spring discharge. Multiple-tracer data—chlorofluorocarbon CFC-113, tritium (3H), helium-3 (3Hetrit), sulfur hexafluoride (SF6)—for four Ca-HCO3 spring waters were consistent with binary mixing curves representing water recharged during 1980 or 1990 mixing with an older (recharged before 1940) tracer-free component. Young-water mixing fractions ranged from 0.3 to 0.7. Tracer concentration data for two Na-Cl spring waters appear to be consistent with binary mixtures of 1990 water with older water recharged in 1965 or 1975. Nitrate-N concentrations are inversely related to apparent ages of spring waters, which indicated that elevated nitrate-N concentrations were likely contributed from recent recharge.
Multitracing Experiment With Solved and Particulate Tracers In An Unsaturated Field Soil
NASA Astrophysics Data System (ADS)
Burkhardt, M.; Kasteel, R.; Vereecken, H.
Solute movement and colloid migration follow preferential flow paths in structured soils at the field scale. The use of microsphreres is a possible option to mimic colloid transport through the vadose zone into the groundwater. We present results of multi- tracing experiments conducted in an Orthic Luvisol using bromide (Br-), the reactive dye tracer Brilliant Blue (BB) and microspheres. The fluorescent microspheres (1 and 10 µm in diameter) were functionalized with a negative surface charge. Eight field plots (about 2 m2) were irrigated with 10 mm and 40 mm during 6 h. Four field plots were sampled directly after the irrgation, the others were exposed for 90 days to natural wheather conditions. Photographs of horizontal cross-sections and disturbed soil sam- ples were taken every 5 to 10 cm down to a depth of 160 cm. Image analysis was used to derive concentration distributions of BB using a calibration relationship between concentration and color spectra. The microspheres were quantified after desorption of the soil samples by fluorescent microscopy and image analysis. We used moment analysis to characterize transport phenomena. We found that transport through the soil matrix was affected by sorption, but all of the applied compounds were transported through preferential flow paths (earthworm burrows) down to a depth of 160 cm irre- spective of their chemical properties. Furthermore, this study shows that microspheres can be used to mimic colloid facilitated transport under unsaturated conditions in a field soil.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Guoping; Mayes, Melanie; Parker, Jack C
2010-01-01
We implemented the widely used CXTFIT code in Excel to provide flexibility and added sensitivity and uncertainty analysis functions to improve transport parameter estimation and to facilitate model discrimination for multi-tracer experiments on structured soils. Analytical solutions for one-dimensional equilibrium and nonequilibrium convection dispersion equations were coded as VBA functions so that they could be used as ordinary math functions in Excel for forward predictions. Macros with user-friendly interfaces were developed for optimization, sensitivity analysis, uncertainty analysis, error propagation, response surface calculation, and Monte Carlo analysis. As a result, any parameter with transformations (e.g., dimensionless, log-transformed, species-dependent reactions, etc.) couldmore » be estimated with uncertainty and sensitivity quantification for multiple tracer data at multiple locations and times. Prior information and observation errors could be incorporated into the weighted nonlinear least squares method with a penalty function. Users are able to change selected parameter values and view the results via embedded graphics, resulting in a flexible tool applicable to modeling transport processes and to teaching students about parameter estimation. The code was verified by comparing to a number of benchmarks with CXTFIT 2.0. It was applied to improve parameter estimation for four typical tracer experiment data sets in the literature using multi-model evaluation and comparison. Additional examples were included to illustrate the flexibilities and advantages of CXTFIT/Excel. The VBA macros were designed for general purpose and could be used for any parameter estimation/model calibration when the forward solution is implemented in Excel. A step-by-step tutorial, example Excel files and the code are provided as supplemental material.« less
Mixing of shallow and deep groundwater as indicated by the chemistry and age of karstic springs
Toth, D.J.; Katz, B.G.
2006-01-01
Large karstic springs in east-central Florida, USA were studied using multi-tracer and geochemical modeling techniques to better understand groundwater flow paths and mixing of shallow and deep groundwater. Spring water types included Ca-HCO3 (six), Na-Cl (four), and mixed (one). The evolution of water chemistry for Ca-HCO3 spring waters was modeled by reactions of rainwater with soil organic matter, calcite, and dolomite under oxic conditions. The Na-Cl and mixed-type springs were modeled by reactions of either rainwater or Upper Floridan aquifer water with soil organic matter, calcite, and dolomite under oxic conditions and mixed with varying proportions of saline Lower Floridan aquifer water, which represented 4-53% of the total spring discharge. Multiple-tracer data-chlorofluorocarbon CFC-113, tritium (3H), helium-3 (3Hetrit), sulfur hexafluoride (SF6) - for four Ca-HCO3 spring waters were consistent with binary mixing curves representing water recharged during 1980 or 1990 mixing with an older (recharged before 1940) tracer-free component. Young-water mixing fractions ranged from 0.3 to 0.7. Tracer concentration data for two Na-Cl spring waters appear to be consistent with binary mixtures of 1990 water with older water recharged in 1965 or 1975. Nitrate-N concentrations are inversely related to apparent ages of spring waters, which indicated that elevated nitrate-N concentrations were likely contributed from recent recharge. ?? Springer-Verlag 2006.
NASA Astrophysics Data System (ADS)
Toth, David J.; Katz, Brian G.
2006-09-01
Large karstic springs in east-central Florida, USA were studied using multi-tracer and geochemical modeling techniques to better understand groundwater flow paths and mixing of shallow and deep groundwater. Spring water types included Ca-HCO3 (six), Na-Cl (four), and mixed (one). The evolution of water chemistry for Ca-HCO3 spring waters was modeled by reactions of rainwater with soil organic matter, calcite, and dolomite under oxic conditions. The Na-Cl and mixed-type springs were modeled by reactions of either rainwater or Upper Floridan aquifer water with soil organic matter, calcite, and dolomite under oxic conditions and mixed with varying proportions of saline Lower Floridan aquifer water, which represented 4-53% of the total spring discharge. Multiple-tracer data—chlorofluorocarbon CFC-113, tritium (3H), helium-3 (3Hetrit), sulfur hexafluoride (SF6)—for four Ca-HCO3 spring waters were consistent with binary mixing curves representing water recharged during 1980 or 1990 mixing with an older (recharged before 1940) tracer-free component. Young-water mixing fractions ranged from 0.3 to 0.7. Tracer concentration data for two Na-Cl spring waters appear to be consistent with binary mixtures of 1990 water with older water recharged in 1965 or 1975. Nitrate-N concentrations are inversely related to apparent ages of spring waters, which indicated that elevated nitrate-N concentrations were likely contributed from recent recharge.
Hunt, Charles D.; Rosa, Sarah N.
2009-01-01
Municipal wastewater plumes discharging from aquifer to ocean were detected by nearshore wading surveys at Kihei and Lahaina, on the island of Maui in Hawaii. Developed in cooperation with the Hawaii State Department of Health, the survey methodology included instrument trolling to detect submarine groundwater discharge, followed by analysis of water and macroalgae for a suite of chemical and isotopic constituents that constitute a 'multitracer' approach. Surveys were conducted May 6-28, 2008, during fair-weather conditions and included: (1) wading and kayak trolling with a multiparameter water-quality sonde, (2) marine water-column sampling, and (3) collection of benthic algae samples. Instrument trolling helped guide the water sampling strategy by providing dense, continuous transects of water properties on which groundwater discharge zones could be identified. Water and algae samples for costly chemical and isotopic laboratory analyses were last to be collected but were highly diagnostic of wastewater presence and nutrient origin because of low detection levels and confirmation across multiple tracers. Laboratory results confirmed the presence of wastewater constituents in marine water-column samples at both locales and showed evidence of modifying processes such as denitrification and mixing of effluent with surrounding groundwater and seawater. Carbamazepine was the most diagnostic pharmaceutical, detected in several marine water-column samples and effluent at both Kihei and Lahaina. Heavy nitrogen-isotope compositions in water and algae were highly diagnostic of effluent, particularly where enriched to even heavier values than effluent source compositions by denitrification. Algae provided an added advantage of time-integrating their nitrogen source during growth. The measured Kihei plume coincided almost exactly with prior model predictions, but the Lahaina plume was detected well south of the expected direct path from injection wells to shore and may be guided by a buried valley fill from an ancestral course of Honokowai Stream. Nutrient concentrations in upland wells at Lahaina were comparable to concentrations in wastewater but originate instead from agricultural fertilizers. A key factor in detecting and mapping the wastewater plumes was sampling very close to shore (mostly within 20 m or so) and in very shallow water (mostly 0.5 to 2 m depth). Effluent probably discharges somewhat offshore as well, although prior attempts to detect an injected fluorescent tracer at Lahaina in the 1990s were inconclusive, having focused farther offshore in water mostly 10-30 m deep. Sampling of benthic porewater and algae would offer the best chances for further effluent detection and mapping offshore, and sampling of onland monitor wells could provide additional understanding of geochemical processes that take place in the effluent plumes and bring about some degree of natural attenuation of nutrients.
A Multi-tracer Approach to Determining the Fate of Wastewater in Groundwater
NASA Astrophysics Data System (ADS)
Moran, J. E.; Beller, H. R.; Leif, R.; Singleton, M. J.
2006-12-01
In California, demand for limited fresh water supplies for use as drinking water has increased, and recycled water is increasingly used for irrigation or for groundwater recharge. In this study, analysis of multiple tracers, including general minerals, stable isotopes of the water molecule (for source water identification and evidence for evaporation) and of nitrate (wastewater denitrification indicators), and tritium-helium groundwater age, allow identification and quantification of the fraction of water produced at a well that originated as applied wastewater effluent. Wastewater target compounds include metabolites of alkylphenol ethoxylate nonionic surfactants, pharmaceuticals such as ibuprofen and carbamazepine, personal care products such as triclosan and polycyclic musk fragrance compounds, the insect repellent DEET, and caffeine. In spite of a high fraction (up to 70 percent) of wastewater recharge produced at monitoring wells from two sites (in Livermore, CA and Gilroy, CA), the only detections greater than 50 ng/L were of alkylphenol carboxylic acids and the anti-seizure pharmaceuticals carbamazepine and primadone. However, even these compounds occurred at concentrations in groundwater that were significantly lower than concentrations observed in treated wastewater effluent. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under contract No. W-7405-ENG-48.
NASA Astrophysics Data System (ADS)
Chabaux, F. J.; Prunier, J.; Pierret, M.; Stille, P.
2012-12-01
The characterization of the present-day weathering processes controlling the chemical composition of waters and soils in natural ecosystems is an important issue to predict and to model the response of ecosystems to recent environmental changes. It is proposed here to highlight the interest of a multi-tracer geochemical approach combining measurement of major and trace element concentrations along with U and Sr isotopic ratios to progress in this topic. This approach has been applied to the small granitic Strengbah Catchment, located in the Vosges Mountain (France), used and equipped as a hydro-geochemical observatory since 1986 (Observatoire Hydro-Géochimique de l'Environnement; http://ohge.u-strasbg.fr). This study includes the analysis of major and trace element concentrations and (U-Sr) isotope ratios in soil solutions collected within two soil profiles located on two experimental plots of this watershed, as well as the analysis of soil samples and vegetation samples from these two plots The depth variation of elemental concentration of soil solutions confirms the important influence of the vegetation cycling on the budget of Ca, K, Rb and Sr, whereas Mg and Si budget in soil solutions are quasi exclusively controlled by weathering processes. Variation of Sr, and U isotopic ratios with depth also demonstrates that the sources and biogeochemical processes controlling the Sr budget of soil solutions is different in the uppermost soil horizons and in the deeper ones, and clearly influence by the vegetation cycling.
NASA Technical Reports Server (NTRS)
Shi, Fang; Basinger, Scott A.; Redding, David C.
2006-01-01
Dispersed Fringe Sensing (DFS) is an efficient and robust method for coarse phasing of a segmented primary mirror such as the James Webb Space Telescope (JWST). In this paper, modeling and simulations are used to study the effect of segmented mirror aberrations on the fringe image, DFS signals and DFS detection accuracy. The study has shown due to the pixilation spatial filter effect from DFS signal extraction the effect of wavefront error is reduced and DFS algorithm will be more robust against wavefront aberration by using multi-trace DFS approach. We also studied the JWST Dispersed Hartmann Sensor (DHS) performance in presence of wavefront aberrations caused by the gravity sag and we use the scaled gravity sag to explore the JWST DHS performance relationship with the level of the wavefront aberration. This also includes the effect from line-of-sight jitter.
Gardner, W.P.; Susong, D.D.; Solomon, D.K.; Heasler, H.P.
2011-01-01
Multiple environmental tracers are used to investigate age distribution, evolution, and mixing in local- to regional-scale groundwater circulation around the Norris Geyser Basin area in Yellowstone National Park. Springs ranging in temperature from 3??C to 90??C in the Norris Geyser Basin area were sampled for stable isotopes of hydrogen and oxygen, major and minor element chemistry, dissolved chlorofluorocarbons, and tritium. Groundwater near Norris Geyser Basin is comprised of two distinct systems: a shallow, cool water system and a deep, high-temperature hydrothermal system. These two end-member systems mix to create springs with intermediate temperature and composition. Using multiple tracers from a large number of springs, it is possible constrain the distribution of possible flow paths and refine conceptual models of groundwater circulation in and around a large, complex hydrothermal system. Copyright 2011 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Kolbe, T.; Abbott, B. W.; Marçais, J.; Thomas, Z.; Aquilina, L.; Labasque, T.; Pinay, G.; De Dreuzy, J. R.
2016-12-01
Groundwater transit time and flow path are key factors controlling nitrogen retention and removal capacity at the catchment scale (Abbott et al., 2016), but the relative importance of hydrogeological and topographical factors in determining these parameters remains uncertain (Kolbe et al., 2016). To address this unknown, we used numerical modelling techniques calibrated with CFC groundwater age data to quantify transit time and flow path in an unconfined aquifer in Brittany, France. We assessed the relative importance of parameters (aquifer depth, porosity, arrangement of geological layers, and permeability profile), hydrology (recharge rate), and topography in determining characteristic flow distances (Leray et al., 2016). We found that groundwater flow was highly local (mean travel distance of 350 m) but also relatively old (mean CFC age of 40 years). Sensitivity analysis revealed that groundwater travel distances were not sensitive to geological parameters within the constraints of the CFC age data. However, circulation was sensitive to topography in lowland areas where the groundwater table was close to the land surface, and to recharge rate in upland areas where water input modulated the free surface of the aquifer. We quantified these differences with a local groundwater ratio (rGW-LOCAL) defined as the mean groundwater travel distance divided by the equivalent surface distance water would have traveled along the land surface. Lowland rGW-LOCAL was near 1, indicating primarily topographic controls. Upland rGW-LOCALwas 1.6, meaning the groundwater recharge area was substantially larger than the topographically-defined catchment. This ratio was applied to other catchments in Brittany to test its relevance in comparing controls on groundwater circulation within and among catchments. REFERENCES Abbott et al., 2016, Using multi-tracer inference to move beyond single-catchment ecohydrology. Earth-Science Reviews. Kolbe et al., 2016, Coupling 3D groundwater modeling with CFC-based age dating to classify local groundwater circulation in an unconfined crystalline aquifer. J. Hydrol. Leray et al., 2016, Residence time distributions for hydrologic systems: Mechanistic foundations and steady-state analytical solutions. J. Hydrol.
Rodriguez-Vieitez, Elena; Saint-Aubert, Laure; Carter, Stephen F; Almkvist, Ove; Farid, Karim; Schöll, Michael; Chiotis, Konstantinos; Thordardottir, Steinunn; Graff, Caroline; Wall, Anders; Långström, Bengt; Nordberg, Agneta
2016-03-01
Alzheimer's disease is a multifactorial dementia disorder characterized by early amyloid-β, tau deposition, glial activation and neurodegeneration, where the interrelationships between the different pathophysiological events are not yet well characterized. In this study, longitudinal multitracer positron emission tomography imaging of individuals with autosomal dominant or sporadic Alzheimer's disease was used to quantify the changes in regional distribution of brain astrocytosis (tracer (11)C-deuterium-L-deprenyl), fibrillar amyloid-β plaque deposition ((11)C-Pittsburgh compound B), and glucose metabolism ((18)F-fluorodeoxyglucose) from early presymptomatic stages over an extended period to clinical symptoms. The 52 baseline participants comprised autosomal dominant Alzheimer's disease mutation carriers (n = 11; 49.6 ± 10.3 years old) and non-carriers (n = 16; 51.1 ± 14.2 years old; 10 male), and patients with sporadic mild cognitive impairment (n = 17; 61.9 ± 6.4 years old; nine male) and sporadic Alzheimer's disease (n = 8; 63.0 ± 6.5 years old; five male); for confidentiality reasons, the gender of mutation carriers is not revealed. The autosomal dominant Alzheimer's disease participants belonged to families with known mutations in either presenilin 1 (PSEN1) or amyloid precursor protein (APPswe or APParc) genes. Sporadic mild cognitive impairment patients were further divided into (11)C-Pittsburgh compound B-positive (n = 13; 62.0 ± 6.4; seven male) and (11)C-Pittsburgh compound B-negative (n = 4; 61.8 ± 7.5 years old; two male) groups using a neocortical standardized uptake value ratio cut-off value of 1.41, which was calculated with respect to the cerebellar grey matter. All baseline participants underwent multitracer positron emission tomography scans, cerebrospinal fluid biomarker analysis and neuropsychological assessment. Twenty-six of the participants underwent clinical and imaging follow-up examinations after 2.8 ± 0.6 years. By using linear mixed-effects models, fibrillar amyloid-β plaque deposition was first observed in the striatum of presymptomatic autosomal dominant Alzheimer's disease carriers from 17 years before expected symptom onset; at about the same time, astrocytosis was significantly elevated and then steadily declined. Diverging from the astrocytosis pattern, amyloid-β plaque deposition increased with disease progression. Glucose metabolism steadily declined from 10 years after initial amyloid-β plaque deposition. Patients with sporadic mild cognitive impairment who were (11)C-Pittsburgh compound B-positive at baseline showed increasing amyloid-β plaque deposition and decreasing glucose metabolism but, in contrast to autosomal dominant Alzheimer's disease carriers, there was no significant longitudinal decline in astrocytosis over time. The prominent initially high and then declining astrocytosis in autosomal dominant Alzheimer's disease carriers, contrasting with the increasing amyloid-β plaque load during disease progression, suggests astrocyte activation is implicated in the early stages of Alzheimer's disease pathology. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain.
NASA Astrophysics Data System (ADS)
Hanna, Andrea J. M.; Allison, Mead A.; Bianchi, Thomas S.; Marcantonio, Franco; Goff, John A.
2014-02-01
Arctic coastal environments near major river outfalls, like Simpson Lagoon, Alaska and the adjacent Colville River Delta, potentially contain high-resolution sediment records useful in elucidating late Holocene Arctic sediment transport pathways and coupled terrestrial-ocean evidence of paleoclimate variability. This study utilizes a multi-tracer geochronology approach (137Cs, 239,240Pu, and 14C) tailored for high-latitude environments to determine the age models for cores collected from Simpson Lagoon, and to date seismic boundaries in shallow acoustic reflection data (CHIRP) to examine late Holocene infill patterns. Modern (~100 y) sediment accumulation rates range from <0.02 to 0.46±0.04 cm y-1, with a primary depocenter in western Simpson Lagoon adjacent to the Colville Delta and a secondary depocenter in eastern Simpson Lagoon. CHIRP reflectors, age-constrained by 14C analysis, reveal rapid late Holocene (0-3500 y BP) transgression consistent with high modern shoreline retreat rates. The western depocenter contains >5 m of late Holocene interbedded sediments, likely derived primarily from the Colville River, with onset of accumulation occurring prior to ~3500 y BP. A paleo-high in central Simpson Lagoon, separating the two depocenters, was subaerially exposed prior to ~600 y BP. The millimeters-per-year sedimentation rates across the lagoon, coupled with the undisturbed, interbedded sediment record, indicate that these settings hold great potential to develop new Arctic paleoenvironmental records.
NASA Astrophysics Data System (ADS)
Mirkamali, M. S.; Keshavarz FK, N.; Bakhtiari, M. R.
2013-02-01
Faults, as main pathways for fluids, play a critical role in creating regions of high porosity and permeability, in cutting cap rock and in the migration of hydrocarbons into the reservoir. Therefore, accurate identification of fault zones is very important in maximizing production from petroleum traps. Image processing and modern visualization techniques are provided for better mapping of objects of interest. In this study, the application of fault mapping in the identification of fault zones within the Mishan and Aghajari formations above the Guri base unconformity surface in the eastern part of Persian Gulf is investigated. Seismic single- and multi-trace attribute analyses are employed separately to determine faults in a vertical section, but different kinds of geological objects cannot be identified using individual attributes only. A mapping model is utilized to improve the identification of the faults, giving more accurate results. This method is based on combinations of all individual relevant attributes using a neural network system to create combined attributes, which gives an optimal view of the object of interest. Firstly, a set of relevant attributes were separately calculated on the vertical section. Then, at interpreted positions, some example training locations were manually selected in each fault and non-fault class by an interpreter. A neural network was trained on combinations of the attributes extracted at the example training locations to generate an optimized fault cube. Finally, the results of the fault and nonfault probability cube were estimated, which the neural network applied to the entire data set. The fault probability cube was obtained with higher mapping accuracy and greater contrast, and with fewer disturbances in comparison with individual attributes. The computed results of this study can support better understanding of the data, providing fault zone mapping with reliable results.
NASA Astrophysics Data System (ADS)
Kopf, S.; McGlynn, S.; Cowley, E.; Green, A.; Newman, D. K.; Orphan, V. J.
2014-12-01
Metabolic rates of microbial communities constitute a key physiological parameter for understanding the in situ growth constraints for life in any environment. Isotope labeling techniques provide a powerful approach for measuring such biological activity, due to the use of isotopically enriched substrate tracers whose incorporation into biological materials can be detected with high sensitivity by isotope-ratio mass spectrometry. Nano-meter scale secondary ion mass spectrometry (NanoSIMS) combined with stable isotope labeling provides a unique tool for studying the spatiometabolic activity of microbial populations at the single cell level in order to assess both community structure and population diversity. However, assessing the distribution and range of microbial activity in complex environmental systems with slow-growing organisms, diverse carbon and nitrogen sources, or heterotrophic subpopulations poses a tremendous technical challenge because the introduction of isotopically labeled substrates frequently changes the nutrient availability and can inflate or bias measures of activity. Here, we present the use of hydrogen isotope labeling with deuterated water as an important new addition to the isotopic toolkit and apply it for the determination of single cell microbial activities by NanoSIMS imaging. This tool provides a labeling technique that minimally alters any aquatic chemical environment, can be administered with strong labels even in minimal addition (natural background is very low), is an equally universal substrate for all forms of life even in complex, carbon and nitrogen saturated systems, and can be combined with other isotopic tracers. The combination of heavy water labeling with the most commonly used NanoSIMS tracer, 15N, is technically challenging but opens up a powerful new set of multi-tracer experiments for the study of microbial activity in complex communities. We present the first truly simultaneous single cell triple isotope system measurements of 2H/1H, 13C/12C and 15N/14N and apply it to study of microbial metabolic heterogeneity and nitrogen metabolism in a continuous culture case study. Our data provide insight into both the diversity of microbial activity rates, as well as patterns of ammonium utilization at the single cell level.
NASA Astrophysics Data System (ADS)
Chatton, Eliot; Labasque, Thierry; Guillou, Aurélie; Béthencourt, Lorine; de La Bernardie, Jérôme; Boisson, Alexandre; Koch, Florian; Aquilina, Luc
2017-04-01
Identification of biogeochemical reactions in aquifers and determining kinetics is important for the prediction of contaminant transport in aquifers and groundwater management. Therefore, experiments accounting for both conservative and reactive transport are essential to understand the biogeochemical reactivity at field scale. This study presents the results of a groundwater tracer test using the combined injection of dissolved conservative and reactive tracers (He, Xe, Ar, Br-, O2 and NO3-) in order to evaluate the transport properties of a fractured media in Brittany, France. Dissolved gas concentrations were continuously monitored in situ with a CF-MIMS (Chatton et al, 2016) allowing a high frequency (1 gas every 2 seconds) multi-tracer analysis (N2, O2, CO2, CH4, N2O, H2, He, Ne, Ar, Kr, Xe) over a large resolution (6 orders of magnitude). Along with dissolved gases, groundwater biogeochemistry was monitored through the sampling of major anions and cations, trace elements and microbiological diversity. The results show breakthrough curves allowing the combined quantification of conservative and reactive transport properties. This ongoing work is an original approach investigating the link between heterogeneity of porous media and biogeochemical reactions at field scale. Eliot Chatton, Thierry Labasque, Jérôme de La Bernardie, Nicolas Guihéneuf, Olivier Bour and Luc Aquilina; Field Continuous Measurement of Dissolved Gases with a CF-MIMS: Applications to the Physics and Biogeochemistry of Groundwater Flow; Environmental Science & Technology, in press, 2016.
NASA Astrophysics Data System (ADS)
Dwivedi, R.; McIntosh, J. C.; Meixner, T.; Ferré, T. P. A.; Chorover, J.
2016-12-01
Mountain systems are critical sources of recharge to adjacent alluvial basins in dryland regions. Yet, mountain systems face poorly defined threats due to climate change in terms of reduced snowpack, precipitation changes, and increased temperatures. Fundamentally, the climate risks to mountain systems are uncertain due to our limited understanding of natural recharge processes. Our goal is to combine measurements and models to provide improved spatial and temporal descriptions of groundwater flow paths and transit times in a headwater catchment located in a sub-humid region. This information is important to quantifying groundwater age and, thereby, to providing more accurate assessments of the vulnerability of these systems to climate change. We are using: (a) combination of geochemical composition, along with 2H/18O and 3H isotopes to improve an existing conceptual model for mountain block recharge (MBR) for the Marshall Gulch Catchment (MGC) located within the Santa Catalina Mountains. The current model only focuses on shallow flow paths through the upper unconfined aquifer with no representation of the catchment's fractured-bedrock aquifer. Groundwater flow, solute transport, and groundwater age will be modeled throughout MGC using COMSOL Multiphysics® software. Competing models in terms of spatial distribution of required hydrologic parameters, e.g. hydraulic conductivity and porosity, will be proposed and these models will be used to design discriminatory data collection efforts based on multi-tracer methods. Initial end-member mixing results indicate that baseflow in MGC, if considered the same as the streamflow during the dry periods, is not represented by the chemistry of deep groundwater in the mountain system. In the ternary mixing space, most of the samples plot outside the mixing curve. Therefore, to further constrain the contributions of water from various reservoirs we are collecting stable water isotopes, tritium, and solute chemistry of precipitation, shallow groundwater, local spring water, MGC streamflow, and at a drainage location much lower than MGC outlet to better define and characterize each end-member of the ternary mixing model. Consequently, the end-member mixing results are expected to facilitate us in better understanding the MBR processes in and beyond MGC. Mountain systems are critical sources of recharge to adjacent alluvial basins in dryland regions. Yet, mountain systems face poorly defined threats due to climate change in terms of reduced snowpack, precipitation changes, and increased temperatures. Fundamentally, the climate risks to mountain systems are uncertain due to our limited understanding of natural recharge processes. Our goal is to combine measurements and models to provide improved spatial and temporal descriptions of groundwater flow paths and transit times in a headwater catchment located in a sub-humid region. This information is important to quantifying groundwater age and, thereby, to providing more accurate assessments of the vulnerability of these systems to climate change. We are using: (a) combination of geochemical composition, along with 2H/18O and 3H isotopes to improve an existing conceptual model for mountain block recharge (MBR) for the Marshall Gulch Catchment (MGC) located within the Santa Catalina Mountains. The current model only focuses on shallow flow paths through the upper unconfined aquifer with no representation of the catchment's fractured-bedrock aquifer. Groundwater flow, solute transport, and groundwater age will be modeled throughout MGC using COMSOL Multiphysics® software. Competing models in terms of spatial distribution of required hydrologic parameters, e.g. hydraulic conductivity and porosity, will be proposed and these models will be used to design discriminatory data collection efforts based on multi-tracer methods. Initial end-member mixing results indicate that baseflow in MGC, if considered the same as the streamflow during the dry periods, is not represented by the chemistry of deep groundwater in the mountain system. In the ternary mixing space, most of the samples plot outside the mixing curve. Therefore, to further constrain the contributions of water from various reservoirs we are collecting stable water isotopes, tritium, and solute chemistry of precipitation, shallow groundwater, local spring water, MGC streamflow, and at a drainage location much lower than MGC outlet to better define and characterize each end-member of the ternary mixing model. Consequently, the end-member mixing results are expected to facilitate us in better understanding the MBR processes in and beyond MGC.
Groundwater Recharge Processes Revealed By Multi-Tracers Approach in a Headwater, North China Plain
NASA Astrophysics Data System (ADS)
Sakakibara, K.; Tsujimura, M.; Song, X.; Zhang, J.
2014-12-01
Groundwater recharge variation in space and time is crucial for effective water management especially in arid/ semi-arid regions. In order to reveal comprehensive groundwater recharge processes in a catchment with a large topographical relief and seasonal hydrological variations, intensive field surveys were conducted at 4 times in different seasons in Wangkuai watershed, Taihang Mountains, which is a main groundwater recharge zone of North China Plain. The groundwater, spring, stream water and lake water were sampled, and inorganic solute constituents and stable isotopes of oxygen-18 and deuterium were determined on all water samples. Also, the stream flow rate was observed in stable state condition. The stable isotopic compositions, silica and bicarbonate concentrations in the groundwater show close values as those in the surface water, suggesting main groundwater recharge occurs from surface water at mountain-plain transitional zone throughout a year. Also, the deuterium and oxgen-18 in the Wangkuai reservoir and the groundwater in the vicinity of the reservoir show higher values, suggesting the reservoir water, affected by evaporation effect, seems to have an important role for the groundwater recharge in alluvial plain. For specifying the groundwater recharge area and quantifying groundwater recharge rate from the reservoir, an inversion analysis and a simple mixing model were applied in Wangkuai watershed using stable isotopes of oxygen-18 and deuterium. The model results show that groundwater recharge occurs dominantly at the altitude from 357 m to 738 m corresponding to mountain-plain transitional zone, and groundwater recharge rate by Wangkuai reservoir is estimated to be 2.4 % of total groundwater recharge in Wangkuai watershed.
NASA Astrophysics Data System (ADS)
Sakakibara, Koichi; Tsujimura, Maki; Song, Xianfang; Zhang, Jie
2014-05-01
Groundwater recharge is a crucial hydrological process for effective water management especially in arid/ semi-arid regions. However, the insufficient number of specific research regarding groundwater recharge process has been reported previously. Intensive field surveys were conducted during rainy season, mid dry season, and end of dry season, in order to clarify comprehensive groundwater recharge and flow regime of Wangkuai watershed in a headwater, which is a main recharge zone of North China Plain. The groundwater, spring, stream water and lake water were sampled, and inorganic solute constituents and stable isotopes of oxygen 18 and deuterium were determined on all water samples. Also the stream flow rate was observed. The solute ion concentrations and stable isotopic compositions show that the most water of this region can be characterized by Ca-HCO3 type and the main water source is precipitation which is affected by altitude effect of stable isotopes. In addition, the river and reservoir of the area seem to recharge the groundwater during rainy season, whereas interaction between surface water and groundwater does not become dominant gradually after the rainy season. The inversion analysis applied in Wangkuai watershed using simple mixing model represents an existing multi-flow systems which shows a distinctive tracer signal and flow rate. In summary, the groundwater recharged at different locations in the upper stream of Wangkuai reservoir flows downward to alluvial fan with a certain amount of mixing together, also the surface water recharges certainly the groundwater in alluvial plain in the rainy season.
Temporal dynamics in dominant runoff sources and flow paths in the Andean Páramo
NASA Astrophysics Data System (ADS)
Correa, Alicia; Windhorst, David; Tetzlaff, Doerthe; Crespo, Patricio; Célleri, Rolando; Feyen, Jan; Breuer, Lutz
2017-07-01
The relative importance of catchment's water provenance and flow paths varies in space and time, complicating the conceptualization of the rainfall-runoff responses. We assessed the temporal dynamics in source areas, flow paths, and age by End Member Mixing Analysis (EMMA), hydrograph separation, and Inverse Transit Time Proxies (ITTPs) estimation within a headwater catchment in the Ecuadorian Andes. Twenty-two solutes, stable isotopes, pH, and electrical conductivity from a stream and 12 potential sources were analyzed. Four end-members were required to satisfactorily represent the hydrological system, i.e., rainfall, spring water, and water from the bottom layers of Histosols and Andosols. Water from Histosols in and near the riparian zone was the highest source contributor to runoff throughout the year (39% for the drier season, 45% for the wetter season), highlighting the importance of the water that is stored in the riparian zone. Spring water contributions to streamflow tripled during the drier season, as evidenced by geochemical signatures that are consistent with deeper flow paths rather than shallow interflow through Andosols. Rainfall exhibited low seasonal variation in this contribution. Hydrograph separation revealed that 94% and 84% is preevent water in the drier and wetter seasons, respectively. From low-flow to high-flow conditions, all the sources increased their contribution except spring water. The relative age of stream water decreased during wetter periods, when the contributing area of the riparian zone expands. The multimethod and multitracer approach enabled to closely study the interchanging importance of flow processes and water source dynamics from an interannual perspective.
The mysteries of remote memory.
Albo, Zimbul; Gräff, Johannes
2018-03-19
Long-lasting memories form the basis of our identity as individuals and lie central in shaping future behaviours that guide survival. Surprisingly, however, our current knowledge of how such memories are stored in the brain and retrieved, as well as the dynamics of the circuits involved, remains scarce despite seminal technical and experimental breakthroughs in recent years. Traditionally, it has been proposed that, over time, information initially learnt in the hippocampus is stored in distributed cortical networks. This process-the standard theory of memory consolidation-would stabilize the newly encoded information into a lasting memory, become independent of the hippocampus, and remain essentially unmodifiable throughout the lifetime of the individual. In recent years, several pieces of evidence have started to challenge this view and indicate that long-lasting memories might already ab ovo be encoded, and subsequently stored in distributed cortical networks, akin to the multiple trace theory of memory consolidation. In this review, we summarize these recent findings and attempt to identify the biologically plausible mechanisms based on which a contextual memory becomes remote by integrating different levels of analysis: from neural circuits to cell ensembles across synaptic remodelling and epigenetic modifications. From these studies, remote memory formation and maintenance appear to occur through a multi-trace, dynamic and integrative cellular process ranging from the synapse to the nucleus, and represent an exciting field of research primed to change quickly as new experimental evidence emerges.This article is part of a discussion meeting issue 'Of mice and mental health: facilitating dialogue between basic and clinical neuroscientists'. © 2018 The Authors.
NASA Astrophysics Data System (ADS)
Saffer, Demian M.; Kopf, Achim J.
2016-12-01
At many subduction zones, pore water geochemical anomalies at seafloor seeps and in shallow boreholes indicate fluid flow and chemical transport from depths of several kilometers. Identifying the source regions for these fluids is essential toward quantifying flow pathways and volatile fluxes through fore arcs, and in understanding their connection to the loci of excess pore pressure at depth. Here we develop a model to track the coupled effects of boron desorption, smectite dehydration, and progressive consolidation within sediment at the top of the subducting slab, where such deep fluid signals likely originate. Our analysis demonstrates that the relative timing of heating and consolidation is a dominant control on pore water composition. For cold slabs, pore water freshening is maximized because dehydration releases bound water into low porosity sediment, whereas boron concentrations and isotopic signatures are modest because desorption is strongly sensitive to temperature and is only partially complete. For warmer slabs, freshening is smaller, because dehydration occurs earlier and into larger porosities, but the boron signatures are larger. The former scenario is typical of nonaccretionary margins where insulating sediment on the subducting plate is commonly thin. This result provides a quantitative explanation for the global observation that signatures of deeply sourced fluids are generally strongest at nonaccretionary margins. Application of our multitracer approach to the Costa Rica, N. Japan, N. Barbados, and Mediterranean Ridge subduction zones illustrates that desorption and dehydration are viable explanations for observed geochemical signals, and suggest updip fluid migration from these source regions over tens of km.
The mysteries of remote memory
2018-01-01
Long-lasting memories form the basis of our identity as individuals and lie central in shaping future behaviours that guide survival. Surprisingly, however, our current knowledge of how such memories are stored in the brain and retrieved, as well as the dynamics of the circuits involved, remains scarce despite seminal technical and experimental breakthroughs in recent years. Traditionally, it has been proposed that, over time, information initially learnt in the hippocampus is stored in distributed cortical networks. This process—the standard theory of memory consolidation—would stabilize the newly encoded information into a lasting memory, become independent of the hippocampus, and remain essentially unmodifiable throughout the lifetime of the individual. In recent years, several pieces of evidence have started to challenge this view and indicate that long-lasting memories might already ab ovo be encoded, and subsequently stored in distributed cortical networks, akin to the multiple trace theory of memory consolidation. In this review, we summarize these recent findings and attempt to identify the biologically plausible mechanisms based on which a contextual memory becomes remote by integrating different levels of analysis: from neural circuits to cell ensembles across synaptic remodelling and epigenetic modifications. From these studies, remote memory formation and maintenance appear to occur through a multi-trace, dynamic and integrative cellular process ranging from the synapse to the nucleus, and represent an exciting field of research primed to change quickly as new experimental evidence emerges. This article is part of a discussion meeting issue ‘Of mice and mental health: facilitating dialogue between basic and clinical neuroscientists’. PMID:29352028
Colour-dressed hexagon tessellations for correlation functions and non-planar corrections
NASA Astrophysics Data System (ADS)
Eden, Burkhard; Jiang, Yunfeng; le Plat, Dennis; Sfondrini, Alessandro
2018-02-01
We continue the study of four-point correlation functions by the hexagon tessellation approach initiated in [38] and [39]. We consider planar tree-level correlation functions in N=4 supersymmetric Yang-Mills theory involving two non-protected operators. We find that, in order to reproduce the field theory result, it is necessary to include SU( N) colour factors in the hexagon formalism; moreover, we find that the hexagon approach as it stands is naturally tailored to the single-trace part of correlation functions, and does not account for multi-trace admixtures. We discuss how to compute correlators involving double-trace operators, as well as more general 1 /N effects; in particular we compute the whole next-to-leading order in the large- N expansion of tree-level BMN two-point functions by tessellating a torus with punctures. Finally, we turn to the issue of "wrapping", Lüscher-like corrections. We show that SU( N) colour-dressing reproduces an earlier empirical rule for incorporating single-magnon wrapping, and we provide a direct interpretation of such wrapping processes in terms of N=2 supersymmetric Feynman diagrams.
NASA Astrophysics Data System (ADS)
Pasten-Zapata, Ernesto; Ledesma-Ruiz, Rogelio; Ramirez, Aldo; Harter, Thomas; Mahlknecht, Jürgen
2014-05-01
To effectively manage groundwater quality it is essential to understand sources of contamination and underground processes. The objective of the study was to identify sources and fate of nitrate pollution occurring in an aquifer underneath a sub-humid to humid region in NE Mexico which provides 10% of national citrus production. Nitrate isotopes and halide ratios were applied to understand nitrate sources and transformations in relation to land use/land cover. It was found that the study area is subject to diverse nitrate sources including organic waste and wastewater, synthetic fertilizers and soil processes. Animal manure and sewage from septic tanks were the causes of groundwater nitrate pollution within orchards and vegetable agriculture. Dairy activities within a radius of 1,000m from a sampling point increased nitrate pollution. Leachates from septic tanks incited nitrate pollution in residential areas. Soil nitrogen and animal waste were the sources of nitrate in groundwater under shrubland and grassland. Partial denitrification processes were evidenced. The denitrification process helped to attenuate nitrate concentration in the agricultural lands and grassland particularly during summer months.
Holograms of a dynamical top quark
NASA Astrophysics Data System (ADS)
Clemens, Will; Evans, Nick; Scott, Marc
2017-09-01
We present holographic descriptions of dynamical electroweak symmetry breaking models that incorporate the top mass generation mechanism. The models allow computation of the spectrum in the presence of large anomalous dimensions due to walking and strong Nambu-Jona-Lasinio interactions. Technicolor and QCD dynamics are described by the bottom-up Dynamic AdS/QCD model for arbitrary gauge groups and numbers of quark flavors. An assumption about the running of the anomalous dimension of the quark bilinear operator is input, and the model then predicts the spectrum and decay constants for the mesons. We add Nambu-Jona-Lasinio interactions responsible for flavor physics from extended technicolor, top-color, etc., using Witten's multitrace prescription. We show the key behaviors of a top condensation model can be reproduced. We study generation of the top mass in (walking) one doublet and one family technicolor models and with strong extended technicolor interactions. The models clearly reveal the tensions between the large top mass and precision data for δ ρ . The necessary tunings needed to generate a model compatible with precision constraints are simply demonstrated.
[Selenium deficiency in an organic extensive water buffalo farm].
Große, Reinhard; Binici, Cagri; Pieper, Robert; Müller, Kerstin E
2018-06-01
This case report presents investigations of muscle problems in three male water buffaloes (1-2 years) kept extensively (loose housing, pasture). The bulls were presented because of listlessness and increased lying periods. They displayed difficulties to stand up, a stilted gait, and tremor in the legs. The determination of the selenium concentration by the measurement of glutathione peroxidase activity in whole blood samples (EDTA) demonstrated selenium deficiency in all three buffaloes. This confirmed the tentative diagnosis of nutritive myodystrophy due to selenium deficiency. Following a single injection of 1500 mg all-rac-alpha-tocopherol acetate and 11 mg sodium selenite, the bulls recovered clinically. The whole blood samples taken subsequently from seven adult water buffaloes on the farm showed selenium deficiency in all animals. Consequently, slow-release multi-trace element boluses were administered once orally - as far as possible - to all adult animals of the herd. After 1 year, a good to very good selenium supply was observed in all these buffaloes, except for one cow, in which bolus application had failed. Schattauer GmbH.
Ground water pollution by roof runoff infiltration evidenced with multi-tracer experiments.
Ammann, Adrian A; Hoehn, Eduard; Koch, Sabine
2003-03-01
The infiltration of urban roof runoff into well permeable subsurface material may have adverse effects on the ground water quality and endanger drinking water resources. Precipitation water from three different roofs of an industrial complex was channelled to a pit and infiltrated into a perialpine glaciofluvial gravel-and-sand aquifer. A shaft was constructed at the bottom of the pit and equipped with an array of TDR probes, lysimeters and suction cups that allowed measuring and sampling soil water at different depths. A fast infiltration flow was observed during natural rainfall events and during artificial infiltration experiments. For a better understanding of the behaviour of contaminants, experiments were conducted with cocktails of compounds of different reactivity (ammonium, strontium, atratone) and of non-reactive tracers (uranine, bromide, naphthionate), which represent different classes of pollutants. The experiment identified cation exchange reactions influencing the composition of the infiltrating water. These processes occurred under preferential flow conditions in macropores of the material. Measuring concentration changes under the controlled inflow of tracer experiments, the pollution potential was found to be high. Non-reactive tracers exhibited fast breakthrough and little sorption.
Multimodal Imaging of Alzheimer Pathophysiology in the Brain's Default Mode Network
Shin, Jonghan; Kepe, Vladimir; Small, Gary W.; ...
2011-01-01
The spatial correlations between the brain's default mode network (DMN) and the brain regions known to develop pathophysiology in Alzheimer's disease (AD) have recently attracted much attention. In this paper, we compare results of different functional and structural imaging modalities, including MRI and PET, and highlight different patterns of anomalies observed within the DMN. Multitracer PET imaging in subjects with and without dementia has demonstrated that [C-11]PIB- and [F-18]FDDNP-binding patterns in patients with AD overlap within nodes of the brain's default network including the prefrontal, lateral parietal, lateral temporal, and posterior cingulate cortices, with the exception of the medial temporalmore » cortex (especially, the hippocampus) where significant discrepancy between increased [F-18]FDDNP binding and negligible [C-11]PIB-binding was observed. [F-18]FDDNP binding in the medial temporal cortex—a key constituent of the DMN—coincides with both the presence of amyloid and tau pathology, and also with cortical areas with maximal atrophy as demonstrated by T1-weighted MR imaging of AD patients.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rasmussen, K.A.; Neumann, A.C.; Haddad, R.I.
The stable-isotope composition ({delta}{sup 13}C) of total organic carbon (TOC) was measured as a function of depth throughout a 217-cm-thick sequence of Holocene carbonate sediment within the Bight of Abaco lagoon, Little Bahama Bank. Biofacies and lithofacies analyses indicate progressive banktop submergence and paleoenvironmental response during Holocene sea-level rise. Stable-isotope values shift markedly from {minus}27.7{per thousand} within the 7900 B.P. paleosol at the base of the core to {minus}11.1{per thousand} at the present-day sediment-water interface. An abrupt excursion toward heavy-isotope values records the first establishment of Thalassia seagrass upon open-marine flooding. A multitracer approach, combining biofacies, lithofacies, and stable-isotope analysismore » of TOC confirms that the dramatic +17{per thousand} shift observed in {delta}{sup 13}C was a direct result of sea-level rise and associated environmental changes over the banktop; there is little evidence of spurious diagenetic overprint. Stable-isotope analyses of organic carbon may enhance the reconstruction of carbonate sequences by revealing a distinctive geochemical signature of banktop flooding, including the onset of growth of otherwise unpreservable Thalassia seagrass.« less
NASA Astrophysics Data System (ADS)
Shiel, Alyssa E.; Weis, Dominique; Orians, Kristin J.
2012-01-01
Environmental monitoring and remediation require techniques to identify the source and fate of metals emissions. The measurement of heavy metal isotopic signatures, made possible by the advent of the MC-ICP-MS, is a powerful new geochemical tool, which may be used to trace the source of these metals in the environment. In a multi-tracer study, Cd, Zn and Pb isotopic compositions (MC-ICP-MS) and elemental concentrations (HR-ICP-MS) are used to distinguish between natural and anthropogenic sources of these metals in bivalves collected from western Canada (British Columbia), Hawaii, and the USA East Coast. Variability in the δ 114/110Cd values of bivalves (-1.20‰ to -0.09‰) is attributed to differences in the relative contributions of Cd from natural and anthropogenic sources between sites. Cadmium isotopic compositions (δ 114/110Cd = -0.69‰ to -0.09‰) identify high Cd levels in B.C. oysters as primarily natural (i.e., upwelling of Cd rich intermediate waters in the North Pacific), with some variability attributed to anthropogenic sources (e.g., mining and smelting). Variability in the δ 66/64Zn values exhibited by the B.C. bivalves is relatively small (0.28-0.36‰). Despite the low Pb levels found in B.C. oysters, Pb isotopes are used to identify emissions from industrial processes and the consumption of unleaded gasoline and diesel fuel as significant metal sources. Although the Cd concentrations of the USA East Coast bivalves are primarily lower than those of B.C. oysters, their relatively light Cd isotopic compositions (δ 114/110Cd = -1.20‰ to -0.54‰) indicate the significance of anthropogenic Cd sources and are attributed to the high prevalence of industry on this coast. The δ 114/110Cd values of USA East Coast bivalves include the lightest ever reported, with the exception of values reported for extraterrestrial materials. In addition, the Pb isotopic compositions of bivalves from the USA East Coast indicate Pb emissions from the combustion of coal are an important source of Pb, consistent with the high consumption of coal for power production on this coast. This study demonstrates the effective use of Cd and Zn isotopes to trace anthropogenic sources in the environment and the benefit of combining these tools with Pb "fingerprinting" techniques.
NASA Astrophysics Data System (ADS)
Sardenne, Fany; Bodin, Nathalie; Chassot, Emmanuel; Amiel, Aurélien; Fouché, Edwin; Degroote, Maxime; Hollanda, Stéphanie; Pethybridge, Heidi; Lebreton, Benoit; Guillou, Gaël; Ménard, Frédéric
2016-08-01
This study examined the trophic ecology of three sympatric tropical tuna species (bigeye BET, skipjack SKJ, and yellowfin YFT) sampled in the Western Indian Ocean throughout 2013. Specifically we explored inter-specific resource partitioning and ontogenetic variability using neutral fatty acids and stable isotope analysis of liver and muscle from small (⩽100 cm fork length, FL) and large (>100 cm FL) tuna collected in mixed schools at the surface by purse-seine. Both biochemical tracers were used to calculate trophic niche indices that collectively revealed high potential for resource overlap, especially among small tuna. Resource overlap appeared strongest between BET and YFT, with SKJ tissues having high carbon isotope (δ13C) values (-17 ± 0.3‰), lower nitrogen isotope (δ15N) values (11.4 ± 0.6‰), and higher relative proportion of poly-unsaturated fatty acids (PUFA) than the two other species, indicating a different diet. Size was found to be a strong predictor for most biochemical tracers in the three species with δ13C, δ15N and total lipid content in the liver. In the larger species (YFT and BET), proportions of mono-unsaturated fatty acids typically increased with size, while quantities of PUFA decreased. In addition to ontogenetic variability, trophic markers were shown to vary between sampling area and season: higher lipid reserves and δ15N values, and lower δ13C values occurred during monsoon periods around Seychelles than in the Mozambique Channel (parted from about 1500 km). Our multi-tracer approach reveals the magnitude of potential competitive interactions in mixed tropical tuna schools at both small and large sizes and demonstrates that ontogenetic niche differentiation acts as a major factor of coexistence in tropical tuna.
NASA Astrophysics Data System (ADS)
Priestley, Stacey C.; Wohling, Daniel L.; Keppel, Mark N.; Post, Vincent E. A.; Love, Andrew J.; Shand, Paul; Tyroller, Lina; Kipfer, Rolf
2017-11-01
The investigation of regionally extensive groundwater systems in remote areas is hindered by a shortage of data due to a sparse observation network, which limits our understanding of the hydrogeological processes in arid regions. The study used a multidisciplinary approach to determine hydraulic connectivity between the Great Artesian Basin (GAB) and the underlying Arckaringa Basin in the desert region of Central Australia. In order to manage the impacts of groundwater abstraction from the Arckaringa Basin, it is vital to understand its connectivity with the GAB (upper aquifer), as the latter supports local pastoral stations and groundwater-dependent springs with unique endemic flora and fauna. The study is based on the collation of available geological information, a detailed analysis of hydraulic data, and data on environmental tracers. Enhanced inter-aquifer leakage in the centre of the study area was identified, as well as recharge to the GAB from ephemeral rivers and waterholes. Throughout the rest of the study area, inter-aquifer leakage is likely controlled by diffuse inter-aquifer leakage, but the coarse spatial resolution means that the presence of additional enhanced inter-aquifer leakage sites cannot be excluded. This study makes the case that a multi-tracer approach along with groundwater hydraulics and geology provides a tool-set to investigate enhanced inter-aquifer leakage even in a groundwater basin with a paucity of data. A particular problem encountered in this study was the ambiguous interpretation of different age tracers, which is attributed to diffusive transport across flow paths caused by low recharge rates.
Runoff sources and flow paths dynamics in the Andean Páramo.
NASA Astrophysics Data System (ADS)
Correa, Alicia; Windhorst, David; Tetzlaff, Doerthe; Silva, Camila; Crespo, Patricio; Celleri, Rolando; Feyen, Jan; Breuer, Lutz
2017-04-01
The dynamics of runoff sources and flow paths in headwater catchments are still poorly understood. This is even more the case for remote areas such as the Páramo (Alpine grasslands) in the Andes, where these ecosystems act as water towers for a large fraction of the society. Temporal dynamics in water source areas, flow paths and relative age were assessed in a small catchment in the Ecuadorian Andes using data from the Zhurucay Ecohydrological Observatory (7.53 km2). We applied End Member Mixing Analysis, Hydrograph Separation and Inverse Transit Time Proxies to a multi-tracer set of solutes, stable isotopes, pH and electrical conductivity sampled from stream and twelve potential sources during two years. Rainfall, spring water and water from the bottom layers of Histosols (located at the foot of the hillslopes and in the riparian zone) and Andosols (located at the hillslopes) represented the dominant sources for runoff generation. Water coming from Histosols was the main contributor to stream water year-round, in line with a hydrological system that is dominated by pre-event water. Rainfall presented a uniform contribution during the year, while in drier conditions the spring water tripled in contribution. In wetter conditions, the relative age of stream water decreases, when the contributing area of the riparian zone expands, increasing the connectivity with lateral flow from hillslopes to the channel network. Being one of the earliest in the region, this multi-method study improved the understanding of the hydrological processes of headwater catchments and allowed to demonstrate that catchments with relatively homogeneous hydro-climatic conditions are characterized by inter-annual varying source contributions.
NASA Astrophysics Data System (ADS)
Feng, Xiaojuan; Gustafsson, Örjan; Holmes, R. Max; Vonk, Jorien E.; van Dongen, Bart E.; Semiletov, Igor P.; Dudarev, Oleg V.; Yunker, Mark B.; Macdonald, Robie W.; Wacker, Lukas; Montluçon, Daniel B.; Eglinton, Timothy I.
2015-11-01
Distinguishing the sources, ages, and fate of various terrestrial organic carbon (OC) pools mobilized from heterogeneous Arctic landscapes is key to assessing climatic impacts on the fluvial release of carbon from permafrost. Through molecular 14C measurements, including novel analyses of suberin- and/or cutin-derived diacids (DAs) and hydroxy fatty acids (FAs), we compared the radiocarbon characteristics of a comprehensive suite of terrestrial markers (including plant wax lipids, cutin, suberin, lignin, and hydroxy phenols) in the sedimentary particles from nine major arctic and subarctic rivers in order to establish a benchmark assessment of the mobilization patterns of terrestrial OC pools across the pan-Arctic. Terrestrial lipids, including suberin-derived longer-chain DAs (C24,26,28), plant wax FAs (C24,26,28), and n-alkanes (C27,29,31), incorporated significant inputs of aged carbon, presumably from deeper soil horizons. Mobilization and translocation of these "old" terrestrial carbon components was dependent on nonlinear processes associated with permafrost distributions. By contrast, shorter-chain (C16,18) DAs and lignin phenols (as well as hydroxy phenols in rivers outside eastern Eurasian Arctic) were much more enriched in 14C, suggesting incorporation of relatively young carbon supplied by runoff processes from recent vegetation debris and surface layers. Furthermore, the radiocarbon content of terrestrial markers is heavily influenced by specific OC sources and degradation status. Overall, multitracer molecular 14C analysis sheds new light on the mobilization of terrestrial OC from arctic watersheds. Our findings of distinct ages for various terrestrial carbon components may aid in elucidating fate of different terrestrial OC pools in the face of increasing arctic permafrost thaw.
NASA Astrophysics Data System (ADS)
Willem Foppen, Jan; Bogaard, Thom; van Osnabrugge, Bart; Puddu, Michela; Grass, Robert
2015-04-01
With tracer experiments, knowledge on solute transport, travel times, flow pathways, source areas, and linkages between infiltration and exfiltration zones in subsurface hydrological studies can be obtained. To overcome the well-known limitations of artificial tracers, we report here the development and application of an inexpensive method to produce large quantities of environmentally friendly 150-200 nm microparticles composed of a magnetite core to which small fragments of synthetic 80 nt ssDNA were adsorbed, which were then covered by a layer of inert silica (acronym: SiDNAMag). The main advantages of using DNA are the theoretically unlimited amount of different DNA tracers and the low DNA detection limit using the quantitative polymerase chain reaction (qPCR); the main advantage of the silica layer is to prevent DNA decay, while the magnetite core facilitates magnetic separation, recovery and up-concentration. In 10 cm columns of saturated quartz sand, we first injected NaCl, a conservative salt tracer, and measured the breakthrough. Then, we injected SiDNAMag suspended in water of known composition, harvested the SiDNAMag in column effluent samples, and measured the DNA concentration via qPCR after dissolving the SiDNAMag. The results indicated that the timing of the rising limb of the DNA breakthrough curve, the plateau phase and the falling limb were identical to the NaCl breakthrough curve. However, the relative maximum DNA concentration reached during the plateau phase was around 0.3, indicating that around 70% of the SiDNAMag mass was retained in the column. From these results we inferred that SiDNAMag was not retarded and therefore not subject to equilibrium sorption. Instead, first order irreversible kinetic attachment appeared to be the dominant retention mechanism. Based on our results, we speculate that, despite significant retention, due to the low DNA detection limit and the possibility of magnetic up-concentration, the use of SiDNAMag is a very promising technique to determine complex flow patterns, travel times, and flow pathways in many different subsurface hydrological applications.
NASA Astrophysics Data System (ADS)
Sakakibara, Koichi; Tsujimura, Maki; Song, Xianfang; Zhang, Jie
2017-02-01
Groundwater recharge variations in time and space are crucial for effective water management, especially in low-precipitation regions. To determine comprehensive groundwater recharge processes in a catchment with large seasonal hydrological variations, intensive field surveys were conducted in the Wangkuai Reservoir watershed located in the Taihang Mountains, North China, during three different times of the year: beginning of the rainy season (June 2011), mid-rainy season (August 2012), and dry season (November 2012). Oxygen and hydrogen isotope and chemical analyses were conducted on the groundwater, spring water, stream water, and reservoir water of the Wangkuai Reservoir watershed. The results were processed using endmember mixing analysis to determine the amount of contribution of the groundwater recharging processes. Similar isotopic and chemical signatures between the surface water and groundwater in the target area indicate that the surface water in the mountain-plain transitional area and the Wangkuai Reservoir are the principal groundwater recharge sources, which result from the highly permeable geological structure of the target area and perennial large-scale surface water, respectively. Additionally, the widespread and significant effect of the diffuse groundwater recharge on the Wangkuai Reservoir was confirmed with the deuterium (d) excess indicator and the high contribution throughout the year, calculated using endmember mixing analysis. Conversely, the contribution of the stream water to the groundwater recharge in the mountain-plain transitional area clearly decreases from the beginning of the rainy season to the mid-rainy season, whereas that of the precipitation increases. This suggests that the main groundwater recharge source shifts from stream water to episodic/continuous heavy precipitation in the mid-rainy season. In other words, the surface water and precipitation commonly affect the groundwater recharge in the rainy season, whereas the reservoir and stream water play important roles in the groundwater recharge in the low-precipitation period. The results should contribute not only to the understanding of the mountain hydrology but also to groundwater resource management in the North China Plain.
A preliminary assessment of sources of nitrate in springwaters, Suwannee River basin, Florida
Katz, B.G.; Hornsby, H.D.
1998-01-01
A cooperative study between the Suwannee River Water Management District (SRWMD) and the U.S. Geological Survey (USGS) is evaluating sources of nitrate in water from selected springs and zones in the Upper Floridan aquifer in the Suwannee River Basin. A multi-tracer approach, which consists of the analysis of water samples for naturally occurring chemical and isotopic indicators, is being used to better understand sources and chronology of nitrate contamination in the middle Suwannee River region. In July and August 1997, water samples were collected and analyzed from six springs and two wells for major ions, nutrients, and dissolved organic carbon. These water samples also were analyzed for environmental isotopes [18O/16O, D/H, 13C/12C, 15N/14N] to determine sources of water and nitrate. Chlorofluorocarbons (CCl3F, CCl2F2, and C2Cl3F3) and tritium (3H) were analyzed to assess the apparent ages (residence time) of springwaters and water from the Upper Floridan aquifer. Delta 15N-NO3 values in water from the six springs range from 3.94 per mil (Little River Springs) to 8.39 per mil (Lafayette Blue Spring). The range of values indicates that nitrate in the sampled springwaters most likely originates from a mixture of inorganic (fertilizers) and organic (animal wastes) sources, although the higher delta 15N-NO3 value for Lafayette Blue Spring indicates that an organic source of nitrogen is likely at this site. Water samples from the two wells sampled in Lafayette County have high delta 15N-NO3 values of 10.98 and 12.1 per mil, indicating the likelihood of an organic source of nitrate. These two wells are located near dairy and poultry farms, where leachate from animal wastes may contribute nitrate to ground water. Based on analysis of chlorofluorocarbons in ground water, the mean residence time of water in springs ranges from about 12 to 25 years. Chlorofluorocarbons-modeled recharge dates for water samples from the two shallow zones in the Upper Floridan aquifer range from 1985 to 1989.
NASA Astrophysics Data System (ADS)
Land, Lewis; Huff, G. F.
2010-03-01
Several natural and anthropogenic tracers have been used to evaluate groundwater residence time within a karstic limestone aquifer in southeastern New Mexico, USA. Natural groundwater discharge occurs in the lower Pecos Valley from a region of karst springs, wetlands and sinkhole lakes at Bitter Lakes National Wildlife Refuge, on the northeast margin of the Roswell Artesian Basin. The springs and sinkholes are formed in gypsum bedrock that serves as a leaky confining unit for an artesian aquifer in the underlying San Andres limestone. Because wetlands on the Refuge provide habitat for threatened and endangered species, there is concern about the potential for contamination by anthropogenic activity in the aquifer recharge area. Estimates of the time required for groundwater to travel through the artesian aquifer vary widely because of uncertainties regarding karst conduit flow. A better understanding of groundwater residence time is required to make informed decisions about management of water resources and wildlife habitat at Bitter Lakes. Results indicate that the artesian aquifer contains a significant component of water recharged within the last 10-50 years, combined with pre-modern groundwater originating from deeper underlying aquifers, some of which may be indirectly sourced from the high Sacramento Mountains to the west.
NASA Astrophysics Data System (ADS)
Land, L. A.; Huff, R.
2009-12-01
Several natural and anthropogenic tracers are used to evaluate groundwater residence time within the karstic limestone aquifer of the Roswell Artesian Basin, southeastern New Mexico, USA. Natural groundwater discharge occurs in the lower Pecos Valley from a region of karst springs, wetlands and sinkhole lakes at Bitter Lakes National Wildlife Refuge. The springs and sinkholes are formed in gypsum bedrock that serves as a leaky confining unit for an artesian aquifer in the underlying San Andres limestone. Because wetlands on the Refuge provide habitat for a number of threatened and endangered species, Refuge managers have expressed concern about the potential for contamination by anthropogenic activity in the aquifer recharge area. Estimates of the time required for groundwater to travel through the artesian aquifer vary widely because of uncertainties regarding the role of karst conduit flow. A better understanding of groundwater residence time is thus required to make informed decisions about management of water resources and wildlife habitat at Bitter Lakes. Results of tracer investigations indicate that the artesian aquifer contains a significant component of water recharged within the last 10 to 50 years, combined with pre-modern groundwater originating from deeper underlying aquifers, some of which may be indirectly sourced from the high Sacramento Mountains to the west.
Pastén-Zapata, Ernesto; Ledesma-Ruiz, Rogelio; Harter, Thomas; Ramírez, Aldo I; Mahlknecht, Jürgen
2014-02-01
Nitrate isotopic values are often used as a tool to understand sources of contamination in order to effectively manage groundwater quality. However, recent literature describes that biogeochemical reactions may modify these values. Therefore, data interpretation is difficult and often vague. We provide a discussion on this topic and complement the study using halides as comparative tracers assessing an aquifer underneath a sub-humid to humid region in NE Mexico. Hydrogeological information and stable water isotopes indicate that active groundwater recharge occurs in the 8000km(2) study area under present-day climatic and hydrologic conditions. Nitrate isotopes and halide ratios indicate a diverse mix of nitrate sources and transformations. Nitrate sources include organic waste and wastewater, synthetic fertilizers and soil processes. Animal manure and sewage from septic tanks were the causes of groundwater nitrate pollution within orchards and vegetable agriculture. Dairy activities within a radius of 1,000 m from a sampling point significantly contributed to nitrate pollution. Leachates from septic tanks caused nitrate pollution in residential areas. Soil nitrogen and animal waste were the sources of nitrate in groundwater under shrubland and grassland. Partial denitrification processes helped to attenuate nitrate concentration underneath agricultural lands and grassland, especially during summer months. © 2013. Published by Elsevier B.V. All rights reserved.
Rhodamine-WT dye losses in a mountain stream environment
Bencala, Kenneth E.; Rathburn, Ronald E.; Jackman, Alan P.; Kennedy, Vance C.; Zellweger, Gary W.; Avanzino, Ronald J.
1983-01-01
A significant fraction of rhodamine WT dye was lost during a short term multitracer injection experiment in a mountain stream environment. The conservative anion chloride and the sorbing cation lithium were concurrently injected. In-stream rhodamine WT concentrations were as low as 45 percent of that expected, based on chloride data. Concentration data were available from shallow‘wells’dug near the stream course and from a seep of suspected return flow. Both rhodamine WT dye and lithium were nonconservative with respect to the conservative chloride, with rhodamine WT dye closely following the behavior of the sorbing lithium.Nonsorption and sorption mechanisms for rhodamine WT loss in a mountain stream were evaluated in laboratory experiments. Experiments evaluating nonsorption losses indicated minimal losses by such mechanisms. Laboratory experiments using sand and gravel size streambed sediments show an appreciable capacity for rhodamine WT sorption.The detection of tracers in the shallow wells and seep indicates interaction between the stream and the flow in the surrounding subsurface, intergravel water, system. The injected tracers had ample opportunity for intimate contact with materials shown in the laboratory experiments to be potentially sorptive. It is suggested that in the study stream system, interaction with streambed gravel was a significant mechanism for the attenuation of rhodamine WT dye (relative to chloride).
Addressing the Sustainability of Groundwater Extraction in California Using Hydrochronology
NASA Astrophysics Data System (ADS)
Moran, J. E.; Visser, A.; Singleton, M. J.; Esser, B. K.
2017-12-01
In urban and agricultural settings in California, intense pressure on water supplies has led to extensive managed aquifer recharge and extensive overdraft in these areas, respectively. The California Sustainable Groundwater Management Act (SGMA) includes criteria for pumping that maintains groundwater levels and basin storage, and avoids stream depletion and degradation of water quality. Most sustainability plans will likely use water level monitoring and water budget balancing based on integrated flow models as evidence of compliance. However, hydrochronology data are applicable to several of the criteria, and provide an independent method of addressing questions related to basin turnover time, recharge rate, surface water-groundwater interaction, and the age distribution at pumping wells. We have applied hydrochronology (mainly tritium-helium groundwater age dating and extrinsic tracers) in urban areas to delineate flowpaths of artificially recharged water, to identify stagnant zones bypassed by the engineered flow system, and to predict vulnerability of drinking water sources to contamination. In agricultural areas, we have applied multi-tracer hydrochronology to delineate groundwater stratigraphy, to identify paleowater, and to project future nitrate concentrations in long-screened wells. This presentation will describe examples in which groundwater dating and other tracer methods can be applied to directly address the SGMA criteria for sustainable groundwater pumping.
Review of Copper Provision in the Parenteral Nutrition of Adults [Formula: see text].
Livingstone, Callum
2017-04-01
The essential trace element copper (Cu) is required for a range of physiologic processes, including wound healing and functioning of the immune system. The correct amount of Cu must be provided in parenteral nutrition (PN) if deficiency and toxicity are to be avoided. While provision in line with the standard recommendations should suffice for most patients, Cu requirements may be higher in patients with increased gastrointestinal losses and severe burns and lower in those with cholestasis. The tests of Cu status that are currently available for clinical use are unreliable. Serum Cu concentration is the most commonly ordered test but is insensitive to Cu deficiency and toxicity and is misleadingly increased during the acute phase response. These limitations make it difficult for prescribers to assess Cu status and to decide how much Cu to provide. There is a need for better tests of Cu status to be developed to decrease uncertainty and improve individualization of Cu dosing. More information is needed on Cu requirements in disease and Cu contamination of PN components and other intravenous fluids. New multi-trace element products should be developed that provide Cu doses in line with the 2012 American Society for Parenteral and Enteral Nutrition recommendations. This article discusses the evaluation and treatment of Cu deficiency and toxicity in patients treated with PN.
NASA Astrophysics Data System (ADS)
Wilske, Cornelia; Rödiger, Tino; Suckow, Axel; Geyer, Stefan; Weise, Stephan; Merchel, Silke; Rugel, Georg; Pavetich, Stefan; Merkel, Broder; Siebert, Christian
2017-04-01
The water supply in semi-arid Israel and Palestine, predominantly relies on groundwater as freshwater resource, stressed by increasing demand and low recharge rates. Sustainable management of such resources requires a sound understanding of its groundwater migration through space and time, particularly in structurally complex multi-aquifer systems as the Eastern Mountain Aquifer, affected by salting. To differentiate between the flow paths of the different water bodies and their respective residence times, a multi-tracer approach, combining age dating isotopes (36Cl/Cl; 3H) with rock specific isotopes like 87Sr/86Sr and δ34S-SO4 was applied. As a result, the investigated groundwaters from the two Cretaceous aquifers and their respective flow paths are differentiable by e.g. their 87Sr/86Sr signatures, resembling the intensity of the rock-water interaction and hence indirectly residence times. In the discharge areas within the Jordan Valley and along the Dead Sea shore, δ34S-SO4 ratios reveal the different sources of salinity (ascending brines, interstitial brines and dissolved salts). Based on 36Cl and 3H and the atmospheric input functions, very heterogeneous infiltration times and effective flow velocities, respectively, indicate an at least dual porosity system, resulting in distinctly different regimes of matrix and pipe flow.
Long-term decay and possible reactivation of induced seismicity at the Basel EGS site
NASA Astrophysics Data System (ADS)
Kraft, Toni; Herrmann, Marcus; Karvounis, Dimitrios; Tormann, Thessa; Deichmann, Nicolas; Wiemer, Stefan
2016-04-01
In December 2006, an extensive fluid injection was carried out below the city of Basel, Switzerland, to stimulate a reservoir for an Enhanced Geothermal System (EGS). After six days of gradual increase of flow rate (and thus seismicity), a strongly felt ML3.4 earthquakes led to the immediate termination of the project. The well was opened subsequently and seismicity declined rapidly. The Basel EGS project might be an unsuccessful attempt in terms of energy supply, but a chance to advance the physical understanding of EGSs. The well-monitored and well-studied induced sequence allowed many new insights in terms of reservoir creation. A special observation in the nine years of monitoring is the revive of seismic activity six years after prolonged seismic decay. This renewed activity increase might relate to a gradual pressure increase due to the ultimate shut-in (closure) of the borehole about one year before. Until now, a detailed analysis of the long-term behaviour remained unexplored since a consistent catalogue did not exist. In the current study, we took advantage of the high waveform similarity within a seismic sequence and applied a multi-trace template-matching (i.e. cross-correlation) procedure to detect seismic events about one order of magnitude below the detection threshold. We detected about 100,000 events within the six-day long stimulation alone - previously, only 13,000 microearthquakes were detected. We only scanned the recordings of the deepest borehole station (2.7km). This station is very close to the 5km-deep reservoir and has the highest signal-to-noise ratio among all (borehole-)stations. Our newly obtained catalogue spans over more than nine years and features a uniform (and low) detection threshold and a uniform magnitude determination. The improved resolution of the long-term behaviour and the later seismicity increase will help to understand involved mechanisms better. More induced or natural sequences can be investigated with our procedure.
NASA Astrophysics Data System (ADS)
Dwivedi, R.; Meixner, T.; McIntosh, J. C.; Ferre, T. P. A.; Eastoe, C. J.; Minor, R. L.; Barron-Gafford, G.; Chorover, J.
2017-12-01
The composition of natural mountainous waters maintains important control over the water quality available to downstream users. Furthermore, the geochemical constituents of stream water in the mountainous catchments represent the result of the spatial and temporal evolution of critical zone structure and processes. A key problem is that high elevation catchments involve rugged terrain and are subject to extreme climate and landscape gradients; therefore, high density or high spatial resolution hydro-geochemical observations are rare. Despite such difficulties, the Santa Catalina Mountains Critical Zone Observatory (SCM-CZO), Tucson, AZ, generates long-term hydrogeochemical data for understanding not only hydrological processes and their seasonal characters, but also the geochemical impacts of such processes on streamflow chemical composition. Using existing instrumentation and hydrogeochemical observations from the last 9+ years (2009 through 2016 and an initial part of 2017), we employed a multi-tracer approach along with principal component analysis to identify water sources and their seasonal character. We used our results to inform hydrological process understanding (flow paths, residence times, and water sources) for our study site. Our results indicate that soil water is the largest contributor to streamflow, which is ephemeral in nature. Although a 3-dimensional mixing space involving precipitation, soil water, interflow, and deep groundwater end-members could explain most of the streamflow chemistry, geochemical complexity was observed to grow with catchment storage. In terms of processes and their seasonal character, we found soil water and interflow were the primary end-member contributors to streamflow in all seasons. Deep groundwater only contributes to streamflow at high catchment storage conditions, but it provides major ions such as Na, Mg, and Ca that are lacking in other water types. In this way, our results indicate that any future efforts aimed at explaining concentration-discharge behavior of our field site should consider at least three-dimensional mixing space or 4 end-members.
NASA Astrophysics Data System (ADS)
Siade, A. J.; Suckow, A. O.; Morris, R.; Raiber, M.; Prommer, H.
2017-12-01
The calibration of regional groundwater flow models, including those investigating coal-seam gas (CSG) impacts in the Surat Basin, Australia, are not typically constrained using environmental tracers, although the use of such data can potentially provide significant reductions in predictive uncertainties. These additional sources of information can also improve the conceptualisation of flow systems and the quantification of groundwater fluxes. In this study, new multi-tracer data (14C, 39Ar, 81Kr, and 36Cl) were collected for the eastern recharge areas of the basin and within the deeper Hutton and Precipice Sandstone formations to complement existing environmental tracer data. These data were used to better understand the recharge mechanisms, recharge rates and the hydraulic properties associated with deep aquifer systems in the Surat Basin. Together with newly acquired pressure data documenting the response to the large-scale reinjection of highly treated CSG co-produced water, the environmental tracer data helped to improve the conceptualisation of the aquifer system, forming the basis for a more robust quantification of the long-term impacts of CSG-related activities. An existing regional scale MODFLOW-USG groundwater flow model of the area was used as the basis for our analysis of existing and new observation data. A variety of surrogate modelling approaches were used to develop simplified models that focussed on the flow and transport behaviour of the deep aquifer systems. These surrogate models were able to represent sub-system behaviour in terms of flow, multi-environmental tracer transport and the observed large-scale hydrogeochemical patterns. The incorporation of the environmental tracer data into the modelling framework provide an improved understanding of the flow regimes of the deeper aquifer systems as well as valuable information on how to reduce uncertainties in hydraulic properties where there is little or no historical observations of hydraulic heads.
Air exchange rates and migration of VOCs in basements and residences
Du, Liuliu; Batterman, Stuart; Godwin, Christopher; Rowe, Zachary; Chin, Jo-Yu
2015-01-01
Basements can influence indoor air quality by affecting air exchange rates (AERs) and by the presence of emission sources of volatile organic compounds (VOCs) and other pollutants. We characterized VOC levels, AERs and interzonal flows between basements and occupied spaces in 74 residences in Detroit, Michigan. Flows were measured using a steady-state multi-tracer system, and 7-day VOC measurements were collected using passive samplers in both living areas and basements. A walkthrough survey/inspection was conducted in each residence. AERs in residences and basements averaged 0.51 and 1.52 h−1, respectively, and had strong and opposite seasonal trends, e.g., AERs were highest in residences during the summer, and highest in basements during the winter. Air flows from basements to occupied spaces also varied seasonally. VOC concentration distributions were right-skewed, e.g., 90th percentile benzene, toluene, naphthalene and limonene concentrations were 4.0, 19.1, 20.3 and 51.0 μg m−3, respectively; maximum concentrations were 54, 888, 1117 and 134 μg m−3. Identified VOC sources in basements included solvents, household cleaners, air fresheners, smoking, and gasoline-powered equipment. The number and type of potential VOC sources found in basements are significant and problematic, and may warrant advisories regarding the storage and use of potentially strong VOCs sources in basements. PMID:25601281
Cresson, Pierre; Bouchoucha, Marc; Morat, Fabien; Miralles, Francoise; Chavanon, Fabienne; Loizeau, Veronique; Cossa, Daniel
2015-11-01
Chemical contamination levels and stable isotope ratios provide integrated information about contaminant exposure, trophic position and also biological and environmental influences on marine organisms. By combining these approaches with otolith shape analyses, the aim of the present study was to document the spatial variability of Hg and PCB contamination of the European hake (Merluccius merluccius) in the French Mediterranean, hypothesizing that local contaminant sources, environmental conditions and biological specificities lead to site-specific contamination patterns. High Hg concentrations discriminated Corsica (average: 1.36 ± 0.80 μg g(-1) dm) from the Gulf of Lions (average values<0.5 μg g(-1) dm), where Rhône River input caused high PCB burdens. CB 153 average concentrations ranged between 4.00 ± 0.64 and 18.39 ± 12.38 ng g(-1) dm in the Gulf of Lions, whatever the sex of the individuals, whereas the highest values in Corsica were 6.75 ± 4.22 ng g(-1) dm. Otolith shape discriminated juveniles and adults, due to their different habitats. The use of combined ecotracers was revealed as a powerful tool to discriminate between fish populations at large and small spatial scale, and to enable understanding of the environmental and biological influences on contamination patterns. Copyright © 2015 Elsevier B.V. All rights reserved.
Abdollahi, E.; Kohram, H.; Shahir, M. H.; Nemati, M. H.
2015-01-01
Published data on the effects of ruminal bolus on the number of ovulatory follicles in ewes does not exist. The present study determined the effects of a ruminal bolus on trace element status, follicular dynamics and reproductive performance in ewes. Eighty Afshari cycling ewes were synchronized during breeding season using CIDR for 14 days and assigned to 4 groups (n=20); group 1 received a single Ferrobloc bolus four weeks prior to CIDR insertion following 400 IU eCG on CIDR removal, group 2 received two boluses four weeks prior to CIDR insertion following 400 IU eCG on CIDR removal, group 3 received only 400 IU eCG on CIDR removal and group 4 (control) received no bolus and no eCG. Transrectal ultrasonography was done to monitor the ovarian follicles on the day of CIDR removal and a day later. Results showed that boluses increased the status of copper, selenium and iodine on mating day and days 90 to 100 of gestation. Ruminal bolus did not significantly increase the number of different classes of ovarian follicles in ewes fed a diet meeting all trace mineral requirements. All ewes eventually became pregnant with 1 or 2 boluses but the multiple births rate (80%) was higher (P<0.05) after 2 boluses compared to the other groups. PMID:27175153
Log corrections to entropy of three dimensional black holes with soft hair
NASA Astrophysics Data System (ADS)
Grumiller, Daniel; Perez, Alfredo; Tempo, David; Troncoso, Ricardo
2017-08-01
We calculate log corrections to the entropy of three-dimensional black holes with "soft hairy" boundary conditions. Their thermodynamics possesses some special features that preclude a naive direct evaluation of these corrections, so we follow two different approaches. The first one exploits that the BTZ black hole belongs to the spectrum of Brown-Henneaux as well as soft hairy boundary conditions, so that the respective log corrections are related through a suitable change of the thermodynamic ensemble. In the second approach the analogue of modular invariance is considered for dual theories with anisotropic scaling of Lifshitz type with dynamical exponent z at the boundary. On the gravity side such scalings arise for KdV-type boundary conditions, which provide a specific 1-parameter family of multi-trace deformations of the usual AdS3/CFT2 setup, with Brown-Henneaux corresponding to z = 1 and soft hairy boundary conditions to the limiting case z → 0+. Both approaches agree in the case of BTZ black holes for any non-negative z. Finally, for soft hairy boundary conditions we show that not only the leading term, but also the log corrections to the entropy of black flowers endowed with affine û (1) soft hair charges exclusively depend on the zero modes and hence coincide with the ones for BTZ black holes.
Protocol for quantitative tracing of surface water with synthetic DNA
NASA Astrophysics Data System (ADS)
Foppen, J. W.; Bogaard, T. A.
2012-04-01
Based on experiments we carried out in 2010 with various synthetic single stranded DNA markers with a size of 80 nucleotides (ssDNA; Foppen et al., 2011), we concluded that ssDNA can be used to carry out spatially distributed multi-tracer experiments in the environment. Main advantages are in principle unlimited amount of tracers, environmental friendly and tracer recovery at very high dilution rates (detection limit is very low). However, when ssDNA was injected in headwater streams, we found that at selected downstream locations, the total mass recovery was less than 100%. The exact reason for low mass recovery was unknown. In order to start identifying the cause of the loss of mass in these surface waters, and to increase our knowledge of the behaviour of synthetic ssDNA in the environment, we examined the effect of laboratory and field protocols working with artificial DNA by performing numerous batch experiments. Then, we carried out several field tests in different headwater streams in the Netherlands and in Luxembourg. The laboratory experiments consisted of a batch of water in a vessel with in the order of 10^10 ssDNA molecules injected into the batch. The total duration of each experiment was 10 hour, and, at regular time intervals, 100 µl samples were collected in a 1.5 ml Eppendorf vial for qPCR analyses. The waters we used ranged from milliQ water to river water with an Electrical Conductivity of around 400 μS/cm. The batch experiments were performed in different vessel types: polyethylene bottles, polypropylene copolymer bottles , and glass bottles. In addition, two filter types were tested: 1 µm pore size glass fibre filters and 0.2 µm pore size cellulose acetate filters. Lastly, stream bed sediment was added to the batch experiments to quantify interaction of the DNA with sediment. For each field experiment around 10^15 ssDNA molecules were injected, and water samples were collected 100 - 600 m downstream of the point of injection. Additionally, the field tests were performed with salt and deuterium as tracer. To study possible decay by sunlight and/or microbial activity for synthetic DNA, immediately in the field and for the duration of the entire experiment, we carried out batch experiments. All samples were stored in a 1.5 ml Eppendorf vial in a cool-box in dry ice (-80°C). Quantitative PCR on a Mini Opticon (Bio Rad, Hercules, CA, USA) was carried out to determine DNA concentrations in the samples. Results showed the importance of a strict protocol for working with ssDNA as a tracer for quantitative tracing, since ssDNA interacts with surface areas of glass and plastic, depending on water quality and ionic strength. Interaction with the sediment and decay due to sunlight and/or microbial activity was negligible in most cases. The ssDNA protocol was then tested in natural streams. Promising results were obtained using ssDNA as quantitative tracer. The breakthrough curves using ssDNA were similar to the ones of salt or deuterium. We will present the revised protocol to use ssDNA for multi-tracing experiments in natural streams and discuss the opportunities and limitations.
NASA Astrophysics Data System (ADS)
Katz, B. G.; Bohlke, J.; Hornsby, D.
2001-05-01
Nitrate is readily transported from agricultural activities at the surface to the Upper Floridan aquifer in northern Florida due to karst features mantled by highly permeable sands and a high recharge rate (50 cm/yr). In Suwannee and Lafayette Counties, nitrate contamination of groundwater is widespread due to the 10-30 kg/ha nitrogen (N) applied annually for the past few decades as synthetic fertilizers (the dominant source of N). Water samples were collected from 12 springs during baseflow conditions (1997-99) and monthly from 14 wells (1998-99). Springwaters were analyzed for various chemical (N species, dissolved gases, CFCs) and isotopic tracers (15N, 3H/3He, 18O, D, 13C). Water from wells was analyzed monthly for N species, and during low-flow and high-flow conditions for 15N, 18O, D, and 13C. As a result of oxic conditions in the aquifer, nitrate was the dominant N species in water samples. Large monthly fluctuations of groundwater nitrate concentrations were observed at most wells. Relatively high nitrate concentrations in groundwater from 7 wells likely resulted from seasonal agricultural practices including fertilizer applications and manure spreading on cropland. Relatively low nitrate concentrations in groundwater from two wells during high-flow conditions were related to mixing with river water. Groundwater samples had N-isotope values (3.8-11.7 per mil) that indicated varying mixtures of inorganic and organic N sources, which corresponded in part to varying proportions of synthetic fertilizers and manure applied to fields. In springwaters from Suwannee County, nitrate trends and N-isotope data (2.7-6.2 per mil) were consistent with a peak in fertilizer N input in the late 1970's and a relatively high overall ratio of artificial fertilizer/manure. In contrast, springwater nitrate trends and N-isotope data (4.5-9.1 per mil) in Lafayette County were consistent with a more monotonic increase in fertilizer N input and relatively low overall ratio of artificial fertilizer/manure. Dampened nitrate trends in springwaters in both counties, relative to trends in estimated N inputs, likely were related to ages of groundwater discharging from springs that are on the order of decades (10-30 years), based on 3H/3He and CFC age-dating techniques.
An online-coupled NWP/ACT model with conserved Lagrangian levels
NASA Astrophysics Data System (ADS)
Sørensen, B.; Kaas, E.; Lauritzen, P. H.
2012-04-01
Numerical weather and climate modelling is under constant development. Semi-implicit semi-Lagrangian (SISL) models have proven to be numerically efficient in both short-range weather forecasts and climate models, due to the ability to use long time steps. Chemical/aerosol feedback mechanism are becoming more and more relevant in NWP as well as climate models, since the biogenic and anthropogenic emissions can have a direct effect on the dynamics and radiative properties of the atmosphere. To include chemical feedback mechanisms in the NWP models, on-line coupling is crucial. In 3D semi-Lagrangian schemes with quasi-Lagrangian vertical coordinates the Lagrangian levels are remapped to Eulerian model levels each time step. This remapping introduces an undesirable tendency to smooth sharp gradients and creates unphysical numerical diffusion in the vertical distribution. A semi-Lagrangian advection method is introduced, it combines an inherently mass conserving 2D semi-Lagrangian scheme, with a SISL scheme employing both hybrid vertical coordinates and a fully Lagrangian vertical coordinate. This minimizes the vertical diffusion and thus potentially improves the simulation of the vertical profiles of moisture, clouds, and chemical constituents. Since the Lagrangian levels suffer from traditional Lagrangian limitations caused by the convergence and divergence of the flow, remappings to the Eulerian model levels are generally still required - but this need only be applied after a number of time steps - unless dynamic remapping methods are used. For this several different remapping methods has been implemented. The combined scheme is mass conserving, consistent, and multi-tracer efficient.
Prouty, Nancy G.; Mienis, Furu; Campbell, P.; Roark, E. Brendan; Davies, Andrew; Robertson, Craig M.; Duineveld, Gerard; Ross, Steve W.; Rhodes, M.; Demopoulos, Amanda W.J.
2017-01-01
Submarine canyons are often hotspots of biomass and productivity in the deep sea. However, the majority of deep-sea canyons remain poorly sampled. Using a multi-tracer approach, results from a detailed geochemical investigation from a year-long sediment trap deployment reveals details concerning the source, transport, and fate of particulate matter to the depositional zone (1318 m) of Baltimore Canyon on the US Mid-Atlantic Bight (MAB). Both organic biomarker composition (sterol and n-alkanes) and bulk characteristics (δ13C, Δ14C, Chl-a) suggest that on an annual basis particulate matter from marine and terrestrially-derived organic matter are equally important. However, elevated Chlorophyll-a and sterol concentrations during the spring sampling period highlight the seasonal influx of relatively fresh phytodetritus. In addition, the contemporaneous increase in the particle reactive elements cadmium (Cd) and molybdenum (Mo) in the spring suggest increased scavenging, aggregation, and sinking of biomass during seasonal blooms in response to enhanced surface production within the nutricline. While internal waves within the canyon resuspend sediment between 200 and 600 m, creating a nepheloid layer rich in lithogenic material, near-bed sediment remobilization in the canyon depositional zone is minimal. Instead, vertical transport and lateral transport across the continental margin are the dominant processes driving seasonal input of particulate matter. In turn, seasonal variability in deposited particulate organic matter may be linked to benthic faunal composition and ecosystem scale carbon cycling.
Regulation of CO2 Air Sea Fluxes by Sediments in the North Sea
NASA Astrophysics Data System (ADS)
Burt, William; Thomas, Helmuth; Hagens, Mathilde; Brenner, Heiko; Pätsch, Johannes; Clargo, Nicola; Salt, Lesley
2016-04-01
A multi-tracer approach is applied to assess the impact of boundary fluxes (e.g. benthic input from sediments or lateral inputs from the coastline) on the acid-base buffering capacity, and overall biogeochemistry, of the North Sea. Analyses of both basin-wide observations in the North Sea and transects through tidal basins at the North-Frisian coastline, reveal that surface distributions of the δ13C signature of dissolved inorganic carbon (DIC) are predominantly controlled by a balance between biological production and respiration. In particular, variability in metabolic DIC throughout stations in the well-mixed southern North Sea indicates the presence of an external carbon source, which is traced to the European continental coastline using naturally-occurring radium isotopes (224Ra and 228Ra). 228Ra is also shown to be a highly effective tracer of North Sea total alkalinity (AT) compared to the more conventional use of salinity. Coastal inputs of metabolic DIC and AT are calculated on a basin-wide scale, and ratios of these inputs suggest denitrification as a primary metabolic pathway for their formation. The AT input paralleling the metabolic DIC release prevents a significant decline in pH as compared to aerobic (i.e. unbuffered) release of metabolic DIC. Finally, long-term pH trends mimic those of riverine nitrate loading, highlighting the importance of coastal AT production via denitrification in regulating pH in the southern North Sea.
NASA Astrophysics Data System (ADS)
Wurstner White, S.; Brandenberger, J. M.; Kulongoski, J. T.; Aalseth, C.; Williams, R. M.; Mace, E. K.; Humble, P.; Seifert, A.; Cloutier, J. M.
2015-12-01
Argon-39 has a half-life of 269 years, making it an ideal tracer for groundwater dating in the age range of 50-1000 years. In September 2014, two production wells within the San Joaquin Valley Aquifer System, located in Fresno, CA were sampled and analyzed for a suite of inorganic and organic contaminants and isotopic constituents. The radiotracers 3H (< 50 years) and 14C (> 1000 years) are routinely measured as part of the U. S. Geological Survey (USGS) National Water Quality Assessment (NAWQA) Enhanced Trends Network project. Adding 39Ar to the suite of tracers provides age data in the intermediate range to refine the groundwater age distribution of mixed waters and establishes groundwater residence times and flow rates. Characterizing the groundwater recharge and flow rate is of particular interest at these wells for determining the sources and movement of contaminants in groundwater, particularly nitrate, DBCP, and perchlorate. The sampled wells were pumped and purged. The sample collection for the 39Ar measurements required extracting the dissolved gases from 3000-5000 L of groundwater using a membrane degasification system with a maximum flow rate of 50 gpm (11.4 m^3/hr). The membranes are plastic hollow fibers that are hydrophobic. The gas was collected in duplicate large aluminum coated plastic sample bags. The gas was purified and then counted via direct beta counting using ultra-low background proportional counters loaded with a mixture of geologic Ar and methane to enhance the sensitivity for Ar measurements. The activity of 39Ar is 1.01 Bq/kg Ar, corresponding to an abundance of 0.808 ppq. The estimated absolute ages of the samples from the two groundwater wells were 23.3 and 27.0 percent of modern Ar. The comparison of the groundwater residence times determined using the suite of radiotracers (3H, 39Ar, and 14C) highlighted the value of knowing the intermediate age of groundwater when determining contaminant fate and transport pathways.
NASA Astrophysics Data System (ADS)
Saffer, D. M.; Kopf, A.
2015-12-01
At many subduction zones, pore water geochemical anomalies at seafloor seeps and in shallow boreholes indicate upward fluid flow and chemical transport from depths of several km. Identifying the source regions and flow pathways of these fluids is a key step toward quantifying volatile fluxes through forearcs, and in understanding their potential connection to loci of excess pore pressure along the plate boundary. Here, we focus on observations of pore water freshening (reported in terms of [Cl]), elevated [B], and light δ11B. Pore water freshening is generally thought to result from clay dehydration, whereas the B and δ11B signatures are interpreted to reflect desorption of isotopically light B from pelitic sediments with increasing temperature. We develop a model to track the coupled effects of B desorption, smectite dehydration, and progressive consolidation within the underthrusting sediment section. Our model incorporates established kinetic models of clay dehydration, and experimental data that define the temperature-dependent distribution coefficient (Kd) and fractionation of B in marine sediments. A generic sensitivity analysis demonstrates that the relative timing of heating and consolidation is a dominant control on pore water composition. For cold slabs, freshening is maximized because dehydration releases bound water into low porosity sediment, whereas B concentrations and isotopic signatures are modest because desorption is only partially complete. For warmer slabs, [B] and [Cl] signals are smaller, because heating and desorption occur shallower and into larger porosities, but the predicted δ11B signal is larger. The former scenario is typical of non-accretionary margins where the insulating sediment layer on the subducting plate is commonly <1 km thick. This result provides a quantitative explanation for the global observation that [Cl] depletion and [B] enrichment signals are generally strongest at non-accretionary margins. Application of our multi-tracer approach to the Costa Rica, N. Japan, N. Barbados, and Mediterranean Ridge subduction zones illustrates that clay dehydration and B desorption are viable mechanisms for the generation of observed geochemical signatures, including pore water freshening of over 50%, [B] up to 10x seawater values, and δ11B as low as 17‰.
NASA Astrophysics Data System (ADS)
Chabaux, François; Prunier, Jonathan; Pierret, Marie-Claire; Stille, Peter
2013-04-01
It is proposed in this study to highlight the interest of multi-tracer geochemical approaches combining measurement of major and trace element concentrations along with U and Sr isotopic ratios to constrain the characterization of the present-day weathering processes controlling the chemical composition of waters and soils in natural ecosystems. This is important if we want to predict and to model correctly the response of ecosystems to recent environmental changes. The approach is applied to the small granitic Strengbah Catchment, located in the Vosges Mountain (France), used and equipped as a hydro-geochemical observatory since 1986 (Observatoire Hydro-Géochimique de l'Environnement; http://ohge.u-strasbg.fr). This study includes the analysis of major and trace element concentrations and (U-Sr) isotope ratios in soil solutions collected within two soil profiles located on two experimental plots of this watershed, along with the analysis of soil samples and vegetation samples from these two plots. The depth variation of elemental concentrations of soil solutions confirms the important influence of the vegetation cycling on the budget of Ca, K, Rb and Sr, whereas Mg and Si budget in soil solutions are quasi exclusively controlled by weathering processes. Variation of Sr, and U isotopic ratios with depth also demonstrates that the sources and biogeochemical processes controlling the Sr budget of soil solutions is different in the uppermost soil horizons and in the deeper ones, and clearly influence by the vegetation cycling. From the obtained data, it can be therefore proposed a scheme where in addition to the external flux associated to the decomposition of organic matter and throughfall, occurs a double lithogenic flux: a surface flux which can be associated to dissolution of secondary minerals contained in fine silt fractions and a deeper one, controlled by water-rock interactions which can mobilize elements from primary minerals like plagioclases or orthose. These results shows also that the Strengbach watershed is in a transient state of weathering with an important loss of nutriments such as Ca in soils solutions since 15years, associated with an increase of a lithogenic flux indicating a recent modification of weathering/dissolution reactions involved in the soil horizons. The origin of the weathering modification could be the consequence of the acid rains on weathering granitic bedrock or a consequence of forest exploitation incompatible with the nutriment reserve of soils with recent plantations of conifer, which impoverish soils.
A method to investigate inter-aquifer leakage using hydraulics and multiple environmental tracers
NASA Astrophysics Data System (ADS)
Priestley, Stacey; Love, Andrew; Wohling, Daniel; Post, Vincent; Shand, Paul; Kipfer, Rolf; Tyroller, Lina
2016-04-01
Informed aquifer management decisions regarding sustainable yields or potential exploitation require an understanding of the groundwater system (Alley et al. 2002, Cherry and Parker 2004). Recently, the increase in coal seam gas (CSG) or shale gas production has highlighted the need for a better understanding of inter-aquifer leakage and contaminant migration. In most groundwater systems, the quantity or location of inter-aquifer leakage is unknown. Not taking into account leakage rates in the analysis of large scale flow systems can also lead to significant errors in the estimates of groundwater flow rates in aquifers (Love et al. 1993, Toth 2009). There is an urgent need for robust methods to investigate inter-aquifer leakage at a regional scale. This study builds on previous groundwater flow and inter-aquifer leakage studies to provide a methodology to investigate inter-aquifer leakage in a regional sedimentary basin using hydraulics and a multi-tracer approach. The methodology incorporates geological, hydrogeological and hydrochemical information in the basin to determine the likelihood and location of inter-aquifer leakage. Of particular benefit is the analysis of hydraulic heads and environmental tracers at nested piezometers, or where these are unavailable bore couplets comprising bores above and below the aquitard of interest within a localised geographical area. The proposed methodology has been successful in investigating inter-aquifer leakage in the Arckaringa Basin, South Australia. The suite of environmental tracers and isotopes used to analyse inter-aquifer leakage included the stable isotopes of water, radiocarbon, chloride-36, 87Sr/86Sr and helium isotopes. There is evidence for inter-aquifer leakage in the centre of the basin ~40 km along the regional flow path. This inter-aquifer leakage has been identified by a slight draw-down in the upper aquifer during pumping in the lower aquifer, overlap in Sr isotopes, δ2H, δ18O and chloride concentrations as well as hydrochemical evidence of mixing with shallower groundwater with shorter residence times. References Alley W. M. Healy R. W. Labaugh J. W. Reilly T. E. 2002. Hydrology - Flow and storage in groundwater systems. Science 296: 1985-1990. Cherry J. A. Parker, B. L. 2004. Role of Aquitards in the Protection of Aquifers from Contamination: A "State of Science" Report. Denver, USA. AWWA Research Foundation. Love A. J. Herczeg A. L. Armstrong D. Stadter F. Mazor E. 1993. Groundwater-Flow Regime within the Gambier Embayment of the Otway Basin, Australia - Evidence from Hydraulics and Hydrochemistry. Journal of Hydrology 143: 297-338. Tóth J. 2009. Gravitational Systems of Groundwater Flow: Theory, Evaluation, Utilization. Cambridge University Press.
77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-10
... Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General Services Administration... price analysis technique in order to establish a fair and reasonable price. DATES: Interested parties....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...
Hagedorn, Benjamin; Kerfoot, Henry B; Verwiel, Mark; Matlock, Bruce
2016-07-01
In this study, a multi-tracer approach was applied to a complex, methane-impacted site in Southern California to (1) distinguish between natural gas and landfill gas (LFG)-derived methane impacts at site perimeter gas probes, (2) estimate the relative age of the LFG at these probes, and (3) document natural attenuation trends during a 3-year monitoring period. Relationships between methane and ethane values suggest that at the majority of probes, methane is from LFG and not from natural gas and that the relative contribution of LFG methane at these probes has increased over the monitoring period. To evaluate whether LFG is attenuating in the subsurface, the relative age of LFG was estimated by comparing readily degraded VOCs that are major constituents in LFG (toluene in this case) with those resistant to degradation (Freons). Time-series data trends are consistent with several probes being impacted by fresh LFG from recent releases that occurred after the update of the local LFG collection and control system (LFGCCS). Data further indicate some probes to be only affected by legacy LFG from a past release that occurred prior to the LFGCCS update and that, because of a lack of oxygen in the subsurface, had not been fully degraded. The outlined attenuation evaluation methodology is potentially applicable to other sites or even groundwater contaminants; however, the assessment is limited by the degree of homogeneity of the LFG source composition and non-LFG-derived toluene inputs to the analyzed samples. Published by Elsevier Ltd.
Extremal Correlators in the Ads/cft Correspondence
NASA Astrophysics Data System (ADS)
D'Hoker, Eric; Freedman, Daniel Z.; Mathur, Samir D.; Matusis, Alec; Rastelli, Leonardo
The non-renormalization of the 3-point functions
Chemical and Isotopic Tracers of Groundwater Sustainability: an Overview of New Science Directions
NASA Astrophysics Data System (ADS)
Bullen, T.
2002-12-01
Groundwater sustainability is an emerging concept that is rapidly gaining attention from both scientists and water resource managers, particularly with regard to contamination and degradation of water quality in strategic aquifers. The sustainability of a groundwater resource is a complex function of its susceptibility to factors such as intrusion of poor-quality water from diverse sources, lack of sufficient recharge and reorganization of groundwater flowpaths in response to excessive abstraction. In theory the critical limit occurs when degradation becomes irreversible, such that remediative efforts may be fruitless on a reasonable human time scale. Chemical and isotopic tracers are proving to be especially useful tools for assessment of groundwater sustainability issues such as characterization of recharge, identification of potential sources, pathways and impacts of contaminants and prediction of how hydrology will change in response to excessive abstraction. A variety of relatively cost-efficient tracers are now available with which to assess the susceptibility of groundwater reserves to contamination from both natural and anthropogenic sources, and may provide valuable monitoring and regulatory tools for water resource managers. In this overview, the results of several ongoing groundwater studies by the U.S. Geological Survey will be discussed from the perspective of implications for new science directions for groundwater sustainability research that can benefit water policy development. A fundamental concept is that chemical and isotopic tracers used individually often provide ambiguous information, and are most effective when used in a rigorous "multi-tracer" context that considers the complex linkages between the hydrology, geology and biology of groundwater systems.
Wang, Shiqin; Tang, Changyuan; Song, Xianfang; Yuan, Ruiqiang; Wang, Qinxue; Zhang, Yinghua
2013-07-01
In semi-arid regions, most human activities occur in alluvial fan areas; however, NO3(-) pollution has greatly threatened the shallow groundwater quality. In this paper, δ(15)N-NO3(-) and multi-tracers were used to identify the origin and fate of NO3(-) in groundwater of the Baiyangdian lake watershed, North China Plain. The investigation was conducted in two typical regions: one is the agricultural area located in the upstream of the watershed and another is the region influenced by urban wastewater in the downstream of the watershed. Results indicate that the high NO3(-) concentrations of the upstream shallow groundwater were sourced from fertilizer and manure or sewage leakage, whilst the mixture and denitrification caused the decrease in the NO3(-) concentration along the flow path of the groundwater. In the downstream, industrial and domestic effluent has a great impact on groundwater quality. The contaminated rivers contributed from 45% to 76% of the total recharge to the groundwater within a distance of 40 m from the river. The mixture fraction of the wastewater declined with the increasing distance away from the river. However, groundwater with NO3(-) concentrations larger than 20 mg l(-1) was only distributed in areas near to the polluted river or the sewage irrigation area. It is revealed that the frontier and depression regions of an alluvial fan in a lake watershed with abundant organics, silt and clay sediments have suitable conditions for denitrification in the downstream.
NASA Astrophysics Data System (ADS)
de Vries, Diemer; Hörchens, Lars; Grond, Peter
2007-12-01
The state of the art of wave field synthesis (WFS) systems is that they can reproduce sound sources and secondary (mirror image) sources with natural spaciousness in a horizontal plane, and thus perform satisfactory 2D auralization of an enclosed space, based on multitrace impulse response data measured or simulated along a 2D microphone array. However, waves propagating with a nonzero elevation angle are also reproduced in the horizontal plane, which is neither physically nor perceptually correct. In most listening environments to be auralized, the floor is highly absorptive since it is covered with upholstered seats, occupied during performances by a well-dressed audience. A first-order ceiling reflection, reaching the floor directly or via a wall, will be severely damped and will not play a significant role in the room response anymore. This means that a spatially correct WFS reproduction of first-order ceiling reflections, by means of a loudspeaker array at the ceiling of the auralization reproduction room, is necessary and probably sufficient to create the desired 3D spatial perception. To determine the driving signals for the loudspeakers in the ceiling array, it is necessary to identify the relevant ceiling reflection(s) in the multichannel impulse response data and separate those events from the data set. Two methods are examined to identify, separate, and reproduce the relevant reflections: application of the Radon transform, and decomposition of the data into cylindrical harmonics. Application to synthesized and measured data shows that both methods in principle are able to identify, separate, and reproduce the relevant events.
Holography as a highly efficient renormalization group flow. I. Rephrasing gravity
NASA Astrophysics Data System (ADS)
Behr, Nicolas; Kuperstein, Stanislav; Mukhopadhyay, Ayan
2016-07-01
We investigate how the holographic correspondence can be reformulated as a generalization of Wilsonian renormalization group (RG) flow in a strongly interacting large-N quantum field theory. We first define a highly efficient RG flow as one in which the Ward identities related to local conservation of energy, momentum and charges preserve the same form at each scale. To achieve this, it is necessary to redefine the background metric and external sources at each scale as functionals of the effective single-trace operators. These redefinitions also absorb the contributions of the multitrace operators to these effective Ward identities. Thus, the background metric and external sources become effectively dynamical, reproducing the dual classical gravity equations in one higher dimension. Here, we focus on reconstructing the pure gravity sector as a highly efficient RG flow of the energy-momentum tensor operator, leaving the explicit constructive field theory approach for generating such RG flows to the second part of the work. We show that special symmetries of the highly efficient RG flows carry information through which we can decode the gauge fixing of bulk diffeomorphisms in the corresponding gravity equations. We also show that the highly efficient RG flow which reproduces a given classical gravity theory in a given gauge is unique provided the endpoint can be transformed to a nonrelativistic fixed point with a finite number of parameters under a universal rescaling. The results obtained here are used in the second part of this work, where we do an explicit field-theoretic construction of the RG flow and obtain the dual classical gravity theory.
NASA Astrophysics Data System (ADS)
Pérez Quezadas, Juan; Heilweil, Victor M.; Cortés Silva, Alejandra; Araguas, Luis; Salas Ortega, María del Rocío
2016-12-01
Geochemistry and environmental tracers were used to understand groundwater resources, recharge processes, and potential sources of contamination in the Rio Actopan Basin, Veracruz State, Mexico. Total dissolved solids are lower in wells and springs located in the basin uplands compared with those closer to the coast, likely associated with rock/water interaction. Geochemical results also indicate some saltwater intrusion near the coast and increased nitrate near urban centers. Stable isotopes show that precipitation is the source of recharge to the groundwater system. Interestingly, some high-elevation springs are more isotopically enriched than average annual precipitation at higher elevations, indicating preferential recharge during the drier but cooler winter months when evapotranspiration is reduced. In contrast, groundwater below 1,200 m elevation is more isotopically depleted than average precipitation, indicating recharge occurring at much higher elevation than the sampling site. Relatively cool recharge temperatures, derived from noble gas measurements at four sites (11-20 °C), also suggest higher elevation recharge. Environmental tracers indicate that groundwater residence time in the basin ranges from 12,000 years to modern. While this large range shows varying groundwater flowpaths and travel times, ages using different tracer methods (14C, 3H/3He, CFCs) were generally consistent. Comparing multiple tracers such as CFC-12 with CFC-113 indicates piston-flow to some discharge points, yet binary mixing of young and older groundwater at other points. In summary, groundwater within the Rio Actopan Basin watershed is relatively young (Holocene) and the majority of recharge occurs in the basin uplands and moves towards the coast.
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
ERIC Educational Resources Information Center
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-01-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…
NASA Astrophysics Data System (ADS)
Riml, Joakim; Wörman, Anders; Kunkel, Uwe; Radke, Michael
2013-04-01
Detection of pharmaceutical residues in streaming waters is common in urbanized areas. Although the occurrence and source of these micropollutants is known, their behavior in these aquatic ecosystems is still only partly understood. Specifically, quantitative information of biogeochemical processes in stream-specific environments where predominant reactions occur is often missing. In an attempt to address this knowledge gap, we performed simultaneous tracer tests in Säva Brook, Sweden, with bezafibrate, clofibric acid, diclofenac, ibuprofen, metoprolol and naproxen, as well as with the more inert solutes uranine and Rhodamine WT. The breakthrough curves at five successive sampling stations along a 16 km long stream reach were evaluated using a coupled physical-biogeochemical model framework containing surface water transport together with a representation of transient storage in slow/immobile zones of the stream. The multi-tracer experiment opens for decoupling of hydrological and biogeochemical contribution to the fate, and by linking impact and sensitivity analyses to relative significance of model parameters the most important processes for each contaminant were elucidated. Specifically for Säva Brook, the proposed methodology revealed that the pharmaceutical-contaminated stream water remained in the storage zones for times corresponding to 5-25% of the flow time of the stream. Furthermore, the results indicate a great variability in terms of predominant biogeochemical processes between the different contaminants. Rapid reactions occurring in the transient storage zone attenuated both ibuprofen and clofibric acid, and we conclude that a major degradation pathway for these contaminants was biodegradation in the hyporheic zone. In contrast, bezafibrate, metoprolol, and naproxen were mainly affected by sorption both in the storage zone and the main channel, while diclofenac displayed negligible effects of biogeochemical reactions.
NASA Astrophysics Data System (ADS)
Czarnecki, Sezin; Colak Esetlili, Bihter; Esetlili, Tolga; Tepecik, Mahmut; Anac, Dilek; Düring, Rolf-Alexander
2014-05-01
The study area Güzelhisar Basin is 6 km far from the city Aliaga, Aegean Region in Turkey which represents a rather industrialized area having five large iron and steel factories, but also areas of agriculture. Steel industry in Aliaga is causing metal pollution. Around Güzelhisar Basin and nearby, the dominant crop fields are cotton, maize, vegetables, olive trees and vineyards. Güzelhisar stream and dam water is used for irrigation of the agricultural land. Due to contamination from metal industry in Aliaga, organic farming is not allowed in this region. Industrial activities in the region present a threat on sustainable agriculture. The region is a multi-impacted area in terms of several pollutant sources affecting soil and water quality. The overall objective of the project is to trace back plant nutrients (N, P, K, Ca, Mg, Na, Fe, Mn, Zn, Cu, and B), hazardous substances (i. e. persistent organic pollutants), radionuclides (40K, 232Th, 226Ra/238U), and metal contents (As, Cd, Cr, Co, Cu, Hg, Mn, Ni, Pb, and Zn) by examining the soils, agricultural crops and natural plants from Güzelhisar Basin and water and sediments from Güzelhisar stream and dam. Spatial distribution of pollution will be evaluated by regionalization methods. For this, an advanced analytical methodology will be applied which provides an understanding of sources and occurrence of the respective substances of concern. An innovative multi-tracer approach comprising organic and inorganic marker substances, will identify and quantitatively assess sources and their impact on water pollution and the pollutant pathways in this agricultural crop production system.
NASA Astrophysics Data System (ADS)
Black, Noel F.; McJames, Scott; Rust, Thomas C.; Kadrmas, Dan J.
2008-01-01
We are developing methods for imaging multiple PET tracers in a single scan with staggered injections, where imaging measures for each tracer are separated and recovered using differences in tracer kinetics and radioactive decay. In this work, signal separation performance for rapid dual-tracer 62Cu-PTSM (blood flow) + 62Cu-ATSM (hypoxia) tumor imaging was evaluated in a large animal model. Four dogs with pre-existing tumors received a series of dynamic PET scans with 62Cu-PTSM and 62Cu-ATSM, permitting evaluation of a rapid dual-tracer protocol designed by previous simulation work. Several imaging measures were computed from the dual-tracer data and compared with those from separate, single-tracer imaging. Static imaging measures (e.g. SUV) for each tracer were accurately recovered from dual-tracer data. The wash-in (k1) and wash-out (k2) rate parameters for both tracers were likewise well recovered (r = 0.87-0.99), but k3 was not accurately recovered for PTSM (r = 0.19) and moderately well recovered for ATSM (r = 0.70). Some degree of bias was noted, however, which may potentially be overcome through further refinement of the signal separation algorithms. This work demonstrates that complementary information regarding tumor blood flow and hypoxia can be acquired by a single dual-tracer PET scan, and also that the signal separation procedure works effectively for real physiologic data with realistic levels of kinetic model mismatch. Rapid multi-tracer PET has the potential to improve tumor assessment for image-guide therapy and monitoring, and further investigation with these and other tracers is warranted.
Carbon balance of South Asia constrained by passenger aircraft CO2 measurements
NASA Astrophysics Data System (ADS)
Patra, P. K.; Niwa, Y.; Schuck, T. J.; Brenninkmeijer, C. A.; Machida, T.; Matsueda, H.; Sawa, Y.
2011-12-01
Quantifying the fluxes of carbon dioxide (CO2) between the atmosphere and terrestrial ecosystems in all their diversity, across the continents, is important and urgent for implementing effective mitigating policies. Whereas much is known for Europe and North America for instance, in comparison, South Asia, with 1.6 billion inhabitants and considerable CO2 fluxes, remained terra incognita in this respect. The sole measurement site at Cape Rama does not constrain CO2 fluxes during the summer monsoon season. We use regional measurements of atmospheric CO2 aboard a Lufthansa passenger aircraft between Frankfurt (Germany) and Chennai (India) at cruise altitude, in addition to the existing network sites for 2008, to estimate monthly fluxes for 64-regions using Bayesian inversion and ACTM transport model simulations. The applicability of the model's transport parameterization is confirmed using multi-tracer (SF6, CH4, N2O) simulations for the CARIBIC datasets. The annual carbon flux obtained by including the aircraft data is twice as large as the fluxes simulated by a terrestrial ecosystem model that was applied to prescribe the fluxes used in the inversions. It is shown that South Asia sequestered carbon at a rate of 0.37±0.20 Pg C yr-1 for the years 2007 and 2008, primarily during the summer monsoon season when the water limitation for this tropical ecosystem is relaxed. The seasonality and the strength of the calculated monthly fluxes are successfully validated using independent measurements of vertical CO2 profiles over Delhi and spatial variations at cruising altitude by the CONTRAIL program over Asia aboard Japan Airlines passenger aircraft (Patra et al., 2011). Major challenges remain the verification of the inverse model flux seasonality and annual totals by bottom-up estimations using field measurements and terrestrial ecosystem models.
Sources of nitrate contamination and age of water in large karstic springs of Florida
Katz, B.G.
2004-01-01
In response to concerns about the steady increase in nitrate concentrations over the past several decades in many of Florida's first magnitude spring waters (discharge ???2.8 m3/s), multiple isotopic and other chemical tracers were analyzed in water samples from 12 large springs to assess sources and timescales of nitrate contamination. Nitrate-N concentrations in spring waters ranged from 0.50 to 4.2 mg/L, and ??15N values of nitrate in spring waters ranged from 2.6 to 7.9 per mil. Most ??15N values were below 6 per mil indicating that inorganic fertilizers were the dominant source of nitrogen in these waters. Apparent ages of groundwater discharging from springs ranged from 5 to about 35 years, based on multi-tracer analyses (CFC-12, CFC-113, SF6, 3H/3He) and a piston flow assumption; however, apparent tracer ages generally were not concordant. The most reliable spring-water ages appear to be based on tritium and 3He data, because concentrations of CFCs and SF6 in several spring waters were much higher than would be expected from equilibration with modern atmospheric concentrations. Data for all tracers were most consistent with output curves for exponential and binary mixing models that represent mixtures of water in the Upper Floridan aquifer recharged since the early 1960s. Given that groundwater transit times are on the order of decades and are related to the prolonged input of nitrogen from multiple sources to the aquifer, nitrate could persist in groundwater that flows toward springs for several decades due to slow transport of solutes through the aquifer matrix.
Black, Noel F.; McJames, Scott; Rust, Thomas C.; Kadrmas, Dan J.
2013-01-01
We are developing methods for imaging multiple PET tracers in a single scan with staggered injections, where imaging measures for each tracer are separated and recovered using differences in tracer kinetics and radioactive decay. In this work, signal-separation performance for rapid dual-tracer 62Cu-PTSM (blood flow) + 62Cu-ATSM (hypoxia) tumor imaging was evaluated in a large animal model. Four dogs with pre-existing tumors received a series of dynamic PET scans with 62Cu-PTSM and 62Cu-ATSM, permitting evaluation of a rapid dual-tracer protocol designed by previous simulation work. Several imaging measures were computed from the dual-tracer data and compared with those from separate, single-tracer imaging. Static imaging measures (e.g. SUV) for each tracer were accurately recovered from dual-tracer data. The wash-in (k1) and wash-out (k2) rate parameters for both tracers were likewise well recovered (r = 0.87 – 0.99), but k3 was not accurately recovered for PTSM (r = 0.19) and moderately well recovered for ATSM (r = 0.70). Some degree of bias was noted, however, which may potentially be overcome through further refinement of the signal-separation algorithms. This work demonstrates that complementary information regarding tumor blood flow and hypoxia can be acquired by a single dual-tracer PET scan, and also that the signal-separation procedure works effectively for real physiologic data with realistic levels of kinetic model-mismatch. Rapid multi-tracer PET has the potential to improve tumor assessment for image-guide therapy and monitoring, and further investigation with these and other tracers is warranted. PMID:18182698
Marini, Juan C; Lanpher, Brendan C; Scaglia, Fernando; O'Brien, William E; Sun, Qin; Garlick, Peter J; Jahoor, Farook
2011-01-01
Background: Phenylbutyrate is a drug used in patients with urea cycle disorder to elicit alternative pathways for nitrogen disposal. However, phenylbutyrate administration decreases plasma branched-chain amino acid (BCAA) concentrations, and previous research suggests that phenylbutyrate administration may increase leucine oxidation, which would indicate increased protein degradation and net protein loss. Objective: We investigated the effects of phenylbutyrate administration on whole-body protein metabolism, glutamine, leucine, and urea kinetics in healthy and ornithine transcarbamylase–deficient (OTCD) subjects and the possible benefits of BCAA supplementation during phenylbutyrate therapy. Design: Seven healthy control and 7 partial-OTCD subjects received either phenylbutyrate or no treatment in a crossover design. In addition, the partial-OTCD and 3 null-OTCD subjects received phenylbutyrate and phenylbutyrate plus BCAA supplementation. A multitracer protocol was used to determine the whole-body fluxes of urea and amino acids of interest. Results: Phenylbutyrate administration reduced ureagenesis by ≈15% without affecting the fluxes of leucine, tyrosine, phenylalanine, or glutamine and the oxidation of leucine or phenylalanine. The transfer of 15N from glutamine to urea was reduced by 35%. However, a reduction in plasma concentrations of BCAAs due to phenylbutyrate treatment was observed. BCAA supplementation did not alter the respective baseline fluxes. Conclusions: Prolonged phenylbutyrate administration reduced ureagenesis and the transfer of 15N from glutamine to urea without parallel reductions in glutamine flux and concentration. There were no changes in total-body protein breakdown and amino acid catabolism, which suggests that phenylbutyrate can be used to dispose of nitrogen effectively without adverse effects on body protein economy. PMID:21490144
Jackson, C. Rhett; Du, Enhao; Klaus, Julian; ...
2016-08-12
Interactions among hydraulic conductivity distributions, subsurface topography, and lateral flow are poorly understood. We applied 407 mm of water and a suite of tracers over 51 h to a 12 by 16.5 m forested hillslope segment to determine interflow thresholds, preferential pathway pore velocities, large-scale conductivities, the time series of event water fractions, and the fate of dissolved nutrients. The 12% hillslope featured loamy sand A and E horizons overlying a sandy clay loam Bt at 1.25 m average depth. Interflow measured from two drains within an interception trench commenced after 131 and 208 mm of irrigation. Cumulative interflow equaledmore » 49% of applied water. Conservative tracer differences between the collection drains indicated differences in flow paths and storages within the plot. Event water fractions rose steadily throughout irrigation, peaking at 50% sixteen h after irrigation ceased. Data implied that tightly held water exchanged with event water throughout the experiment and a substantial portion of preevent water was released from the argillic layer. Surface-applied dye tracers bypassed the matrix, with peak concentrations measured shortly after flow commencement, indicating preferential network conductivities of 864–2240 mm/h, yet no macropore flow was observed. Near steady-state flow conditions indicated average conductivities of 460 mm/h and 2.5 mm/h for topsoils and the Bt horizon, respectively. Low ammonium and phosphorus concentrations in the interflow suggested rapid uptake or sorption, while higher nitrate concentrations suggested more conservative transport. Lastly, these results reveal how hydraulic conductivity variation and subsurface topographic complexity explain otherwise paradoxical solute and flow behaviors.« less
Le Croizier, Gaël; Schaal, Gauthier; Gallon, Régis; Fall, Massal; Le Grand, Fabienne; Munaron, Jean-Marie; Rouget, Marie-Laure; Machu, Eric; Le Loc'h, François; Laë, Raymond; De Morais, Luis Tito
2016-12-15
The link between trophic ecology and metal accumulation in marine fish species was investigated through a multi-tracers approach combining fatty acid (FA) and stable isotope (SI) analyses on fish from two contrasted sites on the coast of Senegal, one subjected to anthropogenic metal effluents and another one less impacted. The concentrations of thirteen trace metal elements (As, Cd, Co, Cr, Cu, Fe, Li, Mn, Ni, Pb, Sn, U, and Zn) were measured in fish liver. Individuals from each site were classified into three distinct groups according to their liver FA and muscle SI compositions. Trace element concentrations were tested between groups revealing that bioaccumulation of several metals was clearly dependent on the trophic guild of fish. Furthermore, correlations between individual trophic markers and trace metals gave new insights into the determination of their origin. Fatty acids revealed relationships between the dietary regimes and metal accumulation that were not detected with stable isotopes, possibly due to the trace metal elements analysed in this study. In the region exposed to metallic inputs, the consumption of benthic preys was the main pathway for metal transfer to the fish community while in the unaffected one, pelagic preys represented the main source of metals. Within pelagic sources, metallic transfer to fish depended on phytoplankton taxa on which the food web was based, suggesting that microphytoplankton (i.e., diatoms and dinoflagellates) were a more important source of exposition than nano- and picoplankton. This study confirmed the influence of diet in the metal accumulation of marine fish communities, and proved that FAs are very useful and complementary tools to SIs to link metal accumulation in fish with their trophic ecology. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Herrington, A. R.; Lauritzen, P. H.; Reed, K. A.
2017-12-01
The spectral element dynamical core of the Community Atmosphere Model (CAM) has recently been coupled to an approximately isotropic, finite-volume grid per implementation of the conservative semi-Lagrangian multi-tracer transport scheme (CAM-SE-CSLAM; Lauritzen et al. 2017). In this framework, the semi-Lagrangian transport of tracers are computed on the finite-volume grid, while the adiabatic dynamics are solved using the spectral element grid. The physical parameterizations are evaluated on the finite-volume grid, as opposed to the unevenly spaced Gauss-Lobatto-Legendre nodes of the spectral element grid. Computing the physics on the finite-volume grid reduces numerical artifacts such as grid imprinting, possibly because the forcing terms are no longer computed at element boundaries where the resolved dynamics are least smooth. The separation of the physics grid and the dynamics grid allows for a unique opportunity to understand the resolution sensitivity in CAM-SE-CSLAM. The observed large sensitivity of CAM to horizontal resolution is a poorly understood impediment to improved simulations of regional climate using global, variable resolution grids. Here, a series of idealized moist simulations are presented in which the finite-volume grid resolution is varied relative to the spectral element grid resolution in CAM-SE-CSLAM. The simulations are carried out at multiple spectral element grid resolutions, in part to provide a companion set of simulations, in which the spectral element grid resolution is varied relative to the finite-volume grid resolution, but more generally to understand if the sensitivity to the finite-volume grid resolution is consistent across a wider spectrum of resolved scales. Results are interpreted in the context of prior ideas regarding resolution sensitivity of global atmospheric models.
Fractography of ceramic and metal failures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1984-01-01
STP 827 is organized into the two broad areas of ceramics and metals. The ceramics section covers fracture analysis techniques, surface analysis techniques, and applied fractography. The metals section covers failure analysis techniques, and latest approaches to fractography, and applied fractography.
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke
2013-07-01
This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.
Mulware, Stephen Juma
2015-01-01
The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.
NASA Technical Reports Server (NTRS)
Evers, Ken H.; Bachert, Robert F.
1987-01-01
The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.
NASA Technical Reports Server (NTRS)
Rummler, D. R.
1976-01-01
The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.
Determining the Number of Factors in P-Technique Factor Analysis
ERIC Educational Resources Information Center
Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael
2017-01-01
Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…
Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour
2018-01-12
Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.
Sidaway, Ben; Euloth, Tracey; Caron, Heather; Piskura, Matthew; Clancy, Jessica; Aide, Alyson
2012-07-01
The purpose of this study was to compare the reliability of three previously used techniques for the measurement of ankle dorsiflexion ROM, open-chained goniometry, closed-chained goniometry, and inclinometry, to a novel trigonometric technique. Twenty-one physiotherapy students used four techniques (open-chained goniometry, closed-chained goniometry, inclinometry, and trigonometry) to assess dorsiflexion range of motion in 24 healthy volunteers. All student raters underwent training to establish competence in the four techniques. Raters then measured dorsiflexion with a randomly assigned measuring technique four times over two sessions, one week apart. Data were analyzed using a technique by session analysis of variance, technique measurement variability being the primary index of reliability. Comparisons were also made between the measurements derived from the four techniques and those obtained from a computerized video analysis system. Analysis of the rater measurement variability around the technique means revealed significant differences between techniques with the least variation being found in the trigonometric technique. Significant differences were also found between the technique means but no differences between sessions were evident. The trigonometric technique produced mean ROMs closest in value to those derived from computer analysis. Application of the trigonometric technique resulted in the least variability in measurement across raters and consequently should be considered for use when changes in dorsiflexion ROM need to be reliably assessed. Copyright © 2012 Elsevier B.V. All rights reserved.
Advances in the analysis and design of constant-torque springs
NASA Technical Reports Server (NTRS)
McGuire, John R.; Yura, Joseph A.
1996-01-01
In order to improve the design procedure of constant-torque springs used in aerospace applications, several new analysis techniques have been developed. These techniques make it possible to accurately construct a torque-rotation curve for any general constant-torque spring configuration. These new techniques allow for friction in the system to be included in the analysis, an area of analysis that has heretofore been unexplored. The new analysis techniques also include solutions for the deflected shape of the spring as well as solutions for drum and roller support reaction forces. A design procedure incorporating these new capabilities is presented.
Artificial intelligence techniques used in respiratory sound analysis--a systematic review.
Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian
2014-02-01
Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.
NASA Technical Reports Server (NTRS)
Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.
1977-01-01
The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.
NASA Astrophysics Data System (ADS)
Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.
2002-03-01
Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.
Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M
2007-02-19
A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.
Digital techniques for ULF wave polarization analysis
NASA Technical Reports Server (NTRS)
Arthur, C. W.
1979-01-01
Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.
The integrated manual and automatic control of complex flight systems
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1985-01-01
Pilot/vehicle analysis techniques for optimizing aircraft handling qualities are presented. The analysis approach considered is based on the optimal control frequency domain techniques. These techniques stem from an optimal control approach of a Neal-Smith like analysis on aircraft attitude dynamics extended to analyze the flared landing task. Some modifications to the technique are suggested and discussed. An in depth analysis of the effect of the experimental variables, such as prefilter, is conducted to gain further insight into the flared land task for this class of vehicle dynamics.
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Factor Analysis and Counseling Research
ERIC Educational Resources Information Center
Weiss, David J.
1970-01-01
Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…
Elimination of Perchlorate Oxidizers from Pyrotechnic Flare Compositions
2007-03-09
in candelas ( cd ), where the candela is defined as, 1 cd = 1 lumen /steradian-1. DSC A thermal analysis technique known as Differential...Shorter Wavelength Infrared band routinely monitored in decoy flare performance tests. TGA A thermal analysis technique known as Thermogravimetric ...Scanning Calorimetry DTA A thermal analysis technique known as Differential Thermal Analysis GAP Glycidyl Azide Polymer used as a curable binder in some
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2014 CFR
2014-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2013 CFR
2013-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...
Advanced Undergraduate Experiments in Thermoanalytical Chemistry.
ERIC Educational Resources Information Center
Hill, J. O.; Magee, R. J.
1988-01-01
Describes several experiments using the techniques of thermal analysis and thermometric titrimetry. Defines thermal analysis and several recent branches of the technique. Notes most of the experiments use simple equipment and standard laboratory techniques. (MVL)
Development of analysis techniques for remote sensing of vegetation resources
NASA Technical Reports Server (NTRS)
Draeger, W. C.
1972-01-01
Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.
Quadrant Analysis as a Strategic Planning Technique in Curriculum Development and Program Marketing.
ERIC Educational Resources Information Center
Lynch, James; And Others
1996-01-01
Quadrant analysis, a widely-used research technique, is suggested as useful in college or university strategic planning. The technique uses consumer preference data and produces information suitable for a wide variety of curriculum and marketing decisions. Basic quadrant analysis design is described, and advanced variations are discussed, with…
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Towards generating ECSS-compliant fault tree analysis results via ConcertoFLA
NASA Astrophysics Data System (ADS)
Gallina, B.; Haider, Z.; Carlsson, A.
2018-05-01
Attitude Control Systems (ACSs) maintain the orientation of the satellite in three-dimensional space. ACSs need to be engineered in compliance with ECSS standards and need to ensure a certain degree of dependability. Thus, dependability analysis is conducted at various levels and by using ECSS-compliant techniques. Fault Tree Analysis (FTA) is one of these techniques. FTA is being automated within various Model Driven Engineering (MDE)-based methodologies. The tool-supported CHESS-methodology is one of them. This methodology incorporates ConcertoFLA, a dependability analysis technique enabling failure behavior analysis and thus FTA-results generation. ConcertoFLA, however, similarly to other techniques, still belongs to the academic research niche. To promote this technique within the space industry, we apply it on an ACS and discuss about its multi-faceted potentialities in the context of ECSS-compliant engineering.
Application of dermoscopy image analysis technique in diagnosing urethral condylomata acuminata.
Zhang, Yunjie; Jiang, Shuang; Lin, Hui; Guo, Xiaojuan; Zou, Xianbiao
2018-01-01
In this study, cases with suspected urethral condylomata acuminata were examined by dermoscopy, in order to explore an effective method for clinical. To study the application of dermoscopy image analysis technique in clinical diagnosis of urethral condylomata acuminata. A total of 220 suspected urethral condylomata acuminata were clinically diagnosed first with the naked eyes, and then by using dermoscopy image analysis technique. Afterwards, a comparative analysis was made for the two diagnostic methods. Among the 220 suspected urethral condylomata acuminata, there was a higher positive rate by dermoscopy examination than visual observation. Dermoscopy examination technique is still restricted by its inapplicability in deep urethral orifice and skin wrinkles, and concordance between different clinicians may also vary. Dermoscopy image analysis technique features a high sensitivity, quick and accurate diagnosis and is non-invasive, and we recommend its use.
NASA Technical Reports Server (NTRS)
Lindstrom, David J.; Lindstrom, Richard M.
1989-01-01
Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
Data analysis techniques used at the Oak Ridge Y-12 plant flywheel evaluation laboratory
NASA Astrophysics Data System (ADS)
Steels, R. S., Jr.; Babelay, E. F., Jr.
1980-07-01
Some of the more advanced data analysis techniques applied to the problem of experimentally evaluating the performance of high performance composite flywheels are presented. Real time applications include polar plots of runout with interruptions relating to balance and relative motions between parts, radial growth measurements, and temperature of the spinning part. The technique used to measure torque applied to a containment housing during flywheel failure is also presented. The discussion of pre and post test analysis techniques includes resonant frequency determination with modal analysis, waterfall charts, and runout signals at failure.
Approaches to answering critical CER questions.
Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y
2015-01-01
While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.
Rasch Analysis for Instrument Development: Why, When, and How?
ERIC Educational Resources Information Center
Boone, William J.
2016-01-01
This essay describes Rasch analysis psychometric techniques and how such techniques can be used by life sciences education researchers to guide the development and use of surveys and tests. Specifically, Rasch techniques can be used to document and evaluate the measurement functioning of such instruments. Rasch techniques also allow researchers to…
Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek
2017-07-04
One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
Towards Effective Clustering Techniques for the Analysis of Electric Power Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh
2013-11-30
Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less
Mathematics Competency for Beginning Chemistry Students Through Dimensional Analysis.
Pursell, David P; Forlemu, Neville Y; Anagho, Leonard E
2017-01-01
Mathematics competency in nursing education and practice may be addressed by an instructional variation of the traditional dimensional analysis technique typically presented in beginning chemistry courses. The authors studied 73 beginning chemistry students using the typical dimensional analysis technique and the variation technique. Student quantitative problem-solving performance was evaluated. Students using the variation technique scored significantly better (18.3 of 20 points, p < .0001) on the final examination quantitative titration problem than those who used the typical technique (10.9 of 20 points). American Chemical Society examination scores and in-house assessment indicate that better performing beginning chemistry students were more likely to use the variation technique rather than the typical technique. The variation technique may be useful as an alternative instructional approach to enhance beginning chemistry students' mathematics competency and problem-solving ability in both education and practice. [J Nurs Educ. 2017;56(1):22-26.]. Copyright 2017, SLACK Incorporated.
2015-01-01
for IC fault detection . This section provides background information on inversion methods. Conventional inversion techniques and their shortcomings are...physical techniques, electron beam imaging/analysis, ion beam techniques, scanning probe techniques. Electrical tests are used to detect faults in 13 an...hand, there is also the second harmonic technique through which duty cycle degradation faults are detected by collecting the magnitude and the phase of
Child versus adult psychoanalysis: two processes or one?
Sugarman, Alan
2009-12-01
Child analysis continues to be seen as a different technique from adult analysis because children are still involved in a developmental process and because the primary objects continue to play active roles in their lives. This paper argues that this is a false dichotomy. An extended vignette of the analysis of a latency-aged girl is used to demonstrate that the psychoanalytic process that develops in child analysis is structurally the same as that in adult analysis. Both revolve around the analysis of resistance and transference and use both to promote knowledge of the patient's mind at work. And both techniques formulate interventions based on the analyst's appraisal of the patient's mental organization. It is hoped that stressing the essential commonality of both techniques will promote the development of an overarching theory of psychoanalytic technique.
Biostatistics Series Module 10: Brief Overview of Multivariate Methods.
Hazra, Avijit; Gogtay, Nithya
2017-01-01
Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.
Program risk analysis handbook
NASA Technical Reports Server (NTRS)
Batson, R. G.
1987-01-01
NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.
Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.
ERIC Educational Resources Information Center
Carlson, David H.
1986-01-01
This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
Quarternary Pollen Analysis in Secondary School Ecology
ERIC Educational Resources Information Center
Slater, F. M.
1972-01-01
Describes techniques for studying historic changes in climate by analysis of pollen preserved in peat bogs. Illustrates the methodology and data analysis techniques by reference to results from English research. (AL)
Residual stresses of thin, short rectangular plates
NASA Technical Reports Server (NTRS)
Andonian, A. T.; Danyluk, S.
1985-01-01
The analysis of the residual stresses in thin, short rectangular plates is presented. The analysis is used in conjunction with a shadow moire interferometry technique by which residual stresses are obtained over a large spatial area from a strain measurement. The technique and analysis are applied to a residual stress measurement of polycrystalline silicon sheet grown by the edge-defined film growth technique.
Flow Injection Technique for Biochemical Analysis with Chemiluminescence Detection in Acidic Media
Chen, Jing; Fang, Yanjun
2007-01-01
A review with 90 references is presented to show the development of acidic chemiluminescence methods for biochemical analysis by use of flow injection technique in the last 10 years. A brief discussion of both the chemiluminescence and flow injection technique is given. The proposed methods for biochemical analysis are described and compared according to the used chemiluminescence system.
Computer-assisted techniques to evaluate fringe patterns
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Bhat, Gopalakrishna K.
1992-01-01
Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
2000-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing: Digital Timing Analysis Tools and Techniques. Articles in this issue include: SX and SX-A Series Devices Power Sequencing; JTAG and SXISX-AISX-S Series Devices; Analysis Techniques (i.e., notes on digital timing analysis tools and techniques); Status of the Radiation Hard reconfigurable Field Programmable Gate Array Program, Input Transition Times; Apollo Guidance Computer Logic Study; RT54SX32S Prototype Data Sets; A54SX32A - 0.22 micron/UMC Test Results; Ramtron FM1608 FRAM; and Analysis of VHDL Code and Synthesizer Output.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
Dunphy, C H; Polski, J M; Evans, H L; Gardner, L J
2001-08-01
Immunophenotyping of bone marrow (BM) specimens with acute myelogenous leukemia (AML) may be performed by flow cytometric (FC) or immunohistochemical (IH) techniques. Some markers (CD34, CD15, and CD117) are available for both techniques. Myeloperoxidase (MPO) analysis may be performed by enzyme cytochemical (EC) or IH techniques. To determine the reliability of these markers and MPO by these techniques, we designed a study to compare the results of analyses of these markers and MPO by FC (CD34, CD15, and CD117), EC (MPO), and IH (CD34, CD15, CD117, and MPO) techniques. Twenty-nine AMLs formed the basis of the study. These AMLs all had been immunophenotyped previously by FC analysis; 27 also had had EC analysis performed. Of the AMLs, 29 had BM core biopsies and 26 had BM clots that could be evaluated. The paraffin blocks of the 29 BM core biopsies and 26 BM clots were stained for CD34, CD117, MPO, and CD15. These results were compared with results by FC analysis (CD34, CD15, and CD117) and EC analysis (MPO). Immunodetection of CD34 expression in AML had a similar sensitivity by FC and IH techniques. Immunodetection of CD15 and CD117 had a higher sensitivity by FC analysis than by IH analysis. Detection of MPO by IH analysis was more sensitive than by EC analysis. There was no correlation of French-American-British (FAB) subtype of AML with CD34 or CD117 expression. Expression of CD15 was associated with AMLs with a monocytic component. Myeloperoxidase reactivity by IH analysis was observed in AMLs originally FAB subtyped as M0. CD34 can be equally detected by FC and IH techniques. CD15 and CD117 are better detected by FC analysis and MPO is better detected by IH analysis.
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2010 CFR
2010-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2011 CFR
2011-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2012 CFR
2012-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...
48 CFR 215.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...
Review and classification of variability analysis techniques with clinical applications.
Bravi, Andrea; Longtin, André; Seely, Andrew J E
2011-10-10
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.
NASA Technical Reports Server (NTRS)
Landmann, A. E.; Tillema, H. F.; Marshall, S. E.
1989-01-01
The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.
Review and classification of variability analysis techniques with clinical applications
2011-01-01
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357
Flow analysis techniques for phosphorus: an overview.
Estela, José Manuel; Cerdà, Víctor
2005-04-15
A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.
A Universal Tare Load Prediction Algorithm for Strain-Gage Balance Calibration Data Analysis
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2011-01-01
An algorithm is discussed that may be used to estimate tare loads of wind tunnel strain-gage balance calibration data. The algorithm was originally developed by R. Galway of IAR/NRC Canada and has been described in the literature for the iterative analysis technique. Basic ideas of Galway's algorithm, however, are universally applicable and work for both the iterative and the non-iterative analysis technique. A recent modification of Galway's algorithm is presented that improves the convergence behavior of the tare load prediction process if it is used in combination with the non-iterative analysis technique. The modified algorithm allows an analyst to use an alternate method for the calculation of intermediate non-linear tare load estimates whenever Galway's original approach does not lead to a convergence of the tare load iterations. It is also shown in detail how Galway's algorithm may be applied to the non-iterative analysis technique. Hand load data from the calibration of a six-component force balance is used to illustrate the application of the original and modified tare load prediction method. During the analysis of the data both the iterative and the non-iterative analysis technique were applied. Overall, predicted tare loads for combinations of the two tare load prediction methods and the two balance data analysis techniques showed excellent agreement as long as the tare load iterations converged. The modified algorithm, however, appears to have an advantage over the original algorithm when absolute voltage measurements of gage outputs are processed using the non-iterative analysis technique. In these situations only the modified algorithm converged because it uses an exact solution of the intermediate non-linear tare load estimate for the tare load iteration.
Hybrid soft computing systems for electromyographic signals analysis: a review.
Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates
2014-02-03
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.
Hybrid soft computing systems for electromyographic signals analysis: a review
2014-01-01
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979
Systems design analysis applied to launch vehicle configuration
NASA Technical Reports Server (NTRS)
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Peters, Jeanne M.
1987-01-01
An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.
Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Jong, Jen-Yi
1986-01-01
An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.
Performance analysis of clustering techniques over microarray data: A case study
NASA Astrophysics Data System (ADS)
Dash, Rasmita; Misra, Bijan Bihari
2018-03-01
Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.
Analysis technique for controlling system wavefront error with active/adaptive optics
NASA Astrophysics Data System (ADS)
Genberg, Victor L.; Michels, Gregory J.
2017-08-01
The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2000-01-01
The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.
The combined use of order tracking techniques for enhanced Fourier analysis of order components
NASA Astrophysics Data System (ADS)
Wang, K. S.; Heyns, P. S.
2011-04-01
Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.
Cognitive task analysis: Techniques applied to airborne weapons training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terranova, M.; Seamster, T.L.; Snyder, C.E.
1989-01-01
This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less
Magnetic separation techniques in sample preparation for biological analysis: a review.
He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke
2014-12-01
Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.
Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C
2004-09-30
Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.
Technologies for Clinical Diagnosis Using Expired Human Breath Analysis
Mathew, Thalakkotur Lazar; Pownraj, Prabhahari; Abdulla, Sukhananazerin; Pullithadathil, Biji
2015-01-01
This review elucidates the technologies in the field of exhaled breath analysis. Exhaled breath gas analysis offers an inexpensive, noninvasive and rapid method for detecting a large number of compounds under various conditions for health and disease states. There are various techniques to analyze some exhaled breath gases, including spectrometry, gas chromatography and spectroscopy. This review places emphasis on some of the critical biomarkers present in exhaled human breath, and its related effects. Additionally, various medical monitoring techniques used for breath analysis have been discussed. It also includes the current scenario of breath analysis with nanotechnology-oriented techniques. PMID:26854142
Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques
Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.
2013-01-01
Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.
1976-10-01
A low-cost micromotor combustor technique has been devised to support the development of reduced-smoke solid propellant formulations. The technique...includes a simple, reusable micromotor capable of high chamber pressures, a combustion products collection system, and procedures for analysis of
Eigensystem analysis of classical relaxation techniques with applications to multigrid analysis
NASA Technical Reports Server (NTRS)
Lomax, Harvard; Maksymiuk, Catherine
1987-01-01
Classical relaxation techniques are related to numerical methods for solution of ordinary differential equations. Eigensystems for Point-Jacobi, Gauss-Seidel, and SOR methods are presented. Solution techniques such as eigenvector annihilation, eigensystem mixing, and multigrid methods are examined with regard to the eigenstructure.
Characterization of emission microscopy and liquid crystal thermography in IC fault localization
NASA Astrophysics Data System (ADS)
Lau, C. K.; Sim, K. S.
2013-05-01
This paper characterizes two fault localization techniques - Emission Microscopy (EMMI) and Liquid Crystal Thermography (LCT) by using integrated circuit (IC) leakage failures. The majority of today's semiconductor failures do not reveal a clear visual defect on the die surface and therefore require fault localization tools to identify the fault location. Among the various fault localization tools, liquid crystal thermography and frontside emission microscopy are commonly used in most semiconductor failure analysis laboratories. Many people misunderstand that both techniques are the same and both are detecting hot spot in chip failing with short or leakage. As a result, analysts tend to use only LCT since this technique involves very simple test setup compared to EMMI. The omission of EMMI as the alternative technique in fault localization always leads to incomplete analysis when LCT fails to localize any hot spot on a failing chip. Therefore, this research was established to characterize and compare both the techniques in terms of their sensitivity in detecting the fault location in common semiconductor failures. A new method was also proposed as an alternative technique i.e. the backside LCT technique. The research observed that both techniques have successfully detected the defect locations resulted from the leakage failures. LCT wass observed more sensitive than EMMI in the frontside analysis approach. On the other hand, EMMI performed better in the backside analysis approach. LCT was more sensitive in localizing ESD defect location and EMMI was more sensitive in detecting non ESD defect location. Backside LCT was proven to work as effectively as the frontside LCT and was ready to serve as an alternative technique to the backside EMMI. The research confirmed that LCT detects heat generation and EMMI detects photon emission (recombination radiation). The analysis results also suggested that both techniques complementing each other in the IC fault localization. It is necessary for a failure analyst to use both techniques when one of the techniques produces no result.
Analysis of "The Wonderful Desert." Technical Report No. 170.
ERIC Educational Resources Information Center
Green, G. M.; And Others
This report presents a text analysis of "The Wonderful Desert," a brief selection from the "Reader's Digest Skill Builder" series. (The techniques used in arriving at the analysis are presented in a Reading Center Technical Report, Number 168, "Problems and Techniques of Text Analysis.") Tables are given for a…
Ali, S. J.; Kraus, R. G.; Fratanduono, D. E.; ...
2017-05-18
Here, we developed an iterative forward analysis (IFA) technique with the ability to use hydrocode simulations as a fitting function for analysis of dynamic compression experiments. The IFA method optimizes over parameterized quantities in the hydrocode simulations, breaking the degeneracy of contributions to the measured material response. Velocity profiles from synthetic data generated using a hydrocode simulation are analyzed as a first-order validation of the technique. We also analyze multiple magnetically driven ramp compression experiments on copper and compare with more conventional techniques. Excellent agreement is obtained in both cases.
NASA Astrophysics Data System (ADS)
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-09-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yonggang
In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less
NASA Technical Reports Server (NTRS)
Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.
2013-01-01
Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.
Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.
NASA Technical Reports Server (NTRS)
Leonard, Desiree M.
1991-01-01
Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
A methodological comparison of customer service analysis techniques
James Absher; Alan Graefe; Robert Burns
2003-01-01
Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...
Phased-mission system analysis using Boolean algebraic methods
NASA Technical Reports Server (NTRS)
Somani, Arun K.; Trivedi, Kishor S.
1993-01-01
Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.
Henze, Marcus; Mohammed, Ashour; Mier, Walter; Rudat, Volker; Dietz, Andreas; Nollert, Jörg; Eisenhut, Michael; Haberkorn, Uwe
2002-03-01
While fluorine-18 2-fluoro-2-deoxy-D-glucose (FDG) positron emission tomography (PET) is helpful in the pretherapeutic evaluation of head and neck cancer, it is only available in selected centres. Therefore, single-photon emission tomography (SPET) tracers would be desirable if they were to demonstrate tumour uptake reliably. This multitracer study was performed to evaluate the pretherapeutic uptake of the SPET tracers iodine-123 alpha-methyl-L-tyrosine (IMT) and technetium-99m hexakis-2-methoxyisobutylisonitrile (99mTc-MIBI) in primary carcinomas of the hypopharynx and larynx and to compare the results with those of FDG PET. We examined 22 fasted patients (20 male, 2 female, mean age 60.5+/-10.2 years) with histologically confirmed carcinoma of the hypopharynx (n=9) or larynx (n=13), within 1 week before therapy. In 20 patients a cervical PET scan was acquired after intravenous injection of 232+/-43 MBq 18F-FDG. Data analysis was semiquantitative, being based on standardised uptake values (SUVs) obtained at 60-90 min after injection. After injection of 570+/-44 MBq 99mTc-MIBI, cervical SPET scans (high-resolution collimator, 64x64 matrix, 64 steps, 40 s each) were obtained in 19 patients, 15 and 60 min after tracer injection. Finally, 15 min after injection of 327+/-93 MBq 123I-IMT (medium-energy collimator, 64x64 matrix, 64 steps, 40 s each) SPET scans were acquired in 15 patients. All images were analysed visually and by calculating the tumour to nuchal muscle ratio. Eighteen of 20 (90%) carcinomas showed an increased glucose metabolism, with a mean SUV of 8.7 and a mean carcinoma to muscle ratio of 7.3. The IMT uptake was increased in 13 of 15 (87%) patients, who had a mean carcinoma to muscle ratio of 2.9. Only 13 of 19 (68%) carcinomas revealed pathological MIBI uptake, with a mean tumour to muscle ratio of 2.2 and no significant difference between early and late MIBI SPET images (P=0.23). In conclusion, in the diagnosis of primary carcinomas of the hypopharynx and larynx, IMT SPET achieved a detection rate comparable to that of FDG PET. IMT SPET was clearly superior to MIBI SPET in this population. A further evaluation of the specificity of IMT in a larger number of patients appears justified.
Bilek, Maciej; Namieśnik, Jacek
2016-01-01
For a long time, chromatographic techniques and techniques related to them have stimulated the development of new procedures in the field of pharmaceutical analysis. The newly developed methods, characterized by improved metrological parameters, allow for more accurate testing of, among others, the composition of raw materials, intermediates and final products. The chromatographic techniques also enable studies on waste generated in research laboratories and factories producing pharmaceuticals and parapharmaceuticals. Based on the review of reports published in Polish pharmaceutical journals, we assessed the impact of chromatographic techniques on the development of pharmaceutical analysis. The first chromatographic technique used in pharmaceutical analysis was a so-called capillary analysis. It was applied in the 1930s to control the identity of pharmaceutical formulations. In the 1940s and 1950s, the chromatographic techniques were mostly a subject of review publications, while their use in experimental work was rare. Paper chromatography and thin layer chromatography were introduced in the 1960s and 1970s, respectively. These new analytical tools have contributed to the intensive development of research in the field of phytochemistry and the analysis of herbal medicines. The development of colunm chromatography-based techniques, i.e., gas chromatography and high performance liquid chromatography took place in the end of 20th century. Both aforementioned techniques were widely applied in pharmaceutical analysis, for example, to assess the stability of drugs, test for impurities and degradation products as well as in pharmacokinetics studies. The first decade of 21" century was the time of new detection methods in gas and liquid chromatography. The information sources used to write this article were Polish pharmaceutical journals, both professional and scientific, originating from the interwar and post-war period, i.e., "Kronika Farmaceutyczna", "Farmacja Współczesna", "Wiadomości Farmaceutyczne", "Acta Poloniae Pharmaceutica", "Farmacja Polska", "Dissertationes Pharmaceuticae", "Annales UMCS sectio DDD Phamacia". The number of published works using various chromatography techniques was assessed based on the content description of individual issues of the journal "Acta Poloniae Pharmaceutica".
Streamflow characterization using functional data analysis of the Potomac River
NASA Astrophysics Data System (ADS)
Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.
2013-12-01
Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.
Application of a sensitivity analysis technique to high-order digital flight control systems
NASA Technical Reports Server (NTRS)
Paduano, James D.; Downing, David R.
1987-01-01
A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.
NASA Astrophysics Data System (ADS)
Shao, Xupeng
2017-04-01
Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy
Finite element modeling of truss structures with frequency-dependent material damping
NASA Technical Reports Server (NTRS)
Lesieutre, George A.
1991-01-01
A physically motivated modelling technique for structural dynamic analysis that accommodates frequency dependent material damping was developed. Key features of the technique are the introduction of augmenting thermodynamic fields (AFT) to interact with the usual mechanical displacement field, and the treatment of the resulting coupled governing equations using finite element analysis methods. The AFT method is fully compatible with current structural finite element analysis techniques. The method is demonstrated in the dynamic analysis of a 10-bay planar truss structure, a structure representative of those contemplated for use in future space systems.
Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials
NASA Technical Reports Server (NTRS)
Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.
2004-01-01
A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.
Analytical techniques: A compilation
NASA Technical Reports Server (NTRS)
1975-01-01
A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.
Correlating Detergent Fiber Analysis and Dietary Fiber Analysis Data for Corn Stover
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfrum, E. J.; Lorenz, A. J.; deLeon, N.
There exist large amounts of detergent fiber analysis data [neutral detergent fiber (NDF), acid detergent fiber (ADF), acid detergent lignin (ADL)] for many different potential cellulosic ethanol feedstocks, since these techniques are widely used for the analysis of forages. Researchers working in the area of cellulosic ethanol are interested in the structural carbohydrates in a feedstock (principally glucan and xylan), which are typically determined by acid hydrolysis of the structural fraction after multiple extractions of the biomass. These so-called dietary fiber analysis methods are significantly more involved than detergent fiber analysis methods. The purpose of this study was to determinemore » whether it is feasible to correlate detergent fiber analysis values to glucan and xylan content determined by dietary fiber analysis methods for corn stover. In the detergent fiber analysis literature cellulose is often estimated as the difference between ADF and ADL, while hemicellulose is often estimated as the difference between NDF and ADF. Examination of a corn stover dataset containing both detergent fiber analysis data and dietary fiber analysis data predicted using near infrared spectroscopy shows that correlations between structural glucan measured using dietary fiber techniques and cellulose estimated using detergent techniques, and between structural xylan measured using dietary fiber techniques and hemicellulose estimated using detergent techniques are high, but are driven largely by the underlying correlation between total extractives measured by fiber analysis and NDF/ADF. That is, detergent analysis data is correlated to dietary fiber analysis data for structural carbohydrates, but only indirectly; the main correlation is between detergent analysis data and solvent extraction data produced during the dietary fiber analysis procedure.« less
Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques
2018-04-30
Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice
Real time automatic detection of bearing fault in induction machine using kurtogram analysis.
Tafinine, Farid; Mokrani, Karim
2012-11-01
A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.
Holographic Interferometry and Image Analysis for Aerodynamic Testing
1980-09-01
tunnels, (2) development of automated image analysis techniques for reducing quantitative flow-field data from holographic interferograms, and (3...investigation and development of software for the application of digital image analysis to other photographic techniques used in wind tunnel testing.
Iterative categorization (IC): a systematic technique for analysing qualitative data
2016-01-01
Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155
Ground Vibration Test Planning and Pre-Test Analysis for the X-33 Vehicle
NASA Technical Reports Server (NTRS)
Bedrossian, Herand; Tinker, Michael L.; Hidalgo, Homero
2000-01-01
This paper describes the results of the modal test planning and the pre-test analysis for the X-33 vehicle. The pre-test analysis included the selection of the target modes, selection of the sensor and shaker locations and the development of an accurate Test Analysis Model (TAM). For target mode selection, four techniques were considered, one based on the Modal Cost technique, one based on Balanced Singular Value technique, a technique known as the Root Sum Squared (RSS) method, and a Modal Kinetic Energy (MKE) approach. For selecting sensor locations, four techniques were also considered; one based on the Weighted Average Kinetic Energy (WAKE), one based on Guyan Reduction (GR), one emphasizing engineering judgment, and one based on an optimum sensor selection technique using Genetic Algorithm (GA) search technique combined with a criteria based on Hankel Singular Values (HSV's). For selecting shaker locations, four techniques were also considered; one based on the Weighted Average Driving Point Residue (WADPR), one based on engineering judgment and accessibility considerations, a frequency response method, and an optimum shaker location selection based on a GA search technique combined with a criteria based on HSV's. To evaluate the effectiveness of the proposed sensor and shaker locations for exciting the target modes, extensive numerical simulations were performed. Multivariate Mode Indicator Function (MMIF) was used to evaluate the effectiveness of each sensor & shaker set with respect to modal parameter identification. Several TAM reduction techniques were considered including, Guyan, IRS, Modal, and Hybrid. Based on a pre-test cross-orthogonality checks using various reduction techniques, a Hybrid TAM reduction technique was selected and was used for all three vehicle fuel level configurations.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
ERIC Educational Resources Information Center
Foley, John P., Jr.
A study was conducted to refine and coordinate occupational analysis, job performance aids, and elements of the instructional systems development process for task specific Air Force maintenance training. Techniques for task identification and analysis (TI & A) and data gathering techniques for occupational analysis were related. While TI &…
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.
Gorzsás, András; Sundberg, Björn
2014-01-01
Fourier transform infrared (FT-IR) spectroscopy is a fast, sensitive, inexpensive, and nondestructive technique for chemical profiling of plant materials. In this chapter we discuss the instrumental setup, the basic principles of analysis, and the possibilities for and limitations of obtaining qualitative and semiquantitative information by FT-IR spectroscopy. We provide detailed protocols for four fully customizable techniques: (1) Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS): a sensitive and high-throughput technique for powders; (2) attenuated total reflectance (ATR) spectroscopy: a technique that requires no sample preparation and can be used for solid samples as well as for cell cultures; (3) microspectroscopy using a single element (SE) detector: a technique used for analyzing sections at low spatial resolution; and (4) microspectroscopy using a focal plane array (FPA) detector: a technique for rapid chemical profiling of plant sections at cellular resolution. Sample preparation, measurement, and data analysis steps are listed for each of the techniques to help the user collect the best quality spectra and prepare them for subsequent multivariate analysis.
A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.
Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian
2018-01-19
This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Srivastava, Anjali
The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.
Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan
2018-04-23
Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.
Techniques for the analysis of data from coded-mask X-ray telescopes
NASA Technical Reports Server (NTRS)
Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.
1987-01-01
Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.
A histogram-based technique for rapid vector extraction from PIV photographs
NASA Technical Reports Server (NTRS)
Humphreys, William M., Jr.
1991-01-01
A new analysis technique, performed totally in the image plane, is proposed which rapidly extracts all available vectors from individual interrogation regions on PIV photographs. The technique avoids the need for using Fourier transforms with the associated computational burden. The data acquisition and analysis procedure is described, and results of a preliminary simulation study to evaluate the accuracy of the technique are presented. Recently obtained PIV photographs are analyzed.
Three Techniques for Task Analysis: Examples from the Nuclear Utilities.
ERIC Educational Resources Information Center
Carlisle, Kenneth E.
1984-01-01
Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)
UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.
Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois
2018-03-01
Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.
Predicting Effective Course Conduction Strategy Using Datamining Techniques
ERIC Educational Resources Information Center
Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.
2017-01-01
Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
An algol program for dissimilarity analysis: a divisive-omnithetic clustering technique
Tipper, J.C.
1979-01-01
Clustering techniques are used properly to generate hypotheses about patterns in data. Of the hierarchical techniques, those which are divisive and omnithetic possess many theoretically optimal properties. One such method, dissimilarity analysis, is implemented here in ALGOL 60, and determined to be competitive computationally with most other methods. ?? 1979.
Islas, Gabriela; Hernandez, Prisciliano
2017-01-01
To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027
Sample preparation techniques for the determination of trace residues and contaminants in foods.
Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M
2007-06-15
The determination of trace residues and contaminants in complex matrices, such as food, often requires extensive sample extraction and preparation prior to instrumental analysis. Sample preparation is often the bottleneck in analysis and there is a need to minimise the number of steps to reduce both time and sources of error. There is also a move towards more environmentally friendly techniques, which use less solvent and smaller sample sizes. Smaller sample size becomes important when dealing with real life problems, such as consumer complaints and alleged chemical contamination. Optimal sample preparation can reduce analysis time, sources of error, enhance sensitivity and enable unequivocal identification, confirmation and quantification. This review considers all aspects of sample preparation, covering general extraction techniques, such as Soxhlet and pressurised liquid extraction, microextraction techniques such as liquid phase microextraction (LPME) and more selective techniques, such as solid phase extraction (SPE), solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The applicability of each technique in food analysis, particularly for the determination of trace organic contaminants in foods is discussed.
Use of different spectroscopic techniques in the analysis of Roman age wall paintings.
Agnoli, Francesca; Calliari, Irene; Mazzocchin, Gian-Antonio
2007-01-01
In this paper the analysis of samples of Roman age wall paintings coming from: Pordenone, Vicenza and Verona is carried out by using three different techniques: energy dispersive x-rays spectroscopy (EDS), x-rays fluorescence (XRF) and proton induced x-rays emission (PIXE). The features of the three spectroscopic techniques in the analysis of samples of archaeological interest are discussed. The studied pigments were: cinnabar, yellow ochre, green earth, Egyptian blue and carbon black.
Current status of the real-time processing of complex radar signatures
NASA Astrophysics Data System (ADS)
Clay, E.
The real-time processing technique developed by ONERA to characterize radar signatures at the Brahms station is described. This technique is used for the real-time analysis of the RCS of airframes and rotating parts, the one-dimensional tomography of aircraft, and the RCS of electromagnetic decoys. Using this technique, it is also possible to optimize the experimental parameters, i.e., the analysis band, the microwave-network gain, and the electromagnetic window of the analysis.
NASA Technical Reports Server (NTRS)
Towner, Robert L.; Band, Jonathan L.
2012-01-01
An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.
Evidential Reasoning in Expert Systems for Image Analysis.
1985-02-01
techniques to image analysis (IA). There is growing evidence that these techniques offer significant improvements in image analysis , particularly in the...2) to provide a common framework for analysis, (3) to structure the ER process for major expert-system tasks in image analysis , and (4) to identify...approaches to three important tasks for expert systems in the domain of image analysis . This segment concluded with an assessment of the strengths
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-12
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.
Earth rotation, station coordinates and orbit determination from satellite laser ranging
NASA Astrophysics Data System (ADS)
Murata, Masaaki
The Project MERIT, a special program of international colaboration to Monitor Earth Rotation and Intercompare the Techniques of observation and analysis, has come to an end with great success. Its major objective was to evaluate the ultimate potential of space techniques such as VLBI and satellite laser ranging, in contrast with the other conventional techniques, in the determination of rotational dynamics of the earth. The National Aerospace Laboratory (NAL) has officially participated in the project as an associate analysis center for satellite laser technique for the period of the MERIT Main Campaign (September 1983-October 1984). In this paper, the NAL analysis center results are presented.
NASA Astrophysics Data System (ADS)
Li, Xinyi; Bao, Jingfu; Huang, Yulin; Zhang, Benfeng; Omori, Tatsuya; Hashimoto, Ken-ya
2018-07-01
In this paper, we propose the use of the hierarchical cascading technique (HCT) for the finite element method (FEM) analysis of bulk acoustic wave (BAW) devices. First, the implementation of this technique is presented for the FEM analysis of BAW devices. It is shown that the traveling-wave excitation sources proposed by the authors are fully compatible with the HCT. Furthermore, a HCT-based absorbing mechanism is also proposed to replace the perfectly matched layer (PML). Finally, it is demonstrated how the technique is much more efficient in terms of memory consumption and execution time than the full FEM analysis.
Techniques for Analysis of DSN 64-meter Antenna Azimuth Bearing Film Height Records
NASA Technical Reports Server (NTRS)
Stevens, R.; Quach, C. T.
1983-01-01
The DSN 64-m antennas use oil pad azimuth thrust bearings. Instrumentation on the bearing pads measures the height of the oil film between the pad and the bearing runner. Techniques to analyze the film height record are developed and discussed. The analysis techniques present the unwieldy data in a compact form for assessment of bearing condition. The techniques are illustrated by analysis of a small sample of film height records from each of the three 64-m antennas. The results show the general condition of the bearings of DSS 43 and DSS 63 as good to excellent, and a DSS 14 as marginal.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-01
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839
Some failure modes and analysis techniques for terrestrial solar cell modules
NASA Technical Reports Server (NTRS)
Shumka, A.; Stern, K. H.
1978-01-01
Analysis data are presented on failed/defective silicon solar cell modules of various types and produced by different manufacturers. The failure mode (e.g., internal short and open circuits, output power degradation, isolation resistance degradation, etc.) are discussed in detail and in many cases related to the type of technology used in the manufacture of the modules; wherever applicable, appropriate corrective actions are recommended. Consideration is also given to some failure analysis techniques that are applicable to such modules, including X-ray radiography, capacitance measurement, cell shunt resistance measurement by the shadowing technique, steady-state illumination test station for module performance illumination, laser scanning techniques, and the SEM.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
Screening for trace explosives by AccuTOF™-DART®: an in-depth validation study.
Sisco, Edward; Dake, Jeffrey; Bridge, Candice
2013-10-10
Ambient ionization mass spectrometry is finding increasing utility as a rapid analysis technique in a number of fields. In forensic science specifically, analysis of many types of samples, including drugs, explosives, inks, bank dye, and lotions, has been shown to be possible using these techniques [1]. This paper focuses on one type of ambient ionization mass spectrometry, Direct Analysis in Real Time Mass Spectrometry (DART-MS or DART), and its viability as a screening tool for trace explosives analysis. In order to assess viability, a validation study was completed which focused on the analysis of trace amounts of nitro and peroxide based explosives. Topics which were studied, and are discussed, include method optimization, reproducibility, sensitivity, development of a search library, discrimination of mixtures, and blind sampling. Advantages and disadvantages of this technique over other similar screening techniques are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A review of intelligent systems for heart sound signal analysis.
Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S
2017-10-01
Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.
A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Etingov, Pavel V.; Ren, Huiying
2016-07-18
This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.
K-Fold Crossvalidation in Canonical Analysis.
ERIC Educational Resources Information Center
Liang, Kun-Hsia; And Others
1995-01-01
A computer-assisted, K-fold cross-validation technique is discussed in the framework of canonical correlation analysis of randomly generated data sets. Analysis results suggest that this technique can effectively reduce the contamination of canonical variates and canonical correlations by sample-specific variance components. (Author/SLD)
Automated analysis and classification of melanocytic tumor on skin whole slide images.
Xu, Hongming; Lu, Cheng; Berendt, Richard; Jha, Naresh; Mandal, Mrinal
2018-06-01
This paper presents a computer-aided technique for automated analysis and classification of melanocytic tumor on skin whole slide biopsy images. The proposed technique consists of four main modules. First, skin epidermis and dermis regions are segmented by a multi-resolution framework. Next, epidermis analysis is performed, where a set of epidermis features reflecting nuclear morphologies and spatial distributions is computed. In parallel with epidermis analysis, dermis analysis is also performed, where dermal cell nuclei are segmented and a set of textural and cytological features are computed. Finally, the skin melanocytic image is classified into different categories such as melanoma, nevus or normal tissue by using a multi-class support vector machine (mSVM) with extracted epidermis and dermis features. Experimental results on 66 skin whole slide images indicate that the proposed technique achieves more than 95% classification accuracy, which suggests that the technique has the potential to be used for assisting pathologists on skin biopsy image analysis and classification. Copyright © 2018 Elsevier Ltd. All rights reserved.
Bonetti, Jennifer; Quarino, Lawrence
2014-05-01
This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications. © 2014 American Academy of Forensic Sciences.
Development of a sensitivity analysis technique for multiloop flight control systems
NASA Technical Reports Server (NTRS)
Vaillard, A. H.; Paduano, J.; Downing, D. R.
1985-01-01
This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis
NASA Astrophysics Data System (ADS)
Chou, Hui-Yu; Yang, Jyh-Bin
2017-10-01
The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.
NASA Astrophysics Data System (ADS)
Tolstikov, Vladimir V.
Analysis of the metabolome with coverage of all of the possibly detectable components in the sample, rather than analysis of each individual metabolite at a given time, can be accomplished by metabolic analysis. Targeted and/or nontargeted approaches are applied as needed for particular experiments. Monitoring hundreds or more metabolites at a given time requires high-throughput and high-end techniques that enable screening for relative changes in, rather than absolute concentrations of, compounds within a wide dynamic range. Most of the analytical techniques useful for these purposes use GC or HPLC/UPLC separation modules coupled to a fast and accurate mass spectrometer. GC separations require chemical modification (derivatization) before analysis, and work efficiently for the small molecules. HPLC separations are better suited for the analysis of labile and nonvolatile polar and nonpolar compounds in their native form. Direct infusion and NMR-based techniques are mostly used for fingerprinting and snap phenotyping, where applicable. Discovery and validation of metabolic biomarkers are exciting and promising opportunities offered by metabolic analysis applied to biological and biomedical experiments. We have demonstrated that GC-TOF-MS, HPLC/UPLC-RP-MS and HILIC-LC-MS techniques used for metabolic analysis offer sufficient metabolome mapping providing researchers with confident data for subsequent multivariate analysis and data mining.
Integrative sparse principal component analysis of gene expression data.
Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge
2017-12-01
In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.
Error analysis of multi-needle Langmuir probe measurement technique.
Barjatya, Aroh; Merritt, William
2018-04-01
Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.
Error analysis of multi-needle Langmuir probe measurement technique
NASA Astrophysics Data System (ADS)
Barjatya, Aroh; Merritt, William
2018-04-01
Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.
Recent Electrochemical and Optical Sensors in Flow-Based Analysis
Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn
2006-01-01
Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.
[Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].
Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia
2008-07-01
Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.
Safety analysis in test facility design
NASA Astrophysics Data System (ADS)
Valk, A.; Jonker, R. J.
1990-09-01
The application of safety analysis techniques as developed in, for example nuclear and petrochemical industry, can be very beneficial in coping with the increasing complexity of modern test facility installations and their operations. To illustrate the various techniques available and their phasing in a project, an overview of the most commonly used techniques is presented. Two case studies are described: the hazard and operability study techniques and safety zoning in relation to the possible presence of asphyxiating atmospheres.
NASA Technical Reports Server (NTRS)
Garvin, J. B.; Mouginis-Mark, P. J.; Head, J. W.
1981-01-01
A data collection and analysis scheme developed for the interpretation of rock morphology from lander images is reviewed with emphasis on rock population characterization techniques. Data analysis techniques are also discussed in the context of identifying key characteristics of a rock that place it in a single category with similar rocks. Actual rock characteristics observed from Viking and Venera lander imagery are summarized. Finally, some speculations regarding the block fields on Mars and Venus are presented.
Electrolytic preconcentration in instrumental analysis.
Sioda, R E; Batley, G E; Lund, W; Wang, J; Leach, S C
1986-05-01
The use of electrolytic deposition as a separation and preconcentration step in trace metal analysis is reviewed. Both the principles and applications of the technique are dealt with in some detail. Electrolytic preconcentration can be combined with a variety of instrumental techniques. Special attention is given to stripping voltammetry, potentiometric stripping analysis, different combinations with atomic-absorption spectrometry, and the use of flow-through porous electrodes. It is pointed out that the electrolytic preconcentration technique deserves more extensive use as well as fundamental investigation.
Dompierre, Kathryn A; Barbour, S Lee
2016-06-01
Soft tailings pose substantial challenges for mine reclamation due to their high void ratios and low shear strengths, particularly for conventional terrestrial reclamation practices. Oil sands mine operators have proposed the development of end pit lakes to contain the soft tailings, called fluid fine tailings (FFT), generated when bitumen is removed from oil sands ore. End pit lakes would be constructed within mined-out pits with FFT placed below the lake water. However, the feasibility of isolating the underlying FFT has yet to be fully evaluated. Chemical constituents of interest may move from the FFT into the lake water via two key processes: (1) advective-diffusive mass transport with upward pore water flow caused by settling of the FFT; and (2) mixing created by wind events or unstable density profiles through the lake water and upper portion of the FFT. In 2013 and 2014, temperature and stable isotopes of water profiles were measured through the FFT and lake water in the first end pit lake developed by Syncrude Canada Ltd. Numerical modelling was undertaken to simulate these profiles to identify the key mechanisms controlling conservative mass transport in the FFT. Shallow mixing of the upper 1.1 m of FFT with lake water was required to explain the observed temperature and isotopic profiles. Following mixing, the re-establishment of both the temperature and isotope profiles required an upward advective flux of approximately 1.5 m/year, consistent with average FFT settling rates measured at the study site. These findings provide important insight on the ability to sequester soft tailings in an end pit lake, and offer a foundation for future research on the development of end pit lakes as an oil sands reclamation strategy. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Miller, A. J.; Allison, M. A.; Bianchi, T. S.; Marcantonio, F.
2012-12-01
Sediment cores collected from Simpson Lagoon on the inner Beaufort Sea shelf adjacent to the Colville River delta, AK are being utilized to develop new, high-resolution (sub-decadal scale) archives of the 0-3,000 year Arctic paleoclimate record necessary to assess natural and anthropogenic climate variability. An imperative first step for developing a new paleoclimate archive is to establish methodologies for constraining the age-depth relationship. Naturally occurring and bomb-produced radioisotopes have been utilized in sediments to constrain downcore variability of accumulation rates on 100-103 y timescales, but this methodology is complicated by low activities of many of these tracers at high latitudes. The present study utilizes the combination of a (1) multi-tracer approach and a (2) tailored measurement strategy to overcome this limitation. 210Pb and 137Cs analyses were conducted on the fine (<32μm) sediment fraction to maximize measurable activity and to minimize radioisotope activity variability resulting from changes in grain size: 137Cs geochronologies proved more reliable in this setting and revealed mm/y sediment accumulation in the lagoon. To corroborate the 137Cs results, 239,240Pu activities were analyzed for selected sites using ICP-MS which has ultra-low detection limits, and yielded accumulation rates that matched the Cs geochronology. Age model development for the remainder of the core lengths (>~100 y in age) were completed using radiocarbon dating of benthic foraminifera tests, which proved the only datable in situ carbon available in this sediment archive. These dates have been used to constrain the ages of acoustic reflectors in CHIRP subbottom seismic records collected from the lagoon. Using this age control, spatial patterns of lagoonal sediment accumulation over the last ~3 ky were derived from the CHIRP data. Two depocenters are identified and validate combining age-dated coring with high-resolution seismic profiling to identify areas of the highest temporal resolution for Arctic paleoclimate research in coastal sediments.
Santoni, S; Huneau, F; Garel, E; Aquilina, L; Vergnaud-Ayraud, V; Labasque, T; Celle-Jeanton, H
2016-12-15
This study aims at identifying the water-rock interactions and mixing rates within a complex granite-carbonate coastal aquifer under high touristic pressure. Investigations have been carried out within the coastal aquifer of Bonifacio (southern Corsica, France) mainly composed of continental granitic weathering products and marine calcarenite sediments filling a granitic depression. A multi-tracer approach combining physico-chemical parameters, major ions, selected trace elements, stable isotopes of the water molecule and 87 Sr/ 86 Sr ratios measurements is undertaken for 20 groundwater samples during the low water period in November 2014. 5 rock samples of the sedimentary deposits and surrounding granites are also analysed. First, the water-rock interactions processes governing the groundwater mineralization are described in order to fix the hydrogeochemical background. Secondly, the flow conditions are refined through the quantification of inter aquifer levels mixing, and thirdly, the kinetics of water-rock interaction based on groundwater residence time from a previous study using CFCs and SF 6 are quantified for the two main flow lines. A regional contrast in the groundwater recharge altitude allowed the oxygene-18 to be useful combined with the 87 Sr/ 86 Sr ratios to differentiate the groundwater origins and to compute the mixing rates, revealing the real extension of the watershed and the availability of the resource. The results also highlight a very good correlation between the groundwater residence time and the spatial evolution of 87 Sr/ 86 Sr ratios, allowing water-rock interaction kinetics to be defined empirically for the two main flow lines through the calcarenites. These results demonstrate the efficiency of strontium isotopes as tracers of water-rock interaction kinetics and by extension their relevance as a proxy of groundwater residence time, fundamental parameter documenting the long term sustainability of the hydrosystem. Copyright © 2016 Elsevier B.V. All rights reserved.
Toxic elements and bio-metals in Cantharellus mushrooms from Poland and China.
Falandysz, Jerzy; Chudzińska, Maria; Barałkiewicz, Danuta; Drewnowska, Małgorzata; Hanć, Anetta
2017-04-01
Data on multi-trace element composition and content relationships have been obtained for Cantharellus cibarius, C. tubaeformis, and C. minor mushrooms from Poland and China by inductive coupled plasma-dynamic reaction cell-mass spectroscopy. There is no previous data published on As, Li, V, Tl, and U in chanterelles from Poland and on Ba, Co, Cr, Ni, Rb, and Sr in chanterelles from China. The results implied a role of the soil background geochemistry at the collection site with the occurrence of Ag, As, Ba, Cr, Cs, Li, Mn, Pb, Rb, Sr, U, and V in the fruiting bodies. Both geogenic Cd and anthropogenic Cd can contribute in load of this element in chanterelles from the Świetokrzyskie Mts. region in Poland, while geogenic source can be highly dominant in the background areas of Yunnan. An essentiality of Cu and Zn and effort by mushroom to maintain their physiological regulation could be reflected by data for Cantharellus mushrooms from both regions of the world, but its geogenic source (and possibly anthropogenic) can matter also in the region of the Świetokrzyskie Mountains in Poland. The elements Co, Ni, and Tl were at the same order of magnitude in contents in C. cibarius in Poland and Yunnan, China. C. tubaeformis differed from C. cibarius by a lower content of correlated Co, Ni, and Zn. Soil which is polymetallic and highly weathered in Yunnan can be suggested as a natural geogenic source of greater concentrations of As, Ba, Cr, Li, Pb, Sr, U, and V in the chanterelles there while lower of Mn and Rb, when related to chanterelles in Poland. A difference in Cs content between the sites can be attributed as an effect of the 137 Cs release from the Chernobyl accident, in which Poland was much more affected than Yunnan, where deposition was negligible.
Changes of pituitary gland volume in Kennedy disease.
Pieper, C C; Teismann, I K; Konrad, C; Heindel, W L; Schiffbauer, H
2013-12-01
Kennedy disease is a rare X-linked neurodegenerative disorder caused by a CAG repeat expansion in the first exon of the androgen-receptor gene. Apart from neurologic signs, this mutation can cause a partial androgen insensitivity syndrome with typical alterations of gonadotropic hormones produced by the pituitary gland. The aim of the present study was therefore to evaluate the impact of Kennedy disease on pituitary gland volume under the hypothesis that endocrinologic changes caused by partial androgen insensitivity may lead to morphologic changes (ie, hypertrophy) of the pituitary gland. Pituitary gland volume was measured in sagittal sections of 3D T1-weighted 3T-MR imaging data of 8 patients with genetically proven Kennedy disease and compared with 16 healthy age-matched control subjects by use of Multitracer by a blinded, experienced radiologist. The results were analyzed by a univariant ANOVA with total brain volume as a covariant. Furthermore, correlation and linear regression analyses were performed for pituitary volume, patient age, disease duration, and CAG repeat expansion length. Intraobserver reliability was evaluated by means of the Pearson correlation coefficient. Pituitary volume was significantly larger in patients with Kennedy disease (636 [±90] mm(3)) than in healthy control subjects (534 [±91] mm(3)) (P = .041). There was no significant difference in total brain volume (P = .379). Control subjects showed a significant decrease in volume with age (r = -0.712, P = .002), whereas there was a trend to increasing gland volume in patients with Kennedy disease (r = 0.443, P = .272). Gland volume correlated with CAG repeat expansion length in patients (r = 0.630, P = .047). The correlation coefficient for intraobserver reliability was 0.94 (P < .001). Patients with Kennedy disease showed a significantly higher pituitary volume that correlated with the CAG repeat expansion length. This could reflect hypertrophy as the result of elevated gonadotropic hormone secretion caused by the androgen receptor mutation with partial androgen insensitivity.
Differentiated spring behavior under changing hydrological conditions in an alpine karst aquifer
NASA Astrophysics Data System (ADS)
Filippini, Maria; Squarzoni, Gabriela; De Waele, Jo; Fiorucci, Adriano; Vigna, Bartolomeo; Grillo, Barbara; Riva, Alberto; Rossetti, Stefano; Zini, Luca; Casagrande, Giacomo; Stumpp, Christine; Gargini, Alessandro
2018-01-01
Limestone massifs with a high density of dolines form important karst aquifers in most of the Alps, often with groundwater circulating through deep karst conduits and water coming out of closely spaced springs with flow rates of over some cubic meters per second. Although several hydrogeological studies and tracing experiments were carried out in many of these carbonate mountains in the past, the hydrogeology of most of these karst aquifers is still poorly known. Geological, hydrodynamic and hydrochemical investigations have been carried out in one of the most representative of these areas (Cansiglio-Monte Cavallo, NE Italy) since spring 2015, in order to enhance the knowledge on this important type of aquifer system. Additionally, a cave-to-spring multitracer test was carried out in late spring 2016 by using three different fluorescent tracers. This hydrogeological study allowed: 1) gathering new detailed information on the geological and tectonic structure of such alpine karst plateau; 2) defining discharge rates of the three main springs (Gorgazzo, Santissima, and Molinetto) by constructing rating curves; 3) understanding the discharging behavior of the system with respect to different recharge conditions; 4) better defining the recharge areas of the three springs. The three nearby springs (the spring front stretches over 5 km), that drain the investigated karst aquifer system, show different behaviors with respect to changing discharge conditions, demonstrating this aquifer to be divided in partially independent drainage systems under low-flow conditions, when their chemistry is clearly differentiated. Under high-flow conditions, waters discharging at all springs show more similar geochemical characteristics. The combination of geochemistry, hydrodynamic monitoring and dye tracing tests has shown that the three springs have different recharge areas. The study points out that even closely spaced karst springs, that apparently drain the same karst mountain, can have different behaviors, and thus distinctive reactions toward polluting events, a characteristic to be taken into account for their management.
A microhistological technique for analysis of food habits of mycophagous rodents.
Patrick W. McIntire; Andrew B. Carey
1989-01-01
We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...
NASA Technical Reports Server (NTRS)
Zamora, M. A.
1977-01-01
Consumables analysis/crew training simulator interface requirements were defined. Two aspects were investigated: consumables analysis support techniques to crew training simulator for advanced spacecraft programs, and the applicability of the above techniques to the crew training simulator for the space shuttle program in particular.
USDA-ARS?s Scientific Manuscript database
Current wet chemical methods for biomass composition analysis using two-step sulfuric acid hydrolysis are time-consuming, labor-intensive, and unable to provide structural information about biomass. Infrared techniques provide fast, low-cost analysis, are non-destructive, and have shown promising re...
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1975-01-01
Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.
Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G
2015-01-01
Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.
NASA Astrophysics Data System (ADS)
Becker, T.; Clark, J. F.
2012-12-01
Coupled with the unpredictability of a changing climate, the projected growth in human population over the next century requires new and innovative ways to augment already-depleted water supplies. An increasingly popular and promising development is managed aquifer recharge (MAR), a cost-effective method of intentionally storing potable water in groundwater aquifers at engineered sites worldwide. Reclaimed (or recycled) water, defined as cleaned and treated wastewater, will account for a larger portion of MAR water in future years. A crucial component for managing groundwater recharged with reclaimed water is its subsurface travel time. The California Department of Public Health (CDPH), with the most recent draft of regulations issued on November 21, 2011, requires the application of groundwater tracers to demonstrate subsurface residence time. Residence time increases the quality of reclaimed water via soil-aquifer treatment (SAT), which includes mechanisms such as sorption, biological degradation, and microbial inactivation to remove potential contaminants or pathogens. This study addresses the need for an appropriate tracer to determine groundwater residence times near MAR facilities. Standard shallow groundwater dating techniques, such as T/3He and chlorofluorocarbon (CFC) methods, cannot be used because their uncertainties are typically ± 2 years, longer than the target CDPH retention time of ~6 months. These methods also cannot map preferential flow paths. Sulfur hexafluoride (SF6), a nonreactive synthetic gas, is well-established as a deliberate tracer for determining subsurface travel time; however, SF6 is a very strong greenhouse gas and the California Air Resources Board (CARB) is regulating its emission. Other tracers, such as noble gas isotopes, that have successfully determined subsurface retention times are impractical due to their high cost. A multi-tracer experiment at the San Gabriel Spreading Grounds test basin (Montebello Forebay, Los Angeles County, CA, USA) has been in progress since September 6, 2011, following injection of boric acid enriched in boron-10 (10B) and bromide (Br-) tracers. Tracer concentrations are collected at 9 monitoring wells that have pre-experiment estimated travel times between 0.5 to 180 days. Results indicate that 10B-enriched boric acid is an effective deliberate tracer at MAR sites; however, the ion's movement is slightly retarded relative to bromide by the substrate. 10B/Br- travel time ratios range from 1 to 1.4. In addition to the two deliberate geochemical tracers, heat is being evaluated as a possible intrinsic tracer at MAR sites. At the time of the experiment (late summer), reclaimed water was significantly warmer (~20°F) than the native groundwater as it entered the system. Time series are developed from loggers outfitted at each monitoring well, with measurements recorded hourly accurate to one thousandth of a degree. Results are similar to 10B & Br- travel times and validate the potential of heat as an intrinsic tracer.
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
Man-machine analysis of translation and work tasks of Skylab films
NASA Technical Reports Server (NTRS)
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
1979-01-01
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
Lo, Shih-Jie; Yao, Da-Jeng
2015-07-23
This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell.
Lo, Shih-Jie; Yao, Da-Jeng
2015-01-01
This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918
Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.
Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko
2017-11-01
A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.
Transgender Phonosurgery: A Systematic Review and Meta-analysis.
Song, Tara Elena; Jiang, Nancy
2017-05-01
Objectives Different surgical techniques have been described in the literature to increase vocal pitch. The purpose of this study is to systematically review these surgeries and perform a meta-analysis to determine which technique increases pitch the most. Data Sources CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct. Review Methods A systematic review and meta-analysis of the literature was performed using the CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct databases. Studies were eligible for inclusion if they evaluated pitch-elevating phonosurgical techniques in live humans and performed pre- and postoperative acoustic analysis. Data were gathered regarding surgical technique, pre- and postoperative fundamental frequencies, perioperative care measures, and complications. Results Twenty-nine studies were identified. After applying inclusion and exclusion criteria, a total of 13 studies were included in the meta-analysis. Mechanisms of pitch elevation included increasing vocal cord tension (cricothyroid approximation), shortening the vocal cord length (cold knife glottoplasty, laser-shortening glottoplasty), and decreasing mass (laser reduction glottoplasty). The most common interventions were shortening techniques and cricothyroid approximation (6 studies each). The largest increase in fundamental frequency was seen with techniques that shortened the vocal cords. Preoperative speech therapy, postoperative voice rest, and reporting of patient satisfaction were inconsistent. Many of the studies were limited by low power and short length of follow-up. Conclusions Multiple techniques for elevation of vocal pitch exist, but vocal cord shortening procedures appear to result in the largest increase in fundamental frequency.
Analysis of thin plates with holes by using exact geometrical representation within XFEM.
Perumal, Logah; Tso, C P; Leng, Lim Thong
2016-05-01
This paper presents analysis of thin plates with holes within the context of XFEM. New integration techniques are developed for exact geometrical representation of the holes. Numerical and exact integration techniques are presented, with some limitations for the exact integration technique. Simulation results show that the proposed techniques help to reduce the solution error, due to the exact geometrical representation of the holes and utilization of appropriate quadrature rules. Discussion on minimum order of integration order needed to achieve good accuracy and convergence for the techniques presented in this work is also included.
A combination of selected mapping and clipping to increase energy efficiency of OFDM systems
Lee, Byung Moo; Rim, You Seung
2017-01-01
We propose an energy efficient combination design for OFDM systems based on selected mapping (SLM) and clipping peak-to-average power ratio (PAPR) reduction techniques, and show the related energy efficiency (EE) performance analysis. The combination of two different PAPR reduction techniques can provide a significant benefit in increasing EE, because it can take advantages of both techniques. For the combination, we choose the clipping and SLM techniques, since the former technique is quite simple and effective, and the latter technique does not cause any signal distortion. We provide the structure and the systematic operating method, and show the various analyzes to derive the EE gain based on the combined technique. Our analysis show that the combined technique increases the EE by 69% compared to no PAPR reduction, and by 19.34% compared to only using SLM technique. PMID:29023591
GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration
Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng
2015-01-01
The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315
Non-destructive evaluation techniques, high temperature ceramic component parts for gas turbines
NASA Technical Reports Server (NTRS)
Reiter, H.; Hirsekorn, S.; Lottermoser, J.; Goebbels, K.
1984-01-01
This report concerns studies conducted on various tests undertaken on material without destroying the material. Tests included: microradiographic techniques, vibration analysis, high-frequency ultrasonic tests with the addition of evaluation of defects and structure through analysis of ultrasonic scattering data, microwave tests and analysis of sound emission.
Image Analysis, Microscopic, and Spectrochemical Study of the PVC Dry Blending Process,
The dry blending process used in the production of electrical grade pvc formulations has been studies using a combination of image analysis , microscopic...by image analysis techniques. Optical and scanning electron microscopy were used to assess morphological differences. Spectrochemical techniques were used to indicate chemical changes.
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laskin, Julia; Lanekoff, Ingela
2015-11-13
Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less
Turbine blade tip durability analysis
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.
1981-01-01
An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.
A strategy for selecting data mining techniques in metabolomics.
Banimustafa, Ahmed Hmaidan; Hardy, Nigel W
2012-01-01
There is a general agreement that the development of metabolomics depends not only on advances in chemical analysis techniques but also on advances in computing and data analysis methods. Metabolomics data usually requires intensive pre-processing, analysis, and mining procedures. Selecting and applying such procedures requires attention to issues including justification, traceability, and reproducibility. We describe a strategy for selecting data mining techniques which takes into consideration the goals of data mining techniques on the one hand, and the goals of metabolomics investigations and the nature of the data on the other. The strategy aims to ensure the validity and soundness of results and promote the achievement of the investigation goals.
Spectral Analysis and Experimental Modeling of Ice Accretion Roughness
NASA Technical Reports Server (NTRS)
Orr, D. J.; Breuer, K. S.; Torres, B. E.; Hansman, R. J., Jr.
1996-01-01
A self-consistent scheme for relating wind tunnel ice accretion roughness to the resulting enhancement of heat transfer is described. First, a spectral technique of quantitative analysis of early ice roughness images is reviewed. The image processing scheme uses a spectral estimation technique (SET) which extracts physically descriptive parameters by comparing scan lines from the experimentally-obtained accretion images to a prescribed test function. Analysis using this technique for both streamwise and spanwise directions of data from the NASA Lewis Icing Research Tunnel (IRT) are presented. An experimental technique is then presented for constructing physical roughness models suitable for wind tunnel testing that match the SET parameters extracted from the IRT images. The icing castings and modeled roughness are tested for enhancement of boundary layer heat transfer using infrared techniques in a "dry" wind tunnel.
Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan
2018-02-05
Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.
Directed Incremental Symbolic Execution
NASA Technical Reports Server (NTRS)
Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz
2011-01-01
The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.
NASA Astrophysics Data System (ADS)
Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław
Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.
Application of phyto-indication and radiocesium indicative methods for microrelief mapping
NASA Astrophysics Data System (ADS)
Panidi, E.; Trofimetz, L.; Sokolova, J.
2016-04-01
Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.
Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.
Ritz, Christian; Van der Vliet, Leana
2009-09-01
The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.
ERIC Educational Resources Information Center
Moriarty, Dick; Zarebski, John
This paper delineates the exact methodology developed by the Sports Institute for Research/Change Agent Research (SIR/CAR) for applying a systems analysis technique to a voluntary mutual benefit organization, such as a school or amateur athletic group. The functions of the technique are to compare avowed and actual behavior, to utilize group…
Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane
2008-02-01
MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.
Use of communication techniques by Maryland dentists.
Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V
2013-12-01
Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.
Iterative categorization (IC): a systematic technique for analysing qualitative data.
Neale, Joanne
2016-06-01
The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
Large space antennas: A systems analysis case history
NASA Technical Reports Server (NTRS)
Keafer, Lloyd S. (Compiler); Lovelace, U. M. (Compiler)
1987-01-01
The value of systems analysis and engineering is aptly demonstrated by the work on Large Space Antennas (LSA) by the NASA Langley Spacecraft Analysis Branch. This work was accomplished over the last half-decade by augmenting traditional system engineering, analysis, and design techniques with computer-aided engineering (CAE) techniques using the Langley-developed Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system. This report chronicles the research highlights and special systems analyses that focused the LSA work on deployable truss antennas. It notes developmental trends toward greater use of CAE techniques in their design and analysis. A look to the future envisions the application of improved systems analysis capabilities to advanced space systems such as an advanced space station or to lunar and Martian missions and human habitats.
Method for improving accuracy in full evaporation headspace analysis.
Xie, Wei-Qi; Chai, Xin-Sheng
2017-05-01
We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
EPA Method 245.2: Mercury (Automated Cold Vapor Technique)
Method 245.2 describes procedures for preparation and analysis of drinking water samples for analysis of mercury using acid digestion and cold vapor atomic absorption. Samples are prepared using an acid digestion technique.
Light stable isotope analysis of meteorites by ion microprobe
NASA Technical Reports Server (NTRS)
Mcsween, Harry Y., Jr.
1994-01-01
The main goal was to develop the necessary secondary ion mass spectrometer (SIMS) techniques to use a Cameca ims-4f ion microprobe to measure light stable isotope ratios (H, C, O and S) in situ and in non-conducting mineral phases. The intended application of these techniques was the analysis of meteorite samples, although the techniques that have been developed are equally applicable to the investigation of terrestrial samples. The first year established techniques for the analysis of O isotope ratios (delta O-18 and delta O-17) in conducting mineral phases and the measurement of S isotope ratios (delta S-34) in a variety of sulphide phases. In addition, a technique was developed to measure delta S-34 values in sulphates, which are insulators. Other research undertaken in the first year resulted in SIMS techniques for the measurement of wide variety of trace elements in carbonate minerals, with the aim of understanding the nature of alteration fluids in carbonaceous chondrites. In the second year we developed techniques for analyzing O isotope ratios in nonconducting mineral phases. These methods are potentially applicable to the measurement of other light stable isotopes such as H, C and S in insulators. Also, we have further explored the analytical techniques used for the analysis of S isotopes in sulphides by analyzing troilite in a number of L and H ordinary chondrites. This was done to see if there was any systematic differences with petrological type.
NASA Astrophysics Data System (ADS)
Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho
2013-10-01
Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.
NASA Astrophysics Data System (ADS)
Randle, K.; Al-Jundi, J.; Mamas, C. J. V.; Sokhi, R. S.; Earwaker, L. G.
1993-06-01
Our work on heavy metals in the estuarine environment has involved the use of two multielement techniques: neutron activation analysis (NAA) and proton-induced X-ray emission (PIXE) analysis. As PIXE is essentially a surface analytical technique problems may arise due to sample inhomogeneity and surface roughness. In order to assess the contribution of these effects we have compared the results from PIXE analysis with those from a technique which analyzes a larger bulk sample rather than just the surface. An obvious method was NAA. A series of sediment samples containing particles of variable diameter were compared. Pellets containing a few mg of sediment were prepared from each sample and analyzed by the PIXE technique using both an absolute and a comparitive method. For INAA the rest of the sample was then irradiated with thermal neutrons and element concentrations determined from analyses of the subsequent gamma-ray spectrum. Results from the two methods are discussed.
Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.
Kemp, Pavlina S; VanderVeen, Deborah K
2016-01-01
The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.
Automated thermal mapping techniques using chromatic image analysis
NASA Technical Reports Server (NTRS)
Buck, Gregory M.
1989-01-01
Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.
Using Machine Learning Techniques in the Analysis of Oceanographic Data
NASA Astrophysics Data System (ADS)
Falcinelli, K. E.; Abuomar, S.
2017-12-01
Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.
Optimization Techniques for Analysis of Biological and Social Networks
2012-03-28
analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational
Developing techniques for cause-responsibility analysis of occupational accidents.
Jabbari, Mousa; Ghorbani, Roghayeh
2016-11-01
The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.
A simple 2D composite image analysis technique for the crystal growth study of L-ascorbic acid.
Kumar, Krishan; Kumar, Virender; Lal, Jatin; Kaur, Harmeet; Singh, Jasbir
2017-06-01
This work was destined for 2D crystal growth studies of L-ascorbic acid using the composite image analysis technique. Growth experiments on the L-ascorbic acid crystals were carried out by standard (optical) microscopy, laser diffraction analysis, and composite image analysis. For image analysis, the growth of L-ascorbic acid crystals was captured as digital 2D RGB images, which were then processed to composite images. After processing, the crystal boundaries emerged as white lines against the black (cancelled) background. The crystal boundaries were well differentiated by peaks in the intensity graphs generated for the composite images. The lengths of crystal boundaries measured from the intensity graphs of composite images were in good agreement (correlation coefficient "r" = 0.99) with the lengths measured by standard microscopy. On the contrary, the lengths measured by laser diffraction were poorly correlated with both techniques. Therefore, the composite image analysis can replace the standard microscopy technique for the crystal growth studies of L-ascorbic acid. © 2017 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.
2014-01-01
Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811
ADP of multispectral scanner data for land use mapping
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1971-01-01
The advantages and disadvantages of various remote sensing instrumentation and analysis techniques are reviewed. The use of multispectral scanner data and the automatic data processing techniques are considered. A computer-aided analysis system for remote sensor data is described with emphasis on the image display, statistics processor, wavelength band selection, classification processor, and results display. Advanced techniques in using spectral and temporal data are also considered.
Solid State Audio/Speech Processor Analysis.
1980-03-01
techniques. The techniques were demonstrated to be worthwhile in an efficient realtime AWR system. Finally, microprocessor architectures were designed to...do not include custom chip development, detailed hardware design , construction or testing. ITTDCD is very encouraged by the results obtained in this...California, Berkley, was responsible for furnishing the simulation data of OD speech analysis techniques and for the design and development of the hardware OD
Estimating Mass of Inflatable Aerodynamic Decelerators Using Dimensionless Parameters
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
2011-01-01
This paper describes a technique for estimating mass for inflatable aerodynamic decelerators. The technique uses dimensional analysis to identify a set of dimensionless parameters for inflation pressure, mass of inflation gas, and mass of flexible material. The dimensionless parameters enable scaling of an inflatable concept with geometry parameters (e.g., diameter), environmental conditions (e.g., dynamic pressure), inflation gas properties (e.g., molecular mass), and mass growth allowance. This technique is applicable for attached (e.g., tension cone, hypercone, and stacked toroid) and trailing inflatable aerodynamic decelerators. The technique uses simple engineering approximations that were developed by NASA in the 1960s and 1970s, as well as some recent important developments. The NASA Mars Entry and Descent Landing System Analysis (EDL-SA) project used this technique to estimate the masses of the inflatable concepts that were used in the analysis. The EDL-SA results compared well with two independent sets of high-fidelity finite element analyses.
S-192 analysis: Conventional and special data processing techniques. [Michigan
NASA Technical Reports Server (NTRS)
Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.
1975-01-01
The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.
Advanced techniques for determining long term compatibility of materials with propellants
NASA Technical Reports Server (NTRS)
Green, R. L.
1972-01-01
The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.
Rasch Analysis for Instrument Development: Why, When, and How?
Boone, William J.
2016-01-01
This essay describes Rasch analysis psychometric techniques and how such techniques can be used by life sciences education researchers to guide the development and use of surveys and tests. Specifically, Rasch techniques can be used to document and evaluate the measurement functioning of such instruments. Rasch techniques also allow researchers to construct “Wright maps” to explain the meaning of a test score or survey score and develop alternative forms of tests and surveys. Rasch techniques provide a mechanism by which the quality of life sciences–related tests and surveys can be optimized and the techniques can be used to provide a context (e.g., what topics a student has mastered) when explaining test and survey results. PMID:27856555
Analysis of Synthetic Polymers.
ERIC Educational Resources Information Center
Smith, Charles G.; And Others
1989-01-01
Reviews techniques for the characterization and analysis of synthetic polymers, copolymers, and blends. Includes techniques for structure determination, separation, and quantitation of additives and residual monomers; determination of molecular weight; and the study of thermal properties including degradation mechanisms. (MVL)
Fabrication Techniques and Principles for Flat Plate Antennas
DOT National Transportation Integrated Search
1973-09-01
The report documents the fabrication techniques and principles selected to produce one and ten million flat plate antennas per year. An engineering analysis of the reliability, electrical integrity, and repeatability is made, and a cost analysis summ...
NASA Astrophysics Data System (ADS)
Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.
2013-07-01
The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.
Moini, Mehdi
2018-05-01
In the past few years, there has been a significant effort by the forensic science community to develop new scientific techniques for the analysis of forensic evidence. Forensic chemists have been spearheaded to develop information-rich confirmatory technologies and techniques and apply them to a broad array of forensic challenges. The purpose of these confirmatory techniques is to provide alternatives to presumptive techniques that rely on data such as color changes, pattern matching, or retention time alone, which are prone to more false positives. To this end, the application of separation techniques in conjunction with mass spectrometry has played an important role in the analysis of forensic evidence. Moreover, in the past few years the role of liquid separation techniques, such as liquid chromatography and capillary electrophoresis in conjunction with mass spectrometry, has gained significant tractions and have been applied to a wide range of chemicals, from small molecules such as drugs and explosives, to large molecules such as proteins. For example, proteomics and peptidomics have been used for identification of humans, organs, and bodily fluids. A wide range of HPLC techniques including reversed phase, hydrophilic interaction, mixed-mode, supercritical fluid, multidimensional chromatography, and nanoLC, as well as several modes of capillary electrophoresis mass spectrometry, including capillary zone electrophoresis, partial filling, full filling, and micellar electrokenetic chromatography have been applied to the analysis drugs, explosives, and questioned documents. In this article, we review recent (2015-2017) applications of liquid separation in conjunction with mass spectrometry to the analysis of forensic evidence. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John
2013-05-01
Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A
2016-03-05
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.
2016-01-01
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Matthew W.
2013-01-01
This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include themore » inherently weak Raman cross section and susceptibility to fluorescence interference.« less
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Determination of dynamic fracture toughness using a new experimental technique
NASA Astrophysics Data System (ADS)
Cady, Carl M.; Liu, Cheng; Lovato, Manuel L.
2015-09-01
In other studies dynamic fracture toughness has been measured using Charpy impact and modified Hopkinson Bar techniques. In this paper results will be shown for the measurement of fracture toughness using a new test geometry. The crack propagation velocities range from ˜0.15 mm/s to 2.5 m/s. Digital image correlation (DIC) will be the technique used to measure both the strain and the crack growth rates. The boundary of the crack is determined using the correlation coefficient generated during image analysis and with interframe timing the crack growth rate and crack opening can be determined. A comparison of static and dynamic loading experiments will be made for brittle polymeric materials. The analysis technique presented by Sammis et al. [1] is a semi-empirical solution, however, additional Linear Elastic Fracture Mechanics analysis of the strain fields generated as part of the DIC analysis allow for the more commonly used method resembling the crack tip opening displacement (CTOD) experiment. It should be noted that this technique was developed because limited amounts of material were available and crack growth rates were to fast for a standard CTOD method.
Hori, Yusuke S; Fukuhara, Toru; Aoi, Mizuho; Oda, Kazunori; Shinno, Yoko
2018-06-01
Metastatic glioblastoma is a rare condition, and several studies have reported the involvement of multiple organs including the lymph nodes, liver, and lung. The lung and pleura are reportedly the most frequent sites of metastasis, and diagnosis using less invasive tools such as cytological analysis with fine needle aspiration biopsy is challenging. Cytological analysis of fluid specimens tends to be negative because of the small number of cells obtained, whereas the cell block technique reportedly has higher sensitivity because of a decrease in cellular dispersion. Herein, the authors describe a patient with a history of diffuse astrocytoma who developed intractable, progressive accumulation of pleural fluid. Initial cytological analysis of the pleural effusion obtained by thoracocentesis was negative, but reanalysis using the cell block technique revealed the presence of glioblastoma cells. This is the first report to suggest the effectiveness of the cell block technique in the diagnosis of extracranial glioblastoma using pleural effusion. In patients with a history of glioma, the presence of extremely intractable pleural effusion warrants cytological analysis of the fluid using this technique in order to initiate appropriate chemotherapy.
Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K
2017-12-01
Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.
Matrix Perturbation Techniques in Structural Dynamics
NASA Technical Reports Server (NTRS)
Caughey, T. K.
1973-01-01
Matrix perturbation are developed techniques which can be used in the dynamical analysis of structures where the range of numerical values in the matrices extreme or where the nature of the damping matrix requires that complex valued eigenvalues and eigenvectors be used. The techniques can be advantageously used in a variety of fields such as earthquake engineering, ocean engineering, aerospace engineering and other fields concerned with the dynamical analysis of large complex structures or systems of second order differential equations. A number of simple examples are included to illustrate the techniques.
Data Unfolding with Wiener-SVD Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, W.; Li, X.; Qian, X.
Here, data unfolding is a common analysis technique used in HEP data analysis. Inspired by the deconvolution technique in the digital signal processing, a new unfolding technique based on the SVD technique and the well-known Wiener filter is introduced. The Wiener-SVD unfolding approach achieves the unfolding by maximizing the signal to noise ratios in the effective frequency domain given expectations of signal and noise and is free from regularization parameter. Through a couple examples, the pros and cons of the Wiener-SVD approach as well as the nature of the unfolded results are discussed.
Data Unfolding with Wiener-SVD Method
Tang, W.; Li, X.; Qian, X.; ...
2017-10-04
Here, data unfolding is a common analysis technique used in HEP data analysis. Inspired by the deconvolution technique in the digital signal processing, a new unfolding technique based on the SVD technique and the well-known Wiener filter is introduced. The Wiener-SVD unfolding approach achieves the unfolding by maximizing the signal to noise ratios in the effective frequency domain given expectations of signal and noise and is free from regularization parameter. Through a couple examples, the pros and cons of the Wiener-SVD approach as well as the nature of the unfolded results are discussed.
Statistical Evaluation of Time Series Analysis Techniques
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1973-01-01
The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.
Laboratory Spectrometer for Wear Metal Analysis of Engine Lubricants.
1986-04-01
analysis, the acid digestion technique for sample pretreatment is the best approach available to date because of its relatively large sample size (1000...microliters or more). However, this technique has two major shortcomings limiting its application: (1) it requires the use of hydrofluoric acid (a...accuracy. Sample preparation including filtration or acid digestion may increase analysis times by 20 minutes or more. b. Repeatability In the analysis
NASA Technical Reports Server (NTRS)
Viezee, W.; Russell, P. B.; Hake, R. D., Jr.
1974-01-01
The matching method of lidar data analysis is explained, and the results from two flights studying the stratospheric aerosol using lidar techniques are summarized and interpreted. Support is lent to the matching method of lidar data analysis by the results, but it is not yet apparent that the analysis technique leads to acceptable results on all nights in all seasons.
NASA Technical Reports Server (NTRS)
Goldstein, J. I.; Williams, D. B.
1992-01-01
This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.
Digital Dental X-ray Database for Caries Screening
NASA Astrophysics Data System (ADS)
Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila
2016-06-01
Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.
Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse
NASA Astrophysics Data System (ADS)
Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa
1994-01-01
Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.
LIBS: a potential tool for industrial/agricultural waste water analysis
NASA Astrophysics Data System (ADS)
Karpate, Tanvi; K. M., Muhammed Shameem; Nayak, Rajesh; V. K., Unnikrishnan; Santhosh, C.
2016-04-01
Laser Induced Breakdown Spectroscopy (LIBS) is a multi-elemental analysis technique with various advantages and has the ability to detect any element in real time. This technique holds a potential for environmental monitoring and various such analysis has been done in soil, glass, paint, water, plastic etc confirms the robustness of this technique for such applications. Compared to the currently available water quality monitoring methods and techniques, LIBS has several advantages, viz. no need for sample preparation, fast and easy operation, and chemical free during the process. In LIBS, powerful pulsed laser generates plasma which is then analyzed to get quantitative and qualitative details of the elements present in the sample. Another main advantage of LIBS technique is that it can perform in standoff mode for real time analysis. Water samples from industries and agricultural strata tend to have a lot of pollutants making it harmful for consumption. The emphasis of this project is to determine such harmful pollutants present in trace amounts in industrial and agricultural wastewater. When high intensity laser is made incident on the sample, a plasma is generated which gives a multielemental emission spectra. LIBS analysis has shown outstanding success for solids samples. For liquid samples, the analysis is challenging as the liquid sample has the chances of splashing due to the high energy of laser and thus making it difficult to generate plasma. This project also deals with determining the most efficient method for testing of water sample for qualitative as well as quantitative analysis using LIBS.
A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.
ERIC Educational Resources Information Center
Kim, Jin Eun
A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…
NASA Technical Reports Server (NTRS)
Merino, F.; Wakabayashi, I.; Pleasant, R. L.; Hill, M.
1982-01-01
Preferred techniques for providing abort pressurization and engine feed system net positive suction pressure (NPSP) for low thrust chemical propulsion systems (LTPS) were determined. A representative LTPS vehicle configuration is presented. Analysis tasks include: propellant heating analysis; pressurant requirements for abort propellant dump; and comparative analysis of pressurization techniques and thermal subcoolers.
A technique for conducting point pattern analysis of cluster plot stem-maps
C.W. Woodall; J.M. Graham
2004-01-01
Point pattern analysis of forest inventory stem-maps may aid interpretation and inventory estimation of forest attributes. To evaluate the techniques and benefits of conducting point pattern analysis of forest inventory stem-maps, Ripley`s K(t) was calculated for simulated tree spatial distributions and for over 600 USDA Forest Service Forest...
NASA Technical Reports Server (NTRS)
1974-01-01
The use of optical data processing (ODP) techniques for motion analysis in two-dimensional imagery was studied. The basic feasibility of this approach was demonstrated, but inconsistent performance of the photoplastic used for recording spatial filters prevented totally automatic operation. Promising solutions to the problems encountered are discussed, and it is concluded that ODP techniques could be quite useful for motion analysis.
ERIC Educational Resources Information Center
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
Nondestructive evaluation of turbine blades vibrating in resonant modes
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Ahmadshahi, Mansour A.
1991-12-01
The paper presents the analysis of the strain distribution of turbine blades. The holographic moire technique is used in conjunction with computer analysis of the fringes. The application of computer fringe analysis technique reduces the number of holograms to be recorded to two. Stroboscopic illumination is used to record the patterns. Strains and stresses are computed.
ERIC Educational Resources Information Center
Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.
2014-01-01
Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…
Multiscale Analysis of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C. A.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-06-04
Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.
Wear Debris Analysis of Grease Lubricated Ball Bearings.
1982-04-12
Ferrography method was performed by the Naval Air Engineering Center (NAVAIRENGCEN), Lakehurst, New Jersey. A total of three sets of two 6309 deep-groove ball... Ferrography technique. The analysis of the grease-retained wear debris necessitated the development of a technique to reduce the grease samples to a...condition where they were compatible with the Ferrography technique. A major achievement was the successful application of dissolving the grease
Determination of authenticity of brand perfume using electronic nose prototypes
NASA Astrophysics Data System (ADS)
Gebicki, Jacek; Szulczynski, Bartosz; Kaminski, Marian
2015-12-01
The paper presents the practical application of an electronic nose technique for fast and efficient discrimination between authentic and fake perfume samples. Two self-built electronic nose prototypes equipped with a set of semiconductor sensors were employed for that purpose. Additionally 10 volunteers took part in the sensory analysis. The following perfumes and their fake counterparts were analysed: Dior—Fahrenheit, Eisenberg—J’ose, YSL—La nuit de L’homme, 7 Loewe and Spice Bomb. The investigations were carried out using the headspace of the aqueous solutions. Data analysis utilized multidimensional techniques: principle component analysis (PCA), linear discrimination analysis (LDA), k-nearest neighbour (k-NN). The results obtained confirmed the legitimacy of the electronic nose technique as an alternative to the sensory analysis as far as the determination of authenticity of perfume is concerned.
The Utility of Template Analysis in Qualitative Psychology Research.
Brooks, Joanna; McCluskey, Serena; Turley, Emma; King, Nigel
2015-04-03
Thematic analysis is widely used in qualitative psychology research, and in this article, we present a particular style of thematic analysis known as Template Analysis. We outline the technique and consider its epistemological position, then describe three case studies of research projects which employed Template Analysis to illustrate the diverse ways it can be used. Our first case study illustrates how the technique was employed in data analysis undertaken by a team of researchers in a large-scale qualitative research project. Our second example demonstrates how a qualitative study that set out to build on mainstream theory made use of the a priori themes (themes determined in advance of coding) permitted in Template Analysis. Our final case study shows how Template Analysis can be used from an interpretative phenomenological stance. We highlight the distinctive features of this style of thematic analysis, discuss the kind of research where it may be particularly appropriate, and consider possible limitations of the technique. We conclude that Template Analysis is a flexible form of thematic analysis with real utility in qualitative psychology research.
Wang, Chuji; Sahay, Peeyush
2009-01-01
Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503
DOT National Transportation Integrated Search
1980-01-01
This project was undertaken for the Virginia Department of Transportation Safety to assess the feasibility of implementing the Data Analysis and Reporting Techniques (DART) computer software system in Virginia. Following a review of available literat...
Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana MF; Silva, Rosângela; de Souza, Sheila Mendonça; Araujo, Adauto
2013-01-01
Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis. PMID:23579793
Rapid Method for Sodium Hydroxide/Sodium Peroxide Fusion ...
Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Plutonium-238 and plutonium-239 in water and air filters Method Selected for: SAM lists this method as a pre-treatment technique supporting analysis of refractory radioisotopic forms of plutonium in drinking water and air filters using the following qualitative techniques: • Rapid methods for acid or fusion digestion • Rapid Radiochemical Method for Plutonium-238 and Plutonium 239/240 in Building Materials for Environmental Remediation Following Radiological Incidents. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana M F; Silva, Rosângela; Souza, Sheila Mendonça de; Araujo, Adauto
2013-04-01
Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.
Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A
2012-03-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.
Uranium Detection - Technique Validation Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colletti, Lisa Michelle; Garduno, Katherine; Lujan, Elmer J.
As a LANL activity for DOE/NNSA in support of SHINE Medical Technologies™ ‘Accelerator Technology’ we have been investigating the application of UV-vis spectroscopy for uranium analysis in solution. While the technique has been developed specifically for sulfate solutions, the proposed SHINE target solutions, it can be adapted to a range of different solution matrixes. The FY15 work scope incorporated technical development that would improve accuracy, specificity, linearity & range, precision & ruggedness, and comparative analysis. Significant progress was achieved throughout FY 15 addressing these technical challenges, as is summarized in this report. In addition, comparative analysis of unknown samples usingmore » the Davies-Gray titration technique highlighted the importance of controlling temperature during analysis (impacting both technique accuracy and linearity/range). To fully understand the impact of temperature, additional experimentation and data analyses were performed during FY16. The results from this FY15/FY16 work were presented in a detailed presentation, LA-UR-16-21310, and an update of this presentation is included with this short report summarizing the key findings. The technique is based on analysis of the most intense U(VI) absorbance band in the visible region of the uranium spectra in 1 M H 2SO 4, at λ max = 419.5 nm.« less
Recent advances in capillary electrophoretic migration techniques for pharmaceutical analysis.
Deeb, Sami El; Wätzig, Hermann; El-Hady, Deia Abd; Albishri, Hassan M; de Griend, Cari Sänger-van; Scriba, Gerhard K E
2014-01-01
Since the introduction about 30 years ago, CE techniques have gained a significant impact in pharmaceutical analysis. The present review covers recent advances and applications of CE for the analysis of pharmaceuticals. Both small molecules and biomolecules such as proteins are considered. The applications range from the determination of drug-related substances to the analysis of counterions and the determination of physicochemical parameters. Furthermore, general considerations of CE methods in pharmaceutical analysis are described. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Economou, Anastasios
2018-01-01
This work reviews the field of screen-printed electrodes (SPEs) modified with “green” metals for electrochemical stripping analysis of toxic elements. Electrochemical stripping analysis has been established as a useful trace analysis technique offering many advantages compared to competing optical techniques. Although mercury has been the preferred electrode material for stripping analysis, the toxicity of mercury and the associated legal requirements in its use and disposal have prompted research towards the development of “green” metals as alternative electrode materials. When combined with the screen-printing technology, such environment-friendly metals can lead to disposable sensors for trace metal analysis with excellent operational characteristics. This review focuses on SPEs modified with Au, Bi, Sb, and Sn for stripping analysis of toxic elements. Different modification approaches (electroplating, bulk modification, use of metal precursors, microengineering techniques) are considered and representative applications are described. A developing related field, namely biosensing based on stripping analysis of metallic nanoprobe labels, is also briefly mentioned. PMID:29596391
Economou, Anastasios
2018-03-29
This work reviews the field of screen-printed electrodes (SPEs) modified with "green" metals for electrochemical stripping analysis of toxic elements. Electrochemical stripping analysis has been established as a useful trace analysis technique offering many advantages compared to competing optical techniques. Although mercury has been the preferred electrode material for stripping analysis, the toxicity of mercury and the associated legal requirements in its use and disposal have prompted research towards the development of "green" metals as alternative electrode materials. When combined with the screen-printing technology, such environment-friendly metals can lead to disposable sensors for trace metal analysis with excellent operational characteristics. This review focuses on SPEs modified with Au, Bi, Sb, and Sn for stripping analysis of toxic elements. Different modification approaches (electroplating, bulk modification, use of metal precursors, microengineering techniques) are considered and representative applications are described. A developing related field, namely biosensing based on stripping analysis of metallic nanoprobe labels, is also briefly mentioned.
On the Power of Abstract Interpretation
NASA Technical Reports Server (NTRS)
Reddy, Uday S.; Kamin, Samuel N.
1991-01-01
Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally-oriented arguments instead of the detailed operational arguments used by Selur et. al.. Hence, our proofs are much simpler. They should be useful for further future improvements.
A novel pulse height analysis technique for nuclear spectroscopic and imaging systems
NASA Astrophysics Data System (ADS)
Tseng, H. H.; Wang, C. Y.; Chou, H. P.
2005-08-01
The proposed pulse height analysis technique is based on the constant and linear relationship between pulse width and pulse height generated from front-end electronics of nuclear spectroscopic and imaging systems. The present technique has successfully implemented into the sump water radiation monitoring system in a nuclear power plant. The radiation monitoring system uses a NaI(Tl) scintillator to detect radioactive nuclides of Radon daughters brought down by rain. The technique is also used for a nuclear medical imaging system. The system uses a position sensitive photomultiplier tube coupled with a scintillator. The proposed techniques has greatly simplified the electronic design and made the system a feasible one for potable applications.
A simple white noise analysis of neuronal light responses.
Chichilnisky, E J
2001-05-01
A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.
NASA Technical Reports Server (NTRS)
Behbehani, K.
1980-01-01
A new sensor/actuator failure analysis technique for turbofan jet engines was developed. Three phases of failure analysis, namely detection, isolation, and accommodation are considered. Failure detection and isolation techniques are developed by utilizing the concept of Generalized Likelihood Ratio (GLR) tests. These techniques are applicable to both time varying and time invariant systems. Three GLR detectors are developed for: (1) hard-over sensor failure; (2) hard-over actuator failure; and (3) brief disturbances in the actuators. The probability distribution of the GLR detectors and the detectability of sensor/actuator failures are established. Failure type is determined by the maximum of the GLR detectors. Failure accommodation is accomplished by extending the Multivariable Nyquest Array (MNA) control design techniques to nonsquare system designs. The performance and effectiveness of the failure analysis technique are studied by applying the technique to a turbofan jet engine, namely the Quiet Clean Short Haul Experimental Engine (QCSEE). Single and multiple sensor/actuator failures in the QCSEE are simulated and analyzed and the effects of model degradation are studied.
A study of data analysis techniques for the multi-needle Langmuir probe
NASA Astrophysics Data System (ADS)
Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.
2018-06-01
In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Advanced analysis technique for the evaluation of linear alternators and linear motors
NASA Technical Reports Server (NTRS)
Holliday, Jeffrey C.
1995-01-01
A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.
NASA Astrophysics Data System (ADS)
Kajiya, E. A. M.; Campos, P. H. O. V.; Rizzutto, M. A.; Appoloni, C. R.; Lopes, F.
2014-02-01
This paper presents systematic studies and analysis that contributed to the identification of the forgery of a work by the artist Emiliano Augusto Cavalcanti de Albuquerque e Melo, known as Di Cavalcanti. The use of several areas of expertise such as brush stroke analysis ("pinacologia"), applied physics, and art history resulted in an accurate diagnosis for ascertaining the authenticity of the work entitled "Violeiro" (1950). For this work we used non-destructive methods such as techniques of infrared, ultraviolet, visible and tangential light imaging combined with chemical analysis of the pigments by portable X-Ray Fluorescence (XRF) and graphic gesture analysis. Each applied method of analysis produced specific information that made possible the identification of materials and techniques employed and we concluded that this work is not consistent with patterns characteristic of the artist Di Cavalcanti.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaBelle, S.J.; Smith, A.E.; Seymour, D.A.
1977-02-01
The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less
Regional environmental analysis and management: New techniques for current problems
NASA Technical Reports Server (NTRS)
Honea, R. B.; Paludan, C. T. N.
1974-01-01
Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.
Another Look at the Power of Meta-Analysis in the Solomon Four-Group Design.
ERIC Educational Resources Information Center
Sawilowsky, Shlomo S.; Markman, Barry S.
This paper demonstrates that a meta-analysis technique applied to the Solomon Four-Group Design (SFGD) can fail to find significance even though an earlier "weaker" test may have found significance. The meta-analysis technique was promoted by Braver and Braver as the most powerful single test for analyzing data from an SFGD. They…
ERIC Educational Resources Information Center
Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi
2006-01-01
This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…
Multidimensional chromatography in food analysis.
Herrero, Miguel; Ibáñez, Elena; Cifuentes, Alejandro; Bernal, Jose
2009-10-23
In this work, the main developments and applications of multidimensional chromatographic techniques in food analysis are reviewed. Different aspects related to the existing couplings involving chromatographic techniques are examined. These couplings include multidimensional GC, multidimensional LC, multidimensional SFC as well as all their possible combinations. Main advantages and drawbacks of each coupling are critically discussed and their key applications in food analysis described.
ERIC Educational Resources Information Center
Leech, Nancy L.; Onwuegbuzie, Anthony J.
2008-01-01
Qualitative researchers in school psychology have a multitude of analyses available for data. The purpose of this article is to present several of the most common methods for analyzing qualitative data. Specifically, the authors describe the following 18 qualitative analysis techniques: method of constant comparison analysis, keywords-in-context,…
ERIC Educational Resources Information Center
Mosier, Nancy R.
Financial analysis techniques are tools that help managers make sound financial decisions that contribute to general corporate objectives. A literature review reveals that the most commonly used financial analysis techniques are payback time, average rate of return, present value or present worth, and internal rate of return. Despite the success…
ERIC Educational Resources Information Center
Wake Forest Univ., Winston Salem, NC. Bowman Gray School of Medicine.
Utilizing a systematic sampling technique, the professional activities of small groups of pediatricians, family practitioners, surgeons, obstetricians, and internists were observed for 4 or 5 days by a medical student who checked a prearranged activity sheet every 30 seconds to: (1) identify those tasks and activities an assistant could be trained…
The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason L. Wright; Milos Manic
2010-05-01
This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.
The Shock and Vibration Digest. Volume 16, Number 1
1984-01-01
investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is
Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach
ERIC Educational Resources Information Center
Lending, Diane; May, Jeffrey
2013-01-01
Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…
Maione, Camila; Barbosa, Rommel Melgaço
2018-01-24
Rice is one of the most important staple foods around the world. Authentication of rice is one of the most addressed concerns in the present literature, which includes recognition of its geographical origin and variety, certification of organic rice and many other issues. Good results have been achieved by multivariate data analysis and data mining techniques when combined with specific parameters for ascertaining authenticity and many other useful characteristics of rice, such as quality, yield and others. This paper brings a review of the recent research projects on discrimination and authentication of rice using multivariate data analysis and data mining techniques. We found that data obtained from image processing, molecular and atomic spectroscopy, elemental fingerprinting, genetic markers, molecular content and others are promising sources of information regarding geographical origin, variety and other aspects of rice, being widely used combined with multivariate data analysis techniques. Principal component analysis and linear discriminant analysis are the preferred methods, but several other data classification techniques such as support vector machines, artificial neural networks and others are also frequently present in some studies and show high performance for discrimination of rice.
NASA Technical Reports Server (NTRS)
Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.
1995-01-01
Solving for the displacements of free-free coupled systems acted upon by static loads is commonly performed throughout the aerospace industry. Many times, these problems are solved using static analysis with inertia relief. This solution technique allows for a free-free static analysis by balancing the applied loads with inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus displacement-dependent loads. Solving for the final displacements of such systems is commonly performed using iterative solution techniques. Unfortunately, these techniques can be time-consuming and labor-intensive. Since the coupled system equations for free-free systems with displacement-dependent loads can be written in closed-form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. Using a MSC/NASTRAN DMAP Alter, displacement-dependent loads have been included in static analysis with inertia relief. Such an Alter has been used successfully to solve efficiently a common aerospace problem typically solved using an iterative technique.
Automatic differentiation evaluated as a tool for rotorcraft design and optimization
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.; Young, Katherine C.
1995-01-01
This paper investigates the use of automatic differentiation (AD) as a means for generating sensitivity analyses in rotorcraft design and optimization. This technique transforms an existing computer program into a new program that performs sensitivity analysis in addition to the original analysis. The original FORTRAN program calculates a set of dependent (output) variables from a set of independent (input) variables, the new FORTRAN program calculates the partial derivatives of the dependent variables with respect to the independent variables. The AD technique is a systematic implementation of the chain rule of differentiation, this method produces derivatives to machine accuracy at a cost that is comparable with that of finite-differencing methods. For this study, an analysis code that consists of the Langley-developed hover analysis HOVT, the comprehensive rotor analysis CAMRAD/JA, and associated preprocessors is processed through the AD preprocessor ADIFOR 2.0. The resulting derivatives are compared with derivatives obtained from finite-differencing techniques. The derivatives obtained with ADIFOR 2.0 are exact within machine accuracy and do not depend on the selection of step-size, as are the derivatives obtained with finite-differencing techniques.
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... overfilling and spill, or overpressurization and venting through relief valves or rupture disks; and (v... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as...
New Results in Software Model Checking and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2010-01-01
This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.
Analysis of Gold Ores by Fire Assay
ERIC Educational Resources Information Center
Blyth, Kristy M.; Phillips, David N.; van Bronswijk, Wilhelm
2004-01-01
Students of an Applied Chemistry degree course carried out a fire-assay exercise. The analysis showed that the technique was a worthwhile quantitative analytical technique and covered interesting theory including acid-base and redox chemistry and other concepts such as inquarting and cupelling.
Separation and Analysis of Citral Isomers.
ERIC Educational Resources Information Center
Sacks, Jeff; And Others
1983-01-01
Provides background information, procedures, and results of an experiments designed to introduce undergraduates to the technique of steam distillation as a means of isolating thermally sensitive compounds. Chromatographic techniques (HPLC) and mass spectrometric analysis are used in the experiment which requires three laboratory periods. (JN)
A CHARTING TECHNIQUE FOR THE ANALYSIS OF BUSINESS SYSTEMS,
This paper describes a charting technique useful in the analysis of business systems and in studies of the information economics of the firm. The...planning advanced systems. It is not restricted to any particular kind of business or information system. (Author)
Fault detection in digital and analog circuits using an i(DD) temporal analysis technique
NASA Technical Reports Server (NTRS)
Beasley, J.; Magallanes, D.; Vridhagiri, A.; Ramamurthy, Hema; Deyong, Mark
1993-01-01
An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.
1976-09-01
The purpose of this research effort was to determine the financial management educational needs of USAF graduate logistics positions. Goal analysis...was used to identify financial management techniques and task analysis was used to develop a method to identify the use of financial management techniques...positions. The survey identified financial management techniques in five areas: cost accounting, capital budgeting, working capital, financial forecasting, and programming. (Author)
A Sensitivity Analysis of Circular Error Probable Approximation Techniques
1992-03-01
SENSITIVITY ANALYSIS OF CIRCULAR ERROR PROBABLE APPROXIMATION TECHNIQUES THESIS Presented to the Faculty of the School of Engineering of the Air Force...programming skills. Major Paul Auclair patiently advised me in this endeavor, and Major Andy Howell added numerous insightful contributions. I thank my...techniques. The two ret(st accuratec techniiques require numerical integration and can take several hours to run ov a personal comlputer [2:1-2,4-6]. Some
NASA Technical Reports Server (NTRS)
Coen, Peter G.
1991-01-01
A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.
Application of pattern recognition techniques to crime analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.
1976-08-15
The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)
Presentation-Oriented Visualization Techniques.
Kosara, Robert
2016-01-01
Data visualization research focuses on data exploration and analysis, yet the vast majority of visualizations people see were created for a different purpose: presentation. Whether we are talking about charts showing data to help make a presenter's point, data visuals created to accompany a news story, or the ubiquitous infographics, many more people consume charts than make them. Traditional visualization techniques treat presentation as an afterthought, but are there techniques uniquely suited to data presentation but not necessarily ideal for exploration and analysis? This article focuses on presentation-oriented techniques, considering their usefulness for presentation first and any other purposes as secondary.
Marcos-Garcés, V; Harvat, M; Molina Aguilar, P; Ferrández Izquierdo, A; Ruiz-Saurí, A
2017-08-01
Measurement of collagen bundle orientation in histopathological samples is a widely used and useful technique in many research and clinical scenarios. Fourier analysis is the preferred method for performing this measurement, but the most appropriate staining and microscopy technique remains unclear. Some authors advocate the use of Haematoxylin-Eosin (H&E) and confocal microscopy, but there are no studies comparing this technique with other classical collagen stainings. In our study, 46 human skin samples were collected, processed for histological analysis and stained with Masson's trichrome, Picrosirius red and H&E. Five microphotographs of the reticular dermis were taken with a 200× magnification with light microscopy, polarized microscopy and confocal microscopy, respectively. Two independent observers measured collagen bundle orientation with semiautomated Fourier analysis with the Image-Pro Plus 7.0 software and three independent observers performed a semiquantitative evaluation of the same parameter. The average orientation for each case was calculated with the values of the five pictures. We analyzed the interrater reliability, the consistency between Fourier analysis and average semiquantitative evaluation and the consistency between measurements in Masson's trichrome, Picrosirius red and H&E-confocal. Statistical analysis for reliability and agreement was performed with the SPSS 22.0 software and consisted of intraclass correlation coefficient (ICC), Bland-Altman plots and limits of agreement and coefficient of variation. Interrater reliability was almost perfect (ICC > 0.8) with all three histological and microscopy techniques and always superior in Fourier analysis than in average semiquantitative evaluation. Measurements were consistent between Fourier analysis by one observer and average semiquantitative evaluation by three observers, with an almost perfect agreement with Masson's trichrome and Picrosirius red techniques (ICC > 0.8) and a strong agreement with H&E-confocal (0.7 < ICC < 0.8). Comparison of measurements between the three techniques for the same observer showed an almost perfect agreement (ICC > 0.8), better with Fourier analysis than with semiquantitative evaluation (single and average). These results in nonpathological skin samples were also confirmed in a preliminary analysis in eight scleroderma skin samples. Our results show that Masson's trichrome and Picrosirius red are consistent with H&E-confocal for measuring collagen bundle orientation in histological samples and could thus be used indistinctly for this purpose. Fourier analysis is superior to average semiquantitative evaluation and should keep being used as the preferred method. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Determining Kinetic Parameters for Isothermal Crystallization of Glasses
NASA Technical Reports Server (NTRS)
Ray, C. S.; Zhang, T.; Reis, S. T.; Brow, R. K.
2006-01-01
Non-isothermal crystallization techniques are frequently used to determine the kinetic parameters for crystallization in glasses. These techniques are experimentally simple and quick compared to the isothermal techniques. However, the analytical models used for non-isothermal data analysis, originally developed for describing isothermal transformation kinetics, are fundamentally flawed. The present paper describes a technique for determining the kinetic parameters for isothermal crystallization in glasses, which eliminates most of the common problems that generally make the studies of isothermal crystallization laborious and time consuming. In this technique, the volume fraction of glass that is crystallized as a function of time during an isothermal hold was determined using differential thermal analysis (DTA). The crystallization parameters for the lithium-disilicate (Li2O.2SiO2) model glass were first determined and compared to the same parameters determined by other techniques to establish the accuracy and usefulness of the present technique. This technique was then used to describe the crystallization kinetics of a complex Ca-Sr-Zn-silicate glass developed for sealing solid oxide fuel cells.
Evaluation of methods for rapid determination of freezing point of aviation fuels
NASA Technical Reports Server (NTRS)
Mathiprakasam, B.
1982-01-01
Methods for identification of the more promising concepts for the development of a portable instrument to rapidly determine the freezing point of aviation fuels are described. The evaluation process consisted of: (1) collection of information on techniques previously used for the determination of the freezing point, (2) screening and selection of these techniques for further evaluation of their suitability in a portable unit for rapid measurement, and (3) an extensive experimental evaluation of the selected techniques and a final selection of the most promising technique. Test apparatuses employing differential thermal analysis and the change in optical transparency during phase change were evaluated and tested. A technique similar to differential thermal analysis using no reference fuel was investigated. In this method, the freezing point was obtained by digitizing the data and locating the point of inflection. Results obtained using this technique compare well with those obtained elsewhere using different techniques. A conceptual design of a portable instrument incorporating this technique is presented.
Enamel paint techniques in archaeology and their identification using XRF and micro-XRF
NASA Astrophysics Data System (ADS)
Hložek, M.; Trojek, T.; Komoróczy, B.; Prokeš, R.
2017-08-01
This investigation focuses in detail on the analysis of discoveries in South Moravia - important sites from the Roman period in Pasohlávky and Mušov. Using X-ray fluorescence analysis and micro-analysis we help identify the techniques of enamel paint and give a thorough chemical analysis in details which would not be possible to determine by means of macroscopic examination. We thus address the influence of elemental composition on the final colour of the enamel paint and describe the less known technique of combining enamel with millefiori. The material analyses of the metal artefacts decorated with enamel paint significantly contribute to our knowledge of the technology being used during the Roman period.
Medvedev, Nickolay S; Shaverina, Anastasiya V; Tsygankova, Alphiya R; Saprykin, Anatoly I
2016-08-01
The paper presents a combined technique of germanium dioxide analysis by inductively coupled plasma atomic emission spectrometry (ICP-AES) with preconcentration of trace elements by distilling off matrix and electrothermal (ETV) introduction of the trace elements concentrate into the ICP. Evaluation of metrological characteristics of the developed technique of high-purity germanium dioxide analysis was performed. The limits of detection (LODs) for 25 trace elements ranged from 0.05 to 20ng/g. The accuracy of proposed technique is confirmed by "added-found" («or spiking») experiment and comparing the results of ETV-ICP-AES and ICP-AES analysis of high purity germanium dioxide samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Factor weighting in DRASTIC modeling.
Pacheco, F A L; Pires, L M G R; Santos, R M B; Sanches Fernandes, L F
2015-02-01
Evaluation of aquifer vulnerability comprehends the integration of very diverse data, including soil characteristics (texture), hydrologic settings (recharge), aquifer properties (hydraulic conductivity), environmental parameters (relief), and ground water quality (nitrate contamination). It is therefore a multi-geosphere problem to be handled by a multidisciplinary team. The DRASTIC model remains the most popular technique in use for aquifer vulnerability assessments. The algorithm calculates an intrinsic vulnerability index based on a weighted addition of seven factors. In many studies, the method is subject to adjustments, especially in the factor weights, to meet the particularities of the studied regions. However, adjustments made by different techniques may lead to markedly different vulnerabilities and hence to insecurity in the selection of an appropriate technique. This paper reports the comparison of 5 weighting techniques, an enterprise not attempted before. The studied area comprises 26 aquifer systems located in Portugal. The tested approaches include: the Delphi consensus (original DRASTIC, used as reference), Sensitivity Analysis, Spearman correlations, Logistic Regression and Correspondence Analysis (used as adjustment techniques). In all cases but Sensitivity Analysis, adjustment techniques have privileged the factors representing soil characteristics, hydrologic settings, aquifer properties and environmental parameters, by leveling their weights to ≈4.4, and have subordinated the factors describing the aquifer media by downgrading their weights to ≈1.5. Logistic Regression predicts the highest and Sensitivity Analysis the lowest vulnerabilities. Overall, the vulnerability indices may be separated by a maximum value of 51 points. This represents an uncertainty of 2.5 vulnerability classes, because they are 20 points wide. Given this ambiguity, the selection of a weighting technique to integrate a vulnerability index may require additional expertise to be set up satisfactorily. Following a general criterion that weights must be proportional to the range of the ratings, Correspondence Analysis may be recommended as the best adjustment technique. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Hall, David G.; Heidelberg, Laurence; Konno, Kevin
1993-01-01
The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.
NASA Technical Reports Server (NTRS)
Hall, David G.; Heidelberg, Laurence; Konno, Kevin
1993-01-01
The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.
Application of a novel new multispectral nanoparticle tracking technique
NASA Astrophysics Data System (ADS)
McElfresh, Cameron; Harrington, Tyler; Vecchio, Kenneth S.
2018-06-01
Fast, reliable, and accurate particle size analysis techniques must meet the demands of evolving industrial and academic research in areas of functionalized nanoparticle synthesis, advanced materials development, and other nanoscale enabled technologies. In this study a new multispectral particle tracking analysis (m-PTA) technique enabled by the ViewSizer™ 3000 (MANTA Instruments, USA) was evaluated using solutions of monomodal and multimodal gold and polystyrene latex nanoparticles, as well as a spark eroded polydisperse 316L stainless steel nanopowder, and large (non-Brownian) borosilicate particles. It was found that m-PTA performed comparably to the DLS in evaluation of monomodal particle size distributions. When measuring bimodal, trimodal and polydisperse solutions, the m-PTA technique overwhelmingly outperformed traditional dynamic light scattering (DLS) in both peak detection and relative particle concentration analysis. It was also observed that the m-PTA technique is less susceptible to large particle overexpression errors. The ViewSizer™ 3000 was also found to be successful in accurately evaluating sizes and concentrations of monomodal and bimodal sinking borosilicate particles.
NASA Astrophysics Data System (ADS)
Sinha, Mangalika; Modi, Mohammed H.
2017-10-01
In-depth compositional analysis of 240 Å thick aluminium oxide thin film has been carried out using soft x-ray reflectivity (SXR) and x-ray photoelectron spectroscopy technique (XPS). The compositional details of the film is estimated by modelling the optical index profile obtained from the SXR measurements over 60-200 Å wavelength region. The SXR measurements are carried out at Indus-1 reflectivity beamline. The method suggests that the principal film region is comprised of Al2O3 and AlOx (x = 1.6) phases whereas the interface region comprised of SiO2 and AlOx (x = 1.6) mixture. The soft x-ray reflectivity technique combined with XPS measurements explains the compositional details of principal layer. Since the interface region cannot be analyzed with the XPS technique in a non-destructive manner in such a case the SXR technique is a powerful tool for nondestructive compositional analysis of interface region.
NASA Astrophysics Data System (ADS)
Avitabile, Peter; O'Callahan, John
2009-01-01
Generally, response analysis of systems containing discrete nonlinear connection elements such as typical mounting connections require the physical finite element system matrices to be used in a direct integration algorithm to compute the nonlinear response analysis solution. Due to the large size of these physical matrices, forced nonlinear response analysis requires significant computational resources. Usually, the individual components of the system are analyzed and tested as separate components and their individual behavior may essentially be linear when compared to the total assembled system. However, the joining of these linear subsystems using highly nonlinear connection elements causes the entire system to become nonlinear. It would be advantageous if these linear modal subsystems could be utilized in the forced nonlinear response analysis since much effort has usually been expended in fine tuning and adjusting the analytical models to reflect the tested subsystem configuration. Several more efficient techniques have been developed to address this class of problem. Three of these techniques given as: equivalent reduced model technique (ERMT);modal modification response technique (MMRT); andcomponent element method (CEM); are presented in this paper and are compared to traditional methods.
Analysis of intracranial pressure: past, present, and future.
Di Ieva, Antonio; Schmitz, Erika M; Cusimano, Michael D
2013-12-01
The monitoring of intracranial pressure (ICP) is an important tool in medicine for its ability to portray the brain's compliance status. The bedside monitor displays the ICP waveform and intermittent mean values to guide physicians in the management of patients, particularly those having sustained a traumatic brain injury. Researchers in the fields of engineering and physics have investigated various mathematical analysis techniques applicable to the waveform in order to extract additional diagnostic and prognostic information, although they largely remain limited to research applications. The purpose of this review is to present the current techniques used to monitor and interpret ICP and explore the potential of using advanced mathematical techniques to provide information about system perturbations from states of homeostasis. We discuss the limits of each proposed technique and we propose that nonlinear analysis could be a reliable approach to describe ICP signals over time, with the fractal dimension as a potential predictive clinically meaningful biomarker. Our goal is to stimulate translational research that can move modern analysis of ICP using these techniques into widespread practical use, and to investigate to the clinical utility of a tool capable of simplifying multiple variables obtained from various sensors.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
Protocol Analysis: A Methodology for Exploring the Information Processing of Gifted Students.
ERIC Educational Resources Information Center
Anderson, Margaret A.
1986-01-01
Protocol analysis techniques, in which subjects are taught to think aloud, can provide information on the mental operations used by gifted learners. Concerns over the use of such data are described and new directions for the technique are proposed. (CL)
Behavior Analysis: Methodological Foundations.
ERIC Educational Resources Information Center
Owen, James L.
Behavior analysis provides a unique way of coming to understand intrapersonal and interpersonal communication behaviors, and focuses on control techniques available to a speaker and counter-control techniques available to a listener. "Time-series methodology" is a convenient term because it subsumes under one label a variety of baseline…
Classroom Dialogue and Science Achievement.
ERIC Educational Resources Information Center
Clarke, John A.
This study reports the application to classroom dialogue of the Thematic and Structural Analysis (TSA) Technique which has been used previously in the analysis of text materials. The TSA Technique identifies themes (word clusters) and their structural relationship throughout sequentially organized material. Dialogues from four Year 8 science…
Spectroscopic analysis technique for arc-welding process control
NASA Astrophysics Data System (ADS)
Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel
2005-09-01
The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.
A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin McCarthy; Milos Manic
Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less
Schnabel, Thomas; Musso, Maurizio; Tondi, Gianluca
2014-01-01
Vibrational spectroscopy is one of the most powerful tools in polymer science. Three main techniques--Fourier transform infrared spectroscopy (FT-IR), FT-Raman spectroscopy, and FT near-infrared (NIR) spectroscopy--can also be applied to wood science. Here, these three techniques were used to investigate the chemical modification occurring in wood after impregnation with tannin-hexamine preservatives. These spectroscopic techniques have the capacity to detect the externally added tannin. FT-IR has very strong sensitivity to the aromatic peak at around 1610 cm(-1) in the tannin-treated samples, whereas FT-Raman reflects the peak at around 1600 cm(-1) for the externally added tannin. This high efficacy in distinguishing chemical features was demonstrated in univariate analysis and confirmed via cluster analysis. Conversely, the results of the NIR measurements show noticeable sensitivity for small differences. For this technique, multivariate analysis is required and with this chemometric tool, it is also possible to predict the concentration of tannin on the surface.
NASA Astrophysics Data System (ADS)
Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.
2016-01-01
In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.
[The progress in speciation analysis of trace elements by atomic spectrometry].
Wang, Zeng-Huan; Wang, Xu-Nuo; Ke, Chang-Liang; Lin, Qin
2013-12-01
The main purpose of the present work is to review the different non-chromatographic methods for the speciation analysis of trace elements in geological, environmental, biological and medical areas. In this paper, the sample processing methods in speciation analysis were summarized, and the main strategies for non-chromatographic technique were evaluated. The basic principles of the liquid extractions proposed in the published literatures recently and their advantages and disadvantages were discussed, such as conventional solvent extraction, cloud point extraction, single droplet microextraction, and dispersive liquid-liquid microextraction. Solid phase extraction, as a non-chromatographic technique for speciation analysis, can be used in batch or in flow detection, and especially suitable for the online connection to atomic spectrometric detector. The developments and applications of sorbent materials filled in the columns of solid phase extraction were reviewed. The sorbents include chelating resins, nanometer materials, molecular and ion imprinted materials, and bio-sorbents. Other techniques, e. g. hydride generation technique and coprecipitation, were also reviewed together with their main applications.
Extension of vibrational power flow techniques to two-dimensional structures
NASA Technical Reports Server (NTRS)
Cuschieri, Joseph M.
1988-01-01
In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or finite element analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid frequencies between the optimum frequency regimes for SEA and FEA. Power flow analysis has in general been used on 1-D beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to 2-D plate-like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA results at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.
Extension of vibrational power flow techniques to two-dimensional structures
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or Finite Element Analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid- frequencies between the optimum frequency regimes for FEA and SEA. Power flow analysis has in general been used on one-dimensional beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to two-dimensional plate like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.
An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP Techniques
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1972-01-01
Author identified significant preliminary results from the Ouachita portion of the Texoma frame of data indicate many potentials in the analysis and interpretation of ERTS data. It is believed that one of the more significant aspects of this analysis sequence has been the investigation of a technique to relate ERTS analysis and surface observation analysis. At present a sequence involving (1) preliminary analysis based solely upon the spectral characteristics of the data, followed by (2) a surface observation mission to obtain visual information and oblique photography to particular points of interest in the test site area, appears to provide an extremely efficient technique for obtaining particularly meaningful surface observation data. Following such a procedure permits concentration on particular points of interest in the entire ERTS frame and thereby makes the surface observation data obtained to be particularly significant and meaningful. The analysis of the Texoma frame has also been significant from the standpoint of demonstrating a fast turn around analysis capability. Additionally, the analysis has shown the potential accuracy and degree of complexity of features that can be identified and mapped using ERTS data.
Reduction and analysis of data collected during the electromagnetic tornado experiment
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1976-01-01
Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.
An R package for the integrated analysis of metabolomics and spectral data.
Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel
2016-06-01
Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine
Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less
The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions
Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine; ...
2017-07-18
Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-01-01
Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost. PMID:18522756
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
Foster, Katherine T; Beltz, Adriene M
2018-08-01
Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.
Umeta, Ricardo S G; Avanzi, Osmar
2011-07-01
Spine fusions can be performed through different techniques and are used to treat a number of vertebral pathologies. However, there seems to be no consensus regarding which technique of fusion is best suited to treat each distinct spinal disease or group of diseases. To study the effectiveness and complications of the different techniques used for spinal fusion in patients with lumbar spondylosis. Systematic literature review and meta-analysis. Randomized clinical studies comparing the most commonly performed surgical techniques for spine fusion in lumbar-sacral spondylosis, as well as those reporting patient outcome were selected. Identify which technique, if any, presents the best clinical, functional, and radiographic outcome. Systematic literature review and meta-analysis based on scientific articles published and indexed to the following databases: PubMed (1966-2009), Cochrane Collaboration-CENTRAL, EMBASE (1980-2009), and LILACS (1982-2009). The general search strategy focused on the surgical treatment of patients with lumbar-sacral spondylosis. Eight studies met the inclusion criteria and were selected with a total of 1,136 patients. Meta-analysis showed that patients who underwent interbody fusion presented a significantly smaller blood loss (p=.001) and a greater rate of bone fusion (p=.02). Patients submitted to fusion using the posterolateral approach had a significantly shorter operative time (p=.007) and less perioperative complications (p=.03). No statistically significant difference was found for the other studied variables (pain, functional impairment, and return to work). The most commonly used techniques for lumbar spine fusion in patients with spondylosis were interbody fusion and posterolateral approach. Both techniques were comparable in final outcome, but the former presented better rates of fusion and the latter the less complications. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Carden, J. L.; Browner, R.
1982-01-01
The preparation and analysis of standardized waste samples for controlled ecological life support systems (CELSS) are considered. Analysis of samples from wet oxidation experiments, the development of ion chromatographic techniques utilizing conventional high pressure liquid chromatography (HPLC) equipment, and an investigation of techniques for interfacing an ion chromatograph (IC) with an inductively coupled plasma optical emission spectrometer (ICPOES) are discussed.
Performance analysis of static locking in replicated distributed database systems
NASA Technical Reports Server (NTRS)
Kuang, Yinghong; Mukkamala, Ravi
1991-01-01
Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.
Price Analysis on Commercial Item Purchases within the Department of the Navy
2015-02-05
to paying more than a fair and reasonable price for products and services can be attributed to the shortage of qualified contract personnel, the...was fair and reasonable. Sometimes, due to exigent situations, supplies or services are purchased even though an adequate price or cost 4 6 7 0...Supplies- Price Analysis Techniques used ................................................ 25 Figure 10: Services - Price Analysis Techniques used
Phospholipid Fatty Acid Analysis: Past, Present and Future
NASA Astrophysics Data System (ADS)
Findlay, R. H.
2008-12-01
With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.
Numerical analysis of thermal drilling technique on titanium sheet metal
NASA Astrophysics Data System (ADS)
Kumar, R.; Hynes, N. Rajesh Jesudoss
2018-05-01
Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.
Practical issues of hyperspectral imaging analysis of solid dosage forms.
Amigo, José Manuel
2010-09-01
Hyperspectral imaging techniques have widely demonstrated their usefulness in different areas of interest in pharmaceutical research during the last decade. In particular, middle infrared, near infrared, and Raman methods have gained special relevance. This rapid increase has been promoted by the capability of hyperspectral techniques to provide robust and reliable chemical and spatial information on the distribution of components in pharmaceutical solid dosage forms. Furthermore, the valuable combination of hyperspectral imaging devices with adequate data processing techniques offers the perfect landscape for developing new methods for scanning and analyzing surfaces. Nevertheless, the instrumentation and subsequent data analysis are not exempt from issues that must be thoughtfully considered. This paper describes and discusses the main advantages and drawbacks of the measurements and data analysis of hyperspectral imaging techniques in the development of solid dosage forms.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
Discrete ordinates-Monte Carlo coupling: A comparison of techniques in NERVA radiation analysis
NASA Technical Reports Server (NTRS)
Lindstrom, D. G.; Normand, E.; Wilcox, A. D.
1972-01-01
In the radiation analysis of the NERVA nuclear rocket system, two-dimensional discrete ordinates calculations are sufficient to provide detail in the pressure vessel and reactor assembly. Other parts of the system, however, require three-dimensional Monte Carlo analyses. To use these two methods in a single analysis, a means of coupling was developed whereby the results of a discrete ordinates calculation can be used to produce source data for a Monte Carlo calculation. Several techniques for producing source detail were investigated. Results of calculations on the NERVA system are compared and limitations and advantages of the coupling techniques discussed.
Determination of fiber volume in graphite/epoxy materials using computer image analysis
NASA Technical Reports Server (NTRS)
Viens, Michael J.
1990-01-01
The fiber volume of graphite/epoxy specimens was determined by analyzing optical images of cross sectioned specimens using image analysis software. Test specimens were mounted and polished using standard metallographic techniques and examined at 1000 times magnification. Fiber volume determined using the optical imaging agreed well with values determined using the standard acid digestion technique. The results were found to agree within 5 percent over a fiber volume range of 45 to 70 percent. The error observed is believed to arise from fiber volume variations within the graphite/epoxy panels themselves. The determination of ply orientation using image analysis techniques is also addressed.
Structural study of Cu(II) complexes with benzo[b]furancarboxylic acids
NASA Astrophysics Data System (ADS)
Kalinowska, Diana; Klepka, Marcin T.; Wolska, Anna; Drzewiecka-Antonik, Aleksandra; Ostrowska, Kinga; Struga, Marta
2017-11-01
Four Cu(II) complexes with 2- and 3-benzo[b]furancarboxylic acids have been synthesized and characterized using combination of two spectroscopic techniques. These techniques were: (i) FTIR and (ii) XAFS. FTIR analysis confirmed that complexes were formed and gave insight into identification of possible coordinating groups to the metallic center. XANES analysis indicated that the oxidation state of Cu is +2. EXAFS analysis allowed to identify that the first coordination sphere is formed by 4-5 oxygen atoms with the Cu-O distances around 2 Å. Combining these techniques it was possible to structurally describe novel Cu(II) complexes with benzo[b]furancarboxylic acids.
Cullen, Jared; Lobo, Charlene J; Ford, Michael J; Toth, Milos
2015-09-30
Electron-beam-induced deposition (EBID) is a direct-write chemical vapor deposition technique in which an electron beam is used for precursor dissociation. Here we show that Arrhenius analysis of the deposition rates of nanostructures grown by EBID can be used to deduce the diffusion energies and corresponding preexponential factors of EBID precursor molecules. We explain the limitations of this approach, define growth conditions needed to minimize errors, and explain why the errors increase systematically as EBID parameters diverge from ideal growth conditions. Under suitable deposition conditions, EBID can be used as a localized technique for analysis of adsorption barriers and prefactors.
NASA Astrophysics Data System (ADS)
Grassi, N.
2005-06-01
In the framework of the extensive study on the wood painting "Madonna dei fusi" attributed to Leonardo da Vinci, Ion Beam Analysis (IBA) techniques were used at the Florence accelerator laboratory to get information about the elemental composition of the paint layers. After a brief description of the basic principle and the general features of IBA techniques, we will illustrate in detail how the analysis allowed us to characterise the pigments of original and restored areas and the substrate composition, and to obtain information about the stratigraphy of the painting, also providing an estimate of the paint layer thickness.
Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis
NASA Technical Reports Server (NTRS)
Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.
2017-01-01
This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.
Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.
Summers, A E
2000-01-01
ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.
NASA Astrophysics Data System (ADS)
Kozikowski, Raymond T.; Smith, Sarah E.; Lee, Jennifer A.; Castleman, William L.; Sorg, Brian S.; Hahn, David W.
2012-06-01
Fluorescence spectroscopy has been widely investigated as a technique for identifying pathological tissue; however, unrelated subject-to-subject variations in spectra complicate data analysis and interpretation. We describe and evaluate a new biosensing technique, differential laser-induced perturbation spectroscopy (DLIPS), based on deep ultraviolet (UV) photochemical perturbation in combination with difference spectroscopy. This technique combines sequential fluorescence probing (pre- and post-perturbation) with sub-ablative UV perturbation and difference spectroscopy to provide a new spectral dimension, facilitating two improvements over fluorescence spectroscopy. First, the differential technique eliminates significant variations in absolute fluorescence response within subject populations. Second, UV perturbations alter the extracellular matrix (ECM), directly coupling the DLIPS response to the biological structure. Improved biosensing with DLIPS is demonstrated in vivo in a murine model of chemically induced skin lesion development. Component loading analysis of the data indicates that the DLIPS technique couples to structural proteins in the ECM. Analysis of variance shows that DLIPS has a significant response to emerging pathology as opposed to other population differences. An optimal likelihood ratio classifier for the DLIPS dataset shows that this technique holds promise for improved diagnosis of epithelial pathology. Results further indicate that DLIPS may improve diagnosis of tissue by augmenting fluorescence spectra (i.e. orthogonal sensing).
Edge compression techniques for visualization of dense directed graphs.
Dwyer, Tim; Henry Riche, Nathalie; Marriott, Kim; Mears, Christopher
2013-12-01
We explore the effectiveness of visualizing dense directed graphs by replacing individual edges with edges connected to 'modules'-or groups of nodes-such that the new edges imply aggregate connectivity. We only consider techniques that offer a lossless compression: that is, where the entire graph can still be read from the compressed version. The techniques considered are: a simple grouping of nodes with identical neighbor sets; Modular Decomposition which permits internal structure in modules and allows them to be nested; and Power Graph Analysis which further allows edges to cross module boundaries. These techniques all have the same goal--to compress the set of edges that need to be rendered to fully convey connectivity--but each successive relaxation of the module definition permits fewer edges to be drawn in the rendered graph. Each successive technique also, we hypothesize, requires a higher degree of mental effort to interpret. We test this hypothetical trade-off with two studies involving human participants. For Power Graph Analysis we propose a novel optimal technique based on constraint programming. This enables us to explore the parameter space for the technique more precisely than could be achieved with a heuristic. Although applicable to many domains, we are motivated by--and discuss in particular--the application to software dependency analysis.
DART-MS: A New Analytical Technique for Forensic Paint Analysis.
Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice
2018-06-05
Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.
A guide to understanding meta-analysis.
Israel, Heidi; Richter, Randy R
2011-07-01
With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.
Comparison of point intercept and image analysis for monitoring rangeland transects
USDA-ARS?s Scientific Manuscript database
Amidst increasing workloads and static or declining budgets, both public and private land management agencies face the need to adapt resource-monitoring techniques or risk falling behind on resource monitoring volume and quality with old techniques. Image analysis of nadir plot images, acquired with...
DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES
A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...
Secondary Analysis of Qualitative Data.
ERIC Educational Resources Information Center
Turner, Paul D.
The reanalysis of data to answer the original research question with better statistical techniques or to answer new questions with old data is not uncommon in quantitative studies. Meta analysis and research syntheses have increased with the increase in research using similar statistical analyses, refinements of analytical techniques, and the…
Code of Federal Regulations, 2012 CFR
2012-10-01
... licensed medical professional, for a billed item or service identified by data analysis techniques or probe... rate based on the results of a probe review prior to the initiation of complex medical review. Medical... licensed medical professional, for a billed item or service identified by data analysis techniques or probe...
Code of Federal Regulations, 2010 CFR
2010-10-01
... licensed medical professional, for a billed item or service identified by data analysis techniques or probe... rate based on the results of a probe review prior to the initiation of complex medical review. Medical... licensed medical professional, for a billed item or service identified by data analysis techniques or probe...
Code of Federal Regulations, 2011 CFR
2011-10-01
... licensed medical professional, for a billed item or service identified by data analysis techniques or probe... rate based on the results of a probe review prior to the initiation of complex medical review. Medical... licensed medical professional, for a billed item or service identified by data analysis techniques or probe...
A Structural and Content-Based Analysis for Web Filtering.
ERIC Educational Resources Information Center
Lee, P. Y.; Hui, S. C.; Fong, A. C. M.
2003-01-01
Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)
Developing Scenarios: Linking Environmental Scanning and Strategic Planning.
ERIC Educational Resources Information Center
Whiteley, Meredith A.; And Others
1990-01-01
The multiple scenario analysis technique for organizational planning used by multinational corporations is adaptable for colleges and universities. Arizona State University launched a futures-based planning project using the Delphi technique and cross-impact analysis to produce three alternative scenarios (stable, turbulent, and chaotic) to expand…
Fourier Spectroscopy: A Simple Analysis Technique
ERIC Educational Resources Information Center
Oelfke, William C.
1975-01-01
Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)
NASA Astrophysics Data System (ADS)
Sait, Abdulrahman S.
This dissertation presents a reliable technique for monitoring the condition of rotating machinery by applying instantaneous angular speed (IAS) analysis. A new analysis of the effects of changes in the orientation of the line of action and the pressure angle of the resultant force acting on gear tooth profile of spur gear under different levels of tooth damage is utilized. The analysis and experimental work discussed in this dissertation provide a clear understating of the effects of damage on the IAS by analyzing the digital signals output of rotary incremental optical encoder. A comprehensive literature review of state of the knowledge in condition monitoring and fault diagnostics of rotating machinery, including gearbox system is presented. Progress and new developments over the past 30 years in failure detection techniques of rotating machinery including engines, bearings and gearboxes are thoroughly reviewed. This work is limited to the analysis of a gear train system with gear tooth surface faults utilizing angular motion analysis technique. Angular motion data were acquired using an incremental optical encoder. Results are compared to a vibration-based technique. The vibration data were acquired using an accelerometer. The signals were obtained and analyzed in the phase domains using signal averaging to determine the existence and position of faults on the gear train system. Forces between the mating teeth surfaces are analyzed and simulated to validate the influence of the presence of damage on the pressure angle and the IAS. National Instruments hardware is used and NI LabVIEW software code is developed for real-time, online condition monitoring systems and fault detection techniques. The sensitivity of optical encoders to gear fault detection techniques is experimentally investigated by applying IAS analysis under different gear damage levels and different operating conditions. A reliable methodology is developed for selecting appropriate testing/operating conditions of a rotating system to generate an alarm system for damage detection.
Identification of groundwater nitrate sources in pre-alpine catchments: a multi-tracer approach
NASA Astrophysics Data System (ADS)
Stoewer, Myriam; Stumpp, Christine
2014-05-01
Porous aquifers in pre-alpine areas are often used as drinking water resources due to their good water quality status and water yield. Maintaining these resources requires knowledge about possible sources of pollutants and a sustainable management practice in groundwater catchment areas. Of particular interest in agricultural areas, like in pre-alpine regions, is limiting nitrate input as main groundwater pollutant. Therefore, the objective of the presented study is i) to identify main nitrate sources in a pre-alpine groundwater catchment with current low nitrate concentration using stable isotopes of nitrate (d18O and d15N) and ii) to investigate seasonal dynamics of nitrogen compounds. The groundwater catchment areas of four porous aquifers are located in Southern Germany. Most of the land use is organic grassland farming as well as forestry and residential area. Thus, potential sources of nitrate mainly are mineral fertilizer, manure/slurry, leaking sewage system and atmospheric deposition of nitrogen compounds. Monthly freshwater samples (precipitation, river water and groundwater) are analysed for stable isotope of water (d2H, d18O), the concentration of major anions and cations, electrical conductivity, water temperature, pH and oxygen. In addition, isotopic analysis of d18O-NO3- and d15N-NO3- for selected samples is carried out using the denitrifier method. In general, all groundwater samples were oxic (10.0±2.6mg/L) and nitrate concentrations were low (0.2 - 14.6mg/L). The observed nitrate isotope values in the observation area compared to values from local precipitation, sewage, manure and mineral fertilizer as well as to data from literature shows that the nitrate in freshwater samples is of microbial origin. Nitrate derived from ammonium in fertilizers and precipitation as well as from soil nitrogen. It is suggested that a major potential threat to the groundwater quality is ammonia and ammonium at a constant level mainly from agriculture activities as well as continuously release of nitrogen stored in agricultural soils due to mineralization processes. In all groundwater and river water samples a seasonal variation of nitrate sources and concentration is absent but nitrate in precipitation shows a clear seasonal variation with peaks in spring and fall according to agricultural activity. This points to dilution effects of high nitrate inputs due to the large groundwater volume and mean residence time and highlights the function of soil as initial sink for nitrogen compounds delivered by fertilizer. Even though nitrate contamination was low in the study area, the results emphasize the importance of reducing additional nitrate sources in pre-alpine oxic aquifers. This will maintain the good water quality status of the aquifers and enable its use for drinking water supply.
Kernel analysis in TeV gamma-ray selection
NASA Astrophysics Data System (ADS)
Moriarty, P.; Samuelson, F. W.
2000-06-01
We discuss the use of kernel analysis as a technique for selecting gamma-ray candidates in Atmospheric Cherenkov astronomy. The method is applied to observations of the Crab Nebula and Markarian 501 recorded with the Whipple 10 m Atmospheric Cherenkov imaging system, and the results are compared with the standard Supercuts analysis. Since kernel analysis is computationally intensive, we examine approaches to reducing the computational load. Extension of the technique to estimate the energy of the gamma-ray primary is considered. .
Qualitative computer aided evaluation of dental impressions in vivo.
Luthardt, Ralph G; Koch, Rainer; Rudolph, Heike; Walter, Michael H
2006-01-01
Clinical investigations dealing with the precision of different impression techniques are rare. Objective of the present study was to develop and evaluate a procedure for the qualitative analysis of the three-dimensional impression precision based on an established in-vitro procedure. The zero hypothesis to be tested was that the precision of impressions does not differ depending on the impression technique used (single-step, monophase and two-step-techniques) and on clinical variables. Digital surface data of patient's teeth prepared for crowns were gathered from standardized manufactured master casts after impressions with three different techniques were taken in a randomized order. Data-sets were analyzed for each patient in comparison with the one-step impression chosen as the reference. The qualitative analysis was limited to data-points within the 99.5%-range. Based on the color-coded representation areas with maximum deviations were determined (preparation margin and the mantle and occlusal surface). To qualitatively analyze the precision of the impression techniques, the hypothesis was tested in linear models for repeated measures factors (p < 0.05). For the positive 99.5% deviations no variables with significant influence were determined in the statistical analysis. In contrast, the impression technique and the position of the preparation margin significantly influenced the negative 99.5% deviations. The influence of clinical parameter on the deviations between impression techniques can be determined reliably using the 99.5 percentile of the deviations. An analysis regarding the areas with maximum deviations showed high clinical relevance. The preparation margin was pointed out as the weak spot of impression taking.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Group decision-making techniques for natural resource management applications
Coughlan, Beth A.K.; Armour, Carl L.
1992-01-01
This report is an introduction to decision analysis and problem-solving techniques for professionals in natural resource management. Although these managers are often called upon to make complex decisions, their training in the natural sciences seldom provides exposure to the decision-making tools developed in management science. Our purpose is to being to fill this gap. We present a general analysis of the pitfalls of group problem solving, and suggestions for improved interactions followed by the specific techniques. Selected techniques are illustrated. The material is easy to understand and apply without previous training or excessive study and is applicable to natural resource management issues.
The edge of chaos: A nonlinear view of psychoanalytic technique.
Galatzer-Levy, Robert M
2016-04-01
The field of nonlinear dynamics (or chaos theory) provides ways to expand concepts of psychoanalytic process that have implications for the technique of psychoanalysis. This paper describes how concepts of "the edge of chaos," emergence, attractors, and coupled oscillators can help shape analytic technique resulting in an approach to doing analysis which is at the same time freer and more firmly based in an enlarged understanding of the ways in which psychoanalysis works than some current recommendation about technique. Illustrations from a lengthy analysis of an analysand with obsessive-compulsive disorder show this approach in action. Copyright © 2016 Institute of Psychoanalysis.
Automated quantification of the synchrogram by recurrence plot analysis.
Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart
2012-04-01
Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
2016-02-01
Multiresolution analysis techniques including continuous wavelet transform, empirical mode decomposition, and variational mode decomposition are tested in the context of interest rate next-day variation prediction. In particular, multiresolution analysis techniques are used to decompose interest rate actual variation and feedforward neural network for training and prediction. Particle swarm optimization technique is adopted to optimize its initial weights. For comparison purpose, autoregressive moving average model, random walk process and the naive model are used as main reference models. In order to show the feasibility of the presented hybrid models that combine multiresolution analysis techniques and feedforward neural network optimized by particle swarm optimization, we used a set of six illustrative interest rates; including Moody's seasoned Aaa corporate bond yield, Moody's seasoned Baa corporate bond yield, 3-Month, 6-Month and 1-Year treasury bills, and effective federal fund rate. The forecasting results show that all multiresolution-based prediction systems outperform the conventional reference models on the criteria of mean absolute error, mean absolute deviation, and root mean-squared error. Therefore, it is advantageous to adopt hybrid multiresolution techniques and soft computing models to forecast interest rate daily variations as they provide good forecasting performance.
Novel near-infrared sampling apparatus for single kernel analysis of oil content in maize.
Janni, James; Weinstock, B André; Hagen, Lisa; Wright, Steve
2008-04-01
A method of rapid, nondestructive chemical and physical analysis of individual maize (Zea mays L.) kernels is needed for the development of high value food, feed, and fuel traits. Near-infrared (NIR) spectroscopy offers a robust nondestructive method of trait determination. However, traditional NIR bulk sampling techniques cannot be applied successfully to individual kernels. Obtaining optimized single kernel NIR spectra for applied chemometric predictive analysis requires a novel sampling technique that can account for the heterogeneous forms, morphologies, and opacities exhibited in individual maize kernels. In this study such a novel technique is described and compared to less effective means of single kernel NIR analysis. Results of the application of a partial least squares (PLS) derived model for predictive determination of percent oil content per individual kernel are shown.
Madhusudanan, K P; Banerjee, Suchitra; Khanuja, Suman P S; Chattopadhyay, Sunil K
2008-06-01
The applicability of a new mass spectrometric technique, DART (direct analysis in real time) has been studied in the analysis of the hairy root culture of Rauvolfia serpentina. The intact hairy roots were analyzed by holding them in the gap between the DART source and the mass spectrometer for measurements. Two nitrogen-containing compounds, vomilenine and reserpine, were characterized from the analysis of the hairy roots almost instantaneously. The confirmation of the structures of the identified compounds was made through their accurate molecular formula determinations. This is the first report of the application of DART technique for the characterization of compounds that are expressed in the hairy root cultures of Rauvolfia serpentina. Moreover, this also constitutes the first report of expression of reserpine in the hairy root culture of Rauvolfia serpentina. Copyright (c) 2008 John Wiley & Sons, Ltd.
Huang, Junfeng; Wang, Fangjun; Ye, Mingliang; Zou, Hanfa
2014-11-06
Comprehensive analysis of the post-translational modifications (PTMs) on proteins at proteome level is crucial to elucidate the regulatory mechanisms of various biological processes. In the past decades, thanks to the development of specific PTM enrichment techniques and efficient multidimensional liquid chromatography (LC) separation strategy, the identification of protein PTMs have made tremendous progress. A huge number of modification sites for some major protein PTMs have been identified by proteomics analysis. In this review, we first introduced the recent progresses of PTM enrichment methods for the analysis of several major PTMs including phosphorylation, glycosylation, ubiquitination, acetylation, methylation, and oxidation/reduction status. We then briefly summarized the challenges for PTM enrichment. Finally, we introduced the fractionation and separation techniques for efficient separation of PTM peptides in large-scale PTM analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat
2009-01-01
Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.
Wang, Xueju; Pan, Zhipeng; Fan, Feifei; ...
2015-09-10
We present an application of the digital image correlation (DIC) method to high-resolution transmission electron microscopy (HRTEM) images for nanoscale deformation analysis. The combination of DIC and HRTEM offers both the ultrahigh spatial resolution and high displacement detection sensitivity that are not possible with other microscope-based DIC techniques. We demonstrate the accuracy and utility of the HRTEM-DIC technique through displacement and strain analysis on amorphous silicon. Two types of error sources resulting from the transmission electron microscopy (TEM) image noise and electromagnetic-lens distortions are quantitatively investigated via rigid-body translation experiments. The local and global DIC approaches are applied for themore » analysis of diffusion- and reaction-induced deformation fields in electrochemically lithiated amorphous silicon. As a result, the DIC technique coupled with HRTEM provides a new avenue for the deformation analysis of materials at the nanometer length scales.« less