Sample records for sensitive analytical tool

  1. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  2. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  3. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  4. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  5. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  6. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  7. Mechanical and Electronic Approaches to Improve the Sensitivity of Microcantilever Sensors

    PubMed Central

    Mutyala, Madhu Santosh Ku; Bandhanadham, Deepika; Pan, Liu; Pendyala, Vijaya Rohini; Ji, Hai-Feng

    2010-01-01

    Advances in the field of Micro Electro Mechanical Systems (MEMS) and their uses now offer unique opportunities in the design of ultrasensitive analytical tools. The analytical community continues to search for cost-effective, reliable, and even portable analytical techniques that can give reliable and fast response results for a variety of chemicals and biomolecules. Microcantilevers (MCLs) have emerged as a unique platform for label-free biosensor or bioassay. Several electronic designs, including piezoresistive, piezoelectric, and capacitive approaches, have been applied to measure the bending or frequency change of the MCLs upon exposure to chemicals. This review summarizes mechanical, fabrication, and electronics approaches to increase the sensitivity of microcantilever (MCL) sensors. PMID:20975987

  8. HPAEC-PAD for oligosaccharide analysis-novel insights into analyte sensitivity and response stability.

    PubMed

    Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra

    2017-12-01

    The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.

  9. Ultra-sensitive chemical and biological analysis via specialty fibers with built-in microstructured optofluidic channels.

    PubMed

    Zhang, Nan; Li, Kaiwei; Cui, Ying; Wu, Zhifang; Shum, Perry Ping; Auguste, Jean-Louis; Dinh, Xuan Quyen; Humbert, Georges; Wei, Lei

    2018-02-13

    All-in-fiber optofluidics is an analytical tool that provides enhanced sensing performance with simplified analyzing system design. Currently, its advance is limited either by complicated liquid manipulation and light injection configuration or by low sensitivity resulting from inadequate light-matter interaction. In this work, we design and fabricate a side-channel photonic crystal fiber (SC-PCF) and exploit its versatile sensing capabilities in in-line optofluidic configurations. The built-in microfluidic channel of the SC-PCF enables strong light-matter interaction and easy lateral access of liquid samples in these analytical systems. In addition, the sensing performance of the SC-PCF is demonstrated with methylene blue for absorptive molecular detection and with human cardiac troponin T protein by utilizing a Sagnac interferometry configuration for ultra-sensitive and specific biomolecular specimen detection. Owing to the features of great flexibility and compactness, high-sensitivity to the analyte variation, and efficient liquid manipulation/replacement, the demonstrated SC-PCF offers a generic solution to be adapted to various fiber-waveguide sensors to detect a wide range of analytes in real time, especially for applications from environmental monitoring to biological diagnosis.

  10. Automatic differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. AD is assessed as a tool for engineering design. The forward and reverse modes of AD, their computing requirements, as well as approaches to implementing AD are discussed. The application of two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation is also discussed. The observation is made that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available; in some instances, AD may be the alternative to consider in lieu of analytical sensitivity analysis.

  11. High-frequency phase shift measurement greatly enhances the sensitivity of QCM immunosensors.

    PubMed

    March, Carmen; García, José V; Sánchez, Ángel; Arnau, Antonio; Jiménez, Yolanda; García, Pablo; Manclús, Juan J; Montoya, Ángel

    2015-03-15

    In spite of being widely used for in liquid biosensing applications, sensitivity improvement of conventional (5-20MHz) quartz crystal microbalance (QCM) sensors remains an unsolved challenging task. With the help of a new electronic characterization approach based on phase change measurements at a constant fixed frequency, a highly sensitive and versatile high fundamental frequency (HFF) QCM immunosensor has successfully been developed and tested for its use in pesticide (carbaryl and thiabendazole) analysis. The analytical performance of several immunosensors was compared in competitive immunoassays taking carbaryl insecticide as the model analyte. The highest sensitivity was exhibited by the 100MHz HFF-QCM carbaryl immunosensor. When results were compared with those reported for 9MHz QCM, analytical parameters clearly showed an improvement of one order of magnitude for sensitivity (estimated as the I50 value) and two orders of magnitude for the limit of detection (LOD): 30μgl(-1) vs 0.66μgL(-1)I50 value and 11μgL(-1) vs 0.14μgL(-1) LOD, for 9 and 100MHz, respectively. For the fungicide thiabendazole, I50 value was roughly the same as that previously reported for SPR under the same biochemical conditions, whereas LOD improved by a factor of 2. The analytical performance achieved by high frequency QCM immunosensors surpassed those of conventional QCM and SPR, closely approaching the most sensitive ELISAs. The developed 100MHz QCM immunosensor strongly improves sensitivity in biosensing, and therefore can be considered as a very promising new analytical tool for in liquid applications where highly sensitive detection is required. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice

    NASA Astrophysics Data System (ADS)

    Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.

    2013-10-01

    Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d

  13. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayden, D. W.

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried tomore » develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of Hatler et. al., and the reactive temperature rise will be obtained from Mader's work. Finally, the assessment of when a detonation occurs will be derived from Bowden and Yoffe's thermal explosion theory (hot spot).« less

  14. Individual human cell responses to low doses of chemicals studied by synchrotron infrared spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Holman, Hoi-Ying N.; Goth-Goldstein, Regine; Blakely, Elanor A.; Bjornstad, Kathy; Martin, Michael C.; McKinney, Wayne R.

    2000-05-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in the individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR-FTIR microscopy probes intact living cells providing a composite view of all of the molecular response and the ability to monitor the response over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low- doses of chemicals. In this study we used the high spatial - resolution SR-FTIR vibrational spectromicroscopy as a sensitive analytical tool to detect chemical- and radiation- induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of dioxin. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio- compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  15. The biochemical properties of antibodies and their fragments

    USDA-ARS?s Scientific Manuscript database

    Immunoglobulins (Ig) or antibodies are a powerful molecular recognition tools that can be used to identify minute quantities of a given target analyte. Their antigen binding properties define both the sensitivity and selectivity of an immunoassay. Understanding the biochemical properties of this c...

  16. Electrochemical Enzyme Biosensors Revisited: Old Solutions for New Problems.

    PubMed

    Monteiro, Tiago; Almeida, Maria Gabriela

    2018-05-14

    Worldwide legislation is driving the development of novel and highly efficient analytical tools for assessing the composition of every material that interacts with Consumers or Nature. The biosensor technology is one of the most active R&D domains of Analytical Sciences focused on the challenge of taking analytical chemistry to the field. Electrochemical biosensors based on redox enzymes, in particular, are highly appealing due to their usual quick response, high selectivity and sensitivity, low cost and portable dimensions. This review paper aims to provide an overview of the most important advances made in the field since the proposal of the first biosensor, the well-known hand-held glucose meter. The first section addresses the current needs and challenges for novel analytical tools, followed by a brief description of the different components and configurations of biosensing devices, and the fundamentals of enzyme kinetics and amperometry. The following sections emphasize on enzyme-based amperometric biosensors and the different stages of their development.

  17. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  18. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  19. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  20. Linear regression metamodeling as a tool to summarize and present simulation model results.

    PubMed

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  1. Can we use high precision metal isotope analysis to improve our understanding of cancer?

    PubMed

    Larner, Fiona

    2016-01-01

    High precision natural isotope analyses are widely used in geosciences to trace elemental transport pathways. The use of this analytical tool is increasing in nutritional and disease-related research. In recent months, a number of groups have shown the potential this technique has in providing new observations for various cancers when applied to trace metal metabolism. The deconvolution of isotopic signatures, however, relies on mathematical models and geochemical data, which are not representative of the system under investigation. In addition to relevant biochemical studies of protein-metal isotopic interactions, technological development both in terms of sample throughput and detection sensitivity of these elements is now needed to translate this novel approach into a mainstream analytical tool. Following this, essential background healthy population studies must be performed, alongside observational, cross-sectional disease-based studies. Only then can the sensitivity and specificity of isotopic analyses be tested alongside currently employed methods, and important questions such as the influence of cancer heterogeneity and disease stage on isotopic signatures be addressed.

  2. A Sensitive and Effective Proteomic Approach to Identify She-Donkey’s and Goat’s Milk Adulterations by MALDI-TOF MS Fingerprinting

    PubMed Central

    Di Girolamo, Francesco; Masotti, Andrea; Salvatori, Guglielmo; Scapaticci, Margherita; Muraca, Maurizio; Putignani, Lorenza

    2014-01-01

    She-donkey’s milk (DM) and goat’s milk (GM) are commonly used in newborn and infant feeding because they are less allergenic than other milk types. It is, therefore, mandatory to avoid adulteration and contamination by other milk allergens, developing fast and efficient analytical methods to assess the authenticity of these precious nutrients. In this experimental work, a sensitive and robust matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) profiling was designed to assess the genuineness of DM and GM milks. This workflow allows the identification of DM and GM adulteration at levels of 0.5%, thus, representing a sensitive tool for milk adulteration analysis, if compared with other laborious and time-consuming analytical procedures. PMID:25110863

  3. In Vitro and In Vivo SERS Biosensing for Disease Diagnosis.

    PubMed

    Moore, T Joshua; Moody, Amber S; Payne, Taylor D; Sarabia, Grace M; Daniel, Alyssa R; Sharma, Bhavya

    2018-05-11

    For many disease states, positive outcomes are directly linked to early diagnosis, where therapeutic intervention would be most effective. Recently, trends in disease diagnosis have focused on the development of label-free sensing techniques that are sensitive to low analyte concentrations found in the physiological environment. Surface-enhanced Raman spectroscopy (SERS) is a powerful vibrational spectroscopy that allows for label-free, highly sensitive, and selective detection of analytes through the amplification of localized electric fields on the surface of a plasmonic material when excited with monochromatic light. This results in enhancement of the Raman scattering signal, which allows for the detection of low concentration analytes, giving rise to the use of SERS as a diagnostic tool for disease. Here, we present a review of recent developments in the field of in vivo and in vitro SERS biosensing for a range of disease states including neurological disease, diabetes, cardiovascular disease, cancer, and viral disease.

  4. Analytical electron microscopy in the study of biological systems.

    PubMed

    Johnson, D E

    1986-01-01

    The AEM is a powerful tool in biological research, capable of providing information simply not available by other means. The use of a field emission STEM for this application can lead to a significant improvement in spatial resolution in most cases now allowed by the quality of the specimen preparation but perhaps ultimately limited by the effects of radiation damage. Increased elemental sensitivity is at least possible in selected cases with electron energy-loss spectrometry, but fundamental aspects of ELS will probably confine its role to that of a limited complement to EDS. The considerable margin for improvement in sensitivity of the basic analytical technique means that the search for technological improvement will continue. Fortunately, however, current technology can also continue to answer important biological questions.

  5. Rapid and sensitive detection of synthetic cannabinoids AMB-FUBINACA and α-PVP using surface enhanced Raman scattering (SERS)

    NASA Astrophysics Data System (ADS)

    Islam, Syed K.; Cheng, Yin Pak; Birke, Ronald L.; Green, Omar; Kubic, Thomas; Lombardi, John R.

    2018-04-01

    The application of surface enhanced Raman scattering (SERS) has been reported as a fast and sensitive analytical method in the trace detection of the two most commonly known synthetic cannabinoids AMB-FUBINACA and alpha-pyrrolidinovalerophenone (α-PVP). FUBINACA and α-PVP are two of the most dangerous synthetic cannabinoids which have been reported to cause numerous deaths in the United States. While instruments such as GC-MS, LC-MS have been traditionally recognized as analytical tools for the detection of these synthetic drugs, SERS has been recently gaining ground in the analysis of these synthetic drugs due to its sensitivity in trace analysis and its effectiveness as a rapid method of detection. This present study shows the limit of detection of a concentration as low as picomolar for AMB-FUBINACA while for α-PVP, the limit of detection is in nanomolar concentration using SERS.

  6. Increasing productivity for the analysis of trace contaminants in food by gas chromatography-mass spectrometry using automated liner exchange, backflushing and heart-cutting.

    PubMed

    David, Frank; Tienpont, Bart; Devos, Christophe; Lerch, Oliver; Sandra, Pat

    2013-10-25

    Laboratories focusing on residue analysis in food are continuously seeking to increase sample throughput by minimizing sample preparation. Generic sample extraction methods such as QuEChERS lack selectivity and consequently extracts are not free from non-volatile material that contaminates the analytical system. Co-extracted matrix constituents interfere with target analytes, even if highly sensitive and selective GC-MS/MS is used. A number of GC approaches are described that can be used to increase laboratory productivity. These techniques include automated inlet liner exchange and column backflushing for preservation of the performance of the analytical system and heart-cutting two-dimensional GC for increasing sensitivity and selectivity. The application of these tools is illustrated by the analysis of pesticides in vegetables and fruits, PCBs in milk powder and coplanar PCBs in fish. It is demonstrated that considerable increase in productivity can be achieved by decreasing instrument down-time, while analytical performance is equal or better compared to conventional trace contaminant analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Micro-electromechanical sensors in the analytical field.

    PubMed

    Zougagh, Mohammed; Ríos, Angel

    2009-07-01

    Micro- and nano-electromechanical systems (MEMS and NEMS) for use as sensors represent one of the most exciting new fields in analytical chemistry today. These systems are advantageous over currently available non-miniaturized sensors, such as quartz crystal microbalances, thickness shear mode resonators, and flexural plate wave oscillators, because of their high sensitivity, low cost and easy integration into automated systems. In this article, we present and discuss the evolution in the use of MEMS and NEMS, which are basically cantilever-type sensors, as good analytical tools for a wide variety of applications. We discuss the analytical features and the practical potential of micro(nano)-cantilever sensors, which combine the synergetic advantages of selectivity, provided by their functionalization, and the high sensitivity, which is attributed largely to the extremely small size of the sensing element. An insight is given into the different types of functionalization and detection strategies and a critical discussion is presented on the existing state of the art concerning the applications reported for mechanical microsensors. New developments and the possibilities for routine work in the near future are also covered.

  8. The Personal Selling Ethics Scale: Revisions and Expansions for Teaching Sales Ethics

    ERIC Educational Resources Information Center

    Donoho, Casey; Heinze, Timothy

    2011-01-01

    The field of sales draws a large number of marketing graduates. Sales curricula used within today's marketing programs should include rigorous discussions of sales ethics. The Personal Selling Ethics Scale (PSE) provides an analytical tool for assessing and discussing students' ethical sales sensitivities. However, since the scale fails to address…

  9. Fabrication of a novel transparent SERS substrate comprised of Ag-nanoparticle arrays and its application in rapid detection of ractopamine on meat

    USDA-ARS?s Scientific Manuscript database

    Surface-enhanced Raman spectroscopy (SERS) is an emerging analytical tool that boasts the feature of high detection sensitivity and molecular fingerprint specificity attracting increased attention and showing promise in applications including detecting residues of veterinary drugs. In practice, spec...

  10. Testing the robustness of optimal access vessel fleet selection for operation and maintenance of offshore wind farms

    DOE PAGES

    Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain; ...

    2017-09-23

    Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less

  11. Testing the robustness of optimal access vessel fleet selection for operation and maintenance of offshore wind farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain

    Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less

  12. Individual Human Cell Responses to Low Doses of Chemicals and Radiation Studied by Synchrotron Infrared Spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Martin, Michael C.; Holman, Hoi-Ying N.; Blakely, Eleanor A.; Goth-Goldstein, Regine; McKinney, Wayne R.

    2000-03-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR FTIR microscopy probes intact living cells providing a composite view of all of the molecular responses and the ability to monitor the responses over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low-doses of radiation and chemicals. In this study we used high spatial-resolution SR FTIR vibrational spectromicroscopy at ALS Beamline 1.4.3 as a sensitive analytical tool to detect chemical- and radiation-induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of oxidative stresses: bleomycin, hydrogen peroxide, and X-rays. We observe spectral changes that are unique to each exogenous stressor. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio-compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  13. Sherlock Holmes counts the atoms

    NASA Astrophysics Data System (ADS)

    Tuniz, C.; Zoppi, U.; Hotchkis, M. A. C.

    2004-01-01

    Modern forensic science has to deal not only with homicides and other traditional crimes but also with more global threats such as smuggling of nuclear materials, clandestine production of weapons of mass destruction, stockpiling of illicit drugs by state-controlled groups and war crimes. Forensic applications have always benefited from the use of advanced analytical tools that can characterise materials found at crime scenes. In this paper we will discuss the use of accelerator mass spectrometry as an ultra sensitive tool for the crime labs of the third millennium.

  14. Electrostatic Interactions between OmpG Nanopore and Analyte Protein Surface Can Distinguish between Glycosylated Isoforms.

    PubMed

    Fahie, Monifa A; Chen, Min

    2015-08-13

    The flexible loops decorating the entrance of OmpG nanopore move dynamically during ionic current recording. The gating caused by these flexible loops changes when a target protein is bound. The gating is characterized by parameters including frequency, duration, and open-pore current, and these features combine to reveal the identity of a specific analyte protein. Here, we show that OmpG nanopore equipped with a biotin ligand can distinguish glycosylated and deglycosylated isoforms of avidin by their differences in surface charge. Our studies demonstrate that the direct interaction between the nanopore and analyte surface, induced by the electrostatic attraction between the two molecules, is essential for protein isoform detection. Our technique is remarkably sensitive to the analyte surface, which may provide a useful tool for glycoprotein profiling.

  15. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  16. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  17. Analytical and Experimental Studies of Leak Location and Environment Characterization for the International Space Station

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael; Abel, Joshua; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin; hide

    2014-01-01

    The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations ("directionality"). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb-mass/yr. to about 1 lb-mass/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.

  18. Analytical and Experimental Studies of Leak Location and Environment Characterization for the International Space Station

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael S.; Abel, Joshua C.; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin; hide

    2014-01-01

    The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system.An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations (directionality).The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lbmyr. to about 1 lbmday. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ramwake flows and structural shadowing within low Earth orbit.

  19. Current Technical Approaches for the Early Detection of Foodborne Pathogens: Challenges and Opportunities.

    PubMed

    Cho, Il-Hoon; Ku, Seockmo

    2017-09-30

    The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

  20. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    PubMed

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  1. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  3. Multidisciplinary optimization of aeroservoelastic systems using reduced-size models

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1992-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  4. High-throughput and sensitive analysis of 3-monochloropropane-1,2-diol fatty acid esters in edible oils by supercritical fluid chromatography/tandem mass spectrometry.

    PubMed

    Hori, Katsuhito; Matsubara, Atsuki; Uchikata, Takato; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi

    2012-08-10

    We have established a high-throughput and sensitive analytical method based on supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry (QqQ MS) for 3-monochloropropane-1,2-diol (3-MCPD) fatty acid esters in edible oils. All analytes were successfully separated within 9 min without sample purification. The system was precise and sensitive, with a limit of detection less than 0.063 mg/kg. The recovery rate of 3-MCPD fatty acid esters spiked into oil samples was in the range of 62.68-115.23%. Furthermore, several edible oils were tested for analyzing 3-MCPD fatty acid ester profiles. This is the first report on the analysis of 3-MCPD fatty acid esters by SFC/QqQ MS. The developed method will be a powerful tool for investigating 3-MCPD fatty acid esters in edible oils. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. From near-infrared and Raman to surface-enhanced Raman spectroscopy: progress, limitations and perspectives in bioanalysis.

    PubMed

    Dumont, Elodie; De Bleye, Charlotte; Sacré, Pierre-Yves; Netchacovitch, Lauranne; Hubert, Philippe; Ziemons, Eric

    2016-05-01

    Over recent decades, spreading environmental concern entailed the expansion of green chemistry analytical tools. Vibrational spectroscopy, belonging to this class of analytical tool, is particularly interesting taking into account its numerous advantages such as fast data acquisition and no sample preparation. In this context, near-infrared, Raman and mainly surface-enhanced Raman spectroscopy (SERS) have thus gained interest in many fields including bioanalysis. The two former techniques only ensure the analysis of concentrated compounds in simple matrices, whereas the emergence of SERS improved the performances of vibrational spectroscopy to very sensitive and selective analyses. Complex SERS substrates were also developed enabling biomarker measurements, paving the way for SERS immunoassays. Therefore, in this paper, the strengths and weaknesses of these techniques will be highlighted with a focus on recent progress.

  6. Analytical Performance of Four Polymerase Chain Reaction (PCR) and Real Time PCR (qPCR) Assays for the Detection of Six Leishmania Species DNA in Colombia

    PubMed Central

    León, Cielo M.; Muñoz, Marina; Hernández, Carolina; Ayala, Martha S.; Flórez, Carolina; Teherán, Aníbal; Cubides, Juan R.; Ramírez, Juan D.

    2017-01-01

    Leishmaniasis comprises a spectrum of parasitic diseases caused by protozoans of the genus Leishmania. Molecular tools have been widely employed for the detection of Leishmania due to its high sensitivity and specificity. However, the analytical performance of molecular platforms as PCR and real time PCR (qPCR) including a wide variety of molecular markers has never been evaluated. Herein, the aim was to evaluate the analytical performance of 4 PCR-based assays (designed on four different targets) and applied on conventional and real-time PCR platforms. We evaluated the analytical performance of conventional PCR and real time PCR, determining exclusivity and inclusivity, Anticipated Reportable Range (ARR), limit of detection (LoD) and accuracy using primers directed to kDNA, HSP70, 18S and ITS-1 targets. We observed that the kDNA was the most sensitive but does not meet the criterion of exclusivity. The HSP70 presented a higher LoD in conventional PCR and qPCR in comparison with the other markers (1 × 101 and 1 × 10-1 equivalent parasites/mL respectively) and had a higher coefficient of variation in qPCR. No statistically significant differences were found between the days of the test with the four molecular markers. The present study revealed that the 18S marker presented the best performance in terms of analytical sensitivity and specificity for the qPCR in the species tested (species circulating in Colombia). Therefore, we recommend to explore the analytical and diagnostic performance in future studies using a broader number of species across America. PMID:29046670

  7. Analytical Performance of Four Polymerase Chain Reaction (PCR) and Real Time PCR (qPCR) Assays for the Detection of Six Leishmania Species DNA in Colombia.

    PubMed

    León, Cielo M; Muñoz, Marina; Hernández, Carolina; Ayala, Martha S; Flórez, Carolina; Teherán, Aníbal; Cubides, Juan R; Ramírez, Juan D

    2017-01-01

    Leishmaniasis comprises a spectrum of parasitic diseases caused by protozoans of the genus Leishmania . Molecular tools have been widely employed for the detection of Leishmania due to its high sensitivity and specificity. However, the analytical performance of molecular platforms as PCR and real time PCR (qPCR) including a wide variety of molecular markers has never been evaluated. Herein, the aim was to evaluate the analytical performance of 4 PCR-based assays (designed on four different targets) and applied on conventional and real-time PCR platforms. We evaluated the analytical performance of conventional PCR and real time PCR, determining exclusivity and inclusivity, Anticipated Reportable Range (ARR), limit of detection (LoD) and accuracy using primers directed to kDNA, HSP70, 18S and ITS-1 targets. We observed that the kDNA was the most sensitive but does not meet the criterion of exclusivity. The HSP70 presented a higher LoD in conventional PCR and qPCR in comparison with the other markers (1 × 10 1 and 1 × 10 -1 equivalent parasites/mL respectively) and had a higher coefficient of variation in qPCR. No statistically significant differences were found between the days of the test with the four molecular markers. The present study revealed that the 18S marker presented the best performance in terms of analytical sensitivity and specificity for the qPCR in the species tested (species circulating in Colombia). Therefore, we recommend to explore the analytical and diagnostic performance in future studies using a broader number of species across America.

  8. Kennedy Space Center (KSC) Launch Complex 39 (LC-39) Gaseous Hydrogen (GH2) Vent Arm Behavior Prediction Model Review Technical Assessment Report

    NASA Technical Reports Server (NTRS)

    Wilson, Timmy R.; Beech, Geoffrey; Johnston, Ian

    2009-01-01

    The NESC Assessment Team reviewed a computer simulation of the LC-39 External Tank (ET) GH2 Vent Umbilical system developed by United Space Alliance (USA) for the Space Shuttle Program (SSP) and designated KSC Analytical Tool ID 451 (KSC AT-451). The team verified that the vent arm kinematics were correctly modeled, but noted that there were relevant system sensitivities. Also, the structural stiffness used in the math model varied somewhat from the analytic calculations. Results of the NESC assessment were communicated to the model developers.

  9. Detection of low-level PTFE contamination: An application of solid-state NMR to structure elucidation in the pharmaceutical industry.

    PubMed

    Pham, Tran N; Day, Caroline J; Edwards, Andrew J; Wood, Helen R; Lynch, Ian R; Watson, Simon A; Bretonnet, Anne-Sophie Z; Vogt, Frederick G

    2011-01-25

    We report a novel use of solid-state ¹⁹F nuclear magnetic resonance to detect and quantify polytetrafluoroethylene contamination from laboratory equipment, which due to low quantity (up to 1% w/w) and insolubility remained undetected by standard analytical techniques. Solid-state ¹⁹F NMR is shown to be highly sensitive to such fluoropolymers (detection limit 0.02% w/w), and is demonstrated as a useful analytical tool for structure elucidation of unknown solid materials. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Chemically attached gold nanoparticle-carbon nanotube hybrids for highly sensitive SERS substrate

    NASA Astrophysics Data System (ADS)

    Beqa, Lule; Singh, Anant Kumar; Fan, Zheng; Senapati, Dulal; Ray, Paresh Chandra

    2011-08-01

    Surface-enhanced Raman spectroscopy (SERS) has been shown as one of the most powerful analytical tool with high sensitivity. In this manuscript, we report the chemical design of SERS substrate, based on gold nanoparticles of different shapes-decorated with carbon nanotube with an enhancement factor of 7.5 × 1010. Shape dependent result shows that popcorn shape gold nanoparticle decorated SWCNT is the best choice for SERS substrate due to the existence of 'lightning rod effect' through several sharp edges or corners. Our results provide a good approach to develop highly sensitive SERS substrates and can help to improve the fundamental understanding of SERS phenomena.

  11. A multiplex PCR assay for the rapid and sensitive detection of methicillin-resistant Staphylococcus aureus and simultaneous discrimination of Staphylococcus aureus from coagulase-negative staphylococci.

    PubMed

    Xu, Benjin; Liu, Ling; Liu, Li; Li, Xinping; Li, Xiaofang; Wang, Xin

    2012-11-01

    Methicillin-resistant Staphylococcus aureus (MRSA) is a global health concern, which had been detected in food and food production animals. Conventional testing for detection of MRSA takes 3 to 5 d to yield complete information of the organism and its antibiotic sensitivity pattern. So, a rapid method is needed to diagnose and treat the MRSA infections. The present study focused on the development of a multiplex PCR assay for the rapid and sensitive detection of MRSA. The assay simultaneously detected 4 genes, namely, 16S rRNA of the Staphylococcus genus, femA of S. aureus, mecA that encodes methicillin resistance, and one internal control. It was rapid and yielded results within 4 h. The analytical sensitivity and specificity of the multiplex PCR assay was evaluated by comparing it with the conventional method. The analytical sensitivity of the multiplex PCR assay at the DNA level was 10 ng DNA. The analytical specificity was evaluated with 10 reference staphylococci strains and was 100%. The diagnostic evaluation of MRSA was carried out using 360 foodborne staphylococci isolates, and showed 99.1% of specificity, 96.4% of sensitivity, 97.5% of positive predictive value, and 97.3% of negative predictive value compared to the conventional method. The inclusion of an internal control in the multiplex PCR assay is important to exclude false-negative cases. This test can be used as an effective diagnostic and surveillance tool to investigate the spread and emergence of MRSA. © 2012 Institute of Food Technologists®

  12. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  13. Multidimensional NMR approaches towards highly resolved, sensitive and high-throughput quantitative metabolomics.

    PubMed

    Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick

    2017-02-01

    Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Precision of Sensitivity in the Design Optimization of Indeterminate Structures

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Hopkins, Dale A.

    2006-01-01

    Design sensitivity is central to most optimization methods. The analytical sensitivity expression for an indeterminate structural design optimization problem can be factored into a simple determinate term and a complicated indeterminate component. Sensitivity can be approximated by retaining only the determinate term and setting the indeterminate factor to zero. The optimum solution is reached with the approximate sensitivity. The central processing unit (CPU) time to solution is substantially reduced. The benefit that accrues from using the approximate sensitivity is quantified by solving a set of problems in a controlled environment. Each problem is solved twice: first using the closed-form sensitivity expression, then using the approximation. The problem solutions use the CometBoards testbed as the optimization tool with the integrated force method as the analyzer. The modification that may be required, to use the stiffener method as the analysis tool in optimization, is discussed. The design optimization problem of an indeterminate structure contains many dependent constraints because of the implicit relationship between stresses, as well as the relationship between the stresses and displacements. The design optimization process can become problematic because the implicit relationship reduces the rank of the sensitivity matrix. The proposed approximation restores the full rank and enhances the robustness of the design optimization method.

  15. A miniaturized optoelectronic system for rapid quantitative label-free detection of harmful species in food

    NASA Astrophysics Data System (ADS)

    Raptis, Ioannis; Misiakos, Konstantinos; Makarona, Eleni; Salapatas, Alexandros; Petrou, Panagiota; Kakabakos, Sotirios; Botsialas, Athanasios; Jobst, Gerhard; Haasnoot, Willem; Fernandez-Alba, Amadeo; Lees, Michelle; Valamontes, Evangelos

    2016-03-01

    Optical biosensors have emerged in the past decade as the most promising candidates for portable, highly-sensitive bioanalytical systems that can be employed for in-situ measurements. In this work, a miniaturized optoelectronic system for rapid, quantitative, label-free detection of harmful species in food is presented. The proposed system has four distinctive features that can render to a powerful tool for the next generation of Point-of-Need applications, namely it accommodates the light sources and ten interferometric biosensors on a single silicon chip of a less-than-40mm2 footprint, each sensor can be individually functionalized for a specific target analyte, the encapsulation can be performed at the wafer-scale, and finally it exploits a new operation principle, Broad-band Mach-Zehnder Interferometry to ameliorate its analytical capabilities. Multi-analyte evaluation schemes for the simultaneous detection of harmful contaminants, such as mycotoxins, allergens and pesticides, proved that the proposed system is capable of detecting within short time these substances at concentrations below the limits imposed by regulatory authorities, rendering it to a novel tool for the near-future food safety applications.

  16. Capture-based next-generation sequencing reveals multiple actionable mutations in cancer patients failed in traditional testing.

    PubMed

    Xie, Jing; Lu, Xiongxiong; Wu, Xue; Lin, Xiaoyi; Zhang, Chao; Huang, Xiaofang; Chang, Zhili; Wang, Xinjing; Wen, Chenlei; Tang, Xiaomei; Shi, Minmin; Zhan, Qian; Chen, Hao; Deng, Xiaxing; Peng, Chenghong; Li, Hongwei; Fang, Yuan; Shao, Yang; Shen, Baiyong

    2016-05-01

    Targeted therapies including monoclonal antibodies and small molecule inhibitors have dramatically changed the treatment of cancer over past 10 years. Their therapeutic advantages are more tumor specific and with less side effects. For precisely tailoring available targeted therapies to each individual or a subset of cancer patients, next-generation sequencing (NGS) has been utilized as a promising diagnosis tool with its advantages of accuracy, sensitivity, and high throughput. We developed and validated a NGS-based cancer genomic diagnosis targeting 115 prognosis and therapeutics relevant genes on multiple specimen including blood, tumor tissue, and body fluid from 10 patients with different cancer types. The sequencing data was then analyzed by the clinical-applicable analytical pipelines developed in house. We have assessed analytical sensitivity, specificity, and accuracy of the NGS-based molecular diagnosis. Also, our developed analytical pipelines were capable of detecting base substitutions, indels, and gene copy number variations (CNVs). For instance, several actionable mutations of EGFR,PIK3CA,TP53, and KRAS have been detected for indicating drug susceptibility and resistance in the cases of lung cancer. Our study has shown that NGS-based molecular diagnosis is more sensitive and comprehensive to detect genomic alterations in cancer, and supports a direct clinical use for guiding targeted therapy.

  17. Analytical and experimental studies of leak location and environment characterization for the international space station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woronowicz, Michael; Blackmon, Rebecca; Brown, Martin

    2014-12-09

    The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to demonstrate the ability to detect NH{sub 3} coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performancemore » to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations (“directionality”). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb{sub m/}/yr. to about 1 lb{sub m}/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.« less

  18. Bioimaging of cells and tissues using accelerator-based sources.

    PubMed

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (< 1 mum) or tissue (<1 mm) analyses, (2) a temporal resolution to follow molecular dynamics, and (3) a sensitivity in the micromolar to nanomolar range, thus allowing true investigations on biological dynamics. Third-generation synchrotrons now offer the opportunity of bioanalytical measurements at nanometer resolutions with incredible sensitivity. Linear accelerators are more specialized in their physical features but may exceed synchrotron performances. All these techniques have become irreplaceable tools for developing knowledge in biology. This review highlights the pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  19. Carbon Nanomaterial Based Biosensors for Non-Invasive Detection of Cancer and Disease Biomarkers for Clinical Diagnosis

    PubMed Central

    Tung, Thanh Tran

    2017-01-01

    The early diagnosis of diseases, e.g., Parkinson’s and Alzheimer’s disease, diabetes, and various types of cancer, and monitoring the response of patients to the therapy plays a critical role in clinical treatment; therefore, there is an intensive research for the determination of many clinical analytes. In order to achieve point-of-care sensing in clinical practice, sensitive, selective, cost-effective, simple, reliable, and rapid analytical methods are required. Biosensors have become essential tools in biomarker sensing, in which electrode material and architecture play critical roles in achieving sensitive and stable detection. Carbon nanomaterials in the form of particle/dots, tube/wires, and sheets have recently become indispensable elements of biosensor platforms due to their excellent mechanical, electronic, and optical properties. This review summarizes developments in this lucrative field by presenting major biosensor types and variability of sensor platforms in biomedical applications. PMID:28825646

  20. MetMatch: A Semi-Automated Software Tool for the Comparison and Alignment of LC-HRMS Data from Different Metabolomics Experiments

    PubMed Central

    Koch, Stefan; Bueschl, Christoph; Doppler, Maria; Simader, Alexandra; Meng-Reiterer, Jacqueline; Lemmens, Marc; Schuhmacher, Rainer

    2016-01-01

    Due to its unsurpassed sensitivity and selectivity, LC-HRMS is one of the major analytical techniques in metabolomics research. However, limited stability of experimental and instrument parameters may cause shifts and drifts of retention time and mass accuracy or the formation of different ion species, thus complicating conclusive interpretation of the raw data, especially when generated in different analytical batches. Here, a novel software tool for the semi-automated alignment of different measurement sequences is presented. The tool is implemented in the Java programming language, it features an intuitive user interface and its main goal is to facilitate the comparison of data obtained from different metabolomics experiments. Based on a feature list (i.e., processed LC-HRMS chromatograms with mass-to-charge ratio (m/z) values and retention times) that serves as a reference, the tool recognizes both m/z and retention time shifts of single or multiple analytical datafiles/batches of interest. MetMatch is also designed to account for differently formed ion species of detected metabolites. Corresponding ions and metabolites are matched and chromatographic peak areas, m/z values and retention times are combined into a single data matrix. The convenient user interface allows for easy manipulation of processing results and graphical illustration of the raw data as well as the automatically matched ions and metabolites. The software tool is exemplified with LC-HRMS data from untargeted metabolomics experiments investigating phenylalanine-derived metabolites in wheat and T-2 toxin/HT-2 toxin detoxification products in barley. PMID:27827849

  1. MetMatch: A Semi-Automated Software Tool for the Comparison and Alignment of LC-HRMS Data from Different Metabolomics Experiments.

    PubMed

    Koch, Stefan; Bueschl, Christoph; Doppler, Maria; Simader, Alexandra; Meng-Reiterer, Jacqueline; Lemmens, Marc; Schuhmacher, Rainer

    2016-11-02

    Due to its unsurpassed sensitivity and selectivity, LC-HRMS is one of the major analytical techniques in metabolomics research. However, limited stability of experimental and instrument parameters may cause shifts and drifts of retention time and mass accuracy or the formation of different ion species, thus complicating conclusive interpretation of the raw data, especially when generated in different analytical batches. Here, a novel software tool for the semi-automated alignment of different measurement sequences is presented. The tool is implemented in the Java programming language, it features an intuitive user interface and its main goal is to facilitate the comparison of data obtained from different metabolomics experiments. Based on a feature list (i.e., processed LC-HRMS chromatograms with mass-to-charge ratio ( m / z ) values and retention times) that serves as a reference, the tool recognizes both m / z and retention time shifts of single or multiple analytical datafiles/batches of interest. MetMatch is also designed to account for differently formed ion species of detected metabolites. Corresponding ions and metabolites are matched and chromatographic peak areas, m / z values and retention times are combined into a single data matrix. The convenient user interface allows for easy manipulation of processing results and graphical illustration of the raw data as well as the automatically matched ions and metabolites. The software tool is exemplified with LC-HRMS data from untargeted metabolomics experiments investigating phenylalanine-derived metabolites in wheat and T-2 toxin/HT-2 toxin detoxification products in barley.

  2. European multicenter analytical evaluation of the Abbott ARCHITECT STAT high sensitive troponin I immunoassay.

    PubMed

    Krintus, Magdalena; Kozinski, Marek; Boudry, Pascal; Capell, Nuria Estañ; Köller, Ursula; Lackner, Karl; Lefèvre, Guillaume; Lennartz, Lieselotte; Lotz, Johannes; Herranz, Antonio Mora; Nybo, Mads; Plebani, Mario; Sandberg, Maria B; Schratzberger, Wolfgang; Shih, Jessie; Skadberg, Øyvind; Chargui, Ahmed Taoufik; Zaninotto, Martina; Sypniewska, Grazyna

    2014-11-01

    International recommendations highlight the superior value of cardiac troponins (cTns) for early diagnosis of myocardial infarction along with analytical requirements of improved precision and detectability. In this multicenter study, we investigated the analytical performance of a new high sensitive cardiac troponin I (hs-cTnI) assay and its 99th percentile upper reference limit (URL). Laboratories from nine European countries evaluated the ARCHITECT STAT high sensitive troponin I (hs-TnI) immunoassay on the ARCHITECT i2000SR/i1000SR immunoanalyzers. Imprecision, limit of blank (LoB), limit of detection (LoD), limit of quantitation (LoQ) linearity of dilution, interferences, sample type, method comparisons, and 99th percentile URLs were evaluated in this study. Total imprecision of 3.3%-8.9%, 2.0%-3.5% and 1.5%-5.2% was determined for the low, medium and high controls, respectively. The lowest cTnI concentration corresponding to a total CV of 10% was 5.6 ng/L. Common interferences, sample dilution and carryover did not affect the hs-cTnI results. Slight, but statistically significant, differences with sample type were found. Concordance between the investigated hs-cTnI assay and contemporary cTnI assay at 99th percentile cut-off was found to be 95%. TnI was detectable in 75% and 57% of the apparently healthy population using the lower (1.1 ng/L) and upper (1.9 ng/L) limit of the LoD range provided by the ARCHITECT STAT hs-TnI package insert, respectively. The 99th percentile values were gender dependent. The new ARCHITECT STAT hs-TnI assay with improved analytical features meets the criteria of high sensitive Tn test and will be a valuable diagnostic tool.

  3. IUS solid rocket motor contamination prediction methods

    NASA Technical Reports Server (NTRS)

    Mullen, C. R.; Kearnes, J. H.

    1980-01-01

    A series of computer codes were developed to predict solid rocket motor produced contamination to spacecraft sensitive surfaces. Subscale and flight test data have confirmed some of the analytical results. Application of the analysis tools to a typical spacecraft has provided early identification of potential spacecraft contamination problems and provided insight into their solution; e.g., flight plan modifications, plume or outgassing shields and/or contamination covers.

  4. Fluorescence Spectroscopy for the Monitoring of Food Processes.

    PubMed

    Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd

    Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.

  5. Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.

    PubMed

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-05-01

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.

  6. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  7. A GIS-assisted regional screening tool to evaluate the leaching potential of volatile and non-volatile pesticides

    NASA Astrophysics Data System (ADS)

    Ki, Seo Jin; Ray, Chittaranjan

    2015-03-01

    A regional screening tool-which is useful in cases where few site-specific parameters are available for complex vadose zone models-assesses the leaching potential of pollutants to groundwater over large areas. In this study, the previous pesticide leaching tool used in Hawaii was revised to account for the release of new volatile organic compounds (VOCs) from the soil surface. The tool was modified to introduce expanded terms in the traditional pesticide ranking indices (i.e., retardation and attenuation factors), allowing the estimation of the leaching fraction of volatile chemicals based on recharge, soil, and chemical properties to be updated. Results showed that the previous tool significantly overestimated the mass fraction of VOCs leached through soils as the recharge rates increased above 0.001801 m/d. In contrast, the revised tool successfully delineated vulnerable areas to the selected VOCs based on two reference chemicals, a known leacher and non-leacher, which were determined in local conditions. The sensitivity analysis with the Latin-Hypercube-One-factor-At-a-Time method revealed that the new leaching tool was most sensitive to changes in the soil organic carbon sorption coefficient, fractional organic carbon content, and Henry's law constant; and least sensitive to parameters such as the bulk density, water content at field capacity, and particle density in soils. When the revised tool was compared to the analytical (STANMOD) and numerical (HYDRUS-1D) models as a susceptibility measure, it ranked particular VOCs well (e.g., benzene, carbofuran, and toluene) that were consistent with other two models under the given conditions. Therefore, the new leaching tool can be widely used to address intrinsic groundwater vulnerability to contamination of pesticides and VOCs, along with the DRASTIC method or similar Tier 1 models such as SCI-GROW and WIN-PST.

  8. Nucleic acid-based electrochemical nanobiosensors.

    PubMed

    Abi, Alireza; Mohammadpour, Zahra; Zuo, Xiaolei; Safavi, Afsaneh

    2018-04-15

    The detection of biomarkers using sensitive and selective analytical devices is critically important for the early stage diagnosis and treatment of diseases. The synergy between the high specificity of nucleic acid recognition units and the great sensitivity of electrochemical signal transductions has already shown promise for the development of efficient biosensing platforms. Yet nucleic-acid based electrochemical biosensors often rely on target amplification strategies (e.g., polymerase chain reactions) to detect analytes at clinically relevant concentration ranges. The complexity and time-consuming nature of these amplification methods impede moving nucleic acid-based electrochemical biosensors from laboratory-based to point-of-care test settings. Fortunately, advancements in nanotechnology have provided growing evidence that the recruitment of nanoscaled materials and structures can enhance the biosensing performance (particularly in terms of sensitivity and response time) to the level suitable for use in point-of-care diagnostic tools. This Review highlights the significant progress in the field of nucleic acid-based electrochemical nanobiosensing with the focus on the works published during the last five years. Copyright © 2017. Published by Elsevier B.V.

  9. Polymers imprinted with PAH mixtures--comparing fluorescence and QCM sensors.

    PubMed

    Lieberzeit, Peter A; Halikias, Konstantin; Afzal, Adeel; Dickert, Franz L

    2008-12-01

    Molecular imprinting with binary mixtures of different polycyclic aromatic hydrocarbons (PAH) is a tool for design of chemically highly sensitive layers for detection of these analytes. Sensor responses increase by one order of magnitude compared with layers imprinted with one type of template. Detection limits, e.g. for pyrene, reach down to 30 ng L(-1) in water, as could be observed with a naphthalene and pyrene-imprinted polyurethane. Comparing sensor characteristics obtained by QCM and fluorescence reveals different saturation behaviours indicating that, first, single PAH molecules occupy the interaction centres followed by gradual excimer incorporation at higher concentrations finally leading to substantial quenching, when all accessible cavities are occupied. The plateau in the mass-sensitive measurements suggests that up to 80% of the cavities generated in the MIP are re-occupied. Displacement measurements between chrysene and pyrene revealed that for imprinted layers with very high pyrene sensitivities the signals of both PAH are additive, whereas in materials with lower pyrene uptake the two analytes replace each other in the interaction sites of the polymer.

  10. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  11. Media as a teaching tool in psychiatric nursing education.

    PubMed

    Wall, Barbra Mann; Rossen, Eileen K

    2004-01-01

    The authors describe a course in psychiatric nursing where media in the form of literature, film, and music were used as teaching strategies. The purpose was to enhance students' sensitivity to the personal experiences of psychiatric patients while also broadening students' understanding of mental illness and the institutions developed to treat it. Students' critical reading, thinking, and analytic skills were cultivated, along with introspection and self-reflection.

  12. Electrochemical detection for microscale analytical systems: a review.

    PubMed

    Wang, Joseph

    2002-02-11

    As the field of chip-based microscale systems continues its rapid growth, there are urgent needs for developing compatible detection modes. Electrochemistry detection offers considerable promise for such microfluidic systems, with features that include remarkable sensitivity, inherent miniaturization and portability, independence of optical path length or sample turbidity, low cost, low-power requirements and high compatibility with advanced micromachining and microfabrication technologies. This paper highlights recent advances, directions and key strategies in controlled-potential electrochemical detectors for miniaturized analytical systems. Subjects covered include the design and integration of the electrochemical detection system, its requirements and operational principles, common electrode materials, derivatization reactions, electrical-field decouplers, typical applications and future prospects. It is expected that electrochemical detection will become a powerful tool for microscale analytical systems and will facilitate the creation of truly portable (and possibly disposable) devices.

  13. Analytic Closed-Form Solution of a Mixed Layer Model for Stratocumulus Clouds

    NASA Astrophysics Data System (ADS)

    Akyurek, Bengu Ozge

    Stratocumulus clouds play an important role in climate cooling and are hard to predict using global climate and weather forecast models. Thus, previous studies in the literature use observations and numerical simulation tools, such as large-eddy simulation (LES), to solve the governing equations for the evolution of stratocumulus clouds. In contrast to the previous works, this work provides an analytic closed-form solution to the cloud thickness evolution of stratocumulus clouds in a mixed-layer model framework. With a focus on application over coastal lands, the diurnal cycle of cloud thickness and whether or not clouds dissipate are of particular interest. An analytic solution enables the sensitivity analysis of implicitly interdependent variables and extrema analysis of cloud variables that are hard to achieve using numerical solutions. In this work, the sensitivity of inversion height, cloud-base height, and cloud thickness with respect to initial and boundary conditions, such as Bowen ratio, subsidence, surface temperature, and initial inversion height, are studied. A critical initial cloud thickness value that can be dissipated pre- and post-sunrise is provided. Furthermore, an extrema analysis is provided to obtain the minima and maxima of the inversion height and cloud thickness within 24 h. The proposed solution is validated against LES results under the same initial and boundary conditions. Then, the proposed analytic framework is extended to incorporate multiple vertical columns that are coupled by advection through wind flow. This enables a bridge between the micro-scale and the mesoscale relations. The effect of advection on cloud evolution is studied and a sensitivity analysis is provided.

  14. A simple and sensitive method for the determination of fibric acids in the liver by liquid chromatography.

    PubMed

    Karahashi, Minako; Fukuhara, Hiroto; Hoshina, Miki; Sakamoto, Takeshi; Yamazaki, Tohru; Mitsumoto, Atsushi; Kawashima, Yoichi; Kudo, Naomi

    2014-01-01

    Fibrates are used in biochemical and pharmacological studies as bioactive tools. Nevertheless, most studies have lacked information concerning the concentrations of fibric acids working inside tissues because a simple and sensitive method is not available for their quantitation. This study aimed to develop a simple and sensitive bioanalytical method for the quantitation of clofibric, bezafibric and fenofibric acids in samples of very small portions of tissues. Fibric acids were extracted into n-hexane-ethyl acetate from tissue homogenates (10 mg of liver, kidney or muscle) or serum (100 µL) and were derivatized with 4-bromomethyl-6,7-dimethoxycoumarin, followed by HPLC with fluorescence detection. These compounds were separated isocratically on a reversed phase with acetonitrile-water. Standard analytical curves were linear over the concentration range of 0.2-20 nmol/10 mg of liver. Precision and accuracy were within acceptable limits. Recovery from liver homogenates ranged from 93.03 to 112.29%. This method enabled the quantitation of fibric acids in 10 mg of liver from rats treated with clofibric acid, bezafibric acid or fenofibrate. From these analytical data, it became clear that there was no large difference in ratio of acyl-CoA oxidase 1 (Acox1) mRNA level to fibric acid content in the liver among the three fibric acids, suggesting that these three fibric acids have similar potency to increase expression of the Acox1 gene, which is a target of peroxisome proliferator-activated receptor α. Thus, the proposed method is a simple, sensitive and reliable tool for the quantitation of fibric acids working in vivo inside livers.

  15. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less

  16. Fast computation of derivative based sensitivities of PSHA models via algorithmic differentiation

    NASA Astrophysics Data System (ADS)

    Leövey, Hernan; Molkenthin, Christian; Scherbaum, Frank; Griewank, Andreas; Kuehn, Nicolas; Stafford, Peter

    2015-04-01

    Probabilistic seismic hazard analysis (PSHA) is the preferred tool for estimation of potential ground-shaking hazard due to future earthquakes at a site of interest. A modern PSHA represents a complex framework which combines different models with possible many inputs. Sensitivity analysis is a valuable tool for quantifying changes of a model output as inputs are perturbed, identifying critical input parameters and obtaining insight in the model behavior. Differential sensitivity analysis relies on calculating first-order partial derivatives of the model output with respect to its inputs. Moreover, derivative based global sensitivity measures (Sobol' & Kucherenko '09) can be practically used to detect non-essential inputs of the models, thus restricting the focus of attention to a possible much smaller set of inputs. Nevertheless, obtaining first-order partial derivatives of complex models with traditional approaches can be very challenging, and usually increases the computation complexity linearly with the number of inputs appearing in the models. In this study we show how Algorithmic Differentiation (AD) tools can be used in a complex framework such as PSHA to successfully estimate derivative based sensitivities, as is the case in various other domains such as meteorology or aerodynamics, without no significant increase in the computation complexity required for the original computations. First we demonstrate the feasibility of the AD methodology by comparing AD derived sensitivities to analytically derived sensitivities for a basic case of PSHA using a simple ground-motion prediction equation. In a second step, we derive sensitivities via AD for a more complex PSHA study using a ground motion attenuation relation based on a stochastic method to simulate strong motion. The presented approach is general enough to accommodate more advanced PSHA studies of higher complexity.

  17. Investigating the causes of low detectability of pesticides in fruits and vegetables analysed by high-performance liquid chromatography - Time-of-flight.

    PubMed

    Muehlwald, S; Buchner, N; Kroh, L W

    2018-03-23

    Because of the high number of possible pesticide residues and their chemical complexity, it is necessary to develop methods which cover a broad range of pesticides. In this work, a qualitative multi-screening method for pesticides was developed by use of HPLC-ESI-Q-TOF. 110 pesticides were chosen for the creation of a personal compound database and library (PCDL). The MassHunter Qualitative Analysis software from Agilent Technologies was used to identify the analytes. The software parameter settings were optimised to produce a low number of false positive as well as false negative results. The method was validated for 78 selected pesticides. However, the validation criteria were not fulfilled for 45 analytes. Due to this result, investigations were started to elucidate reasons for the low detectability. It could be demonstrated that the three main causes of the signal suppression were the co-eluting matrix (matrix effect), the low sensitivity of the analyte in standard solution and the fragmentation of the analyte in the ion source (in-source collision-induced dissociation). In this paper different examples are discussed showing that the impact of these three causes is different for each analyte. For example, it is possible that an analyte with low signal intensity and an intense fragmentation in the ion source is detectable in a difficult matrix, whereas an analyte with a high sensitivity and a low fragmentation is not detectable in a simple matrix. Additionally, it could be shown that in-source fragments are a helpful tool for an unambiguous identification. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Development of a new semi-analytical model for cross-borehole flow experiments in fractured media

    USGS Publications Warehouse

    Roubinet, Delphine; Irving, James; Day-Lewis, Frederick D.

    2015-01-01

    Analysis of borehole flow logs is a valuable technique for identifying the presence of fractures in the subsurface and estimating properties such as fracture connectivity, transmissivity and storativity. However, such estimation requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. In this paper, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. In comparison with existing models, our approach presents major improvements in terms of computational expense and potential adaptation to a variety of fracture and experimental configurations. After derivation of the formulation, we demonstrate its application in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as for field-data analysis to investigate fracture connectivity and estimate fracture hydraulic properties. These applications provide important insights regarding (i) the strong sensitivity of fracture property estimates to the overall connectivity of the system; and (ii) the non-uniqueness of the corresponding inverse problem for realistic fracture configurations.

  19. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  20. Nutritional Lipidomics: Molecular Metabolism, Analytics, and Diagnostics

    PubMed Central

    Smilowitz, Jennifer T.; Zivkovic, Angela M.; Wan, Yu-Jui Yvonne; Watkins, Steve M.; Nording, Malin L.; Hammock, Bruce D.; German, J. Bruce

    2013-01-01

    The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of mass spectrometry with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments-lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways. PMID:23818328

  1. What to do with thyroid nodules showing benign cytology and BRAF(V600E) mutation? A study based on clinical and radiologic features using a highly sensitive analytic method.

    PubMed

    Kim, Soo-Yeon; Kim, Eun-Kyung; Kwak, Jin Young; Moon, Hee Jung; Yoon, Jung Hyun

    2015-02-01

    BRAF(V600E) mutation analysis has been used as a complementary diagnostic tool to ultrasonography-guided, fine-needle aspiration (US-FNA) in the diagnosis of thyroid nodule with high specificity reported up to 100%. When highly sensitive analytic methods are used, however, false-positive results of BRAF(V600E) mutation analysis have been reported. In this study, we investigated the clinical, US features, and outcome of patients with thyroid nodules with benign cytology but positive BRAF(V600E) mutation using highly sensitive analytic methods from US-FNA. This study included 22 nodules in 22 patients (3 men, 19 women; mean age, 53 years) with benign cytology but positive BRAF(V600E) mutation from US-FNA. US features were categorized according to the internal components, echogenicity, margin, calcifications, and shape. Suspicious US features included markedly hypoechogenicity, noncircumscribed margins, micro or mixed calcifications, and nonparallel shape. Nodules were considered to have either concordant or discordant US features to benign cytology. Medical records and imaging studies were reviewed for final cytopathology results and outcomes during follow-up. Among the 22 nodules, 17 nodules were reviewed. Fifteen of 17 nodules were malignant, and 2 were benign. The benign nodules were confirmed as adenomatous hyperplasia with underlying lymphocytic thyroiditis and a fibrotic nodule with dense calcification. Thirteen of the 15 malignant nodules had 2 or more suspicious US features, and all 15 nodules were considered to have discordant cytology considering suspicious US features. Five nodules had been followed with US or US-FNA without resection, and did not show change in size or US features on follow-up US examinations. BRAF(V600E) mutation analysis is a highly sensitive diagnostic tool in the diagnosis of papillary thyroid carcinomas. In the management of thyroid nodules with benign cytology but positive BRAF(V600E) mutation, thyroidectomy should be considered in nodules which have 2 or more suspicious US features and are considered discordant on image-cytology correlation. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Tributyltin--critical pollutant in whole water samples--development of traceable measurement methods for monitoring under the European Water Framework Directive (WFD) 2000/60/EC.

    PubMed

    Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert

    2015-07-01

    Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.

  3. The Role of Transport Phenomena in Whispering Gallery Mode Optical Biosensor Performance

    NASA Astrophysics Data System (ADS)

    Gamba, Jason

    Whispering gallery mode (WGM) optical resonator sensors have emerged as promising tools for label-free detection of biomolecules in solution. These devices have even demonstrated single-molecule limits of detection in complex biological uids. This extraordinary sensitivity makes them ideal for low-concentration analytical and diagnostic measurements, but a great deal of work must be done toward understanding and optimizing their performance before they are capable of reliable quantitative measurents. The present work explores the physical processes behind this extreme sensitivity and how to best take advantage of them for practical applications of this technology. I begin by examining the nature of the interaction between the intense electromagnetic elds that build up in the optical biosensor and the biomolecules that bind to its surface. This work addresses the need for a coherent and thorough physical model that can be used to predict sensor behavior for a range of experimental parameters. While this knowledge will prove critical for the development of this technology, it has also shone a light on nonlinear thermo-optical and optical phenomena that these devices are uniquely suited to probing. The surprisingly rapid transient response of toroidal WGM biosensors despite sub-femtomolar analyte concentrations is also addressed. The development of asymmetric boundary layers around these devices under ow is revealed to enhance the capture rate of proteins from solution compared to the spherical sensors used previously. These lessons will guide the design of ow systems to minimize measurement time and consumption of precious sample, a key factor in any medically relevant assay. Finally, experimental results suggesting that WGM biosensors could be used to improve the quantitative detection of small-molecule biomarkers in exhaled breath condensate demonstrate how their exceptional sensitivity and transient response can enable the use of this noninvasive method to probe respiratory distress. WGM bioensors are unlike any other analytical tool, and the work presented here focuses on answering engineering questions surrounding their performance and potential.

  4. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    PubMed

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental requirements for steroid hormone quantitation. Further optimization of the sensitivity of the chemical-analytical LC-tandem mass spectrometry methods, especially for wastewater screening, in South Africa is required. Risk assessment studies showed that it was not practical to propose standards or allowable limits for the steroid estrogens E1, E2, EE2, and E3; the use of predicted-no-effect concentration values of the steroid estrogens appears to be appropriate for use in their risk assessment in relation to aquatic organisms. For raw water sources, drinking water, raw and treated wastewater, the use of bioassays, with trigger values, is a useful screening tool option to decide whether further examination of specific endocrine activity may be warranted, or whether concentrations of such activity are of low priority, with respect to health concerns in the human population. The achievement of improved quantitation limits for immuno-analytical methods, like ELISA, used for compound quantitation, and standardization of the method for measuring E2 equivalents (EEQs) used for biological activity (endocrine: e.g., estrogenic) are some areas for future EDC research.

  5. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Non-enzymatic browning in citrus juice: chemical markers, their detection and ways to improve product quality.

    PubMed

    Bharate, Sonali S; Bharate, Sandip B

    2014-10-01

    Citrus juices are widely consumed due to their nutritional benefits and variety of pharmacological properties. Non-enzymatic browning (NEB) is one of the most important chemical reactions responsible for quality and color changes during the heating or prolonged storage of citrus products. The present review covers various aspects of NEB in citrus juice viz. chemistry of NEB, identifiable markers of NEB, analytical methods to identify NEB markers and ways to improve the quality of citrus juice. 2,5-Dimethyl-4-hydroxy-3(2H)-furanone (DMHF) is one of the promising marker formed during browning process with number of analytical methods reported for its analysis; therefore it can be used as an indicator for NEB process. Amongst analytical methods reported, RP-HPLC is more sensitive and accurate method, which can be used as analytical tool. NEB can be prevented by removal of amino acids/ proteins (via ion exchange treatment) or by targeting NEB reactions (e.g. blockage of furfural/ HMF by sulphiting agent).

  7. Electrochemical Quartz Crystal Nanobalance (EQCN) Based Biosensor for Sensitive Detection of Antibiotic Residues in Milk.

    PubMed

    Bhand, Sunil; Mishra, Geetesh K

    2017-01-01

    An electrochemical quartz crystal nanobalance (EQCN), which provides real-time analysis of dynamic surface events, is a valuable tool for analyzing biomolecular interactions. EQCN biosensors are based on mass-sensitive measurements that can detect small mass changes caused by chemical binding to small piezoelectric crystals. Among the various biosensors, the piezoelectric biosensor is considered one of the most sensitive analytical techniques, capable of detecting antigens at picogram levels. EQCN is an effective monitoring technique for regulation of the antibiotics below the maximum residual limit (MRL). The analysis of antibiotic residues requires high sensitivity, rapidity, reliability and cost effectiveness. For analytical purposes the general approach is to take advantage of the piezoelectric effect by immobilizing a biosensing layer on top of the piezoelectric crystal. The sensing layer usually comprises a biological material such as an antibody, enzymes, or aptamers having high specificity and selectivity for the target molecule to be detected. The biosensing layer is usually functionalized using surface chemistry modifications. When these bio-functionalized quartz crystals are exposed to a particular substance of interest (e.g., a substrate, inhibitor, antigen or protein), binding interaction occurs. This causes a frequency or mass change that can be used to determine the amount of material interacted or bound. EQCN biosensors can easily be automated by using a flow injection analysis (FIA) setup coupled through automated pumps and injection valves. Such FIA-EQCN biosensors have great potential for the detection of different analytes such as antibiotic residues in various matrices such as water, waste water, and milk.

  8. Thin silica shell coated Ag assembled nanostructures for expanding generality of SERS analytes

    PubMed Central

    Kang, Yoo-Lee; Lee, Minwoo; Kang, Homan; Kim, Jaehi; Pham, Xuan-Hung; Kim, Tae Han; Hahm, Eunil; Lee, Yoon-Sik; Jeong, Dae Hong

    2017-01-01

    Surface-enhanced Raman scattering (SERS) provides a unique non-destructive spectroscopic fingerprint for chemical detection. However, intrinsic differences in affinity of analyte molecules to metal surface hinder SERS as a universal quantitative detection tool for various analyte molecules simultaneously. This must be overcome while keeping close proximity of analyte molecules to the metal surface. Moreover, assembled metal nanoparticles (NPs) structures might be beneficial for sensitive and reliable detection of chemicals than single NP structures. For this purpose, here we introduce thin silica-coated and assembled Ag NPs (SiO2@Ag@SiO2 NPs) for simultaneous and quantitative detection of chemicals that have different intrinsic affinities to silver metal. These SiO2@Ag@SiO2 NPs could detect each SERS peak of aniline or 4-aminothiophenol (4-ATP) from the mixture with limits of detection (LOD) of 93 ppm and 54 ppb, respectively. E-field distribution based on interparticle distance was simulated using discrete dipole approximation (DDA) calculation to gain insight into enhanced scattering of these thin silica coated Ag NP assemblies. These NPs were successfully applied to detect aniline in river water and tap water. Results suggest that SiO2@Ag@SiO2 NP-based SERS detection systems can be used as a simple and universal detection tool for environment pollutants and food safety. PMID:28570633

  9. Employing Solid Phase Microextraction as Extraction Tool for Pesticide Residues in Traditional Medicinal Plants

    PubMed Central

    Gondo, Thamani T.; Mmualefe, Lesego C.; Okatch, Harriet

    2016-01-01

    HS-SPME was optimised using blank plant sample for analysis of organochlorine pesticides (OCPs) of varying polarities in selected medicinal plants obtained from northern part of Botswana, where OCPs such as DDT and endosulfan have been historically applied to control disease carrying vectors (mosquitos and tsetse fly). The optimised SPME parameters were used to isolate analytes from root samples of five medicinal plants obtained from Maun and Kasane, Botswana. The final analytes determination was done with a gas chromatograph equipped with GC-ECD and analyte was confirmed using electron ionisation mass spectrometer (GC-MS). Dieldrin was the only pesticide detected and confirmed with MS in the Terminalia sericea sample obtained from Kasane. The method was validated and the analyte recoveries ranged from 69.58 ± 7.20 to 113 ± 15.44%, with RSDs ranging from 1.19 to 17.97%. The method indicated good linearity (R 2 > 0.9900) in the range of 2 to 100 ng g−1. The method also proved to be sensitive with low limits of detection (LODs) ranging from 0.48 ± 0.16 to 1.50 ± 0.50 ng g−1. It can be concluded that SPME was successfully utilized as a sampling and extraction tool for pesticides of diverse polarities in root samples of medicinal plants. PMID:27725893

  10. Challenging loop-mediated isothermal amplification (LAMP) technique for molecular detection of Toxoplasma gondii.

    PubMed

    Fallahi, Shirzad; Mazar, Zahra Arab; Ghasemian, Mehrdad; Haghighi, Ali

    2015-05-01

    To compare analytical sensitivity and specificity of a newly described DNA amplification technique, LAMP and nested PCR assay targeting the RE and B1 genes for the detection of Toxoplasma gondii (T. gondii) DNA. The analytical sensitivity of LAMP and nested-PCR was obtained against10-fold serial dilutions of T. gondii DNA ranging from 1 ng to 0.01 fg. DNA samples of other parasites and human chromosomal DNA were used to determine the specificity of molecular assays. After testing LAMP and nested-PCR in duplicate, the detection limit of RE-LAMP, B1-LAMP, RE-nested PCR and B1-nested PCR assays was one fg, 100 fg, 1 pg and 10 pg of T. gondii DNA respectively. All the LAMP assays and nested PCRs were 100% specific. The RE-LAMP assay revealed the most sensitivity for the detection of T. gondii DNA. The obtained results demonstrate that the LAMP technique has a greater sensitivity for detection of T. gondii. Furthermore, these findings indicate that primers based on the RE are more suitable than those based on the B1 gene. However, the B1-LAMP assay has potential as a diagnostic tool for detection of T. gondii. Copyright © 2015 Hainan Medical College. Production and hosting by Elsevier B.V. All rights reserved.

  11. Biosensors and their applications in detection of organophosphorus pesticides in the environment.

    PubMed

    Hassani, Shokoufeh; Momtaz, Saeideh; Vakhshiteh, Faezeh; Maghsoudi, Armin Salek; Ganjali, Mohammad Reza; Norouzi, Parviz; Abdollahi, Mohammad

    2017-01-01

    This review discusses the past and recent advancements of biosensors focusing on detection of organophosphorus pesticides (OPs) due to their exceptional use during the last decades. Apart from agricultural benefits, OPs also impose adverse toxicological effects on animal and human population. Conventional approaches such as chromatographic techniques used for pesticide detection are associated with several limitations. A biosensor technology is unique due to the detection sensitivity, selectivity, remarkable performance capabilities, simplicity and on-site operation, fabrication and incorporation with nanomaterials. This study also provided specifications of the most OPs biosensors reported until today based on their transducer system. In addition, we highlighted the application of advanced complementary materials and analysis techniques in OPs detection systems. The availability of these new materials associated with new sensing techniques has led to introduction of easy-to-use analytical tools of high sensitivity and specificity in the design and construction of OPs biosensors. In this review, we elaborated the achievements in sensing systems concerning innovative nanomaterials and analytical techniques with emphasis on OPs.

  12. Analytical improvements of hybrid LC-MS/MS techniques for the efficient evaluation of emerging contaminants in river waters: a case study of the Henares River (Madrid, Spain).

    PubMed

    Pérez-Parada, Andrés; Gómez-Ramos, María del Mar; Martínez Bueno, María Jesús; Uclés, Samanta; Uclés, Ana; Fernández-Alba, Amadeo R

    2012-02-01

    Instrumental capabilities and software tools of modern hybrid mass spectrometry (MS) instruments such as high-resolution mass spectrometry (HRMS), quadrupole time-of-flight (QTOF), and quadrupole linear ion trap (QLIT) were experimentally investigated for the study of emerging contaminants in Henares River water samples. Automated screening and confirmatory capabilities of QTOF working in full-scan MS and tandem MS (MS/MS) were explored when dealing with real samples. Investigations on the effect of sensitivity and resolution power influence on mass accuracy were studied for the correct assignment of the amoxicillin transformation product 5(R) amoxicillin-diketopiperazine-2',5' as an example of a nontarget compound. On the other hand, a comparison of quantitative and qualitative strategies based on direct injection analysis and off-line solid-phase extraction sample treatment were assayed using two different QLIT instruments for a selected group of emerging contaminants when operating in selected reaction monitoring (SRM) and information-dependent acquisition (IDA) modes. Software-aided screening usually needs a further confirmatory step. Resolving power and MS/MS feature of QTOF showed to confirm/reject most findings in river water, although sensitivity-related limitations are usually found. Superior sensitivity of modern QLIT-MS/MS offered the possibility of direct injection analysis for proper quantitative study of a variety of contaminants, while it simultaneously reduced the matrix effect and increased the reliability of the results. Confirmation of ethylamphetamine, which lacks on a second SRM transition, was accomplished by using the IDA feature. Hybrid MS instruments equipped with high resolution and high sensitivity contributes to enlarge the scope of targeted analytes in river waters. However, in the tested instruments, there is a margin of improvement principally in required sensitivity and data treatment software tools devoted to reliable confirmation and improved automated data processing.

  13. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Feng; Liu, Yijin; Yu, Xiqian

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  14. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE PAGES

    Lin, Feng; Liu, Yijin; Yu, Xiqian; ...

    2017-08-30

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  15. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Price, D. Marvin

    1991-01-01

    Spacecraft designers have always been concerned about the effects of meteoroid impacts on mission safety. The engineering solution to this problem has generally been to erect a bumper or shield placed outboard from the spacecraft wall to disrupt/deflect the incoming projectiles. Spacecraft designers have a number of tools at their disposal to aid in the design process. These include hypervelocity impact testing, analytic impact predictors, and hydrodynamic codes. Analytic impact predictors generally provide the best quick-look estimate of design tradeoffs. The most complete way to determine the characteristics of an analytic impact predictor is through optimization of the protective structures design problem formulated with the predictor of interest. Space Station Freedom protective structures design insight is provided through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. Major results are presented.

  16. Consequences of Base Time for Redundant Signals Experiments

    PubMed Central

    Townsend, James T.; Honey, Christopher

    2007-01-01

    We report analytical and computational investigations into the effects of base time on the diagnosticity of two popular theoretical tools in the redundant signals literature: (1) the race model inequality and (2) the capacity coefficient. We show analytically and without distributional assumptions that the presence of base time decreases the sensitivity of both of these measures to model violations. We further use simulations to investigate the statistical power model selection tools based on the race model inequality, both with and without base time. Base time decreases statistical power, and biases the race model test toward conservatism. The magnitude of this biasing effect increases as we increase the proportion of total reaction time variance contributed by base time. We marshal empirical evidence to suggest that the proportion of reaction time variance contributed by base time is relatively small, and that the effects of base time on the diagnosticity of our model-selection tools are therefore likely to be minor. However, uncertainty remains concerning the magnitude and even the definition of base time. Experimentalists should continue to be alert to situations in which base time may contribute a large proportion of the total reaction time variance. PMID:18670591

  17. Monitoring the Productivity of Coastal Systems Using PH ...

    EPA Pesticide Factsheets

    The impact of nutrient inputs to the eutrophication of coastal ecosystems has been one of the great themes of coastal ecology. There have been countless studies devoted to quantifying how human sources of nutrients, in particular nitrogen (N), effect coastal water bodies. These studies, which often measure in situ concentrations of nutrients, chlorophyll, and dissolved oxygen, are often spatially and/or temporally intensive and expensive. We provide evidence from experimental mesocosms, coupled with data from the water column of a well-mixed estuary, that pH can be a quick, inexpensive, and integrative measure of net ecosystem metabolism. In some cases, this approach is a more sensitive tracer of production than direct measurements of chlorophyll and carbon-14. Taken together, our data suggest that pH is a sensitive, but often overlooked, tool for monitoring estuarine production. This presentation will explore the potential utility of pH as an indicator of ecosystem productivity. Our data suggest that pH is a sensitive and potentially integrator of net ecosystem production. It should not be overlooked, that measuring pH is quick, easy, and inexpensive, further increasing its value as an analytical tool.

  18. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  19. Immunochemistry for high-throughput screening of human exhaled breath condensate (EBC) media: implementation of automated Quanterix SIMOA instrumentation.

    PubMed

    Pleil, Joachim D; Angrish, Michelle M; Madden, Michael C

    2015-12-11

    Immunochemistry is an important clinical tool for indicating biological pathways leading towards disease. Standard enzyme-linked immunosorbent assays (ELISA) are labor intensive and lack sensitivity at low-level concentrations. Here we report on emerging technology implementing fully-automated ELISA capable of molecular level detection and describe application to exhaled breath condensate (EBC) samples. The Quanterix SIMOA HD-1 analyzer was evaluated for analytical performance for inflammatory cytokines (IL-6, TNF-α, IL-1β and IL-8). The system was challenged with human EBC representing the most dilute and analytically difficult of the biological media. Calibrations from synthetic samples and spiked EBC showed excellent linearity at trace levels (r(2)  >  0.99). Sensitivities varied by analyte, but were robust from ~0.006 (IL-6) to ~0.01 (TNF-α) pg ml(-1). All analytes demonstrated response suppression when diluted with deionized water and so assay buffer diluent was found to be a better choice. Analytical runs required ~45 min setup time for loading samples, reagents, calibrants, etc., after which the instrument performs without further intervention for up to 288 separate samples. Currently, available kits are limited to single-plex analyses and so sample volumes require adjustments. Sample dilutions should be made with assay diluent to avoid response suppression. Automation performs seamlessly and data are automatically analyzed and reported in spreadsheet format. The internal 5-parameter logistic (pl) calibration model should be supplemented with a linear regression spline at the very lowest analyte levels, (<1.3 pg ml(-1)). The implementation of the automated Quanterix platform was successfully demonstrated using EBC, which poses the greatest challenge to ELISA due to limited sample volumes and low protein levels.

  20. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  1. Liquid chromatography-mass spectrometry in metabolomics research: mass analyzers in ultra high pressure liquid chromatography coupling.

    PubMed

    Forcisi, Sara; Moritz, Franco; Kanawati, Basem; Tziotis, Dimitrios; Lehmann, Rainer; Schmitt-Kopplin, Philippe

    2013-05-31

    The present review gives an introduction into the concept of metabolomics and provides an overview of the analytical tools applied in non-targeted metabolomics with a focus on liquid chromatography (LC). LC is a powerful analytical tool in the study of complex sample matrices. A further development and configuration employing Ultra-High Pressure Liquid Chromatography (UHPLC) is optimized to provide the largest known liquid chromatographic resolution and peak capacity. Reasonably UHPLC plays an important role in separation and consequent metabolite identification of complex molecular mixtures such as bio-fluids. The most sensitive detectors for these purposes are mass spectrometers. Almost any mass analyzer can be optimized to identify and quantify small pre-defined sets of targets; however, the number of analytes in metabolomics is far greater. Optimized protocols for quantification of large sets of targets may be rendered inapplicable. Results on small target set analyses on different sample matrices are easily comparable with each other. In non-targeted metabolomics there is almost no analytical method which is applicable to all different matrices due to limitations pertaining to mass analyzers and chromatographic tools. The specifications of the most important interfaces and mass analyzers are discussed. We additionally provide an exemplary application in order to demonstrate the level of complexity which remains intractable up to date. The potential of coupling a high field Fourier Transform Ion Cyclotron Resonance Mass Spectrometer (ICR-FT/MS), the mass analyzer with the largest known mass resolving power, to UHPLC is given with an example of one human pre-treated plasma sample. This experimental example illustrates one way of overcoming the necessity of faster scanning rates in the coupling with UHPLC. The experiment enabled the extraction of thousands of features (analytical signals). A small subset of this compositional space could be mapped into a mass difference network whose topology shows specificity toward putative metabolite classes and retention time. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Imaging MALDI MS of Dosed Brain Tissues Utilizing an Alternative Analyte Pre-extraction Approach

    NASA Astrophysics Data System (ADS)

    Quiason, Cristine M.; Shahidi-Latham, Sheerin K.

    2015-06-01

    Matrix-assisted laser desorption ionization (MALDI) imaging mass spectrometry has been adopted in the pharmaceutical industry as a useful tool to detect xenobiotic distribution within tissues. A unique sample preparation approach for MALDI imaging has been described here for the extraction and detection of cobimetinib and clozapine, which were previously undetectable in mouse and rat brain using a single matrix application step. Employing a combination of a buffer wash and a cyclohexane pre-extraction step prior to standard matrix application, the xenobiotics were successfully extracted and detected with an 8 to 20-fold gain in sensitivity. This alternative approach for sample preparation could serve as an advantageous option when encountering difficult to detect analytes.

  3. Dancing with data: an example of acquiring theoretical sensitivity in a grounded theory study.

    PubMed

    Hoare, Karen J; Mills, Jane; Francis, Karen

    2012-06-01

    Glaser suggested that the conceptual route from data collection to a grounded theory is a set of double back steps. The route forward inevitably results in the analyst stepping back. Additionally sidestepping through, leading participants down lines of inquiry and following data threads with other participants, is also characteristic of acquiring theoretical sensitivity, a key concept in grounded theory. Other ways of acquiring theoretical sensitivity comprise: reading the literature, open coding, category building, reflecting in memos followed by doubling back on data collection once further lines of inquiry are opened up. This paper describes how we 'danced with data' in pursuit of heightened theoretical sensitivity in a grounded theory study of information use by nurses working in general practice in New Zealand. Providing an example of how analytical tools are employed to theoretically sample emerging concepts. © 2012 Blackwell Publishing Asia Pty Ltd.

  4. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  5. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  6. [Metabonomics-a useful tool for individualized cancer therapy].

    PubMed

    Chai, Yanlan; Wang, Juan; Liu, Zi

    2013-11-01

    Metabonomics has developed rapidly in post-genome era, and becomes a hot topic of omics. The core idea of metabonomics is to determine the metabolites of relatively low-weight molecular in organisms or cells, by a series of analytical methods such as nuclear magnetic resonance, color spectrum and mass spectrogram, then to transform the data of metabolic pattern into useful information, by chemometric tools and pattern recognition software, and to reveal the essence of life activities of the body. With advantages of high-throughput, high-sensitivity and high-accuracy, metabolomics shows great potential and value in cancer individualized treatment. This paper introduces the concept,contents and methods of metabonomics and reviews its application in cancer individualized therapy.

  7. Structural considerations for fabrication and mounting of the AXAF HRMA optics

    NASA Technical Reports Server (NTRS)

    Cohen, Lester M.; Cernoch, Larry; Mathews, Gary; Stallcup, Michael

    1990-01-01

    A methodology is described which minimizes optics distortion in the fabrication, metrology, and launch configuration phases. The significance of finite element modeling and breadboard testing is described with respect to performance analyses of support structures and material effects in NASA's AXAF X-ray optics. The paper outlines the requirements for AXAF performance, optical fabrication, metrology, and glass support fixtures, as well as the specifications for mirror sensitivity and the high-resolution mirror assembly. Analytical modeling of the tools is shown to coincide with grinding and polishing experiments, and is useful for designing large-area polishing and grinding tools. Metrological subcomponents that have undergone initial testing show evidence of meeting force requirements.

  8. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  9. Helicase-dependent isothermal amplification: a novel tool in the development of molecular-based analytical systems for rapid pathogen detection.

    PubMed

    Barreda-García, Susana; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Lobo-Castañón, María Jesús

    2018-01-01

    Highly sensitive testing of nucleic acids is essential to improve the detection of pathogens, which pose a major threat for public health worldwide. Currently available molecular assays, mainly based on PCR, have a limited utility in point-of-need control or resource-limited settings. Consequently, there is a strong interest in developing cost-effective, robust, and portable platforms for early detection of these harmful microorganisms. Since its description in 2004, isothermal helicase-dependent amplification (HDA) has been successfully applied in the development of novel molecular-based technologies for rapid, sensitive, and selective detection of viruses and bacteria. In this review, we highlight relevant analytical systems using this simple nucleic acid amplification methodology that takes place at a constant temperature and that is readily compatible with microfluidic technologies. Different strategies for monitoring HDA amplification products are described. In addition, we present technological advances for integrating sample preparation, HDA amplification, and detection. Future perspectives and challenges toward point-of-need use not only for clinical diagnosis but also in food safety testing and environmental monitoring are also discussed. Graphical Abstract Expanding the analytical toolbox for the detection of DNA sequences specific of pathogens with isothermal helicase dependent amplification (HDA).

  10. Design and optimization of a nanoprobe comprising amphiphilic chitosan colloids and Au-nanorods: Sensitive detection of human serum albumin in simulated urine

    NASA Astrophysics Data System (ADS)

    Jean, Ren-Der; Larsson, Mikael; Cheng, Wei-Da; Hsu, Yu-Yuan; Bow, Jong-Shing; Liu, Dean-Mo

    2016-12-01

    Metallic nanoparticles have been utilized as analytical tools to detect a wide range of organic analytes. In most reports, gold (Au)-based nanosensors have been modified with ligands to introduce selectivity towards a specific target molecule. However, in a recent study a new concept was presented where bare Au-nanorods on self-assembled carboxymethyl-hexanoyl chitosan (CHC) nanocarriers achieved sensitive and selective detection of human serum albumin (HSA) after manipulation of the solution pH. Here this concept was further advanced through optimization of the ratio between Au-nanorods and CHC nanocarriers to create a nanotechnology-based sensor (termed CHC-AuNR nanoprobe) with an outstanding lower detection limit (LDL) for HSA. The CHC-AuNR nanoprobe was evaluated in simulated urine solution and a LDL as low as 1.5 pM was achieved at an estimated AuNR/CHC ratio of 2. Elemental mapping and protein adsorption kinetics over three orders of magnitude in HSA concentration confirmed accumulation of HSA on the nanorods and revealed the adsorption to be completed within 15 min for all investigated concentrations. The results suggest that the CHC-AuNR nanoprobe has potential to be utilized for cost-effective detection of analytes in complex liquids.

  11. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  12. Advances in microscale separations towards nanoproteomics applications

    DOE PAGES

    Yi, Lian; Piehowski, Paul D.; Shi, Tujin; ...

    2017-07-21

    Microscale separation (e.g., liquid chromatography or capillary electrophoresis) coupled with mass spectrometry (MS) has become the primary tool for advanced proteomics, an indispensable technology for gaining understanding of complex biological processes. In recent decades significant advances have been achieved in MS-based proteomics. But, the current proteomics platforms still face an analytical challenge in overall sensitivity towards nanoproteomics applications for starting materials of less than 1 μg total proteins (e.g., cellular heterogeneity in tissue pathologies). We review recent advances in microscale separation techniques and integrated sample processing strategies that improve the overall sensitivity and proteome coverage of the proteomics workflow, andmore » their contributions towards nanoproteomics applications.« less

  13. Advances in microscale separations towards nanoproteomics applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Lian; Piehowski, Paul D.; Shi, Tujin

    Microscale separation (e.g., liquid chromatography or capillary electrophoresis) coupled with mass spectrometry (MS) has become the primary tool for advanced proteomics, an indispensable technology for gaining understanding of complex biological processes. In recent decades significant advances have been achieved in MS-based proteomics. But, the current proteomics platforms still face an analytical challenge in overall sensitivity towards nanoproteomics applications for starting materials of less than 1 μg total proteins (e.g., cellular heterogeneity in tissue pathologies). We review recent advances in microscale separation techniques and integrated sample processing strategies that improve the overall sensitivity and proteome coverage of the proteomics workflow, andmore » their contributions towards nanoproteomics applications.« less

  14. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  15. Development of an 19F NMR method for the analysis of fluorinated acids in environmental water samples.

    PubMed

    Ellis, D A; Martin, J W; Muir, D C; Mabury, S A

    2000-02-15

    This investigation was carried out to evaluate 19F NMR as an analytical tool for the measurement of trifluoroacetic acid (TFA) and other fluorinated acids in the aquatic environment. A method based upon strong anionic exchange (SAX) chromatography was also optimized for the concentration of the fluoro acids prior to NMR analysis. Extraction of the analyte from the SAX column was carried out directly in the NMR solvent in the presence of the strong organic base, DBU. The method allowed the analysis of the acid without any prior cleanup steps being involved. Optimal NMR sensitivity based upon T1 relaxation times was investigated for seven fluorinated compounds in four different NMR solvents. The use of the relaxation agent chromium acetylacetonate, Cr(acac)3, within these solvent systems was also evaluated. Results show that the optimal NMR solvent differs for each fluorinated analyte. Cr(acac)3 was shown to have pronounced effects on the limits of detection of the analyte. Generally, the optimal sensitivity condition appears to be methanol-d4/2M DBU in the presence of 4 mg/mL of Cr-(acac)3. The method was validated through spike and recovery for five fluoro acids from environmentally relevant waters. Results are presented for the analysis of TFA in Toronto rainwater, which ranged from < 16 to 850 ng/L. The NMR results were confirmed by GC-MS selected-ion monitoring of the fluoroanalide derivative.

  16. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  17. Metabolomics: beyond biomarkers and towards mechanisms

    PubMed Central

    Johnson, Caroline H.; Ivanisevic, Julijana; Siuzdak, Gary

    2017-01-01

    Metabolomics, which is the profiling of metabolites in biofluids, cells and tissues, is routinely applied as a tool for biomarker discovery. Owing to innovative developments in informatics and analytical technologies, and the integration of orthogonal biological approaches, it is now possible to expand metabolomic analyses to understand the systems-level effects of metabolites. Moreover, because of the inherent sensitivity of metabolomics, subtle alterations in biological pathways can be detected to provide insight into the mechanisms that underlie various physiological conditions and aberrant processes, including diseases. PMID:26979502

  18. DNA Electrochemistry and Electrochemical Sensors for Nucleic Acids.

    PubMed

    Ferapontova, Elena E

    2018-06-12

    Sensitive, specific, and fast analysis of nucleic acids (NAs) is strongly needed in medicine, environmental science, biodefence, and agriculture for the study of bacterial contamination of food and beverages and genetically modified organisms. Electrochemistry offers accurate, simple, inexpensive, and robust tools for the development of such analytical platforms that can successfully compete with other approaches for NA detection. Here, electrode reactions of DNA, basic principles of electrochemical NA analysis, and their relevance for practical applications are reviewed and critically discussed.

  19. Neutral Mass Spectrometry of Mega-Dalton Particles with Single-Particle Resolution using a Nano-Electromechanical System

    NASA Astrophysics Data System (ADS)

    Kelber, Scott; Hanay, Mehmet; Naik, Akshay; Chi, Derrick; Hentz, Sebastien; Bullard, Caryn; Collinet, Eric; Duraffourg, Laurent; Roukes, Michael

    2012-02-01

    Nanoelectromechanical systems (NEMS) enable mass sensing with unprecedented sensitivity and mass dynamic range. Previous works have relied on statistical analysis of multiple landing events to assemble mass spectra. Here we demonstrate the utility of using multiple modes of the NEMS device in determining the mass of individual molecules landing on the NEMS. Analyte particles in vapor form are produced using matrix assisted laser desorption ionization. Resonant frequencies of the first two modes of a single NEMS device, placed in close proximity to the analyte source, are tracked using parallel phase locked loops. Each analyte molecule landing on the NEMS generates a distinct frequency shift in the two modes. These time correlated frequency jumps are used to evaluate the mass of each analyte particle landing on the NEMS and thus generate mass spectra. We present the latest experimental results using this scheme and also demonstrate the utility for mass spectrometry of large biomolecules. This NEMS-Mass Spec. system offers a new tool for structural biology and pathology for the analysis of large proteins, protein complexes, and viruses.

  20. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  1. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  2. Development of loop-mediated isothermal amplification methods for detecting Taylorella equigenitalis and Taylorella asinigenitalis

    PubMed Central

    KINOSHITA, Yuta; NIWA, Hidekazu; KATAYAMA, Yoshinari; HARIU, Kazuhisa

    2015-01-01

    ABSTRACT Taylorella equigenitalis is a causative bacterium of contagious equine metritis (CEM), and Taylorella asinigenitalis is species belonging to genus Taylorella. The authors developed two loop-mediated isothermal amplification (LAMP) methods, Te-LAMP and Ta-LAMP, for detecting T. equigenitalis and T. asinigenitalis, respectively. Using experimentally spiked samples, Te-LAMP was as sensitive as a published semi-nested PCR method, and Ta-LAMP was more sensitive than conventional PCR. Multiplex LAMP worked well without nonspecific reactions, and the analytical sensitivities of multiplex LAMP in the spiked samples were almost equivalent to those of Te-LAMP and Ta-LAMP. Therefore, the LAMP methods are considered useful tools to detect T. equigenitalis and/or T. asinigenitalis, and preventive measures will be rapidly implemented if the occurrence of CEM is confirmed by the LAMP methods. PMID:25829868

  3. Sensitive and comprehensive analysis of O-glycosylation in biotherapeutics: a case study of novel erythropoiesis stimulating protein.

    PubMed

    Kim, Unyong; Oh, Myung Jin; Seo, Youngsuk; Jeon, Yinae; Eom, Joon-Ho; An, Hyun Joo

    2017-09-01

    Glycosylation of recombinant human erythropoietins (rhEPOs) is significantly associated with drug's quality and potency. Thus, comprehensive characterization of glycosylation is vital to assess the biotherapeutic quality and establish the equivalency of biosimilar rhEPOs. However, current glycan analysis mainly focuses on the N-glycans due to the absence of analytical tools to liberate O-glycans with high sensitivity. We developed selective and sensitive method to profile native O-glycans on rhEPOs. O-glycosylation on rhEPO including O-acetylation on a sialic acid was comprehensively characterized. Details such as O-glycan structure and O-acetyl-modification site were obtained from tandem MS. This method may be applied to QC and batch analysis of not only rhEPOs but also other biotherapeutics bearing multiple O-glycosylations.

  4. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  5. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  6. Cross reactive arrays of three-way junction sensors for steroid determination

    NASA Technical Reports Server (NTRS)

    Stojanovic, Milan N. (Inventor); Nikic, Dragan B. (Inventor); Landry, Donald (Inventor)

    2008-01-01

    This invention provides analyte sensitive oligonucleotide compositions for detecting and analyzing analytes in solution, including complex solutions using cross reactive arrays of analyte sensitive oligonucleotide compositions.

  7. Development of a new analytical tool for assessing the mutagen 2-methyl-1,4-dinitro-pyrrole in meat products by LC-ESI-MS/MS.

    PubMed

    Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; Yotsuyanagi, Suzana Eri; da Silva Correa Lemos, Ana Lucia; Joussef, Antonio Carlos; De Dea Lindner, Juliano

    2018-08-01

    The use of sorbate and nitrite in meat processing may lead to the formation of 2-methyl-1,4-dinitro-pyrrole (DNMP), a mutagenic compound. This work was aimed at developing and validating an analytical method for the quantitation of DNMP by liquid chromatography-tandem mass spectrometry. Full validation was performed in accordance to Commission Decision 2002/657/EC and method applicability was checked in several samples of meat products. A simple procedure, with low temperature partitioning solid-liquid extraction, was developed. The nitrosation during the extraction was monitored by the N-nitroso-DL-pipecolic acid content. Chromatographic separation was achieved in 8 min with di-isopropyl-3-aminopropyl silane bound to hydroxylated silica as stationary phase. Samples of bacon and cooked sausage yielded the highest concentrations of DNMP (68 ± 3 and 50 ± 3 μg kg -1 , respectively). The developed method proved to be a reliable, selective, and sensitive tool for DNMP measurements in meat products. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  9. Solvent signal suppression for high-resolution MAS-DNP

    NASA Astrophysics Data System (ADS)

    Lee, Daniel; Chaudhari, Sachin R.; De Paëpe, Gaël

    2017-05-01

    Dynamic nuclear polarization (DNP) has become a powerful tool to substantially increase the sensitivity of high-field magic angle spinning (MAS) solid-state NMR experiments. The addition of dissolved hyperpolarizing agents usually results in the presence of solvent signals that can overlap and obscure those of interest from the analyte. Here, two methods are proposed to suppress DNP solvent signals: a Forced Echo Dephasing experiment (FEDex) and TRAnsfer of Populations in DOuble Resonance Echo Dephasing (TRAPDORED) NMR. These methods reintroduce a heteronuclear dipolar interaction that is specific to the solvent, thereby forcing a dephasing of recoupled solvent spins and leaving acquired NMR spectra free of associated resonance overlap with the analyte. The potency of these methods is demonstrated on sample types common to MAS-DNP experiments, namely a frozen solution (of L-proline) and a powdered solid (progesterone), both containing deuterated glycerol as a DNP solvent. The proposed methods are efficient, simple to implement, compatible with other NMR experiments, and extendable past spectral editing for just DNP solvents. The sensitivity gains from MAS-DNP in conjunction with FEDex or TRAPDORED then permits rapid and uninterrupted sample analysis.

  10. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  11. Schistosoma real-time PCR as diagnostic tool for international travellers and migrants.

    PubMed

    Cnops, Lieselotte; Tannich, Egbert; Polman, Katja; Clerinx, Jan; Van Esbroeck, Marjan

    2012-10-01

    To evaluate the use of a genus-specific PCR that combines high sensitivity with the detection of different Schistosoma species for diagnosis in international travellers and migrants in comparison to standard microscopy. The genus-specific real-time PCR was developed to target the 28S ribosomal RNA gene of the major human Schistosoma species. It was validated for analytical specificity and reproducibility and demonstrated an analytical sensitivity of 0.2 eggs per gram of faeces. Its diagnostic performance was further evaluated on 152 faecal, 32 urine and 38 serum samples from patients presenting at the outpatient clinic of the Institute of Tropical Medicine in Antwerp (Belgium). We detected Schistosoma DNA in 76 faecal (50.0%) and five urine (15.6%) samples of which, respectively, nine and one were not detected by standard microscopy. Only two of the 38 serum samples of patients with confirmed schistosomiasis were positive with the presently developed PCR. Sequence analysis on positive faecal samples allowed identification of the Schistosoma species complex. The real-time PCR is highly sensitive and may offer added value in diagnosing imported schistosomiasis. The genus-specific PCR can detect all schistosome species that are infectious to humans and performs very well with faeces and urine, but not in serum. © 2012 Blackwell Publishing Ltd.

  12. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  13. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.

  14. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations

    NASA Astrophysics Data System (ADS)

    Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.

    2016-04-01

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  15. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations.

    PubMed

    Pinho, Ludmila A G; Sá-Barreto, Lívia C L; Infante, Carlos M C; Cunha-Filho, Marcílio S S

    2016-04-15

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form. Copyright © 2016. Published by Elsevier B.V.

  16. Direct determination of trace phthalate esters in alcoholic spirits by spray-inlet microwave plasma torch ionization tandem mass spectrometry.

    PubMed

    Miao, Meng; Zhao, Gaosheng; Xu, Li; Dong, Junguo; Cheng, Ping

    2018-03-01

    A direct analytical method based on spray-inlet microwave plasma torch tandem mass spectrometry was applied to simultaneously determine 4 phthalate esters (PAEs), namely, benzyl butyl phthalate, diethyl phthalate, dipentyl phthalate, and dodecyl phthalate with extremely high sensitivity in spirits without sample treatment. Among the 4 brands of spirit products, 3 kinds of PAE compounds were directly determined at very low concentrations from 1.30 to 114 ng·g -1 . Compared with other online and off-line methods, the spray-inlet microwave plasma torch tandem mass spectrometry technique is extremely simple, rapid, sensitive, and high efficient, providing an ideal screening tool for PAEs in spirits. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Analyzing Water's Optical Absorption

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  18. New Optical Sensing Materials for Application in Marine Research

    NASA Astrophysics Data System (ADS)

    Borisov, S.; Klimant, I.

    2012-04-01

    Optical chemosensors are versatile analytical tools which find application in numerous fields of science and technology. They proved to be a promising alternative to electrochemical methods and are applied increasingly often in marine research. However, not all state-of-the- art optical chemosensors are suitable for these demanding applications since they do not fully fulfil the requirements of high luminescence brightness, high chemical- and photochemical stability or their spectral properties are not adequate. Therefore, development of new advanced sensing materials is still of utmost importance. Here we present a set of novel optical sensing materials recently developed in the Institute of Analytical Chemistry and Food Chemistry which are optimized for marine applications. Particularly, we present new NIR indicators and sensors for oxygen and pH which feature high brightness and low level of autofluorescence. The oxygen sensors rely on highly photostable metal complexes of benzoporphyrins and azabenzoporphyrins and enable several important applications such as simultaneous monitoring of oxygen and chlorophyll or ultra-fast oxygen monitoring (Eddy correlation). We also developed ulta-sensitive oxygen optodes which enable monitoring in nM range and are primary designed for investigation of oxygen minimum zones. The dynamic range of our new NIR pH indicators based on aza-BODIPY dyes is optimized for the marine environment. A highly sensitive NIR luminescent phosphor (chromium(III) doped yttrium aluminium borate) can be used for non-invasive temperature measurements. Notably, the oxygen, pH sensors and temperature sensors are fully compatible with the commercially available fiber-optic readers (Firesting from PyroScience). An optical CO2 sensor for marine applications employs novel diketopyrrolopyrrol indicators and enables ratiometric imaging using a CCD camera. Oxygen, pH and temperature sensors suitable for lifetime and ratiometric imaging of analytes distribution are also realized. To enable versatility of applications we also obtained a range of nano- and microparticles suitable for intra- and extracellular imaging of the above analytes. Bright ratiometric 2-photon-excitable probes were also developed. Magnetic microparticles are demonstrated to be very promising tools for imaging of oxygen, temperature and other parameters in biofilms, corals etc. since they combine the sensing function with the possibility of external manipulation.

  19. Optical sensors and multisensor arrays containing thin film electroluminescent devices

    DOEpatents

    Aylott, Jonathan W.; Chen-Esterlit, Zoe; Friedl, Jon H.; Kopelman, Raoul; Savvateev, Vadim N.; Shinar, Joseph

    2001-12-18

    Optical sensor, probe and array devices for detecting chemical biological, and physical analytes. The devices include an analyte-sensitive layer optically coupled to a thin film electroluminescent layer which activates the analyte-sensitive layer to provide an optical response. The optical response varies depending upon the presence of an analyte and is detected by a photodetector and analyzed to determine the properties of the analyte.

  20. Phase-0/microdosing studies using PET, AMS, and LC-MS/MS: a range of study methodologies and conduct considerations. Accelerating development of novel pharmaceuticals through safe testing in humans - a practical guide.

    PubMed

    Burt, Tal; John, Christy S; Ruckle, Jon L; Vuong, Le T

    2017-05-01

    Phase-0 studies, including microdosing, also called Exploratory Investigational New Drug (eIND) or exploratory clinical trials, are a regulatory framework for first-in-human (FIH) trials. Common to these approaches is the use and implied safety of limited exposures to test articles. Use of sub-pharmacological doses in phase-0/microdose studies requires sensitive analytic tools such as accelerator mass spectrometer (AMS), Positron Emission Tomography (PET), and Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) to determine drug disposition. Areas covered: Here we present a practical guide to the range of methodologies, design options, and conduct strategies that can be used to increase the efficiency of drug development. We provide detailed examples of relevant developmental scenarios. Expert opinion: Validation studies over the past decade demonstrated the reliability of extrapolation of sub-pharmacological to therapeutic-level exposures in more than 80% of cases, an improvement over traditional allometric approaches. Applications of phase-0/microdosing approaches include study of pharmacokinetic and pharmacodynamic properties, target tissue localization, drug-drug interactions, effects in vulnerable populations (e.g. pediatric), and intra-target microdosing (ITM). Study design should take into account the advantages and disadvantages of each analytic tool. Utilization of combinations of these analytic techniques increases the versatility of study designs and the power of data obtained.

  1. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  2. Can the analyte-triggered asymmetric autocatalytic Soai reaction serve as a universal analytical tool for measuring enantiopurity and assigning absolute configuration?

    PubMed

    Welch, Christopher J; Zawatzky, Kerstin; Makarov, Alexey A; Fujiwara, Satoshi; Matsumoto, Arimasa; Soai, Kenso

    2016-12-20

    An investigation is reported on the use of the autocatalytic enantioselective Soai reaction, known to be influenced by the presence of a wide variety of chiral materials, as a generic tool for measuring the enantiopurity and absolute configuration of any substance. Good generality for the reaction across a small group of test analytes was observed, consistent with literature reports suggesting a diversity of compound types that can influence the stereochemical outcome of this reaction. Some trends in the absolute sense of stereochemical enrichment were noted, suggesting the possible utility of the approach for assigning absolute configuration to unknown compounds, by analogy to closely related species with known outcomes. Considerable variation was observed in the triggering strength of different enantiopure materials, an undesirable characteristic when dealing with mixtures containing minor impurities with strong triggering strength in the presence of major components with weak triggering strength. A strong tendency of the reaction toward an 'all or none' type of behavior makes the reaction most sensitive for detecting enantioenrichment close to zero. Consequently, the ability to discern modest from excellent enantioselectivity was relatively poor. While these properties limit the ability to obtain precise enantiopurity measurements in a simple single addition experiment, prospects may exist for more complex experimental setups that may potentially offer improved performance.

  3. Landfill Site Selection by AHP Based Multi-criteria Decision Making Tool: A Case Study in Kolkata, India

    NASA Astrophysics Data System (ADS)

    Majumdar, Ankush; Hazra, Tumpa; Dutta, Amit

    2017-09-01

    This work presents a Multi-criteria Decision Making (MCDM) tool to select a landfill site from three candidate sites proposed for Kolkata Municipal Corporation (KMC) area that complies with accessibility, receptor, environment, public acceptability, geological and economic criteria. Analytical Hierarchy Process has been used to solve the MCDM problem. Suitability of the three sites (viz. Natagachi, Gangajoara and Kharamba) as landfills as proposed by KMC has been checked by Landfill Site Sensitivity Index (LSSI) as well as Economic Viability Index (EVI). Land area availability for disposing huge quantity of Municipal Solid Waste for the design period has been checked. Analysis of the studied sites show that they are moderately suitable for landfill facility construction as both LSSI and EVI scores lay between 300 and 750. The proposed approach represents an effective MCDM tool for siting sanitary landfill in growing metropolitan cities of developing countries like India.

  4. An Excel®-based visualization tool of 2-D soil gas concentration profiles in petroleum vapor intrusion

    PubMed Central

    Verginelli, Iason; Yao, Yijun; Suuberg, Eric M.

    2017-01-01

    In this study we present a petroleum vapor intrusion tool implemented in Microsoft® Excel® using Visual Basic for Applications (VBA) and integrated within a graphical interface. The latter helps users easily visualize two-dimensional soil gas concentration profiles and indoor concentrations as a function of site-specific conditions such as source strength and depth, biodegradation reaction rate constant, soil characteristics and building features. This tool is based on a two-dimensional explicit analytical model that combines steady-state diffusion-dominated vapor transport in a homogeneous soil with a piecewise first-order aerobic biodegradation model, in which rate is limited by oxygen availability. As recommended in the recently released United States Environmental Protection Agency's final Petroleum Vapor Intrusion guidance, a sensitivity analysis and a simplified Monte Carlo uncertainty analysis are also included in the spreadsheet. PMID:28163564

  5. An Excel®-based visualization tool of 2-D soil gas concentration profiles in petroleum vapor intrusion.

    PubMed

    Verginelli, Iason; Yao, Yijun; Suuberg, Eric M

    2016-01-01

    In this study we present a petroleum vapor intrusion tool implemented in Microsoft ® Excel ® using Visual Basic for Applications (VBA) and integrated within a graphical interface. The latter helps users easily visualize two-dimensional soil gas concentration profiles and indoor concentrations as a function of site-specific conditions such as source strength and depth, biodegradation reaction rate constant, soil characteristics and building features. This tool is based on a two-dimensional explicit analytical model that combines steady-state diffusion-dominated vapor transport in a homogeneous soil with a piecewise first-order aerobic biodegradation model, in which rate is limited by oxygen availability. As recommended in the recently released United States Environmental Protection Agency's final Petroleum Vapor Intrusion guidance, a sensitivity analysis and a simplified Monte Carlo uncertainty analysis are also included in the spreadsheet.

  6. Interactions between self-assembled monolayers and an organophosphonate: A detailed study using surface acoustic wave-based mass analysis, polarization modulation-FTIR spectroscopy, and ellipsometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crooks, R.M.; Yang, H.C.; McEllistrem, L.J.

    Self-assembled monolayers (SAMs) having surfaces terminated in the following functional groups: -CH{sub 3}, -OH, -COOH, and (COO{sup -}){sub 2}Cu{sup 2+} (MUA-Cu{sup 2+}) have been prepared and examined as potential chemically sensitive interfaces. Mass measurements made using surface acoustic wave (SAW) devices indicate that these surfaces display different degrees of selectivity and sensitivity to a range of analytes. The response of the MUA-Cu{sup 2+} SAM to the nerve-agent simulant diisopropyl methylphosphonate (DIMP) is particularly intriguing. Exposure of this surface to 50%-of-saturation DIMP yields a surface concentration equivalent to about 20 DIMP monolayers. Such a high surface concentration in equilibrium with amore » much lower-than-saturation vapor pressure has not previously been observed. Newly developed analytical tools have made it possible to measure the infrared spectrum of the chemically receptive surface during analyte dosing. Coupled with in-situ SAW/ellipsometry measurements, which permit simultaneous measurement of mass and thickness with nanogram and Angstrom resolution, respectively, it has been possibly to develop a model for the surface chemistry leading to the unusual behavior of this system. The results indicate that DIMP interacts strongly with surface-confined Cu{sup 2+} adduct that nucleates growth of semi-ordered crystallites having substantially lower vapor pressure than the liquid.« less

  7. Sensitive glow discharge ion source for aerosol and gas analysis

    DOEpatents

    Reilly, Peter T. A. [Knoxville, TN

    2007-08-14

    A high sensitivity glow discharge ion source system for analyzing particles includes an aerodynamic lens having a plurality of constrictions for receiving an aerosol including at least one analyte particle in a carrier gas and focusing the analyte particles into a collimated particle beam. A separator separates the carrier gas from the analyte particle beam, wherein the analyte particle beam or vapors derived from the analyte particle beam are selectively transmitted out of from the separator. A glow discharge ionization source includes a discharge chamber having an entrance orifice for receiving the analyte particle beam or analyte vapors, and a target electrode and discharge electrode therein. An electric field applied between the target electrode and discharge electrode generates an analyte ion stream from the analyte vapors, which is directed out of the discharge chamber through an exit orifice, such as to a mass spectrometer. High analyte sensitivity is obtained by pumping the discharge chamber exclusively through the exit orifice and the entrance orifice.

  8. A fit-for-purpose approach to analytical sensitivity applied to a cardiac troponin assay: time to escape the 'highly-sensitive' trap.

    PubMed

    Ungerer, Jacobus P J; Pretorius, Carel J

    2014-04-01

    Highly-sensitive cardiac troponin (cTn) assays are being introduced into the market. In this study we argue that the classification of cTn assays into sensitive and highly-sensitive is flawed and recommend a more appropriate way to characterize analytical sensitivity of cTn assays. The raw data of 2252 cardiac troponin I (cTnI) tests done in duplicate with a 'sensitive' assay was extracted and used to calculate the cTnI levels in all, including those below the 'limit of detection' (LoD) that were censored. Duplicate results were used to determine analytical imprecision. We show that cTnI can be quantified in all samples including those with levels below the LoD and that the actual margins of error decrease as concentrations approach zero. The dichotomous classification of cTn assays into sensitive and highly-sensitive is theoretically flawed and characterizing analytical sensitivity as a continuous variable based on imprecision at 0 and the 99th percentile cut-off would be more appropriate.

  9. Interactive Visual Analytics Approch for Exploration of Geochemical Model Simulations with Different Parameter Sets

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2015-04-01

    Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction is thermodynamically favorable under a broad range of conditions. This includes low temperatures and absence of microbial catalysators. Our approach has potential for use in other applications that involve exploration of relationships in geochemical simulation model data.

  10. Fluorometric biosniffer (biochemical gas sensor) for breath acetone as a volatile indicator of lipid metabolism

    NASA Astrophysics Data System (ADS)

    Mitsubayashi, Kohji; Chien, Po-Jen; Ye, Ming; Suzuki, Takuma; Toma, Koji; Arakawa, Takahiro

    2016-11-01

    A fluorometric acetone biosniffer (biochemical gas sensor) for assessment of lipid metabolism utilizing reverse reaction of secondary alcohol dehydrogenase was constructed and evaluated. The biosniffer showed highly sensitivity and selectivity for continuous monitoring of gaseous acetone. The measurement of breath acetone concentration during fasting and aerobic exercise were also investigated. The acetone biosniffer provides a novel analytical tool for noninvasive evaluation of human lipid metabolism and it is also expected to use for the clinical and physiological applications such as monitoring the progression of diabetes.

  11. Magic Angle Spinning NMR Metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhi Hu, Jian

    Nuclear Magnetic Resonance (NMR) spectroscopy is a non-destructive, quantitative, reproducible, untargeted and unbiased method that requires no or minimal sample preparation, and is one of the leading analytical tools for metabonomics research [1-3]. The easy quantification and the no need of prior knowledge about compounds present in a sample associated with NMR are advantageous over other techniques [1,4]. 1H NMR is especially attractive because protons are present in virtually all metabolites and its NMR sensitivity is high, enabling the simultaneous identification and monitoring of a wide range of low molecular weight metabolites.

  12. Development of a Portable Sensitive Equipment Decontamination System. Volume 2: Activated Carbon Fiber Wipe

    DTIC Science & Technology

    2010-05-01

    absorption. Thermogravimetric Analysis (TGA) was employed to measure absorption of HD and GD into the nylon fabric. TGA is an analytical tool useful in...Chromatography Analysis 26 4.4.5.1 Mass Removed by Wiper at Room Temperature 26 4.4.6 Chemical Agent Mass Removed by Wiper at Elevated and Reduced...Temperature Tests 71 5.9 Vapor Analysis of Spent Wipe 78 5.10 Wiping Efficacy and Complex Geometries 82 5.11 Spray and Wipe Tests 84 5.12 Effect of

  13. Optimization of on-line hydrogen stable isotope ratio measurements of halogen- and sulfur-bearing organic compounds using elemental analyzer–chromium/high-temperature conversion isotope ratio mass spectrometry (EA-Cr/HTC-IRMS)

    USGS Publications Warehouse

    Gehre, Matthias; Renpenning, Julian; Geilmann, Heike; Qi, Haiping; Coplen, Tyler B.; Kümmel, Steffen; Ivdra, Natalija; Brand, Willi A.; Schimmelmann, Arndt

    2017-01-01

    Conclusions: The optimized EA-Cr/HTC reactor design can be implemented in existing analytical equipment using commercially available material and is universally applicable for both heteroelement-bearing and heteroelement-free organic-compound classes. The sensitivity and simplicity of the on-line EA-Cr/HTC-IRMS technique provide a much needed tool for routine hydrogen-isotope source tracing of organic contaminants in the environment. Copyright © 2016 John Wiley & Sons, Ltd.

  14. [Forensic toxicology, a growing scientific discipline].

    PubMed

    Augsburger, Marc; Staub, Christian

    2008-07-02

    Forensic toxicology has to bring evidence of substances that could have been involved directly or indirectly in the cause of death or that could influence the behaviour of somebody. The increase of the consumption of illegal and legal drugs in modern societies during last decades gave a boost to forensic toxicology. Moreover, improvement with analytical technology gave tools with high degrees of sensitivity and specificity for the screening and quantification of a large amount of substances in various biological specimens, even with very low concentration resulting of a single dose of medication.

  15. Evaluation of plasma proteomic data for Alzheimer disease state classification and for the prediction of progression from mild cognitive impairment to Alzheimer disease.

    PubMed

    Llano, Daniel A; Devanarayan, Viswanath; Simon, Adam J

    2013-01-01

    Previous studies that have examined the potential for plasma markers to serve as biomarkers for Alzheimer disease (AD) have studied single analytes and focused on the amyloid-β and τ isoforms and have failed to yield conclusive results. In this study, we performed a multivariate analysis of 146 plasma analytes (the Human DiscoveryMAP v 1.0 from Rules-Based Medicine) in 527 subjects with AD, mild cognitive impairment (MCI), or cognitively normal elderly subjects from the Alzheimer's Disease Neuroimaging Initiative database. We identified 4 different proteomic signatures, each using 5 to 14 analytes, that differentiate AD from control patients with sensitivity and specificity ranging from 74% to 85%. Five analytes were common to all 4 signatures: apolipoprotein A-II, apolipoprotein E, serum glutamic oxaloacetic transaminase, α-1-microglobulin, and brain natriuretic peptide. None of the signatures adequately predicted progression from MCI to AD over a 12- and 24-month period. A new panel of analytes, optimized to predict MCI to AD conversion, was able to provide 55% to 60% predictive accuracy. These data suggest that a simple panel of plasma analytes may provide an adjunctive tool to differentiate AD from controls, may provide mechanistic insights to the etiology of AD, but cannot adequately predict MCI to AD conversion.

  16. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  17. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  18. Muver, a computational framework for accurately calling accumulated mutations.

    PubMed

    Burkholder, Adam B; Lujan, Scott A; Lavender, Christopher A; Grimm, Sara A; Kunkel, Thomas A; Fargo, David C

    2018-05-09

    Identification of mutations from next-generation sequencing data typically requires a balance between sensitivity and accuracy. This is particularly true of DNA insertions and deletions (indels), that can impart significant phenotypic consequences on cells but are harder to call than substitution mutations from whole genome mutation accumulation experiments. To overcome these difficulties, we present muver, a computational framework that integrates established bioinformatics tools with novel analytical methods to generate mutation calls with the extremely low false positive rates and high sensitivity required for accurate mutation rate determination and comparison. Muver uses statistical comparison of ancestral and descendant allelic frequencies to identify variant loci and assigns genotypes with models that include per-sample assessments of sequencing errors by mutation type and repeat context. Muver identifies maximally parsimonious mutation pathways that connect these genotypes, differentiating potential allelic conversion events and delineating ambiguities in mutation location, type, and size. Benchmarking with a human gold standard father-son pair demonstrates muver's sensitivity and low false positive rates. In DNA mismatch repair (MMR) deficient Saccharomyces cerevisiae, muver detects multi-base deletions in homopolymers longer than the replicative polymerase footprint at rates greater than predicted for sequential single-base deletions, implying a novel multi-repeat-unit slippage mechanism. Benchmarking results demonstrate the high accuracy and sensitivity achieved with muver, particularly for indels, relative to available tools. Applied to an MMR-deficient Saccharomyces cerevisiae system, muver mutation calls facilitate mechanistic insights into DNA replication fidelity.

  19. A density functional theory study of the correlation between analyte basicity, ZnPc adsorption strength, and sensor response.

    PubMed

    Tran, N L; Bohrer, F I; Trogler, W C; Kummel, A C

    2009-05-28

    Density functional theory (DFT) simulations were used to determine the binding strength of 12 electron-donating analytes to the zinc metal center of a zinc phthalocyanine molecule (ZnPc monomer). The analyte binding strengths were compared to the analytes' enthalpies of complex formation with boron trifluoride (BF(3)), which is a direct measure of their electron donating ability or Lewis basicity. With the exception of the most basic analyte investigated, the ZnPc binding energies were found to correlate linearly with analyte basicities. Based on natural population analysis calculations, analyte complexation to the Zn metal of the ZnPc monomer resulted in limited charge transfer from the analyte to the ZnPc molecule, which increased with analyte-ZnPc binding energy. The experimental analyte sensitivities from chemiresistor ZnPc sensor data were proportional to an exponential of the binding energies from DFT calculations consistent with sensitivity being proportional to analyte coverage and binding strength. The good correlation observed suggests DFT is a reliable method for the prediction of chemiresistor metallophthalocyanine binding strengths and response sensitivities.

  20. Two-Dimensional Model for Reactive-Sorption Columns of Cylindrical Geometry: Analytical Solutions and Moment Analysis.

    PubMed

    Khan, Farman U; Qamar, Shamsul

    2017-05-01

    A set of analytical solutions are presented for a model describing the transport of a solute in a fixed-bed reactor of cylindrical geometry subjected to the first (Dirichlet) and third (Danckwerts) type inlet boundary conditions. Linear sorption kinetic process and first-order decay are considered. Cylindrical geometry allows the use of large columns to investigate dispersion, adsorption/desorption and reaction kinetic mechanisms. The finite Hankel and Laplace transform techniques are adopted to solve the model equations. For further analysis, statistical temporal moments are derived from the Laplace-transformed solutions. The developed analytical solutions are compared with the numerical solutions of high-resolution finite volume scheme. Different case studies are presented and discussed for a series of numerical values corresponding to a wide range of mass transfer and reaction kinetics. A good agreement was observed in the analytical and numerical concentration profiles and moments. The developed solutions are efficient tools for analyzing numerical algorithms, sensitivity analysis and simultaneous determination of the longitudinal and transverse dispersion coefficients from a laboratory-scale radial column experiment. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Helios: Understanding Solar Evolution Through Text Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randazzese, Lucien

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance,more » or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.« less

  2. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Sensitivity of charge transport measurements to local inhomogeneities

    NASA Astrophysics Data System (ADS)

    Koon, Daniel; Wang, Fei; Hjorth Petersen, Dirch; Hansen, Ole

    2012-02-01

    We derive analytic expressions for the sensitivity of resistive and Hall measurements to local variations in a specimen's material properties in the combined linear limit of both small magnetic fields and small perturbations, presenting exact, algebraic expressions both for four-point probe measurements on an infinite plane and for symmetric, circular van der Pauw discs. We then generalize the results to obtain corrections to the sensitivities both for finite magnetic fields and for finite perturbations. Calculated functions match published results and computer simulations, and provide an intuitive, visual explanation for experimental misassignment of carrier type in n-type ZnO and agree with published experimental results for holes in a uniform material. These results simplify calculation and plotting of the sensitivities on an NxN grid from a problem of order N^5 to one of order N^3 in the arbitrary case and of order N^2 in the handful of cases that can be solved exactly, putting a powerful tool for inhomogeneity analysis in the hands of the researcher: calculation of the sensitivities requires little more than the solution of Laplace's equation on the specimen geometry.

  4. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  5. Polysialylated N-Glycans Identified in Human Serum Through Combined Developments in Sample Preparation, Separations and Electrospray ionization-mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kronewitter, Scott R.; Marginean, Ioan; Cox, Jonathan T.

    The N-glycan diversity of human serum glycoproteins, i.e. the human blood serum N-glycome, is complex due to the range of glycan structures potentially synthesizable by human glycosylation enzymes. The reported glycome, however, is limited by methods of sample preparation, available analytical platforms, e.g., based upon electrospray ionization-mass spectrometry (ESI-MS), and software tools for data analysis. In this report, several improvements have been implemented in sample preparation and analysis to extend ESI-MS glycan characterization and to provide an improved view of glycan diversity. Sample preparation improvements include acidified, microwave-accelerated, PNGase F N-glycan release, and sodium borohydride reduction were optimized to improvemore » quantitative yields and conserve the number of glycoforms detected. Two-stage desalting (during solid phase extraction and on the analytical column) increased the sensitivity by reducing analyte signal division between multiple reducing-end-forms or cation adducts. On-line separations were improved by using extended length graphitized carbon columns and adding TFA as an acid modifier to a formic acid/reversed phase gradient which provides additional resolving power and significantly improved desorption of both large and heavily sialylated glycans. To improve MS sensitivity and provide gentler ionization conditions at the source-MS interface, subambient pressure ionization with nanoelectrospray (SPIN) has been utilized. When method improvements are combined together with the Glycomics Quintavariate Informed Quantification (GlyQ-IQ) recently described1 these technologies demonstrate the ability to significantly extend glycan detection sensitivity and provide expanded glycan coverage. We demonstrate application of these advances in the context of the human serum glycome, and for which our initial observations include detection of a new class of heavily sialylated N-glycans, including polysialylated N-glycans.« less

  6. Development and analytical validation of a radioimmunoassay for the measurement of feline pancreatic lipase immunoreactivity in serum

    PubMed Central

    2004-01-01

    Abstract Pancreatitis is recognized as an important cause for morbidity and mortality in cats, but diagnosis remains difficult in many cases. As a first step in trying to identify a better diagnostic tool for feline pancreatitis the objective of this project was to develop and analytically validate a radioimmunoassay for the measurement of feline pancreatic lipase immunoreactivity (fPLI). Feline pancreatic lipase (fPL) was purified from pancreatic tissue and antiserum against fPL was raised in rabbits. Tracer was produced by iodination of fPL using the chloramine T method. A radioimmunoassay was established and analytically validated by determination of sensitivity, dilutional parallelism, spiking recovery, intra-assay variability, and interassay variability. A control range for fPLI in cat serum was established from 30 healthy cats using the central 95th percentile. The sensitivity of the assay was 1.2 μg/L. Observed to expected ratios for serial dilutions ranged from 98.8% to 164.3% for 3 different serum samples. Observed to expected ratios for spiking recovery ranged from 76.9% to 147.6% for 3 different serum samples. Coefficients of variation for intra- and interassay variability for 4 different serum samples were 10.1%, 4.5%, 2.2%, and 3.9% and 24.4%, 15.8%, 16.6%, and 21.3%, respectively. A reference range for fPLI was established as 1.2 to 3.8 μg/L. We conclude that the assay described is sensitive, accurate, and precise with limited linearity in the lower and limited reproducibility in the lower and higher end of the working range. Further studies to evaluate the clinical usefulness of this assay are needed and in progress. PMID:15581227

  7. Preliminary Validation of Direct Detection of Foot-And-Mouth Disease Virus within Clinical Samples Using Reverse Transcription Loop-Mediated Isothermal Amplification Coupled with a Simple Lateral Flow Device for Detection

    PubMed Central

    Waters, Ryan A.; Fowler, Veronica L.; Armson, Bryony; Nelson, Noel; Gloster, John; Paton, David J.; King, Donald P.

    2014-01-01

    Rapid, field-based diagnostic assays are desirable tools for the control of foot-and-mouth disease (FMD). Current approaches involve either; 1) Detection of FMD virus (FMDV) with immuochromatographic antigen lateral flow devices (LFD), which have relatively low analytical sensitivity, or 2) portable RT-qPCR that has high analytical sensitivity but is expensive. Loop-mediated isothermal amplification (LAMP) may provide a platform upon which to develop field based assays without these drawbacks. The objective of this study was to modify an FMDV-specific reverse transcription–LAMP (RT-LAMP) assay to enable detection of dual-labelled LAMP products with an LFD, and to evaluate simple sample processing protocols without nucleic acid extraction. The limit of detection of this assay was demonstrated to be equivalent to that of a laboratory based real-time RT-qPCR assay and to have a 10,000 fold higher analytical sensitivity than the FMDV-specific antigen LFD currently used in the field. Importantly, this study demonstrated that FMDV RNA could be detected from epithelial suspensions without the need for prior RNA extraction, utilising a rudimentary heat source for amplification. Once optimised, this RT-LAMP-LFD protocol was able to detect multiple serotypes from field epithelial samples, in addition to detecting FMDV in the air surrounding infected cattle, pigs and sheep, including pre-clinical detection. This study describes the development and evaluation of an assay format, which may be used as a future basis for rapid and low cost detection of FMDV. In addition it provides providing “proof of concept” for the future use of LAMP assays to tackle other challenging diagnostic scenarios encompassing veterinary and human health. PMID:25165973

  8. Analytical model for advective-dispersive transport involving flexible boundary inputs, initial distributions and zero-order productions

    NASA Astrophysics Data System (ADS)

    Chen, Jui-Sheng; Li, Loretta Y.; Lai, Keng-Hsin; Liang, Ching-Ping

    2017-11-01

    A novel solution method is presented which leads to an analytical model for the advective-dispersive transport in a semi-infinite domain involving a wide spectrum of boundary inputs, initial distributions, and zero-order productions. The novel solution method applies the Laplace transform in combination with the generalized integral transform technique (GITT) to obtain the generalized analytical solution. Based on this generalized analytical expression, we derive a comprehensive set of special-case solutions for some time-dependent boundary distributions and zero-order productions, described by the Dirac delta, constant, Heaviside, exponentially-decaying, or periodically sinusoidal functions as well as some position-dependent initial conditions and zero-order productions specified by the Dirac delta, constant, Heaviside, or exponentially-decaying functions. The developed solutions are tested against an analytical solution from the literature. The excellent agreement between the analytical solutions confirms that the new model can serve as an effective tool for investigating transport behaviors under different scenarios. Several examples of applications, are given to explore transport behaviors which are rarely noted in the literature. The results show that the concentration waves resulting from the periodically sinusoidal input are sensitive to dispersion coefficient. The implication of this new finding is that a tracer test with a periodic input may provide additional information when for identifying the dispersion coefficients. Moreover, the solution strategy presented in this study can be extended to derive analytical models for handling more complicated problems of solute transport in multi-dimensional media subjected to sequential decay chain reactions, for which analytical solutions are not currently available.

  9. Molecular Imprinting Technology in Quartz Crystal Microbalance (QCM) Sensors.

    PubMed

    Emir Diltemiz, Sibel; Keçili, Rüstem; Ersöz, Arzu; Say, Rıdvan

    2017-02-24

    Molecularly imprinted polymers (MIPs) as artificial antibodies have received considerable scientific attention in the past years in the field of (bio)sensors since they have unique features that distinguish them from natural antibodies such as robustness, multiple binding sites, low cost, facile preparation and high stability under extreme operation conditions (higher pH and temperature values, etc.). On the other hand, the Quartz Crystal Microbalance (QCM) is an analytical tool based on the measurement of small mass changes on the sensor surface. QCM sensors are practical and convenient monitoring tools because of their specificity, sensitivity, high accuracy, stability and reproducibility. QCM devices are highly suitable for converting the recognition process achieved using MIP-based memories into a sensor signal. Therefore, the combination of a QCM and MIPs as synthetic receptors enhances the sensitivity through MIP process-based multiplexed binding sites using size, 3D-shape and chemical function having molecular memories of the prepared sensor system toward the target compound to be detected. This review aims to highlight and summarize the recent progress and studies in the field of (bio)sensor systems based on QCMs combined with molecular imprinting technology.

  10. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  11. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  12. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  13. Mid-frequency Band Dynamics of Large Space Structures

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.; Adams, Douglas S.

    2004-01-01

    High and low intensity dynamic environments experienced by a spacecraft during launch and on-orbit operations, respectively, induce structural loads and motions, which are difficult to reliably predict. Structural dynamics in low- and mid-frequency bands are sensitive to component interface uncertainty and non-linearity as evidenced in laboratory testing and flight operations. Analytical tools for prediction of linear system response are not necessarily adequate for reliable prediction of mid-frequency band dynamics and analysis of measured laboratory and flight data. A new MATLAB toolbox, designed to address the key challenges of mid-frequency band dynamics, is introduced in this paper. Finite-element models of major subassemblies are defined following rational frequency-wavelength guidelines. For computational efficiency, these subassemblies are described as linear, component mode models. The complete structural system model is composed of component mode subassemblies and linear or non-linear joint descriptions. Computation and display of structural dynamic responses are accomplished employing well-established, stable numerical methods, modern signal processing procedures and descriptive graphical tools. Parametric sensitivity and Monte-Carlo based system identification tools are used to reconcile models with experimental data and investigate the effects of uncertainties. Models and dynamic responses are exported for employment in applications, such as detailed structural integrity and mechanical-optical-control performance analyses.

  14. New features and improved uncertainty analysis in the NEA nuclear data sensitivity tool (NDaST)

    NASA Astrophysics Data System (ADS)

    Dyrda, J.; Soppera, N.; Hill, I.; Bossant, M.; Gulliford, J.

    2017-09-01

    Following the release and initial testing period of the NEA's Nuclear Data Sensitivity Tool [1], new features have been designed and implemented in order to expand its uncertainty analysis capabilities. The aim is to provide a free online tool for integral benchmark testing, that is both efficient and comprehensive, meeting the needs of the nuclear data and benchmark testing communities. New features include access to P1 sensitivities for neutron scattering angular distribution [2] and constrained Chi sensitivities for the prompt fission neutron energy sampling. Both of these are compatible with covariance data accessed via the JANIS nuclear data software, enabling propagation of the resultant uncertainties in keff to a large series of integral experiment benchmarks. These capabilities are available using a number of different covariance libraries e.g., ENDF/B, JEFF, JENDL and TENDL, allowing comparison of the broad range of results it is possible to obtain. The IRPhE database of reactor physics measurements is now also accessible within the tool in addition to the criticality benchmarks from ICSBEP. Other improvements include the ability to determine and visualise the energy dependence of a given calculated result in order to better identify specific regions of importance or high uncertainty contribution. Sorting and statistical analysis of the selected benchmark suite is now also provided. Examples of the plots generated by the software are included to illustrate such capabilities. Finally, a number of analytical expressions, for example Maxwellian and Watt fission spectra will be included. This will allow the analyst to determine the impact of varying such distributions within the data evaluation, either through adjustment of parameters within the expressions, or by comparison to a more general probability distribution fitted to measured data. The impact of such changes is verified through calculations which are compared to a `direct' measurement found by adjustment of the original ENDF format file.

  15. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  16. Microwave-assisted chemical pre-treatment of waste sorghum leaves: Process optimization and development of an intelligent model for determination of volatile compound fractions.

    PubMed

    Rorke, Daneal C S; Suinyuy, Terence N; Gueguim Kana, E B

    2017-01-01

    This study reports the profiling of volatile compounds generated during microwave-assisted chemical pre-treatment of sorghum leaves. Compounds including acetic acid (0-186.26ng/g SL), furfural (0-240.80ng/g SL), 5-hydroxymethylfurfural (HMF) (0-19.20ng/g SL) and phenol (0-7.76ng/g SL) were detected. The reducing sugar production was optimized. An intelligent model based on Artificial Neural Networks (ANNs) was developed and validated to predict a profile of 21 volatile compounds under novel pre-treatment conditions. This model gave R 2 -values of up to 0.93. Knowledge extraction revealed furfural and phenol exhibited high sensitivity to acid- and alkali concentration and S:L ratio, while phenol showed high sensitivity to microwave duration and intensity. Furthermore, furfural production was majorly dependent on acid concentration and fit a dosage-response relationship model with a 2.5% HCl threshold. Significant non-linearities were observed between pre-treatment conditions and the profile of various compounds. This tool reduces analytical costs through virtual analytical instrumentation, improving process economics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Amperometric Enzyme-Based Biosensors for Application in Food and Beverage Industry

    NASA Astrophysics Data System (ADS)

    Csöoregi, Elisabeth; Gáspñr, Szilveszter; Niculescu, Mihaela; Mattiasson, Bo; Schuhmann, Wolfgang

    Continuous, sensitive, selective, and reliable monitoring of a large variety of different compounds in various food and beverage samples is of increasing importance to assure a high-quality and tracing of any possible source of contamination of food and beverages. Most of the presently used classical analytical methods are often requiring expensive instrumentation, long analysis times and well-trained staff. Amperometric enzyme-based biosensors on the other hand have emerged in the last decade from basic science to useful tools with very promising application possibilities in food and beverage industry. Amperometric biosensors are in general highly selective, sensitive, relatively cheap, and easy to integrate into continuous analysis systems. A successful application of such sensors for industrial purposes, however, requires a sensor design, which satisfies the specific needs of monitoring the targeted analyte in the particular application, Since each individual application needs different operational conditions and sensor characteristics, it is obvious that biosensors have to be tailored for the particular case. The characteristics of the biosensors are depending on the used biorecognition element (enzyme), nature of signal transducer (electrode material) and the communication between these two elements (electron-transfer pathway).

  18. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  19. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Heat as a groundwater tracer in shallow and deep heterogeneous media: Analytical solution, spreadsheet tool, and field applications

    USGS Publications Warehouse

    Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah

    2017-01-01

    Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., <1 m) or deep (e.g., up to 100 m) profiles. The solution is not transient, and thus, it should be cautiously applied where diel signals propagate or in deeper zones where multi‐decadal surface signals have disturbed subsurface thermal regimes.

  1. QSPR studies on the photoinduced-fluorescence behaviour of pharmaceuticals and pesticides.

    PubMed

    López-Malo, D; Bueso-Bordils, J I; Duart, M J; Alemán-López, P A; Martín-Algarra, R V; Antón-Fos, G M; Lahuerta-Zamora, L; Martínez-Calatayud, J

    2017-07-01

    Fluorimetric analysis is still a growing line of research in the determination of a wide range of organic compounds, including pharmaceuticals and pesticides, which makes necessary the development of new strategies aimed at improving the performance of fluorescence determinations as well as the sensitivity and, especially, the selectivity of the newly developed analytical methods. In this paper are presented applications of a useful and growing tool suitable for fostering and improving research in the analytical field. Experimental screening, molecular connectivity and discriminant analysis are applied to organic compounds to predict their fluorescent behaviour after their photodegradation by UV irradiation in a continuous flow manifold (multicommutation flow assembly). The screening was based on online fluorimetric measurement and comprised pre-selected compounds with different molecular structures (pharmaceuticals and some pesticides with known 'native' fluorescent behaviour) to study their changes in fluorescent behaviour after UV irradiation. Theoretical predictions agree with the results from the experimental screening and could be used to develop selective analytical methods, as well as helping to reduce the need for expensive, time-consuming and trial-and-error screening procedures.

  2. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Analytical expression for position sensitivity of linear response beam position monitor having inter-electrode cross talk

    NASA Astrophysics Data System (ADS)

    Kumar, Mukesh; Ojha, A.; Garg, A. D.; Puntambekar, T. A.; Senecha, V. K.

    2017-02-01

    According to the quasi electrostatic model of linear response capacitive beam position monitor (BPM), the position sensitivity of the device depends only on the aperture of the device and it is independent of processing frequency and load impedance. In practice, however, due to the inter-electrode capacitive coupling (cross talk), the actual position sensitivity of the device decreases with increasing frequency and load impedance. We have taken into account the inter-electrode capacitance to derive and propose a new analytical expression for the position sensitivity as a function of frequency and load impedance. The sensitivity of a linear response shoe-box type BPM has been obtained through simulation using CST Studio Suite to verify and confirm the validity of the new analytical equation. Good agreement between the simulation results and the new analytical expression suggest that this method can be exploited for proper designing of BPM.

  4. Multifunctionalized Reduced Graphene Oxide Biosensors for Simultaneous Monitoring of Structural Changes in Amyloid-β 40.

    PubMed

    Jeong, Dahye; Kim, Jinsik; Chae, Myung-Sic; Lee, Wonseok; Yang, Seung-Hoon; Kim, YoungSoo; Kim, Seung Min; Lee, Jin San; Lee, Jeong Hoon; Choi, Jungkyu; Yoon, Dae Sung; Hwang, Kyo Seon

    2018-05-28

    Determination of the conformation (monomer, oligomer, or fibril) of amyloid peptide aggregates in the human brain is essential for the diagnosis and treatment of Alzheimer's disease (AD). Accordingly, systematic investigation of amyloid conformation using analytical tools is essential for precisely quantifying the relative amounts of the three conformations of amyloid peptide. Here, we developed a reduced graphene oxide (rGO) based multiplexing biosensor that could be used to monitor the relative amounts of the three conformations of various amyloid-β 40 (Aβ40) fluids. The electrical rGO biosensor was composed of a multichannel sensor array capable of individual detection of monomers, oligomers, and fibrils in a single amyloid fluid sample. From the performance test of each sensor, we showed that this method had good analytical sensitivity (1 pg/mL) and a fairly wide dynamic range (1 pg/mL to 10 ng/mL) for each conformation of Aβ40. To verify whether the rGO biosensor could be used to evaluate the relative amounts of the three conformations, various amyloid solutions (monomeric Aβ40, aggregated Aβ40, and disaggregated Aβ40 solutions) were employed. Notably, different trends in the relative amounts of the three conformations were observed in each amyloid solution, indicating that this information could serve as an important parameter in the clinical setting. Accordingly, our analytical tool could precisely detect the relative amounts of the three conformations of Aβ40 and may have potential applications as a diagnostic system for AD.

  5. Pure-rotational spectrometry: a vintage analytical method applied to modern breath analysis.

    PubMed

    Hrubesh, Lawrence W; Droege, Michael W

    2013-09-01

    Pure-rotational spectrometry (PRS) is an established method, typically used to study structures and properties of polar gas-phase molecules, including isotopic and isomeric varieties. PRS has also been used as an analytical tool where it is particularly well suited for detecting or monitoring low-molecular-weight species that are found in exhaled breath. PRS is principally notable for its ultra-high spectral resolution which leads to exceptional specificity to identify molecular compounds in complex mixtures. Recent developments using carbon aerogel for pre-concentrating polar molecules from air samples have extended the sensitivity of PRS into the part-per-billion range. In this paper we describe the principles of PRS and show how it may be configured in several different modes for breath analysis. We discuss the pre-concentration concept and demonstrate its use with the PRS analyzer for alcohols and ammonia sampled directly from the breath.

  6. Biosensors-on-chip: a topical review

    NASA Astrophysics Data System (ADS)

    Chen, Sensen; Shamsi, Mohtashim H.

    2017-08-01

    This review will examine the integration of two fields that are currently at the forefront of science, i.e. biosensors and microfluidics. As a lab-on-a-chip (LOC) technology, microfluidics has been enriched by the integration of various detection tools for analyte detection and quantitation. The application of such microfluidic platforms is greatly increased in the area of biosensors geared towards point-of-care diagnostics. Together, the merger of microfluidics and biosensors has generated miniaturized devices for sample processing and sensitive detection with quantitation. We believe that microfluidic biosensors (biosensors-on-chip) are essential for developing robust and cost effective point-of-care diagnostics. This review is relevant to a variety of disciplines, such as medical science, clinical diagnostics, LOC technologies including MEMs/NEMs, and analytical science. Specifically, this review will appeal to scientists working in the two overlapping fields of biosensors and microfluidics, and will also help new scientists to find their directions in developing point-of-care devices.

  7. Ethane-Bridged Bisporphyrin Conformational Changes As an Effective Analytical Tool for Nonenzymatic Detection of Urea in the Physiological Range.

    PubMed

    Buccolieri, Alessandro; Hasan, Mohammed; Bettini, Simona; Bonfrate, Valentina; Salvatore, Luca; Santino, Angelo; Borovkov, Victor; Giancane, Gabriele

    2018-06-05

    Conformational switching induced in ethane-bridged bisporphyrins was used as a sensitive transduction method for revealing the presence of urea dissolved in water via nonenzymatic approach. Bisporphyrins were deposited on solid quartz slides by means of the spin-coating method. Molecular conformations of Zn and Ni monometalated bis-porphyrins were influenced by water solvated urea molecules and their fluorescence emission was modulated by the urea concentration. Absorption, fluorescence and Raman spectroscopies allowed the identification of supramolecular processes, which are responsible for host-guest interaction between the active layers and urea molecules. A high selectivity of the sensing mechanism was highlighted upon testing the spectroscopic responses of bis-porphyrin films to citrulline and glutamine used as interfering agents. Additionally, potential applicability was demonstrated by quantifying the urea concentration in real physiological samples proposing this new approach as a valuable alternative analytical procedure to the traditionally used enzymatic methods.

  8. Neutron Activation Analysis of the Rare Earth Elements (REE) - With Emphasis on Geological Materials

    NASA Astrophysics Data System (ADS)

    Stosch, Heinz-Günter

    2016-08-01

    Neutron activation analysis (NAA) has been the analytical method of choice for rare earth element (REE) analysis from the early 1960s through the 1980s. At that time, irradiation facilitieswere widely available and fairly easily accessible. The development of high-resolution gamma-ray detectors in the mid-1960s eliminated, formany applications, the need for chemical separation of the REE from the matrix material, making NAA a reliable and effective analytical tool. While not as precise as isotopedilution mass spectrometry, NAA was competitive by being sensitive for the analysis of about half of the rare earths (La, Ce, Nd, Sm, Eu, Tb, Yb, Lu). The development of inductively coupled plasma mass spectrometry since the 1980s, together with decommissioning of research reactors and the lack of installation of new ones in Europe and North America has led to the rapid decline of NAA.

  9. Evolution of microbiological analytical methods for dairy industry needs

    PubMed Central

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675

  10. Flexible, Low-Cost Sensor Based on Electrolyte Gated Carbon Nanotube Field Effect Transistor for Organo-Phosphate Detection

    PubMed Central

    Bhatt, Vijay Deep; Joshi, Saumya; Becherer, Markus; Lugli, Paolo

    2017-01-01

    A flexible enzymatic acetylcholinesterase biosensor based on an electrolyte-gated carbon nanotube field effect transistor is demonstrated. The enzyme immobilization is done on a planar gold gate electrode using 3-mercapto propionic acid as the linker molecule. The sensor showed good sensing capability as a sensor for the neurotransmitter acetylcholine, with a sensitivity of 5.7 μA/decade, and demonstrated excellent specificity when tested against interfering analytes present in the body. As the flexible sensor is supposed to suffer mechanical deformations, the endurance of the sensor was measured by putting it under extensive mechanical stress. The enzymatic activity was inhibited by more than 70% when the phosphate-buffered saline (PBS) buffer was spiked with 5 mg/mL malathion (an organophosphate) solution. The biosensor was successfully challenged with tap water and strawberry juice, demonstrating its usefulness as an analytical tool for organophosphate detection. PMID:28524071

  11. Ion sampling and transport in Inductively Coupled Plasma Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Farnsworth, Paul B.; Spencer, Ross L.

    2017-08-01

    Quantitative accuracy and high sensitivity in inductively coupled plasma mass spectrometry (ICP-MS) depend on consistent and efficient extraction and transport of analyte ions from an inductively coupled plasma to a mass analyzer, where they are sorted and detected. In this review we examine the fundamental physical processes that control ion sampling and transport in ICP-MS and compare the results of theory and computerized models with experimental efforts to characterize the flow of ions through plasma mass spectrometers' vacuum interfaces. We trace the flow of ions from their generation in the plasma, into the sampling cone, through the supersonic expansion in the first vacuum stage, through the skimmer, and into the ion optics that deliver the ions to the mass analyzer. At each stage we consider idealized behavior and departures from ideal behavior that affect the performance of ICP-MS as an analytical tool.

  12. Applications of fiber-optics-based nanosensors to drug discovery.

    PubMed

    Vo-Dinh, Tuan; Scaffidi, Jonathan; Gregas, Molly; Zhang, Yan; Seewaldt, Victoria

    2009-08-01

    Fiber-optic nanosensors are fabricated by heating and pulling optical fibers to yield sub-micron diameter tips and have been used for in vitro analysis of individual living mammalian cells. Immobilization of bioreceptors (e.g., antibodies, peptides, DNA) selective to targeting analyte molecules of interest provides molecular specificity. Excitation light can be launched into the fiber, and the resulting evanescent field at the tip of the nanofiber can be used to excite target molecules bound to the bioreceptor molecules. The fluorescence or surface-enhanced Raman scattering produced by the analyte molecules is detected using an ultra-sensitive photodetector. This article provides an overview of the development and application of fiber-optic nanosensors for drug discovery. The nanosensors provide minimally invasive tools to probe subcellular compartments inside single living cells for health effect studies (e.g., detection of benzopyrene adducts) and medical applications (e.g., monitoring of apoptosis in cells treated with anticancer drugs).

  13. Evolution of microbiological analytical methods for dairy industry needs.

    PubMed

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.

  14. Forensic collection of trace chemicals from diverse surfaces with strippable coatings.

    PubMed

    Jakubowski, Michael J; Beltis, Kevin J; Drennan, Paul M; Pindzola, Bradford A

    2013-11-07

    Surface sampling for chemical analysis plays a vital role in environmental monitoring, industrial hygiene, homeland security and forensics. The standard surface sampling tool, a simple cotton gauze pad, is failing to meet the needs of the community as analytical techniques become more sensitive and the variety of analytes increases. In previous work, we demonstrated the efficacy of non-destructive, conformal, spray-on strippable coatings for chemical collection from simple glass surfaces. Here we expand that work by presenting chemical collection at a low spiking level (0.1 g m(-2)) from a diverse array of common surfaces - painted metal, engineering plastics, painted wallboard and concrete - using strippable coatings. The collection efficiency of the strippable coatings is compared to and far exceeds gauze pads. Collection from concrete, a particular challenge for wipes like gauze, averaged 73% over eight chemically diverse compounds for the strippable coatings whereas gauze averaged 10%.

  15. Advances in Molecular Rotational Spectroscopy for Applied Science

    NASA Astrophysics Data System (ADS)

    Harris, Brent; Fields, Shelby S.; Pulliam, Robin; Muckle, Matt; Neill, Justin L.

    2017-06-01

    Advances in chemical sensitivity and robust, solid-state designs for microwave/millimeter-wave instrumentation compel the expansion of molecular rotational spectroscopy as research tool into applied science. It is familiar to consider molecular rotational spectroscopy for air analysis. Those techniques for molecular rotational spectroscopy are included in our presentation of a more broad application space for materials analysis using Fourier Transform Molecular Rotational Resonance (FT-MRR) spectrometers. There are potentially transformative advantages for direct gas analysis of complex mixtures, determination of unknown evolved gases with parts per trillion detection limits in solid materials, and unambiguous chiral determination. The introduction of FT-MRR as an alternative detection principle for analytical chemistry has created a ripe research space for the development of new analytical methods and sampling equipment to fully enable FT-MRR. We present the current state of purpose-built FT-MRR instrumentation and the latest application measurements that make use of new sampling methods.

  16. Online platform for applying space-time scan statistics for prospectively detecting emerging hot spots of dengue fever.

    PubMed

    Chen, Chien-Chou; Teng, Yung-Chu; Lin, Bo-Cheng; Fan, I-Chun; Chan, Ta-Chien

    2016-11-25

    Cases of dengue fever have increased in areas of Southeast Asia in recent years. Taiwan hit a record-high 42,856 cases in 2015, with the majority in southern Tainan and Kaohsiung Cities. Leveraging spatial statistics and geo-visualization techniques, we aim to design an online analytical tool for local public health workers to prospectively identify ongoing hot spots of dengue fever weekly at the village level. A total of 57,516 confirmed cases of dengue fever in 2014 and 2015 were obtained from the Taiwan Centers for Disease Control (TCDC). Incorporating demographic information as covariates with cumulative cases (365 days) in a discrete Poisson model, we iteratively applied space-time scan statistics by SaTScan software to detect the currently active cluster of dengue fever (reported as relative risk) in each village of Tainan and Kaohsiung every week. A village with a relative risk >1 and p value <0.05 was identified as a dengue-epidemic area. Assuming an ongoing transmission might continuously spread for two consecutive weeks, we estimated the sensitivity and specificity for detecting outbreaks by comparing the scan-based classification (dengue-epidemic vs. dengue-free village) with the true cumulative case numbers from the TCDC's surveillance statistics. Among the 1648 villages in Tainan and Kaohsiung, the overall sensitivity for detecting outbreaks increases as case numbers grow in a total of 92 weekly simulations. The specificity for detecting outbreaks behaves inversely, compared to the sensitivity. On average, the mean sensitivity and specificity of 2-week hot spot detection were 0.615 and 0.891 respectively (p value <0.001) for the covariate adjustment model, as the maximum spatial and temporal windows were specified as 50% of the total population at risk and 28 days. Dengue-epidemic villages were visualized and explored in an interactive map. We designed an online analytical tool for front-line public health workers to prospectively detect ongoing dengue fever transmission on a weekly basis at the village level by using the routine surveillance data.

  17. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  18. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  19. Development and evaluation of a novel LAMP assay for the diagnosis of Cutaneous and Visceral Leishmaniasis.

    PubMed

    Adams, Emily Rebecca; Schoone, Gerard; Versteeg, Inge; Gomez, Maria Adelaida; Diro, Ermias; Mori, Yasuyoshi; Perlee, Desiree; Downing, Tim; Saravia, Nancy; Assaye, Ashenafi; Hailu, Asrat; Albertini, Audrey; Ndung'u, Joseph Mathu; Schallig, Henk

    2018-04-25

    A novel Pan-Leishmania LAMP assay was developed for diagnosis of Cutaneous and Visceral Leishmaniasis (CL & VL) which can be used in near-patient settings. Primers were designed on the 18S rDNA and the conserved region of minicircle kDNA selected on the basis of high copy number. LAMP assays were evaluated for CL in a prospective cohort trial of 105 patients in South-West Colombia. Lesion swab samples from CL suspects were collected and tested using LAMP and compared to a composite reference of microscopy AND/OR culture to calculate diagnostic accuracy. LAMP assays were tested on 50 VL suspected patients from Ethiopia, including whole blood, peripheral blood mononuclear cells, and buffy coat. Diagnostic accuracy was calculated against a reference standard of microscopy of splenic or bone marrow aspirates. To calculate analytical specificity 100 clinical samples and isolates with fever causing pathogens including malaria, arboviruses and bacterial infections were tested. The LAMP assay had a sensitivity of 95% (95% CI: 87.2% - 98.5 %) and a specificity of 86% (95% CI: 67.3% -95.9 %) for the diagnosis of CL. On VL suspects the sensitivity was 92% (95% CI: 74.9 - 99.1%) and specificity of 100% (95% CI: 85.8-100%) in whole blood. For CL, LAMP is a sensitive tool for diagnosis and requires less equipment, time and expertise than alternative CL diagnostics. For VL, LAMP is sensitive using a minimally invasive sample as compared to the gold standard. The analytical specificity was 100%. Copyright © 2018 Adams et al.

  20. An Innovative Field-Applicable Molecular Test to Diagnose Cutaneous Leishmania Viannia spp. Infections

    PubMed Central

    Saldarriaga, Omar A.; Castellanos-Gonzalez, Alejandro; Porrozzi, Renato; Baldeviano, Gerald C.; Lescano, Andrés G.; de Los Santos, Maxy B.; Fernandez, Olga L.; Saravia, Nancy G.; Costa, Erika; Melby, Peter C.; Travi, Bruno L.

    2016-01-01

    Cutaneous and mucosal leishmaniasis is widely distributed in Central and South America. Leishmania of the Viannia subgenus are the most frequent species infecting humans. L. (V.) braziliensis, L. (V.) panamensis are also responsible for metastatic mucosal leishmaniasis. Conventional or real time PCR is a more sensitive diagnostic test than microscopy, but the cost and requirement for infrastructure and trained personnel makes it impractical in most endemic regions. Primary health systems need a sensitive and specific point of care (POC) diagnostic tool. We developed a novel POC molecular diagnostic test for cutaneous leishmaniasis caused by Leishmania (Viannia) spp. Parasite DNA was amplified using isothermal Recombinase Polymerase Amplification (RPA) with primers and probes that targeted the kinetoplast DNA. The amplification product was detected by naked eye with a lateral flow (LF) immunochromatographic strip. The RPA-LF had an analytical sensitivity equivalent to 0.1 parasites per reaction. The test amplified the principal L. Viannia species from multiple countries: L. (V.) braziliensis (n = 33), L. (V.) guyanensis (n = 17), L. (V.) panamensis (n = 9). The less common L. (V.) lainsoni, L. (V.) shawi, and L. (V.) naiffi were also amplified. No amplification was observed in parasites of the L. (Leishmania) subgenus. In a small number of clinical samples (n = 13) we found 100% agreement between PCR and RPA-LF. The high analytical sensitivity and clinical validation indicate the test could improve the efficiency of diagnosis, especially in chronic lesions with submicroscopic parasite burdens. Field implementation of the RPA-LF test could contribute to management and control of cutaneous and mucosal leishmaniasis. PMID:27115155

  1. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    PubMed

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  2. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  3. A new approach for downscaling of electromembrane extraction as a lab on-a-chip device followed by sensitive Red-Green-Blue detection.

    PubMed

    Baharfar, Mahroo; Yamini, Yadollah; Seidi, Shahram; Arain, Muhammad Balal

    2018-05-30

    A new design of electromembrane extraction (EME) as a lab on-a-chip device was proposed for the extraction and determination of phenazopyridine as the model analyte. The extraction procedure was accomplished by coupling of EME and the packing of a sorbent. The analyte was extracted under the applied electrical field across a membrane sheet impregnated by nitrophenyl octylether (NPOE) into an acceptor phase. It was followed by the absorption of the analyte on strong cation exchanger as a sorbent. The designed chip contained separate spiral channels for donor and acceptor phases featuring embedded platinum electrodes to enhance extraction efficiency. The selected donor and acceptor phases were 0 mM HCl and 100 mM HCl, respectively. The on-chip electromembrane extraction was carried out under the voltage level of 70 V for 50 min. The analysis was carried out by two modes of a simple Red-Green-Blue (RGB) image analysis tool and a conventional HPLC-UV system. After the absorption of the analyte on the solid phase, its color changed and a digital picture of the sorbent was taken for the RGB analysis. The effective parameters on the performance of the chip device, comprising the EME and solid phase microextraction steps, were distinguished and optimized. The accumulation of the analyte on the solid phase showed excellent sensitivity and a limit of detection (LOD) lower than 1.0 μg L-1 achieved by an image analysis using a smartphone. This device also offered acceptable intra- and inter-assay RSD% (<10%). The calibration curves were linear within the range of 10-1000 μg L-1 and 30-1000 μg L-1 (r2 > 0.9969) for HPLC-UV and RGB analysis, respectively. To investigate the applicability of the method in complicated matrices, urine samples of patients being treated with phenazopyridine were analyzed.

  4. Recent developments in VSD imaging of small neuronal networks

    PubMed Central

    Hill, Evan S.; Bruno, Angela M.

    2014-01-01

    Voltage-sensitive dye (VSD) imaging is a powerful technique that can provide, in single experiments, a large-scale view of network activity unobtainable with traditional sharp electrode recording methods. Here we review recent work using VSDs to study small networks and highlight several results from this approach. Topics covered include circuit mapping, network multifunctionality, the network basis of decision making, and the presence of variably participating neurons in networks. Analytical tools being developed and applied to large-scale VSD imaging data sets are discussed, and the future prospects for this exciting field are considered. PMID:25225295

  5. Development and validation of the first liquid chromatography-tandem mass spectrometry assay for simultaneous quantification of multiple antiretrovirals in meconium.

    PubMed

    Himes, Sarah K; Scheidweiler, Karl B; Tassiopoulos, Katherine; Kacanek, Deborah; Hazra, Rohan; Rich, Kenneth; Huestis, Marilyn A

    2013-02-05

    A novel method for the simultaneous quantification of 16 antiretroviral (ARV) drugs and 4 metabolites in meconium was developed and validated. Quantification of 6 nucleoside/nucleotide reverse transcriptase inhibitors, 2 non-nucleoside reverse transcriptase inhibitors, 7 protease inhibitors, and 1 integrase inhibitor was achieved in 0.25 g of meconium. Specimen preparation included methanol homogenization and solid-phase extraction. Separate positive and negative polarity multiple reaction monitoring mode injections were required to achieve sufficient sensitivity. Linearity ranged from 10 to 75 ng/g up to 2500 ng/g for most analytes and 100-500 ng/g up to 25,000 ng/g for some; all correlation coefficients were ≥0.99. Extraction efficiencies from meconium were 32.8-119.5% with analytical recovery of 80.3-108.3% and total imprecision of 2.2-11.0% for all quantitative analytes. Two analytes with analytical recovery (70.0-138.5%) falling outside the 80-120% criteria range were considered semiquantitative. Matrix effects were -98.3-47.0% and -98.0-67.2% for analytes and internal standards, respectively. Analytes were stable (>75%) at room temperature for 24 h, 4 °C for 3 days, -20 °C for 3 freeze-thaw cycles over 3 days, and on the autosampler. Method applicability was demonstrated by analyzing meconium from HIV-uninfected infants born to HIV-positive mothers on ARV therapy. This method can be used as a tool to investigate the potential effects of in utero ARV exposure on childhood health and neurodevelopmental outcomes.

  6. Mining Mathematics in Textbook Lessons

    ERIC Educational Resources Information Center

    Ronda, Erlina; Adler, Jill

    2017-01-01

    In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…

  7. Fire behavior modeling-a decision tool

    Treesearch

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  8. Guidance for the Design and Adoption of Analytic Tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  9. Nested PCR Assay for Eight Pathogens: A Rapid Tool for Diagnosis of Bacterial Meningitis.

    PubMed

    Bhagchandani, Sharda P; Kubade, Sushant; Nikhare, Priyanka P; Manke, Sonali; Chandak, Nitin H; Kabra, Dinesh; Baheti, Neeraj N; Agrawal, Vijay S; Sarda, Pankaj; Mahajan, Parikshit; Ganjre, Ashish; Purohit, Hemant J; Singh, Lokendra; Taori, Girdhar M; Daginawala, Hatim F; Kashyap, Rajpal S

    2016-02-01

    Bacterial meningitis is a dreadful infectious disease with a high mortality and morbidity if remained undiagnosed. Traditional diagnostic methods for bacterial meningitis pose a challenge in accurate identification of pathogen, making prognosis difficult. The present study is therefore aimed to design and evaluate a specific and sensitive nested 16S rDNA genus-based polymerase chain reaction (PCR) assay using clinical cerebrospinal fluid (CSF) for rapid diagnosis of eight pathogens causing the disease. The present work was dedicated to development of an in-house genus specific 16S rDNA nested PCR covering pathogens of eight genera responsible for causing bacterial meningitis using newly designed as well as literature based primers for respective genus. A total 150 suspected meningitis CSF obtained from the patients admitted to Central India Institute of Medical Sciences (CIIMS), India during the period from August 2011 to May 2014, were used to evaluate clinical sensitivity and clinical specificity of optimized PCR assays. The analytical sensitivity and specificity of our newly designed genus-specific 16S rDNA PCR were found to be ≥92%. With such a high sensitivity and specificity, our in-house nested PCR was able to give 100% sensitivity in clinically confirmed positive cases and 100% specificity in clinically confirmed negative cases indicating its applicability in clinical diagnosis. Our in-house nested PCR system therefore can diagnose the accurate pathogen causing bacterial meningitis and therefore be useful in selecting a specific treatment line to minimize morbidity. Results are obtained within 24 h and high sensitivity makes this nested PCR assay a rapid and accurate diagnostic tool compared to traditional culture-based methods.

  10. Fast analysis of radionuclide decay chain migration

    NASA Astrophysics Data System (ADS)

    Chen, J. S.; Liang, C. P.; Liu, C. W.; Li, L.

    2014-12-01

    A novel tool for rapidly predicting the long-term plume behavior of an arbitrary length radionuclide decay chain is presented in this study. This fast tool is achieved based on generalized analytical solutions in compact format derived for a set of two-dimensional advection-dispersion equations coupled with sequential first-order decay reactions in groundwater system. The performance of the developed tool is evaluated by a numerical model using a Laplace transform finite difference scheme. The results of performance evaluation indicate that the developed model is robust and accurate. The developed model is then used to fast understand the transport behavior of a four-member radionuclide decay chain. Results show that the plume extents and concentration levels of any target radionuclide are very sensitive to longitudinal, transverse dispersion, decay rate constant and retardation factor. The developed model are useful tools for rapidly assessing the ecological and environmental impact of the accidental radionuclide releases such as the Fukushima nuclear disaster where multiple radionuclides leaked through the reactor, subsequently contaminating the local groundwater and ocean seawater in the vicinity of the nuclear plant.

  11. Analysis of Cysteine Redox Post-Translational Modifications in Cell Biology and Drug Pharmacology.

    PubMed

    Wani, Revati; Murray, Brion W

    2017-01-01

    Reversible cysteine oxidation is an emerging class of protein post-translational modification (PTM) that regulates catalytic activity, modulates conformation, impacts protein-protein interactions, and affects subcellular trafficking of numerous proteins. Redox PTMs encompass a broad array of cysteine oxidation reactions with different half-lives, topographies, and reactivities such as S-glutathionylation and sulfoxidation. Recent studies from our group underscore the lesser known effect of redox protein modifications on drug binding. To date, biological studies to understand mechanistic and functional aspects of redox regulation are technically challenging. A prominent issue is the lack of tools for labeling proteins oxidized to select chemotype/oxidant species in cells. Predictive computational tools and curated databases of oxidized proteins are facilitating structural and functional insights into regulation of the network of oxidized proteins or redox proteome. In this chapter, we discuss analytical platforms for studying protein oxidation, suggest computational tools currently available in the field to determine redox sensitive proteins, and begin to illuminate roles of cysteine redox PTMs in drug pharmacology.

  12. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  13. Pulsed quantum cascade laser-based cavity ring-down spectroscopy for ammonia detection in breath.

    PubMed

    Manne, Jagadeeshwari; Sukhorukov, Oleksandr; Jäger, Wolfgang; Tulip, John

    2006-12-20

    Breath analysis can be a valuable, noninvasive tool for the clinical diagnosis of a number of pathological conditions. The detection of ammonia in exhaled breath is of particular interest for it has been linked to kidney malfunction and peptic ulcers. Pulsed cavity ringdown spectroscopy in the mid-IR region has developed into a sensitive analytical technique for trace gas analysis. A gas analyzer based on a pulsed mid-IR quantum cascade laser operating near 970 cm(-1) has been developed for the detection of ammonia levels in breath. We report a sensitivity of approximately 50 parts per billion with a 20 s time resolution for ammonia detection in breath with this system. The challenges and possible solutions for the quantification of ammonia in human breath by the described technique are discussed.

  14. Determination of transformation products of unsymmetrical dimethylhydrazine in water using vacuum-assisted headspace solid-phase microextraction.

    PubMed

    Orazbayeva, Dina; Kenessov, Bulat; Psillakis, Elefteria; Nassyrova, Dayana; Bektassov, Marat

    2018-06-22

    A new, sensitive and simple method based on vacuum-assisted headspace solid-phase microextraction (Vac-HSSPME) followed by gas chromatography-mass-spectrometry (GC-MS), is proposed for the quantification of rocket fuel unsymmetrical dimethylhydrazine (UDMH) transformation products in water samples. The target transformation products were: pyrazine, 1-methyl-1H-pyrazole, N-nitrosodimethylamine, N,N-dimethylformamide, 1-methyl-1Н-1,2,4-triazole, 1-methyl-imidazole and 1H-pyrazole. For these analytes and within shorter sampling times, Vac-HSSPME yielded detection limits (0.5-100 ng L -1 ) 3-10 times lower than those reported for regular HSSPME. Vac-HSSPME sampling for 30 min at 50 °C yielded the best combination of analyte responses and their standard deviations (<15%). 1-Formyl-2,2-dimethylhydrazine and formamide were discarded because of the poor precision and accuracy when using Vac-HSSPME. The recoveries for the rest of the analytes ranged between 80 and 119%. The modified Mininert valve and Thermogreen septum could be used for automated extraction as it ensured stable analyte signals even after long waiting times (>24 h). Finally, multiple Vac-HSSME proved to be an efficient tool for controlling the matrix effect and quantifying UDMH transformation products. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Recent Advances in Bacteria Identification by Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry Using Nanomaterials as Affinity Probes

    PubMed Central

    Chiu, Tai-Chia

    2014-01-01

    Identifying trace amounts of bacteria rapidly, accurately, selectively, and with high sensitivity is important to ensuring the safety of food and diagnosing infectious bacterial diseases. Microbial diseases constitute the major cause of death in many developing and developed countries of the world. The early detection of pathogenic bacteria is crucial in preventing, treating, and containing the spread of infections, and there is an urgent requirement for sensitive, specific, and accurate diagnostic tests. Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is an extremely selective and sensitive analytical tool that can be used to characterize different species of pathogenic bacteria. Various functionalized or unmodified nanomaterials can be used as affinity probes to capture and concentrate microorganisms. Recent developments in bacterial detection using nanomaterials-assisted MALDI-MS approaches are highlighted in this article. A comprehensive table listing MALDI-MS approaches for identifying pathogenic bacteria, categorized by the nanomaterials used, is provided. PMID:24786089

  16. Rapid and visual detection of Leptospira in urine by LigB-LAMP assay with pre-addition of dye.

    PubMed

    Ali, Syed Atif; Kaur, Gurpreet; Boby, Nongthombam; Sabarinath, T; Solanki, Khushal; Pal, Dheeraj; Chaudhuri, Pallab

    2017-12-01

    Leptospirosis is considered to be the most widespread zoonotic disease caused by pathogenic species of Leptospira. The present study reports a novel set of primers targeting LigB gene for visual detection of pathogenic Leptospira in urine samples through Loop-mediated isothermal amplification (LAMP). The results were recorded by using Hydroxyl napthol blue (HNB), SYBR GREEN I and calcein. Analytical sensitivity of LAMP was as few as 10 leptospiral organisms in spiked urine samples from cattle and dog. LigB gene based LAMP, termed as LigB-LAMP, was found 10 times more sensitive than conventional PCR. The diagnostic specificity of LAMP was 100% when compared to SYBR green qPCR for detection of Leptospira in urine samples. Though qPCR was found more sensitive, the rapidity and simplicity in setting LAMP test followed by visual detection of Leptospira infection in clinical samples makes LigB-LAMP an alternative and favourable diagnostic tool in resource poor setting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Recent advances in bacteria identification by matrix-assisted laser desorption/ionization mass spectrometry using nanomaterials as affinity probes.

    PubMed

    Chiu, Tai-Chia

    2014-04-28

    Identifying trace amounts of bacteria rapidly, accurately, selectively, and with high sensitivity is important to ensuring the safety of food and diagnosing infectious bacterial diseases. Microbial diseases constitute the major cause of death in many developing and developed countries of the world. The early detection of pathogenic bacteria is crucial in preventing, treating, and containing the spread of infections, and there is an urgent requirement for sensitive, specific, and accurate diagnostic tests. Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is an extremely selective and sensitive analytical tool that can be used to characterize different species of pathogenic bacteria. Various functionalized or unmodified nanomaterials can be used as affinity probes to capture and concentrate microorganisms. Recent developments in bacterial detection using nanomaterials-assisted MALDI-MS approaches are highlighted in this article. A comprehensive table listing MALDI-MS approaches for identifying pathogenic bacteria, categorized by the nanomaterials used, is provided.

  18. Advances in ultrasensitive mass spectrometry of organic molecules.

    PubMed

    Kandiah, Mathivathani; Urban, Pawel L

    2013-06-21

    Ultrasensitive mass spectrometric analysis of organic molecules is important for various branches of chemistry, and other fields including physics, earth and environmental sciences, archaeology, biomedicine, and materials science. It finds applications--as an enabling tool--in systems biology, biological imaging, clinical analysis, and forensics. Although there are a number of technical obstacles associated with the analysis of samples by mass spectrometry at ultratrace level (for example analyte losses during sample preparation, insufficient sensitivity, ion suppression), several noteworthy developments have been made over the years. They include: sensitive ion sources, loss-free interfaces, ion optics components, efficient mass analyzers and detectors, as well as "smart" sample preparation strategies. Some of the mass spectrometric methods published to date can achieve sensitivity which is by several orders of magnitude higher than that of alternative approaches. Femto- and attomole level limits of detection are nowadays common, while zepto- and yoctomole level limits of detection have also been reported. We envision that the ultrasensitive mass spectrometric assays will soon contribute to new discoveries in bioscience and other areas.

  19. The effect on reliability and sensitivity to level of training of combining analytic and holistic rating scales for assessing communication skills in an internal medicine resident OSCE.

    PubMed

    Daniels, Vijay John; Harley, Dwight

    2017-07-01

    Although previous research has compared checklists to rating scales for assessing communication, the purpose of this study was to compare the effect on reliability and sensitivity to level of training of an analytic, a holistic, and a combined analytic-holistic rating scale in assessing communication skills. The University of Alberta Internal Medicine Residency runs OSCEs for postgraduate year (PGY) 1 and 2 residents and another for PGY-4 residents. Communication stations were scored with an analytic scale (empathy, non-verbal skills, verbal skills, and coherence subscales) and a holistic scale. Authors analyzed reliability of individual and combined scales using generalizability theory and evaluated each scale's sensitivity to level of training. For analytic, holistic, and combined scales, 12, 12, and 11 stations respectively yielded a Phi of 0.8 for the PGY-1,2 cohort, and 16, 16, and 14 stations yielded a Phi of 0.8 for the PGY-4 cohort. PGY-4 residents scored higher on the combined scale, the analytic rating scale, and the non-verbal and coherence subscales. A combined analytic-holistic rating scale increased score reliability and was sensitive to level of training. Given increased validity evidence, OSCE developers should consider combining analytic and holistic scales when assessing communication skills. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  1. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  2. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    ERIC Educational Resources Information Center

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  3. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  4. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  5. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  6. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  7. A highly specific competitive direct enzyme immunoassay for sterigmatocystin as a tool for rapid immunochemotaxonomic differentiation of mycotoxigenic Aspergillus species.

    PubMed

    Wegner, S; Bauer, J I; Dietrich, R; Märtlbauer, E; Usleber, E; Gottschalk, C; Gross, M

    2017-02-01

    A simplified method to produce specific polyclonal rabbit antibodies against sterigmatocystin (STC) was established, using a STC-glycolic acid-ether derivative (STC-GE) conjugated to keyhole limpet haemocyanin (immunogen). The competitive direct enzyme immunoassay (EIA) established for STC had a detection limit (20% binding inhibition) of 130 pg ml -1 . The test was highly specific for STC, with minor cross-reactivity with O-methylsterigmatocystin (OMSTC, 0·87%) and negligible reactivity with aflatoxins (<0·02%). STC-EIA was used in combination with a previously developed specific EIA for aflatoxins (<0·1% cross-reactivity with STC and OMSTC), to study the STC/aflatoxin production profiles of reference strains of Aspergillus species. This immunochemotaxonomic procedure was found to be a convenient tool to identify STC- or aflatoxin-producing strains. The carcinogenic mycotoxin sterigmatocystin (STC) is produced by several Aspergillus species, either alone or together with aflatoxins. Here, we report a very simple and straightforward procedure to obtain highly sensitive and specific anti-STC antibodies, and their use in the first ever real STC-specific competitive direct enzyme immunoassay (EIA). In combination with a previous EIA for aflatoxins, this study for the first time demonstrates the potential of a STC/aflatoxin EIA pair for what is branded as 'immunochemotaxonomic' identification of mycotoxigenic Aspergillus species. This new analytical tool enhances analytical possibilities for differential analysis of STC and aflatoxins. © 2016 The Society for Applied Microbiology.

  8. Semi-automated De-identification of German Content Sensitive Reports for Big Data Analytics.

    PubMed

    Seuss, Hannes; Dankerl, Peter; Ihle, Matthias; Grandjean, Andrea; Hammon, Rebecca; Kaestle, Nicola; Fasching, Peter A; Maier, Christian; Christoph, Jan; Sedlmayr, Martin; Uder, Michael; Cavallaro, Alexander; Hammon, Matthias

    2017-07-01

    Purpose  Projects involving collaborations between different institutions require data security via selective de-identification of words or phrases. A semi-automated de-identification tool was developed and evaluated on different types of medical reports natively and after adapting the algorithm to the text structure. Materials and Methods  A semi-automated de-identification tool was developed and evaluated for its sensitivity and specificity in detecting sensitive content in written reports. Data from 4671 pathology reports (4105 + 566 in two different formats), 2804 medical reports, 1008 operation reports, and 6223 radiology reports of 1167 patients suffering from breast cancer were de-identified. The content was itemized into four categories: direct identifiers (name, address), indirect identifiers (date of birth/operation, medical ID, etc.), medical terms, and filler words. The software was tested natively (without training) in order to establish a baseline. The reports were manually edited and the model re-trained for the next test set. After manually editing 25, 50, 100, 250, 500 and if applicable 1000 reports of each type re-training was applied. Results  In the native test, 61.3 % of direct and 80.8 % of the indirect identifiers were detected. The performance (P) increased to 91.4 % (P25), 96.7 % (P50), 99.5 % (P100), 99.6 % (P250), 99.7 % (P500) and 100 % (P1000) for direct identifiers and to 93.2 % (P25), 97.9 % (P50), 97.2 % (P100), 98.9 % (P250), 99.0 % (P500) and 99.3 % (P1000) for indirect identifiers. Without training, 5.3 % of medical terms were falsely flagged as critical data. The performance increased, after training, to 4.0 % (P25), 3.6 % (P50), 4.0 % (P100), 3.7 % (P250), 4.3 % (P500), and 3.1 % (P1000). Roughly 0.1 % of filler words were falsely flagged. Conclusion  Training of the developed de-identification tool continuously improved its performance. Training with roughly 100 edited reports enables reliable detection and labeling of sensitive data in different types of medical reports. Key Points:   · Collaborations between different institutions require de-identification of patients' data. · Software-based de-identification of content-sensitive reports grows in importance as a result of 'Big data'. · A de-identification software was developed and tested natively and after training. · The proposed de-identification software worked quite reliably, following training with roughly 100 edited reports. · A final check of the texts by an authorized person remains necessary. Citation Format · Seuss H, Dankerl P, Ihle M et al. Semi-automated De-identification of German Content Sensitive Reports for Big Data Analytics. Fortschr Röntgenstr 2017; 189: 661 - 671. © Georg Thieme Verlag KG Stuttgart · New York.

  9. Detection of nerve gases using surface-enhanced Raman scattering substrates with high droplet adhesion

    NASA Astrophysics Data System (ADS)

    Hakonen, Aron; Rindzevicius, Tomas; Schmidt, Michael Stenbæk; Andersson, Per Ola; Juhlin, Lars; Svedendahl, Mikael; Boisen, Anja; Käll, Mikael

    2016-01-01

    Threats from chemical warfare agents, commonly known as nerve gases, constitute a serious security issue of increasing global concern because of surging terrorist activity worldwide. However, nerve gases are difficult to detect using current analytical tools and outside dedicated laboratories. Here we demonstrate that surface-enhanced Raman scattering (SERS) can be used for sensitive detection of femtomol quantities of two nerve gases, VX and Tabun, using a handheld Raman device and SERS substrates consisting of flexible gold-covered Si nanopillars. The substrate surface exhibits high droplet adhesion and nanopillar clustering due to elasto-capillary forces, resulting in enrichment of target molecules in plasmonic hot-spots with high Raman enhancement. The results may pave the way for strategic life-saving SERS detection of chemical warfare agents in the field.Threats from chemical warfare agents, commonly known as nerve gases, constitute a serious security issue of increasing global concern because of surging terrorist activity worldwide. However, nerve gases are difficult to detect using current analytical tools and outside dedicated laboratories. Here we demonstrate that surface-enhanced Raman scattering (SERS) can be used for sensitive detection of femtomol quantities of two nerve gases, VX and Tabun, using a handheld Raman device and SERS substrates consisting of flexible gold-covered Si nanopillars. The substrate surface exhibits high droplet adhesion and nanopillar clustering due to elasto-capillary forces, resulting in enrichment of target molecules in plasmonic hot-spots with high Raman enhancement. The results may pave the way for strategic life-saving SERS detection of chemical warfare agents in the field. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06524k

  10. EPA Tools and Resources Webinar: EPA’s Environmental Sampling and Analytical Methods for Environmental Remediation and Recovery

    EPA Pesticide Factsheets

    EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.

  11. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  12. Headspace-SPME-GC/MS as a simple cleanup tool for sensitive 2,6-diisopropylphenol analysis from lipid emulsions and adaptable to other matrices.

    PubMed

    Pickl, Karin E; Adamek, Viktor; Gorges, Roland; Sinner, Frank M

    2011-07-15

    Due to increased regulatory requirements, the interaction of active pharmaceutical ingredients with various surfaces and solutions during production and storage is gaining interest in the pharmaceutical research field, in particular with respect to development of new formulations, new packaging material and the evaluation of cleaning processes. Experimental adsorption/absorption studies as well as the study of cleaning processes require sophisticated analytical methods with high sensitivity for the drug of interest. In the case of 2,6-diisopropylphenol - a small lipophilic drug which is typically formulated as lipid emulsion for intravenous injection - a highly sensitive method in the concentration range of μg/l suitable to be applied to a variety of different sample matrices including lipid emulsions is needed. We hereby present a headspace-solid phase microextraction (HS-SPME) approach as a simple cleanup procedure for sensitive 2,6-diisopropylphenol quantification from diverse matrices choosing a lipid emulsion as the most challenging matrix with regard to complexity. By combining the simple and straight forward HS-SPME sample pretreatment with an optimized GC-MS quantification method a robust and sensitive method for 2,6-diisopropylphenol was developed. This method shows excellent sensitivity in the low μg/l concentration range (5-200μg/l), good accuracy (94.8-98.8%) and precision (intraday-precision 0.1-9.2%, inter-day precision 2.0-7.7%). The method can be easily adapted to other, less complex, matrices such as water or swab extracts. Hence, the presented method holds the potential to serve as a single and simple analytical procedure for 2,6-diisopropylphenol analysis in various types of samples such as required in, e.g. adsorption/absorption studies which typically deal with a variety of different surfaces (steel, plastic, glass, etc.) and solutions/matrices including lipid emulsions. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Taxometric and Factor Analytic Models of Anxiety Sensitivity among Youth: Exploring the Latent Structure of Anxiety Psychopathology Vulnerability

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Stewart, Sherry; Comeau, Nancy

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), a well-established affect-sensitivity individual difference factor, among youth by employing taxometric and factor analytic approaches in an integrative manner. Taxometric analyses indicated that AS, as indexed by the Child Anxiety Sensitivity…

  14. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  15. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  16. An Evaluation of Fractal Surface Measurement Methods for Characterizing Landscape Complexity from Remote-Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina Siu-Ngan; Qiu, Hong-Lie; Quattrochi, Dale A.; Emerson, Charles W.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The rapid increase in digital data volumes from new and existing sensors necessitates the need for efficient analytical tools for extracting information. We developed an integrated software package called ICAMS (Image Characterization and Modeling System) to provide specialized spatial analytical functions for interpreting remote sensing data. This paper evaluates the three fractal dimension measurement methods: isarithm, variogram, and triangular prism, along with the spatial autocorrelation measurement methods Moran's I and Geary's C, that have been implemented in ICAMS. A modified triangular prism method was proposed and implemented. Results from analyzing 25 simulated surfaces having known fractal dimensions show that both the isarithm and triangular prism methods can accurately measure a range of fractal surfaces. The triangular prism method is most accurate at estimating the fractal dimension of higher spatial complexity, but it is sensitive to contrast stretching. The variogram method is a comparatively poor estimator for all of the surfaces, particularly those with higher fractal dimensions. Similar to the fractal techniques, the spatial autocorrelation techniques are found to be useful to measure complex images but not images with low dimensionality. These fractal measurement methods can be applied directly to unclassified images and could serve as a tool for change detection and data mining.

  17. Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.

    PubMed

    Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert

    2017-07-01

    The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Nano-biosensors to detect beta-amyloid for Alzheimer's disease management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Tiwari, Sneham; Vashist, Arti; Nair, Madhavan

    2016-06-15

    Beta-amyloid (β-A) peptides are potential biomarkers to monitor Alzheimer's diseases (AD) for diagnostic purposes. Increased β-A level is neurotoxic and induces oxidative stress in brain resulting in neurodegeneration and causes dementia. As of now, no sensitive and inexpensive method is available for β-A detection under physiological and pathological conditions. Although, available methods such as neuroimaging, enzyme-linked immunosorbent assay (ELISA), and polymerase chain reaction (PCR) detect β-A, but they are not yet extended at point-of-care (POC) due to sophisticated equipments, need of high expertize, complicated operations, and challenge of low detection limit. Recently, β-A antibody based electrochemical immuno-sensing approach has been explored to detect β-A at pM levels within 30-40 min compared to 6-8h of ELISA test. The introduction of nano-enabling electrochemical sensing technology could enable rapid detection of β-A at POC and may facilitate fast personalized health care delivery. This review explores recent advancements in nano-enabling electrochemical β-A sensing technologies towards POC application to AD management. These analytical tools can serve as an analytical tool for AD management program to obtain bio-informatics needed to optimize therapeutics for neurodegenerative diseases diagnosis management. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Sensitivity, reliability and the effects of diurnal variation on a test battery of field usable upper limb fatigue measures.

    PubMed

    Yung, Marcus; Wells, Richard P

    2017-07-01

    Fatigue has been linked to deficits in production quality and productivity and, if of long duration, work-related musculoskeletal disorders. It may thus be a useful risk indicator and design and evaluation tool. However, there is limited information on the test-retest reliability, the sensitivity and the effects of diurnal fluctuation on field usable fatigue measures. This study reports on an evaluation of 11 measurement tools and their 14 parameters. Eight measures were found to have test-retest ICC values greater than 0.8. Four measures were particularly responsive during an intermittent fatiguing condition. However, two responsive measures demonstrated rhythmic behaviour, with significant time effects from 08:00 to mid-afternoon and early evening. Action tremor, muscle mechanomyography and perceived fatigue were found to be most reliable and most responsive; but additional analytical considerations might be required when interpreting daylong responses of MMG and action tremor. Practitioner Summary: This paper presents findings from test-retest and daylong reliability and responsiveness evaluations of 11 fatigue measures. This paper suggests that action tremor, muscle mechanomyography and perceived fatigue were most reliable and most responsive. However, mechanomyography and action tremor may be susceptible to diurnal changes.

  20. Fast assessment of planar chromatographic layers quality using pulse thermovision method.

    PubMed

    Suszyński, Zbigniew; Świta, Robert; Loś, Joanna; Zarzycka, Magdalena B; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2014-12-19

    The main goal of this paper is to demonstrate capability of pulse thermovision (thermal-wave) methodology for sensitive detection of photothermal non-uniformities within light scattering and semi-transparent planar stationary phases. Successful visualization of stationary phases defects required signal processing protocols based on wavelet filtration, correlation analysis and k-means 3D segmentation. Such post-processing data handling approach allows extremely sensitive detection of thickness and structural changes within commercially available planar chromatographic layers. Particularly, a number of TLC and HPTLC stationary phases including silica, cellulose, aluminum oxide, polyamide and octadecylsilane coated with adsorbent layer ranging from 100 to 250μm were investigated. Presented detection protocol can be used as an efficient tool for fast screening the overall heterogeneity of any layered materials. Moreover, described procedure is very fast (few seconds including acquisition and data processing) and may be applied for fabrication processes online controlling. In spite of planar chromatographic plates this protocol can be used for assessment of different planar separation tools like paper based analytical devices or micro total analysis systems, consisted of organic and non-organic layers. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Molecular Imprinting Technology in Quartz Crystal Microbalance (QCM) Sensors

    PubMed Central

    Emir Diltemiz, Sibel; Keçili, Rüstem; Ersöz, Arzu; Say, Rıdvan

    2017-01-01

    Molecularly imprinted polymers (MIPs) as artificial antibodies have received considerable scientific attention in the past years in the field of (bio)sensors since they have unique features that distinguish them from natural antibodies such as robustness, multiple binding sites, low cost, facile preparation and high stability under extreme operation conditions (higher pH and temperature values, etc.). On the other hand, the Quartz Crystal Microbalance (QCM) is an analytical tool based on the measurement of small mass changes on the sensor surface. QCM sensors are practical and convenient monitoring tools because of their specificity, sensitivity, high accuracy, stability and reproducibility. QCM devices are highly suitable for converting the recognition process achieved using MIP-based memories into a sensor signal. Therefore, the combination of a QCM and MIPs as synthetic receptors enhances the sensitivity through MIP process-based multiplexed binding sites using size, 3D-shape and chemical function having molecular memories of the prepared sensor system toward the target compound to be detected. This review aims to highlight and summarize the recent progress and studies in the field of (bio)sensor systems based on QCMs combined with molecular imprinting technology. PMID:28245588

  2. Gas-Permeable Membrane-Based Conductivity Probe Capable of In Situ Real-Time Monitoring of Ammonia in Aquatic Environments.

    PubMed

    Li, Tianling; Panther, Jared; Qiu, Yuan; Liu, Chang; Huang, Jianyin; Wu, Yonghong; Wong, Po Keung; An, Taicheng; Zhang, Shanqing; Zhao, Huijun

    2017-11-21

    Aquatic ammonia has toxic effects on aquatic life. This work reports a gas-permeable membrane-based conductivity probe (GPMCP) developed for real-time monitoring of ammonia in aquatic environments. The GPMCP innovatively combines a gas-permeable membrane with a boric acid receiving phase to selectively extract ammonia from samples and form ammonium at the inner membrane interface. The rate of the receiving phase conductivity increase is directly proportional to the instantaneous ammonia concentration in the sample, which can be rapidly and sensitively determined by the embedded conductivity detector. A precalibration strategy was developed to eliminate the need for an ongoing calibration. The analytical principle and GPMCP performance were systematically validated. The laboratory results showed that ammonia concentrations ranging from 2 to 50 000 μg L -1 can be detected. The field deployment results demonstrated the GPMCP's ability to obtain high-resolution continuous ammonia concentration profiles and the absolute average ammonia concentration over a prolonged deployment period. By inputting the temperature and pH data, the ammonium concentration can be simultaneously derived from the corresponding ammonia concentration. The GPMCP embeds a sophisticated analytical principle with the inherent advantages of high selectivity, sensitivity, and accuracy, and it can be used as an effective tool for long-term, large-scale, aquatic-environment assessments.

  3. A sensitive electrochemical immunosensor based on poly(2-aminobenzylamine) film modified screen-printed carbon electrode for label-free detection of human immunoglobulin G.

    PubMed

    Putnin, Thitirat; Jumpathong, Watthanachai; Laocharoensuk, Rawiwan; Jakmunee, Jaroon; Ounnunkad, Kontad

    2018-08-01

    This work focuses on fabricating poly(2-aminobenzylamine)-modified screen-printed carbon electrode as an electrochemical immunosensor for the label-free detection of human immunoglobulin G. To selectively detect immunoglobulin G, the anti-immunoglobulin G antibody with high affinity to immunoglobulin G was covalently linked with the amine group of poly(2-aminobenzylamine) film-deposited screen-printed carbon electrode. The selectivity for immunoglobulin G was subsequently assured by being challenged with redox-active interferences and adventitious adsorption did not significantly interfere the analyte signal. To obviate the use of costly secondary antibody, the [Fe(CN) 6 ] 4-/3- redox probe was instead applied to measure the number of human immunoglobulin G through the immunocomplex formation that is quantitatively related to the level of the differential pulse voltammetric current. The resulting immunosensor exhibited good sensitivity with the detection limit of 0.15 ng mL -1 , limit of quantitation of 0.50 ng mL -1 and the linear range from 1.0 to 50 ng mL -1 . Given those striking analytical performances and the affordability arising from using cheap screen-printed carbon electrode with label-free detection, the immunosensor serves as a promising model for the next-step development of a diagnostic tool.

  4. Label-Free Bioanalyte Detection from Nanometer to Micrometer Dimensions-Molecular Imprinting and QCMs †.

    PubMed

    Mujahid, Adnan; Mustafa, Ghulam; Dickert, Franz L

    2018-06-01

    Modern diagnostic tools and immunoassay protocols urges direct analyte recognition based on its intrinsic behavior without using any labeling indicator. This not only improves the detection reliability, but also reduces sample preparation time and complexity involved during labeling step. Label-free biosensor devices are capable of monitoring analyte physiochemical properties such as binding sensitivity and selectivity, affinity constants and other dynamics of molecular recognition. The interface of a typical biosensor could range from natural antibodies to synthetic receptors for example molecular imprinted polymers (MIPs). The foremost advantages of using MIPs are their high binding selectivity comparable to natural antibodies, straightforward synthesis in short time, high thermal/chemical stability and compatibility with different transducers. Quartz crystal microbalance (QCM) resonators are leading acoustic devices that are extensively used for mass-sensitive measurements. Highlight features of QCM devices include low cost fabrication, room temperature operation, and most importantly ability to monitor extremely low mass shifts, thus potentially a universal transducer. The combination of MIPs with quartz QCM has turned out as a prominent sensing system for label-free recognition of diverse bioanalytes. In this article, we shall encompass the potential applications of MIP-QCM sensors exclusively label-free recognition of bacteria and virus species as representative micro and nanosized bioanalytes.

  5. Evaluation of a gas chromatography method for azelaic acid determination in selected biological samples

    PubMed Central

    Garelnabi, Mahdi; Litvinov, Dmitry; Parthasarathy, Sampath

    2010-01-01

    Background: Azelaic acid (AzA) is the best known dicarboxilic acid to have pharmaceutical benefits and clinical applications and also to be associated with some diseases pathophysiology. Materials and Methods: We extracted and methylesterified AzA and determined its concentration in human plasma obtained from healthy individuals and also in mice fed AzA containing diet for three months. Results: AzA was detected in Gas Chromatography (GC) and confirmed by Liquid chromatography mass spectrometry (LCMS), and gas chromatography mass spectrometry (GCMC). Our results have shown that AzA can be determined efficiently in selected biological samples by GC method with 1nM limit of detection (LoD) and the limit of quantification (LoQ); was established at 50nM. Analytical Sensitivity as assayed by hexane demonstrated an analytical sensitivity at 0.050nM. The method has demonstrated 8-10% CV batch repeatability across the sample types and 13-18.9% CV for the Within-Lab Precision analysis. The method has shown that AzA can efficiently be recovered from various sample preparation including liver tissue homogenate (95%) and human plasma (97%). Conclusions: Because of its simplicity and lower limit of quantification, the present method provides a useful tool for determining AzA in various biological sample preparations. PMID:22558586

  6. Evaluation of a gas chromatography method for azelaic acid determination in selected biological samples.

    PubMed

    Garelnabi, Mahdi; Litvinov, Dmitry; Parthasarathy, Sampath

    2010-09-01

    Azelaic acid (AzA) is the best known dicarboxilic acid to have pharmaceutical benefits and clinical applications and also to be associated with some diseases pathophysiology. We extracted and methylesterified AzA and determined its concentration in human plasma obtained from healthy individuals and also in mice fed AzA containing diet for three months. AzA was detected in Gas Chromatography (GC) and confirmed by Liquid chromatography mass spectrometry (LCMS), and gas chromatography mass spectrometry (GCMC). Our results have shown that AzA can be determined efficiently in selected biological samples by GC method with 1nM limit of detection (LoD) and the limit of quantification (LoQ); was established at 50nM. Analytical Sensitivity as assayed by hexane demonstrated an analytical sensitivity at 0.050nM. The method has demonstrated 8-10% CV batch repeatability across the sample types and 13-18.9% CV for the Within-Lab Precision analysis. The method has shown that AzA can efficiently be recovered from various sample preparation including liver tissue homogenate (95%) and human plasma (97%). Because of its simplicity and lower limit of quantification, the present method provides a useful tool for determining AzA in various biological sample preparations.

  7. Metabolomic Strategies Involving Mass Spectrometry Combined with Liquid and Gas Chromatography.

    PubMed

    Lopes, Aline Soriano; Cruz, Elisa Castañeda Santa; Sussulini, Alessandra; Klassen, Aline

    2017-01-01

    Amongst all omics sciences, there is no doubt that metabolomics is undergoing the most important growth in the last decade. The advances in analytical techniques and data analysis tools are the main factors that make possible the development and establishment of metabolomics as a significant research field in systems biology. As metabolomic analysis demands high sensitivity for detecting metabolites present in low concentrations in biological samples, high-resolution power for identifying the metabolites and wide dynamic range to detect metabolites with variable concentrations in complex matrices, mass spectrometry is being the most extensively used analytical technique for fulfilling these requirements. Mass spectrometry alone can be used in a metabolomic analysis; however, some issues such as ion suppression may difficultate the quantification/identification of metabolites with lower concentrations or some metabolite classes that do not ionise as well as others. The best choice is coupling separation techniques, such as gas or liquid chromatography, to mass spectrometry, in order to improve the sensitivity and resolution power of the analysis, besides obtaining extra information (retention time) that facilitates the identification of the metabolites, especially when considering untargeted metabolomic strategies. In this chapter, the main aspects of mass spectrometry (MS), liquid chromatography (LC) and gas chromatography (GC) are discussed, and recent clinical applications of LC-MS and GC-MS are also presented.

  8. Accuracy of Brief Screening Tools for Identifying Postpartum Depression Among Adolescent Mothers

    PubMed Central

    Venkatesh, Kartik K.; Zlotnick, Caron; Triche, Elizabeth W.; Ware, Crystal

    2014-01-01

    OBJECTIVE: To evaluate the accuracy of the Edinburgh Postnatal Depression Scale (EPDS) and 3 subscales for identifying postpartum depression among primiparous adolescent mothers. METHODS: Mothers enrolled in a randomized controlled trial to prevent postpartum depression completed a psychiatric diagnostic interview and the 10-item EPDS at 6 weeks, 3 months, and 6 months postpartum. Three subscales of the EPDS were assessed as brief screening tools: 3-item anxiety subscale (EPDS-3), 7-item depressive symptoms subscale (EPDS-7), and 2-item subscale (EPDS-2) that resemble the Patient Health Questionnaire-2. Receiver operating characteristic curves and the areas under the curves for each tool were compared to assess accuracy. The sensitivities and specificities of each screening tool were calculated in comparison with diagnostic criteria for a major depressive disorder. Repeated-measures longitudinal analytical techniques were used. RESULTS: A total of 106 women contributed 289 postpartum visits; 18% of the women met criteria for incident postpartum depression by psychiatric diagnostic interview. When used as continuous measures, the full EPDS, EPDS-7, and EPDS-2 performed equally well (area under the curve >0.9). Optimal cutoff scores for a positive depression screen for the EPDS and EPDS-7 were lower (≥9 and ≥7, respectively) than currently recommended cutoff scores (≥10). At optimal cutoff scores, the EPDS and EPDS-7 both had sensitivities of 90% and specificities of >85%. CONCLUSIONS: The EPDS, EPDS-7, and EPDS-2 are highly accurate at identifying postpartum depression among adolescent mothers. In primary care pediatric settings, the EPDS and its shorter subscales have potential for use as effective depression screening tools. PMID:24344102

  9. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  10. TCSPC based approaches for multiparameter detection in living cells

    NASA Astrophysics Data System (ADS)

    Jahn, Karolina; Buschmann, Volker; Koberling, Felix; Hille, Carsten

    2014-03-01

    In living cells a manifold of processes take place simultaneously. This implies a precise regulation of intracellular ion homeostasis. In order to understand their spatio-temporal pattern comprehensively, the development of multiplexing concepts is essential. Due to the multidimensional characteristics of fluorescence dyes (absorption and emission spectra, decay time, anisotropy), the highly sensitive and non-invasive fluorescence microscopy is a versatile tool for realising multiplexing concepts. A prerequisite are analyte-specific fluorescence dyes with low cross-sensitivity to other dyes and analytes, respectively. Here, two approaches for multiparameter detection in living cells are presented. Insect salivary glands are well characterised secretory active tissues which were used as model systems to evaluate multiplexing concepts. Salivary glands secrete a KCl-rich or NaCl-rich fluid upon stimulation which is mainly regulated by intracellular Ca2+ as second messenger. Thus, pairwise detection of intracellular Na+, Cl- and Ca2+ with the fluorescent dyes ANG2, MQAE and ACR were tested. Therefore, the dyes were excited simultaneously (2-photon excitation) and their corresponding fluorescence decay times were recorded within two spectral ranges using time-correlated singlephoton counting (TCSPC). A second approach presented here is based on a new TCSPC-platform covering decay time detection from picoseconds to milliseconds. Thereby, nanosecond decaying cellular fluorescence and microsecond decaying phosphorescence of Ruthenium-complexes, which is quenched by oxygen, were recorded simultaneously. In both cases changes in luminescence decay times can be linked to changes in analyte concentrations. In consequence of simultaneous excitation as well as detection, it is possible to get a deeper insight into spatio-temporal pattern in living tissues.

  11. Hyphenated and comprehensive liquid chromatography × gas chromatography-mass spectrometry for the identification of Mycobacterium tuberculosis.

    PubMed

    Mourão, Marta P B; Denekamp, Ilse; Kuijper, Sjoukje; Kolk, Arend H J; Janssen, Hans-Gerd

    2016-03-25

    Tuberculosis is one of the world's most emerging public health problems, particularly in developing countries. Chromatography based methods have been used to tackle this epidemic by focusing on biomarker detection. Unfortunately, interferences from lipids in the sputum matrix, particularly cholesterol, adversely affect the identification and detection of the marker compounds. The present contribution describes the serial combination of normal phase liquid chromatography (NPLC) with thermally assisted hydrolysis and methylation followed by gas chromatography-mass spectrometry (THM-GC-MS) to overcome the difficulties of biomarker evaluation. The in-series combination consists of an LC analysis where fractions are collected and then transferred to the THM-GC-MS system. This was either done with comprehensive coupling, transferring all the fractions, or with hyphenated interfacing, i.e. off-line multi heart-cutting, transferring only selected fractions. Owing to the high sensitivity and selectivity of LC as a sample pre-treatment method, and to the high specificity of the MS as a detector, this analytical approach, NPLC × THM-GC-MS, is extremely sensitive. The results obtained indicate that this analytical set-up is able to detect down to 1 × 10(3) mycobacteria/mL of Mycobacterium tuberculosis strain 124, spiked in blank sputum samples. It is a powerful analytical tool and also has great potential for full automation. If further studies demonstrate its usefulness when applied blind in real sputum specimens, this technique could compete with the current smear microscopy in the early diagnosis of tuberculosis. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  13. Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.

    DOT National Transportation Integrated Search

    2015-01-01

    The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...

  14. Commercial lateral flow devices for rapid detection of peanut (Arachis hypogaea) and hazelnut (Corylus avellana) cross-contamination in the industrial production of cookies.

    PubMed

    Röder, Martin; Vieths, Stefan; Holzhauser, Thomas

    2009-09-01

    Lateral flow devices (LFDs) are qualitative immunochromatographic tests for the rapid and specific detection of target analytes. We investigated commercially available LFDs for their ability to detect potentially allergenic peanut and hazelnut in raw cookie dough and chocolate, two important food matrices in the industrial production of cookies. Each three commercial LFDs for the detection of hazelnut and peanut were performed according to the manufacturers' instructions. All LFDs had comparably satisfactory specificity that was investigated with a variety of characteristic foods and food ingredients used in the production of cookies. In concordance with hazelnut-specific enzyme-linked immunosorbent assays (ELISAs), walnut was the most cross-reactive food for hazelnut-specific LFD. The sensitivity was verified in raw cookie doughs and chocolates that were either spiked with peanut or hazelnut between 1 and 25 mg/kg, respectively. Two hazelnut-specific LFDs detected hazelnut at a level of 3.5 mg/kg in both matrices, whereas the third LFD detected hazelnut at a level of 3.9 mg/kg in dough and 12.5 mg/kg in chocolate. Two peanut-specific LFDs detected peanut at a level of 1 mg/kg in both matrices. The third LFD detected peanut at a level of 14.2 mg/kg in chocolate and 4 mg/kg in dough. In conclusion, specific and sensitive LFD were identified for each hazelnut and peanut, having a level of sensitivity that is comparable to commercial ELISA for the investigated matrices. Such sensitive, specific, and rapid tests are useful analytical tools for allergen screening and sanitation in the industrial manufacture of foods.

  15. Taxometric and Factor Analytic Models of Anxiety Sensitivity: Integrating Approaches to Latent Structural Research

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Norton, Peter J.; Schmidt, Norman B.; Taylor, Steven; Forsyth, John P.; Lewis, Sarah F.; Feldner, Matthew T.; Leen-Feldner, Ellen W.; Stewart, Sherry H.; Cox, Brian

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), as indexed by the 16-item Anxiety Sensitivity Index (ASI; S. Reiss, R. A. Peterson, M. Gursky, & R. J. McNally, 1986), by using taxometric and factor-analytic approaches in an integrative manner. Taxometric analyses indicated that AS has a…

  16. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    PubMed Central

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M

    2008-01-01

    Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163

  17. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality.

    PubMed

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M

    2008-11-07

    Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit.

  18. Magneto-photonic crystal optical sensors with sensitive covers

    NASA Astrophysics Data System (ADS)

    Dissanayake, Neluka; Levy, Miguel; Chakravarty, A.; Heiden, P. A.; Chen, N.; Fratello, V. J.

    2011-08-01

    We report on a magneto-photonic crystal on-chip optical sensor for specific analyte detection with polypyrrole and gold nano particles as modified photonic crystal waveguide cover layers. The reaction of the active sensor material with various analytes modifies the electronic structure of the sensor layer causing changes in its refractive index and a strong transduction signal. Magneto-photonic crystal enhanced polarization rotation sensitive to the nature of the cover layer detects the index modification upon analyte adsorption. A high degree of selectivity and sensitivity are observed for aqueous ammonia and methanol with polypyrrole and for thiolated-gold- with gold-nanoparticles covers.

  19. Fiber optic evanescent wave biosensor

    NASA Astrophysics Data System (ADS)

    Duveneck, Gert L.; Ehrat, Markus; Widmer, H. M.

    1991-09-01

    The role of modern analytical chemistry is not restricted to quality control and environmental surveillance, but has been extended to process control using on-line analytical techniques. Besides industrial applications, highly specific, ultra-sensitive biochemical analysis becomes increasingly important as a diagnostic tool, both in central clinical laboratories and in the doctor's office. Fiber optic sensor technology can fulfill many of the requirements for both types of applications. As an example, the experimental arrangement of a fiber optic sensor for biochemical affinity assays is presented. The evanescent electromagnetic field, associated with a light ray guided in an optical fiber, is used for the excitation of luminescence labels attached to the biomolecules in solution to be analyzed. Due to the small penetration depth of the evanescent field into the medium, the generation of luminescence is restricted to the close proximity of the fiber, where, e.g., the luminescent analyte molecules combine with their affinity partners, which are immobilized on the fiber. Both cw- and pulsed light excitation can be used in evanescent wave sensor technology, enabling the on-line observation of an affinity assay on a macroscopic time scale (seconds and minutes), as well as on a microscopic, molecular time scale (nanoseconds or microseconds).

  20. Aerothermodynamic shape optimization of hypersonic blunt bodies

    NASA Astrophysics Data System (ADS)

    Eyi, Sinan; Yumuşak, Mine

    2015-07-01

    The aim of this study is to develop a reliable and efficient design tool that can be used in hypersonic flows. The flow analysis is based on the axisymmetric Euler/Navier-Stokes and finite-rate chemical reaction equations. The equations are coupled simultaneously and solved implicitly using Newton's method. The Jacobian matrix is evaluated analytically. A gradient-based numerical optimization is used. The adjoint method is utilized for sensitivity calculations. The objective of the design is to generate a hypersonic blunt geometry that produces the minimum drag with low aerodynamic heating. Bezier curves are used for geometry parameterization. The performances of the design optimization method are demonstrated for different hypersonic flow conditions.

  1. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  2. Analytical modeling of intumescent coating thermal protection system in a JP-5 fuel fire environment

    NASA Technical Reports Server (NTRS)

    Clark, K. J.; Shimizu, A. B.; Suchsland, K. E.; Moyer, C. B.

    1974-01-01

    The thermochemical response of Coating 313 when exposed to a fuel fire environment was studied to provide a tool for predicting the reaction time. The existing Aerotherm Charring Material Thermal Response and Ablation (CMA) computer program was modified to treat swelling materials. The modified code is now designated Aerotherm Transient Response of Intumescing Materials (TRIM) code. In addition, thermophysical property data for Coating 313 were analyzed and reduced for use in the TRIM code. An input data sensitivity study was performed, and performance tests of Coating 313/steel substrate models were carried out. The end product is a reliable computational model, the TRIM code, which was thoroughly validated for Coating 313. The tasks reported include: generation of input data, development of swell model and implementation in TRIM code, sensitivity study, acquisition of experimental data, comparisons of predictions with data, and predictions with intermediate insulation.

  3. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  4. Using gas chromatography with ion mobility spectrometry to resolve explosive compounds in the presence of interferents.

    PubMed

    Cook, Greg W; LaPuma, Peter T; Hook, Gary L; Eckenrode, Brian A

    2010-11-01

    Ion mobility spectrometry (IMS) is a valued field detection technology because of its speed and high sensitivity, but IMS cannot easily resolve analytes of interest within mixtures. Coupling gas chromatography (GC) to IMS adds a separation capability to resolve complex matrices. A GC-IONSCAN® operated in IMS and GC⁄ IMS modes was evaluated with combinations of five explosives and four interferents. In 100 explosive/interferent combinations, IMS yielded 21 false positives while GC⁄ IMS substantially reduced the occurrence of false positives to one. In addition, the results indicate that through redesign or modification of the preconcentrator there would be significant advantages to using GC⁄ IMS, such as enhancement of the linear dynamic range (LDR) in some situations. By balancing sensitivity with LDR, GC⁄ IMS could prove to be a very advantageous tool when addressing real world complex mixture situations.

  5. uleSIMS characterization of silver reference surfaces

    NASA Astrophysics Data System (ADS)

    Palitsin, V. V.; Dowsett, M. G.; Mata, B. Guzmán de la; Oloff, I. W.; Gibbons, R.

    2006-07-01

    Ultra low energy SIMS (uleSIMS) is a high sensitivity analytical technique that is normally used for ultra shallow profiling at a depth resolution of up to1 nm. This work describes the use of uleSIMS as both a spectroscopic and depth-profiling tool for the characterization of the early stages of corrosion formed on reference surfaces of silver. These samples are being developed to help with the characterization of tarnished surfaces in a cultural heritage context, and uleSIMS enables the tarnishing to be studied from its very earliest stages due to its high sensitivity (ppm-ppb) and surface specificity. We show that, uleSIMS can be used effectively to study the surface chemistry and aid the development of reference surfaces themselves. In particular, handling contaminants, surface dust, and residues from polishing are relatively easy to identify allowing them to be separated from the parts of the mass spectrum specific to the early stages of corrosion.

  6. Parahydrogen-enhanced zero-field nuclear magnetic resonance

    NASA Astrophysics Data System (ADS)

    Theis, T.; Ganssle, P.; Kervern, G.; Knappe, S.; Kitching, J.; Ledbetter, M. P.; Budker, D.; Pines, A.

    2011-07-01

    Nuclear magnetic resonance, conventionally detected in magnetic fields of several tesla, is a powerful analytical tool for the determination of molecular identity, structure and function. With the advent of prepolarization methods and detection schemes using atomic magnetometers or superconducting quantum interference devices, interest in NMR in fields comparable to the Earth's magnetic field and below (down to zero field) has been revived. Despite the use of superconducting quantum interference devices or atomic magnetometers, low-field NMR typically suffers from low sensitivity compared with conventional high-field NMR. Here we demonstrate direct detection of zero-field NMR signals generated through parahydrogen-induced polarization, enabling high-resolution NMR without the use of any magnets. The sensitivity is sufficient to observe spectra exhibiting 13C-1H scalar nuclear spin-spin couplings (known as J couplings) in compounds with 13C in natural abundance, without the need for signal averaging. The resulting spectra show distinct features that aid chemical fingerprinting.

  7. Highly sensitive electrochemical biosensor for bisphenol A detection based on a diazonium-functionalized boron-doped diamond electrode modified with a multi-walled carbon nanotube-tyrosinase hybrid film.

    PubMed

    Zehani, Nedjla; Fortgang, Philippe; Saddek Lachgar, Mohamed; Baraket, Abdoullatif; Arab, Madjid; Dzyadevych, Sergei V; Kherrat, Rochdi; Jaffrezic-Renault, Nicole

    2015-12-15

    A highly sensitive electrochemical biosensor for the detection of Bisphenol A (BPA) in water has been developed by immobilizing tyrosinase onto a diazonium-functionalized boron doped diamond electrode (BDD) modified with multi-walled carbon nanotubes (MWCNTs). The fabricated biosensor exhibits excellent electroactivity towards o-quinone, a product of this enzymatic reaction of BPA oxidation catalyzed by tyrosinase. The developed BPA biosensor displays a large linear range from 0.01 nM to 100 nM, with a detection limit (LOD) of 10 pM. The feasibility of the proposed biosensor has been demonstrated on BPA spiked water river samples. Therefore, it could be a promising and reliable analytical tool for on-site monitoring of BPA in waste water. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. SAM Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation

  9. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  10. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  11. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  12. SAM Pathogen Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.

  13. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  15. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  16. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  17. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  18. SAM Biotoxin Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.

  19. SAM Chemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery

  20. Using Learning Analytics to Support Engagement in Collaborative Writing

    ERIC Educational Resources Information Center

    Liu, Ming; Pardo, Abelardo; Liu, Li

    2017-01-01

    Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…

  1. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  2. The Role of Nanoparticle Design in Determining Analytical Performance of Lateral Flow Immunoassays.

    PubMed

    Zhan, Li; Guo, Shuang-Zhuang; Song, Fayi; Gong, Yan; Xu, Feng; Boulware, David R; McAlpine, Michael C; Chan, Warren C W; Bischof, John C

    2017-12-13

    Rapid, simple, and cost-effective diagnostics are needed to improve healthcare at the point of care (POC). However, the most widely used POC diagnostic, the lateral flow immunoassay (LFA), is ∼1000-times less sensitive and has a smaller analytical range than laboratory tests, requiring a confirmatory test to establish truly negative results. Here, a rational and systematic strategy is used to design the LFA contrast label (i.e., gold nanoparticles) to improve the analytical sensitivity, analytical detection range, and antigen quantification of LFAs. Specifically, we discovered that the size (30, 60, or 100 nm) of the gold nanoparticles is a main contributor to the LFA analytical performance through both the degree of receptor interaction and the ultimate visual or thermal contrast signals. Using the optimal LFA design, we demonstrated the ability to improve the analytical sensitivity by 256-fold and expand the analytical detection range from 3 log 10 to 6 log 10 for diagnosing patients with inflammatory conditions by measuring C-reactive protein. This work demonstrates that, with appropriate design of the contrast label, a simple and commonly used diagnostic technology can compete with more expensive state-of-the-art laboratory tests.

  3. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  4. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  5. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1994-01-01

    The primary accomplishments of the project are as follows: (1) Using the transonic small perturbation equation as a flowfield model, the project demonstrated that the quasi-analytical method could be used to obtain aerodynamic sensitivity coefficients for airfoils at subsonic, transonic, and supersonic conditions for design variables such as Mach number, airfoil thickness, maximum camber, angle of attack, and location of maximum camber. It was established that the quasi-analytical approach was an accurate method for obtaining aerodynamic sensitivity derivatives for airfoils at transonic conditions and usually more efficient than the finite difference approach. (2) The usage of symbolic manipulation software to determine the appropriate expressions and computer coding associated with the quasi-analytical method for sensitivity derivatives was investigated. Using the three dimensional fully conservative full potential flowfield model, it was determined that symbolic manipulation along with a chain rule approach was extremely useful in developing a combined flowfield and quasi-analytical sensitivity derivative code capable of considering a large number of realistic design variables. (3) Using the three dimensional fully conservative full potential flowfield model, the quasi-analytical method was applied to swept wings (i.e. three dimensional) at transonic flow conditions. (4) The incremental iterative technique has been applied to the three dimensional transonic nonlinear small perturbation flowfield formulation, an equivalent plate deflection model, and the associated aerodynamic and structural discipline sensitivity equations; and coupled aeroelastic results for an aspect ratio three wing in transonic flow have been obtained.

  6. A Bayesian network meta-analysis for binary outcome: how to do it.

    PubMed

    Greco, Teresa; Landoni, Giovanni; Biondi-Zoccai, Giuseppe; D'Ascenzo, Fabrizio; Zangrillo, Alberto

    2016-10-01

    This study presents an overview of conceptual and practical issues of a network meta-analysis (NMA), particularly focusing on its application to randomised controlled trials with a binary outcome of interest. We start from general considerations on NMA to specifically appraise how to collect study data, structure the analytical network and specify the requirements for different models and parameter interpretations, with the ultimate goal of providing physicians and clinician-investigators a practical tool to understand pros and cons of NMA. Specifically, we outline the key steps, from the literature search to sensitivity analysis, necessary to perform a valid NMA of binomial data, exploiting Markov Chain Monte Carlo approaches. We also apply this analytical approach to a case study on the beneficial effects of volatile agents compared to total intravenous anaesthetics for surgery to further clarify the statistical details of the models, diagnostics and computations. Finally, datasets and models for the freeware WinBUGS package are presented for the anaesthetic agent example. © The Author(s) 2013.

  7. Multiplexed and Microparticle-based Analyses: Quantitative Tools for the Large-Scale Analysis of Biological Systems

    PubMed Central

    Nolan, John P.; Mandy, Francis

    2008-01-01

    While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537

  8. Fabrication of a sensing module using micromachined biosensors.

    PubMed

    Suzuki, H; Arakawa, H; Karube, I

    2001-12-01

    Micromachining is a powerful tool in constructing micro biosensors and micro systems which incorporate them. A sensing module for blood components was fabricated using the technology. The analytes include glucose, urea, uric acid, creatine, and creatinine. Transducers used to construct the corresponding sensors were a Severinghaus-type carbon dioxide electrode for the urea sensor and a Clark-type oxygen electrode for the other analytes. In these electrodes, detecting electrode patterns were formed on a glass substrate by photolithography and the micro container for the internal electrolyte solution was formed on a silicon substrate by anisotropic etching. A through-hole was formed in the sensitive area, where a silicone gas-permeable membrane was formed and an enzyme was immobilized. The sensors were characterized in terms of pH and temperature dependence and calibration curves along with detection limits. Furthermore, the sensors were incorporated in an acrylate flow cell. Simultaneous operation of these sensors was successfully conducted and distinct and stable responses were observed for respective sensors.

  9. MALDI mass spectrometry imaging, from its origins up to today: the state of the art.

    PubMed

    Francese, Simona; Dani, Francesca R; Traldi, Pietro; Mastrobuoni, Guido; Pieraccini, Giuseppe; Moneti, Gloriano

    2009-02-01

    Mass Spectrometry (MS) has a number of features namely sensitivity, high dynamic range, high resolution, and versatility which make it a very powerful analytical tool for a wide spectrum of applications spanning all the life science fields. Among all the MS techniques, MALDI Imaging mass spectrometry (MALDI MSI) is currently one of the most exciting both for its rapid technological improvements, and for its great potential in high impact bioscience fields. Here, MALDI MSI general principles are described along with technical and instrumental details as well as application examples. Imaging MS instruments and imaging mass spectrometric techniques other than MALDI, are presented along with examples of their use. As well as reporting MSI successes in several bioscience fields, an attempt is made to take stock of what has been achieved so far with this technology and to discuss the analytical and technological advances required for MSI to be applied as a routine technique in clinical diagnostics, clinical monitoring and in drug discovery.

  10. Design optimization of piezoresistive cantilevers for force sensing in air and water

    PubMed Central

    Doll, Joseph C.; Park, Sung-Jin; Pruitt, Beth L.

    2009-01-01

    Piezoresistive cantilevers fabricated from doped silicon or metal films are commonly used for force, topography, and chemical sensing at the micro- and macroscales. Proper design is required to optimize the achievable resolution by maximizing sensitivity while simultaneously minimizing the integrated noise over the bandwidth of interest. Existing analytical design methods are insufficient for modeling complex dopant profiles, design constraints, and nonlinear phenomena such as damping in fluid. Here we present an optimization method based on an analytical piezoresistive cantilever model. We use an existing iterative optimizer to minimimize a performance goal, such as minimum detectable force. The design tool is available as open source software. Optimal cantilever design and performance are found to strongly depend on the measurement bandwidth and the constraints applied. We discuss results for silicon piezoresistors fabricated by epitaxy and diffusion, but the method can be applied to any dopant profile or material which can be modeled in a similar fashion or extended to other microelectromechanical systems. PMID:19865512

  11. An improved multiple flame photometric detector for gas chromatography.

    PubMed

    Clark, Adrian G; Thurbide, Kevin B

    2015-11-20

    An improved multiple flame photometric detector (mFPD) is introduced, based upon interconnecting fluidic channels within a planar stainless steel (SS) plate. Relative to the previous quartz tube mFPD prototype, the SS mFPD provides a 50% reduction in background emission levels, an orthogonal analytical flame, and easier more sensitive operation. As a result, sulfur response in the SS mFPD spans 4 orders of magnitude, yields a minimum detectable limit near 9×10(-12)gS/s, and has a selectivity approaching 10(4) over carbon. The device also exhibits exceptionally large resistance to hydrocarbon response quenching. Additionally, the SS mFPD uniquely allows analyte emission monitoring in the multiple worker flames for the first time. The findings suggest that this mode can potentially further improve upon the analytical flame response of sulfur (both linear HSO, and quadratic S2) and also phosphorus. Of note, the latter is nearly 20-fold stronger in S/N in the collective worker flames response and provides 6 orders of linearity with a detection limit of about 2.0×10(-13)gP/s. Overall, the results indicate that this new SS design notably improves the analytical performance of the mFPD and can provide a versatile and beneficial monitoring tool for gas chromatography. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Pooling sheep faecal samples for the assessment of anthelmintic drug efficacy using McMaster and Mini-FLOTAC in gastrointestinal strongyle and Nematodirus infection.

    PubMed

    Kenyon, Fiona; Rinaldi, Laura; McBean, Dave; Pepe, Paola; Bosco, Antonio; Melville, Lynsey; Devin, Leigh; Mitchell, Gillian; Ianniello, Davide; Charlier, Johannes; Vercruysse, Jozef; Cringoli, Giuseppe; Levecke, Bruno

    2016-07-30

    In small ruminants, faecal egg counts (FECs) and reduction in FECs (FECR) are the most common methods for the assessment of intensity of gastrointestinal (GI) nematodes infections and anthelmintic drug efficacy, respectively. The main limitation of these methods is the time and cost to conduct FECs on a representative number of individual animals. A cost-saving alternative would be to examine pooled faecal samples, however little is known regarding whether pooling can give representative results. In the present study, we compared the FECR results obtained by both an individual and a pooled examination strategy across different pool sizes and analytical sensitivity of the FEC techniques. A survey was conducted on 5 sheep farms in Scotland, where anthelmintic resistance is known to be widespread. Lambs were treated with fenbendazole (4 groups), levamisole (3 groups), ivermectin (3 groups) or moxidectin (1 group). For each group, individual faecal samples were collected from 20 animals, at baseline (D0) and 14 days after (D14) anthelmintic administration. Faecal samples were analyzed as pools of 3-5, 6-10, and 14-20 individual samples. Both individual and pooled samples were screened for GI strongyle and Nematodirus eggs using two FEC techniques with three different levels of analytical sensitivity, including Mini-FLOTAC (analytical sensitivity of 10 eggs per gram of faeces (EPG)) and McMaster (analytical sensitivity of 15 or 50 EPG).For both Mini-FLOTAC and McMaster (analytical sensitivity of 15 EPG), there was a perfect agreement in classifying the efficacy of the anthelmintic as 'normal', 'doubtful' or 'reduced' regardless of pool size. When using the McMaster method (analytical sensitivity of 50 EPG) anthelmintic efficacy was often falsely classified as 'normal' or assessment was not possible due to zero FECs at D0, and this became more pronounced when the pool size increased. In conclusion, pooling ovine faecal samples holds promise as a cost-saving and efficient strategy for assessing GI nematode FECR. However, for the assessment FECR one will need to consider the baseline FEC, pool size and analytical sensitivity of the method. Copyright © 2016. Published by Elsevier B.V.

  13. Surface Acoustic Wave Nebulisation Mass Spectrometry for the Fast and Highly Sensitive Characterisation of Synthetic Dyes in Textile Samples

    NASA Astrophysics Data System (ADS)

    Astefanei, Alina; van Bommel, Maarten; Corthals, Garry L.

    2017-10-01

    Surface acoustic wave nebulisation (SAWN) mass spectrometry (MS) is a method to generate gaseous ions compatible with direct MS of minute samples at femtomole sensitivity. To perform SAWN, acoustic waves are propagated through a LiNbO3 sampling chip, and are conducted to the liquid sample, which ultimately leads to the generation of a fine mist containing droplets of nanometre to micrometre diameter. Through fission and evaporation, the droplets undergo a phase change from liquid to gaseous analyte ions in a non-destructive manner. We have developed SAWN technology for the characterisation of organic colourants in textiles. It generates electrospray-ionisation-like ions in a non-destructive manner during ionisation, as can be observed by the unmodified chemical structure. The sample size is decreased by tenfold to 1000-fold when compared with currently used liquid chromatography-MS methods, with equal or better sensitivity. This work underscores SAWN-MS as an ideal tool for molecular analysis of art objects as it is non-destructive, is rapid, involves minimally invasive sampling and is more sensitive than current MS-based methods. [Figure not available: see fulltext.

  14. Development of an acoustic wave based biosensor for vapor phase detection of small molecules

    NASA Astrophysics Data System (ADS)

    Stubbs, Desmond

    For centuries scientific ingenuity and innovation have been influenced by Mother Nature's perfect design. One of her more elusive designs is that of the sensory olfactory system, an array of highly sensitive receptors responsible for chemical vapor recognition. In the animal kingdom this ability is magnified among canines where ppt (parts per trillion) sensitivity values have been reported. Today, detection dogs are considered an essential part of the US drug and explosives detection schemes. However, growing concerns about their susceptibility to extraneous odors have inspired the development of highly sensitive analytical detection tools or biosensors known as "electronic noses". In general, biosensors are distinguished from chemical sensors in that they use an entity of biological origin (e.g. antibody, cell, enzyme) immobilized onto a surface as the chemically-sensitive film on the device. The colloquial view is that the term "biosensors" refers to devices which detect the presence of entities of biological origin, such as proteins or single-stranded DNA and that this detection must take place in a liquid. Our biosensor utilizes biomolecules, specifically IgG monoclonal antibodies, to achieve molecular recognition of relatively small molecules in the vapor phase.

  15. GlycoWorkbench: a tool for the computer-assisted annotation of mass spectra of glycans.

    PubMed

    Ceroni, Alessio; Maass, Kai; Geyer, Hildegard; Geyer, Rudolf; Dell, Anne; Haslam, Stuart M

    2008-04-01

    Mass spectrometry is the main analytical technique currently used to address the challenges of glycomics as it offers unrivalled levels of sensitivity and the ability to handle complex mixtures of different glycan variations. Determination of glycan structures from analysis of MS data is a major bottleneck in high-throughput glycomics projects, and robust solutions to this problem are of critical importance. However, all the approaches currently available have inherent restrictions to the type of glycans they can identify, and none of them have proved to be a definitive tool for glycomics. GlycoWorkbench is a software tool developed by the EUROCarbDB initiative to assist the manual interpretation of MS data. The main task of GlycoWorkbench is to evaluate a set of structures proposed by the user by matching the corresponding theoretical list of fragment masses against the list of peaks derived from the spectrum. The tool provides an easy to use graphical interface, a comprehensive and increasing set of structural constituents, an exhaustive collection of fragmentation types, and a broad list of annotation options. The aim of GlycoWorkbench is to offer complete support for the routine interpretation of MS data. The software is available for download from: http://www.eurocarbdb.org/applications/ms-tools.

  16. Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity

    PubMed Central

    Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.

    2010-01-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183

  17. Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.

    PubMed

    Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L

    2010-02-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.

  18. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    Continuing studies associated with the development of the quasi-analytical (QA) sensitivity method for three dimensional transonic flow about wings are presented. Furthermore, initial results using the quasi-analytical approach were obtained and compared to those computed using the finite difference (FD) approach. The basic goals achieved were: (1) carrying out various debugging operations pertaining to the quasi-analytical method; (2) addition of section design variables to the sensitivity equation in the form of multiple right hand sides; (3) reconfiguring the analysis/sensitivity package in order to facilitate the execution of analysis/FD/QA test cases; and (4) enhancing the display of output data to allow careful examination of the results and to permit various comparisons of sensitivity derivatives obtained using the FC/QA methods to be conducted easily and quickly. In addition to discussing the above goals, the results of executing subcritical and supercritical test cases are presented.

  19. Analytical Glycobiology at High Sensitivity: Current Approaches and Directions

    PubMed Central

    Novotny, Milos V.; Alley, William R.; Mann, Benjamin F.

    2013-01-01

    This review summarizes the analytical advances made during the last several years in the structural and quantitative determinations of glycoproteins in complex biological mixtures. The main analytical techniques used in the fields of glycomics and glycoproteomics involve different modes of mass spectrometry and their combinations with capillary separation methods such as microcolumn liquid chromatography and capillary electrophoresis. The needs for high-sensitivity measurements have been emphasized in the oligosaccharide profiling used in the field of biomarker discovery through MALDI mass spectrometry. High-sensitivity profiling of both glycans and glycopeptides from biological fluids and tissue extracts has been aided significantly through lectin preconcentration and the uses of affinity chromatography. PMID:22945852

  20. Fluorescence polarization immunoassays for rapid, accurate, and sensitive determination of mycotoxins

    USDA-ARS?s Scientific Manuscript database

    Analytical methods for the determination of mycotoxins in foods are commonly based on chromatographic techniques (GC, HPLC or LC-MS). Although these methods permit a sensitive and accurate determination of the analyte, they require skilled personnel and are time-consuming, expensive, and unsuitable ...

  1. PERFORMANCE OF CONVENTIONAL PCRs BASED ON PRIMERS DIRECTED TO NUCLEAR AND MITOCHONDRIAL GENES FOR THE DETECTION AND IDENTIFICATION OF Leishmania spp.

    PubMed Central

    LOPES, Estela Gallucci; GERALDO, Carlos Alberto; MARCILI, Arlei; SILVA, Ricardo Duarte; KEID, Lara Borges; OLIVEIRA, Trícia Maria Ferreira da Silva; SOARES, Rodrigo Martins

    2016-01-01

    In visceral leishmaniasis, the detection of the agent is of paramount importance to identify reservoirs of infection. Here, we evaluated the diagnostic attributes of PCRs based on primers directed to cytochrome-B (cytB), cytochrome-oxidase-subunit II (coxII), cytochrome-C (cytC), and the minicircle-kDNA. Although PCRs directed to cytB, coxII, cytC were able to detect different species of Leishmania, and the nucleotide sequence of their amplicons allowed the unequivocal differentiation of species, the analytical and diagnostic sensitivity of these PCRs were much lower than the analytical and diagnostic sensitivity of the kDNA-PCR. Among the 73 seropositive animals, the asymptomatic dogs had spleen and bone marrow samples collected and tested; only two animals were positive by PCRs based on cytB, coxII, and cytC, whereas 18 were positive by the kDNA-PCR. Considering the kDNA-PCR results, six dogs had positive spleen and bone marrow samples, eight dogs had positive bone marrow results but negative results in spleen samples and, in four dogs, the reverse situation occurred. We concluded that PCRs based on cytB, coxII, and cytC can be useful tools to identify Leishmania species when used in combination with automated sequencing. The discordance between the results of the kDNA-PCR in bone marrow and spleen samples may indicate that conventional PCR lacks sensitivity for the detection of infected dogs. Thus, primers based on the kDNA should be preferred for the screening of infected dogs. PMID:27253743

  2. [Detection of rubella virus RNA in clinical material by real time polymerase chain reaction method].

    PubMed

    Domonova, É A; Shipulina, O Iu; Kuevda, D A; Larichev, V F; Safonova, A P; Burchik, M A; Butenko, A M; Shipulin, G A

    2012-01-01

    Development of a reagent kit for detection of rubella virus RNA in clinical material by PCR-RT. During development and determination of analytical specificity and sensitivity DNA and RNA of 33 different microorganisms including 4 rubella strains were used. Comparison of analytical sensitivity of virological and molecular-biological methods was performed by using rubella virus strains Wistar RA 27/3, M-33, "Orlov", Judith. Evaluation of diagnostic informativity of rubella virus RNAisolation in various clinical material by PCR-RT method was performed in comparison with determination of virus specific serum antibodies by enzyme immunoassay. A reagent kit for the detection of rubella virus RNA in clinical material by PCR-RT was developed. Analytical specificity was 100%, analytical sensitivity - 400 virus RNA copies per ml. Analytical sensitivity of the developed technique exceeds analytical sensitivity of the Vero E6 cell culture infection method in studies of rubella virus strains Wistar RA 27/3 and "Orlov" by 11g and 31g, and for M-33 and Judith strains is analogous. Diagnostic specificity is 100%. Diagnostic specificity for testing samples obtained within 5 days of rash onset: for peripheral blood sera - 20.9%, saliva - 92.5%, nasopharyngeal swabs - 70.1%, saliva and nasopharyngeal swabs - 97%. Positive and negative predictive values of the results were shown depending on the type of clinical material tested. Application of reagent kit will allow to increase rubella diagnostics effectiveness at the early stages of infectious process development, timely and qualitatively perform differential diagnostics of exanthema diseases, support tactics of anti-epidemic regime.

  3. Fluorescence-based assay as a new screening tool for toxic chemicals

    PubMed Central

    Moczko, Ewa; Mirkes, Evgeny M.; Cáceres, César; Gorban, Alexander N.; Piletsky, Sergey

    2016-01-01

    Our study involves development of fluorescent cell-based diagnostic assay as a new approach in high-throughput screening method. This highly sensitive optical assay operates similarly to e-noses and e-tongues which combine semi-specific sensors and multivariate data analysis for monitoring biochemical processes. The optical assay consists of a mixture of environmental-sensitive fluorescent dyes and human skin cells that generate fluorescence spectra patterns distinctive for particular physico-chemical and physiological conditions. Using chemometric techniques the optical signal is processed providing qualitative information about analytical characteristics of the samples. This integrated approach has been successfully applied (with sensitivity of 93% and specificity of 97%) in assessing whether particular chemical agents are irritating or not for human skin. It has several advantages compared with traditional biochemical or biological assays and can impact the new way of high-throughput screening and understanding cell activity. It also can provide reliable and reproducible method for assessing a risk of exposing people to different harmful substances, identification active compounds in toxicity screening and safety assessment of drugs, cosmetic or their specific ingredients. PMID:27653274

  4. Fluorescence-based assay as a new screening tool for toxic chemicals.

    PubMed

    Moczko, Ewa; Mirkes, Evgeny M; Cáceres, César; Gorban, Alexander N; Piletsky, Sergey

    2016-09-22

    Our study involves development of fluorescent cell-based diagnostic assay as a new approach in high-throughput screening method. This highly sensitive optical assay operates similarly to e-noses and e-tongues which combine semi-specific sensors and multivariate data analysis for monitoring biochemical processes. The optical assay consists of a mixture of environmental-sensitive fluorescent dyes and human skin cells that generate fluorescence spectra patterns distinctive for particular physico-chemical and physiological conditions. Using chemometric techniques the optical signal is processed providing qualitative information about analytical characteristics of the samples. This integrated approach has been successfully applied (with sensitivity of 93% and specificity of 97%) in assessing whether particular chemical agents are irritating or not for human skin. It has several advantages compared with traditional biochemical or biological assays and can impact the new way of high-throughput screening and understanding cell activity. It also can provide reliable and reproducible method for assessing a risk of exposing people to different harmful substances, identification active compounds in toxicity screening and safety assessment of drugs, cosmetic or their specific ingredients.

  5. Fluorescence-based assay as a new screening tool for toxic chemicals

    NASA Astrophysics Data System (ADS)

    Moczko, Ewa; Mirkes, Evgeny M.; Cáceres, César; Gorban, Alexander N.; Piletsky, Sergey

    2016-09-01

    Our study involves development of fluorescent cell-based diagnostic assay as a new approach in high-throughput screening method. This highly sensitive optical assay operates similarly to e-noses and e-tongues which combine semi-specific sensors and multivariate data analysis for monitoring biochemical processes. The optical assay consists of a mixture of environmental-sensitive fluorescent dyes and human skin cells that generate fluorescence spectra patterns distinctive for particular physico-chemical and physiological conditions. Using chemometric techniques the optical signal is processed providing qualitative information about analytical characteristics of the samples. This integrated approach has been successfully applied (with sensitivity of 93% and specificity of 97%) in assessing whether particular chemical agents are irritating or not for human skin. It has several advantages compared with traditional biochemical or biological assays and can impact the new way of high-throughput screening and understanding cell activity. It also can provide reliable and reproducible method for assessing a risk of exposing people to different harmful substances, identification active compounds in toxicity screening and safety assessment of drugs, cosmetic or their specific ingredients.

  6. Development, fabrication, and modeling of highly sensitive conjugated polymer based piezoresistive sensors in electronic skin applications

    NASA Astrophysics Data System (ADS)

    Khalili, Nazanin; Naguib, Hani E.; Kwon, Roy H.

    2016-04-01

    Human intervention can be replaced through development of tools resulted from utilizing sensing devices possessing a wide range of applications including humanoid robots or remote and minimally invasive surgeries. Similar to the five human senses, sensors interface with their surroundings to stimulate a suitable response or action. The sense of touch which arises in human skin is among the most challenging senses to emulate due to its ultra high sensitivity. This has brought forth novel challenging issues to consider in the field of biomimetic robotics. In this work, using a multiphase reaction, a polypyrrole (PPy) based hydrogel is developed as a resistive type pressure sensor with an intrinsically elastic microstructure stemming from three dimensional hollow spheres. Furthermore, a semi-analytical constriction resistance model accounting for the real contact area between the PPy hydrogel sensors and the electrode along with the dependency of the contact resistance change on the applied load is developed. The model is then solved using a Monte Carlo technique and the sensitivity of the sensor is obtained. The experimental results showed the good tracking ability of the proposed model.

  7. Model-based POD study of manual ultrasound inspection and sensitivity analysis using metamodel

    NASA Astrophysics Data System (ADS)

    Ribay, Guillemette; Artusi, Xavier; Jenson, Frédéric; Reece, Christopher; Lhuillier, Pierre-Emile

    2016-02-01

    The reliability of NDE can be quantified by using the Probability of Detection (POD) approach. Former studies have shown the potential of the model-assisted POD (MAPOD) approach to replace expensive experimental determination of POD curves. In this paper, we make use of CIVA software to determine POD curves for a manual ultrasonic inspection of a heavy component, for which a whole experimental POD campaign was not available. The influential parameters were determined by expert analysis. The semi-analytical models used in CIVA for wave propagation and beam-defect interaction have been validated in the range of variation of the influential parameters by comparison with finite element modelling (Athena). The POD curves are computed for « hit/miss » and « â versus a » analysis. The verification of Berens hypothesis is evaluated by statistical tools. A sensitivity study is performed to measure the relative influence of parameters on the defect response amplitude variance, using the Sobol sensitivity index. A meta-model is also built to reduce computing cost and enhance the precision of estimated index.

  8. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  9. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. The interplay between pH sensitivity and label-free protein detection in immunologically modified nano-scaled field-effect transistor.

    PubMed

    Shalev, Gil; Rosenwaks, Yossi; Levy, Ilan

    2012-01-15

    We present experimental results in order to establish a correlation between pH sensitivity of immunologically modified nano-scaled field-effect transistor (NS-ImmunoFET) with their sensing capacity for label-free detection. The NS-ImmunoFETs are fabricated from silicon-on-insulator (SOI) wafers and are fully-depleted with thickness of ~20 nm. The data shows that higher sensitivity to pH entails enhanced sensitivity to analyte detection. This suggests that the mechanism of analyte detection as pure electrostatic perturbation induced by antibody-analyte interaction is over simplified. The fundamental assumption, in existing models for field-effect sensing mechanism assumes that the analyte molecules do not directly interact with the surface but rather stand 'deep' in the solution and away from the dielectric surface. Recent studies clearly provide contradicting evidence demonstrating that antibodies lie down flat on the surface. These observations led us to propose that the proteins that cover the gate area intimately interact with active sites on the surface thus forming a network of interacting sites. Since sensitivity to pH is directly correlated with the amount of amphoteric sites, we witness a direct correlation between sensitivity to pH and analyte detection. The highest and lowest threshold voltage shift for a label-free and specific detection of 6.5 nM IgG were 40 mV and 2.3 mV for NS-ImmunoFETs with pH sensitivity of 35 mV/decade and 15 mV/decade, respectively. Finally, physical modeling of the NS-ImmunoFET is presented and charge of a single IgG protein at pH 6 is calculated. The obtained value is consistent with charge of IgG protein cited in literature. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  12. Tracking pyrethroid toxicity in surface water samples: Exposure dynamics and toxicity identification tools for laboratory tests with Hyalella azteca (Amphipoda).

    PubMed

    Deanovic, Linda A; Stillway, Marie; Hammock, Bruce G; Fong, Stephanie; Werner, Inge

    2018-02-01

    Pyrethroid insecticides are commonly used in pest control and are present at toxic concentrations in surface waters of agricultural and urban areas worldwide. Monitoring is challenging as a result of their high hydrophobicity and low toxicity thresholds, which often fall below the analytical methods detection limits (MDLs). Standard daphnid bioassays used in surface water monitoring are not sensitive enough to protect more susceptible invertebrate species such as the amphipod Hyalella azteca and chemical loss during toxicity testing is of concern. In the present study, we quantified toxicity loss during storage and testing, using both natural and synthetic water, and presented a tool to enhance toxic signal strength for improved sensitivity of H. azteca toxicity tests. The average half-life during storage in low-density polyethylene (LDPE) cubitainers (Fisher Scientific) at 4 °C of 5 pyrethroids (permethrin, bifenthrin, lambda-cyhalothrin, cyfluthrin, and esfenvalerate) and one organophosphate (chlorpyrifos; used as reference) was 1.4 d, and piperonyl butoxide (PBO) proved an effective tool to potentiate toxicity. We conclude that toxicity tests on ambient water samples containing these hydrophobic insecticides are likely to underestimate toxicity present in the field, and mimic short pulse rather than continuous exposures. Where these chemicals are of concern, the addition of PBO during testing can yield valuable information on their presence or absence. Environ Toxicol Chem 2018;37:462-472. © 2017 SETAC. © 2017 SETAC.

  13. New trends in the analytical determination of emerging contaminants and their transformation products in environmental waters.

    PubMed

    Agüera, Ana; Martínez Bueno, María Jesús; Fernández-Alba, Amadeo R

    2013-06-01

    Since the so-called emerging contaminants were established as a new group of pollutants of environmental concern, a great effort has been devoted to the knowledge of their distribution, fate and effects in the environment. After more than 20 years of work, a significant improvement in knowledge about these contaminants has been achieved, but there is still a large gap of information on the growing number of new potential contaminants that are appearing and especially of their unpredictable transformation products. Although the environmental problem arising from emerging contaminants must be addressed from an interdisciplinary point of view, it is obvious that analytical chemistry plays an important role as the first step of the study, as it allows establishing the presence of chemicals in the environment, estimate their concentration levels, identify sources and determine their degradation pathways. These tasks involve serious difficulties requiring different analytical solutions adjusted to purpose. Thus, the complexity of the matrices requires highly selective analytical methods; the large number and variety of compounds potentially present in the samples demands the application of wide scope methods; the low concentrations at which these contaminants are present in the samples require a high detection sensitivity, and high demands on the confirmation and high structural information are needed for the characterisation of unknowns. New developments on analytical instrumentation have been applied to solve these difficulties. Furthermore and not less important has been the development of new specific software packages intended for data acquisition and, in particular, for post-run analysis. Thus, the use of sophisticated software tools has allowed successful screening analysis, determining several hundreds of analytes, and assisted in the structural elucidation of unknown compounds in a timely manner.

  14. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  15. ELISA for detection of variant rabbit haemorrhagic disease virus RHDV2 antigen in liver extracts.

    PubMed

    Dalton, K P; Podadera, A; Granda, V; Nicieza, I; Del Llano, D; González, R; de Los Toyos, J R; García Ocaña, M; Vázquez, F; Martín Alonso, J M; Prieto, J M; Parra, F; Casais, R

    2018-01-01

    The emergence and rapid spread of variant of the rabbit hemorrhagic disease virus (RHDV2) require new diagnostic tools to ensure that efficient control measures are adopted. In the present study, a specific sandwich enzyme-linked immunosorbent assay (ELISA) for detection of RHDV2 antigens in rabbit liver homogenates, based on the use of an RHDV2-specific monoclonal antibody (Mab) 2D9 for antigen capture and an anti-RHDV2 goat polyclonal antibody (Pab), was developed. This ELISA was able to successfully detect RHDV2 and RHDV2 recombinant virions with high sensitivity (100%) and specificity (97.22%). No cross-reactions were detected with RHDV G1 viruses while low cross-reactivity was detected with one of the RHDVa samples analyzed. The ELISA afforded good repeatability and had high analytical sensitivity as it was able to detect a dilution 1:163,640 (6.10ng/mL) of purified RHDV-N11 VLPs, which contained approximately 3.4×10 8 molecules/mL particles. The reliable discrimination between closely related viruses is crucial to understand the epidemiology and the interaction of co-existing pathogens. In the work described here we design and validate an ELISA for laboratory based, specific, sensitive and reliable detection of RHDVb/RHDV2. This ELISA is a valuable, specific virological tool for monitoring virus circulation, which will permit a better control of this disease. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  17. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  18. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  19. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  20. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  1. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  2. Challenges and Opportunities in Analysing Students Modelling

    ERIC Educational Resources Information Center

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-01-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…

  3. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  4. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  5. An Evaluation of an Analytical Simulation of an Airplane with Tailplane Icing by Comparison to Flight Data

    NASA Technical Reports Server (NTRS)

    Hiltner, Dale W.

    2000-01-01

    This report presents the assessment of an analytical tool developed as part of the NASA/FAA Tailplane Icing Program. The analytical tool is a specialized simulation program called TAILSM4 which was developed to model the effects of tailplane icing on the flight dynamics Twin Otter Icing Research Aircraft. This report compares the responses of the TAILSIM program directly to flight test data. The comparisons should be useful to potential users of TAILSIM. The comparisons show that the TAILSIM program qualitatively duplicates the flight test aircraft response during maneuvers with ice on the tailplane. TAILSIM is shown to be quantitatively "in the ballpark" in predicting when Ice Contaminated Tailplane Stall will occur during pushover and thrust transition maneuvers. As such, TAILSIM proved its usefulness to the flight test program by providing a general indication of the aircraft configuration and flight conditions of concern. The aircraft dynamics are shown to be modeled correctly by the equations of motion used in TAILSIM. However, the general accuracy of the TAILSIM responses is shown to be less than desired primarily due to inaccuracies in the aircraft database. The high sensitivity of the TAILSIM program responses to small changes in load factor command input is also shown to be a factor in the accuracy of the responses. A pilot model is shown to allow TAILSIM to produce more accurate responses and contribute significantly to the usefulness of the program. Suggestions to improve the accuracy of the TAILSIM responses are to further refine the database representation of the aircraft aerodynamics and tailplane flowfield and to explore a more realistic definition of the pilot model.

  6. Simultaneous Voltammetric Determination of Acetaminophen and Isoniazid (Hepatotoxicity-Related Drugs) Utilizing Bismuth Oxide Nanorod Modified Screen-Printed Electrochemical Sensing Platforms.

    PubMed

    Mahmoud, Bahaa G; Khairy, Mohamed; Rashwan, Farouk A; Banks, Craig E

    2017-02-07

    To overcome the recent outbreaks of hepatotoxicity-related drugs, a new analytical tool for the continuously determination of these drugs in human fluids is required. Electrochemical-based analytical methods offer an effective, rapid, and simple tool for on-site determination of various organic and inorganic species. However, the design of a sensitive, selective, stable, and reproducible sensor is still a major challenge. In the present manuscript, a facile, one-pot hydrothermal synthesis of bismuth oxide (Bi 2 O 2.33 ) nanostructures (nanorods) was developed. These BiO nanorods were cast onto mass disposable graphite screen-printed electrodes (BiO-SPEs), allowing the ultrasensitive determination of acetaminophen (APAP) in the presence of its common interference isoniazid (INH), which are both found in drug samples. The simultaneous electroanalytical sensing using BiO-SPEs exhibited strong electrocatalytic activity toward the sensing of APAP and INH with an enhanced analytical signal (voltammetric peak) over that achievable at unmodified (bare) SPEs. The electroanalytical sensing of APAP and INH are possible with accessible linear ranges from 0.5 to 1250 μM and 5 to 1760 μM with limits of detection (3σ) of 30 nM and 1.85 μM, respectively. The stability, reproducibility, and repeatability of BiO-SPE were also investigated. The BiO-SPEs were evaluated toward the sensing of APAP and INH in human serum, urine, saliva, and tablet samples. The results presented in this paper demonstrate that BiO-SPEs sensing platforms provide a potential candidate for the accurate determination of APAP and INH within human fluids and pharmaceutical formulations.

  7. Membrane-based lateral flow immunochromatographic strip with nanoparticles as reporters for detection: A review.

    PubMed

    Huang, Xiaolin; Aguilar, Zoraida P; Xu, Hengyi; Lai, Weihua; Xiong, Yonghua

    2016-01-15

    Membrane-based lateral flow immunochromatographic strip (LFICS) is widely used in various fields because of its simplicity, rapidity (detection within 10min), and low cost. However, early designs of membrane-based LFICS for preliminary screening only provide qualitative ("yes/no" signal) or semi-quantitative results without quantitative information. These designs often suffer from low-signal intensity and poor sensitivity and are only capable of single analyte detection, not simultaneous multiple detections. The performance of existing techniques used for detection using LFICS has been considerably improved by incorporating different kinds of nanoparticles (NPs) as reporters. NPs can serve as alternative labels and improve analytical sensitivity or limit of detection of LFICS because of their unique properties, such as optical absorption, fluorescence spectra, and magnetic properties. The controlled manipulation of NPs allows simultaneous or multiple detections by using membrane-based LFICS. In this review, we discuss how colored (e.g., colloidal gold, carbon, and colloidal selenium NPs), luminescent (e.g., quantum dots, up-converting phosphor NPs, and dye-doped NPs), and magnetic NPs are integrated into membrane-based LFICS for the detection of target analytes. Gold NPs are also featured because of their wide applications. Different types and unique properties of NPs are briefly explained. This review focuses on examples of NP-based LFICS to illustrate novel concepts in various devices with potential applications as screening tools. This review also highlights the superiority of NP-based approaches over existing conventional strategies for clinical analysis, food safety, and environmental monitoring. This paper is concluded by a short section on future research trends regarding NP-based LFICS. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Long-term detection of methyltestosterone (ab-) use by a yeast transactivation system.

    PubMed

    Wolf, Sylvi; Diel, Patrick; Parr, Maria Kristina; Rataj, Felicitas; Schänzer, Willhelm; Vollmer, Günter; Zierau, Oliver

    2011-04-01

    The routinely used analytical method for detecting the abuse of anabolic steroids only allows the detection of molecules with known analytical properties. In our supplementary approach to structure-independent detection, substances are identified by their biological activity. In the present study, urines excreted after oral methyltestosterone (MT) administration were analyzed by a yeast androgen screen (YAS). The aim was to trace the excretion of MT or its metabolites in human urine samples and to compare the results with those from the established analytical method. MT and its two major metabolites were tested as pure compounds in the YAS. In a second step, the ability of the YAS to detect MT and its metabolites in urine samples was analyzed. For this purpose, a human volunteer ingested of a single dose of 5 mg methyltestosterone. Urine samples were collected after different time intervals (0-307 h) and were analyzed in the YAS and in parallel by GC/MS. Whereas the YAS was able to trace MT in urine samples at least for 14 days, the detection limits of the GC/MS method allowed follow-up until day six. In conclusion, our results demonstrate that the yeast reporter gene system could detect the activity of anabolic steroids like methyltestosterone with high sensitivity even in urine. Furthermore, the YAS was able to detect MT abuse for a longer period of time than classical GC/MS. Obviously, the system responds to long-lasting metabolites yet unidentified. Therefore, the YAS can be a powerful (pre-) screening tool with the potential that to be used to identify persistent or late screening metabolites of anabolic steroids, which could be used for an enhancement of the sensitivity of GC/MS detection techniques.

  9. Self-Catalyzing Chemiluminescence of Luminol-Diazonium Ion and Its Application for Catalyst-Free Hydrogen Peroxide Detection and Rat Arthritis Imaging.

    PubMed

    Zhao, Chunxin; Cui, Hongbo; Duan, Jing; Zhang, Shenghai; Lv, Jiagen

    2018-02-06

    We report the unique self-catalyzing chemiluminescence (CL) of luminol-diazonium ion (N 2 + -luminol) and its analytical potential. Visual CL emission was initially observed when N 2 + -luminol was subjected to alkaline aqueous H 2 O 2 without the aid of any catalysts. Further experimental investigations found peroxidase-like activity of N 2 + -luminol on the cleavage of H 2 O 2 into OH • radical. Together with other experimental evidence, the CL mechanism is suggested as the activation of N 2 + -luminol and its dediazotization product 3-hydroxyl luminol by OH • radical into corresponding intermediate radicals, and then further oxidation to excited-state 3-N 2 + -phthalic acid and 3-hydroxyphthalic acid, which finally produce 415 nm CL. The self-catalyzing CL of N 2 + -luminol provides us an opportunity to achieve the attractive catalyst-free CL detection of H 2 O 2 . Experiments demonstrated the 10 -8 M level detection sensitivity to H 2 O 2 as well as to glucose or uric acid if presubjected to glucose oxidase or uricase. With the exampled determination of serum glucose and uric acid, N 2 + -luminol shows its analytical potential for other analytes linking the production or consumption of H 2 O 2 . Under physiological condition, N 2 + -luminol exhibits highly selective and sensitive CL toward 1 O 2 among the common reactive oxygen species. This capacity supports the significant application of N 2 + -luminol for detecting 1 O 2 in live animals. By imaging the arthritis in LEW rats, N 2 + -luminol CL is demonstrated as a potential tool for mapping the inflammation-relevant biological events in a live body.

  10. Determination of aerodynamic sensitivity coefficients based on the three-dimensional full potential equation

    NASA Technical Reports Server (NTRS)

    Elbanna, Hesham M.; Carlson, Leland A.

    1992-01-01

    The quasi-analytical approach is applied to the three-dimensional full potential equation to compute wing aerodynamic sensitivity coefficients in the transonic regime. Symbolic manipulation is used to reduce the effort associated with obtaining the sensitivity equations, and the large sensitivity system is solved using 'state of the art' routines. Results are compared to those obtained by the direct finite difference approach and both methods are evaluated to determine their computational accuracy and efficiency. The quasi-analytical approach is shown to be accurate and efficient for large aerodynamic systems.

  11. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  12. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  13. Multi-parameter flow cytometry as a process analytical technology (PAT) approach for the assessment of bacterial ghost production.

    PubMed

    Langemann, Timo; Mayr, Ulrike Beate; Meitz, Andrea; Lubitz, Werner; Herwig, Christoph

    2016-01-01

    Flow cytometry (FCM) is a tool for the analysis of single-cell properties in a cell suspension. In this contribution, we present an improved FCM method for the assessment of E-lysis in Enterobacteriaceae. The result of the E-lysis process is empty bacterial envelopes-called bacterial ghosts (BGs)-that constitute potential products in the pharmaceutical field. BGs have reduced light scattering properties when compared with intact cells. In combination with viability information obtained from staining samples with the membrane potential-sensitive fluorescent dye bis-(1,3-dibutylarbituric acid) trimethine oxonol (DiBAC4(3)), the presented method allows to differentiate between populations of viable cells, dead cells, and BGs. Using a second fluorescent dye RH414 as a membrane marker, non-cellular background was excluded from the data which greatly improved the quality of the results. Using true volumetric absolute counting, the FCM data correlated well with cell count data obtained from colony-forming units (CFU) for viable populations. Applicability of the method to several Enterobacteriaceae (different Escherichia coli strains, Salmonella typhimurium, Shigella flexneri 2a) could be shown. The method was validated as a resilient process analytical technology (PAT) tool for the assessment of E-lysis and for particle counting during 20-l batch processes for the production of Escherichia coli Nissle 1917 BGs.

  14. Application of Raman microscopy to biodegradable double-walled microspheres.

    PubMed

    Widjaja, Effendi; Lee, Wei Li; Loo, Say Chye Joachim

    2010-02-15

    Raman mapping measurements were performed on the cross section of the ternary-phase biodegradable double-walled microsphere (DWMS) of poly(D,L-lactide-co-glycolide) (50:50) (PLGA), poly(L-lactide) (PLLA), and poly(epsilon-caprolactone) (PCL), which was fabricated by a one-step solvent evaporation method. The collected Raman spectra were subjected to a band-target entropy minimization (BTEM) algorithm in order to reconstruct the pure component spectra of the species observed in this sample. Seven pure component spectral estimates were recovered, and their spatial distributions within DWMS were determined. The first three spectral estimates were identified as PLLA, PLGA 50:50, and PCL, which were the main components in DWMS. The last four spectral estimates were identified as semicrystalline polyglycolic acid (PGA), dichloromethane (DCM), copper-phthalocyanine blue, and calcite, which were the minor components in DWMS. PGA was the decomposition product of PLGA. DCM was the solvent used in DWMS fabrication. Copper-phthalocyanine blue and calcite were the unexpected contaminants. The current result showed that combined Raman microscopy and BTEM analysis can provide a sensitive characterization tool to DWMS, as it can give more specific information on the chemical species present as well as the spatial distributions. This novel analytical method for microsphere characterization can serve as a complementary tool to other more established analytical techniques, such as scanning electron microscopy and optical microscopy.

  15. Can cloud point-based enrichment, preservation, and detection methods help to bridge gaps in aquatic nanometrology?

    PubMed

    Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan

    2016-11-01

    Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.

  16. The development of "fab-chips" as low-cost, sensitive surface-enhanced Raman spectroscopy (SERS) substrates for analytical applications.

    PubMed

    Robinson, Ashley M; Zhao, Lili; Shah Alam, Marwa Y; Bhandari, Paridhi; Harroun, Scott G; Dendukuri, Dhananjaya; Blackburn, Jonathan; Brosseau, Christa L

    2015-02-07

    The demand for methods and technologies capable of rapid, inexpensive and continuous monitoring of health status or exposure to environmental pollutants persists. In this work, the development of novel surface-enhanced Raman spectroscopy (SERS) substrates from metal-coated silk fabric, known as zari, presents the potential for SERS substrates to be incorporated into clothing and other textiles for the routine monitoring of important analytes, such as disease biomarkers or environmental pollutants. Characterization of the zari fabric was completed using scanning electron microscopy, energy dispersive X-ray analysis and Raman spectroscopy. Silver nanoparticles (AgNPs) were prepared, characterized by transmission electron microscopy and UV-vis spectroscopy, and used to treat fabric samples by incubation, drop-coating and in situ synthesis. The quality of the treated fabric was evaluated by collecting the SERS signal of 4,4'-bipyridine on these substrates. When AgNPs were drop-coated on the fabric, sensitive and reproducible substrates were obtained. Adenine was selected as a second probe molecule, because it dominates the SERS signal of DNA, which is an important class of disease biomarker, particularly for pathogens such as Plasmodium spp. and Mycobacterium tuberculosis. Excellent signal enhancement could be achieved on these affordable substrates, suggesting that the developed fabric chips have the potential for expanding the use of SERS as a diagnostic and environmental monitoring tool for application in wearable sensor technologies.

  17. Canine olfaction as an alternative to analytical instruments for ...

    EPA Pesticide Factsheets

    Recent literature has touted the use of canine olfaction as a diagnostic tool for identifying pre-clinical disease status, especially cancer and infection from biological media samples. Studies have shown a wide range of outcomes, ranging from almost perfect discrimination, all the way to essentially random results. This disparity is not likely to be a detection issue; dogs have been shown to have extremely sensitive noses as proven by their use for tracking, bomb detection and search and rescue. However, in contrast to analytical instruments, dogs are subject to boredom, fatigue, hunger and external distractions. These challenges are of particular importance in a clinical environment where task repetition is prized, but not as entertaining for a dog as chasing odours outdoors. The question addressed here is how to exploit the intrinsic sensitivity and simplicity of having a dog simply sniff out disease, in the face of variability in behavior and response. There is no argument that living cells emanate a variety of gas- and liquid-phase compounds as waste from normal metabolism, and that these compounds become easureable from various biological media including skin, blood, urine, breath, feces, etc. [1, 2] The overarching term for this phenomenon from the perspective of systems biology analysis is “cellular respiration”, which has become an important topic for the interpretation and documentation of the human exposome, the chemical counterpart to the genome.

  18. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  19. A Bayesian Ensemble Approach for Epidemiological Projections

    PubMed Central

    Lindström, Tom; Tildesley, Michael; Webb, Colleen

    2015-01-01

    Mathematical models are powerful tools for epidemiology and can be used to compare control actions. However, different models and model parameterizations may provide different prediction of outcomes. In other fields of research, ensemble modeling has been used to combine multiple projections. We explore the possibility of applying such methods to epidemiology by adapting Bayesian techniques developed for climate forecasting. We exemplify the implementation with single model ensembles based on different parameterizations of the Warwick model run for the 2001 United Kingdom foot and mouth disease outbreak and compare the efficacy of different control actions. This allows us to investigate the effect that discrepancy among projections based on different modeling assumptions has on the ensemble prediction. A sensitivity analysis showed that the choice of prior can have a pronounced effect on the posterior estimates of quantities of interest, in particular for ensembles with large discrepancy among projections. However, by using a hierarchical extension of the method we show that prior sensitivity can be circumvented. We further extend the method to include a priori beliefs about different modeling assumptions and demonstrate that the effect of this can have different consequences depending on the discrepancy among projections. We propose that the method is a promising analytical tool for ensemble modeling of disease outbreaks. PMID:25927892

  20. A novel assay for detecting canine parvovirus using a quartz crystal microbalance biosensor.

    PubMed

    Kim, Yong Kwan; Lim, Seong-In; Choi, Sarah; Cho, In-Soo; Park, Eun-Hye; An, Dong-Jun

    2015-07-01

    Rapid and accurate diagnosis is crucial to reduce both the shedding and clinical signs of canine parvovirus (CPV). The quartz crystal microbalance (QCM) is a new tool for measuring frequency changes associated with antigen-antibody interactions. In this study, the QCM biosensor and ProLinker™ B were used to rapidly diagnosis CPV infection. ProLinker™ B enables antibodies to be attached to a gold-coated quartz surface in a regular pattern and in the correct orientation for antigen binding. Receiver operating characteristics (ROC) curves were used to set a cut-off value using reference CPVs (two groups: one CPV-positive and one CPV-negative). The ROC curves overlapped and the point of intersection was used as the cut-off value. A QCM biosensor with a cut-off value of -205 Hz showed 95.4% (104/109) sensitivity and 98.0% (149/152) specificity when used to test 261 field fecal samples compared to PCR. In conclusion, the QCM biosensor described herein is eminently suitable for the rapid diagnosis of CPV infection with high sensitivity and specificity. Therefore, it is a promising analytical tool that will be useful for clinical diagnosis, which requires rapid and reliable analyses. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. High Sensitivity Refractive Index Sensor Based on Dual-Core Photonic Crystal Fiber with Hexagonal Lattice.

    PubMed

    Wang, Haiyang; Yan, Xin; Li, Shuguang; An, Guowen; Zhang, Xuenan

    2016-10-08

    A refractive index sensor based on dual-core photonic crystal fiber (PCF) with hexagonal lattice is proposed. The effects of geometrical parameters of the PCF on performances of the sensor are investigated by using the finite element method (FEM). Two fiber cores are separated by two air holes filled with the analyte whose refractive index is in the range of 1.33-1.41. Numerical simulation results show that the highest sensitivity can be up to 22,983 nm/RIU(refractive index unit) when the analyte refractive index is 1.41. The lowest sensitivity can reach to 21,679 nm/RIU when the analyte refractive index is 1.33. The sensor we proposed has significant advantages in the field of biomolecule detection as it provides a wide-range of detection with high sensitivity.

  2. High Sensitivity Refractive Index Sensor Based on Dual-Core Photonic Crystal Fiber with Hexagonal Lattice

    PubMed Central

    Wang, Haiyang; Yan, Xin; Li, Shuguang; An, Guowen; Zhang, Xuenan

    2016-01-01

    A refractive index sensor based on dual-core photonic crystal fiber (PCF) with hexagonal lattice is proposed. The effects of geometrical parameters of the PCF on performances of the sensor are investigated by using the finite element method (FEM). Two fiber cores are separated by two air holes filled with the analyte whose refractive index is in the range of 1.33–1.41. Numerical simulation results show that the highest sensitivity can be up to 22,983 nm/RIU(refractive index unit) when the analyte refractive index is 1.41. The lowest sensitivity can reach to 21,679 nm/RIU when the analyte refractive index is 1.33. The sensor we proposed has significant advantages in the field of biomolecule detection as it provides a wide-range of detection with high sensitivity. PMID:27740607

  3. Advances in Mid-Infrared Spectroscopy for Chemical Analysis

    NASA Astrophysics Data System (ADS)

    Haas, Julian; Mizaikoff, Boris

    2016-06-01

    Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.

  4. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  5. Culturally Sensitive Interventions and Substance Use: A Meta-Analytic Review of Outcomes among Minority Youths

    ERIC Educational Resources Information Center

    Hodge, David R.; Jackson, Kelly F.; Vaughn, Michael G.

    2012-01-01

    This study assessed the effectiveness of culturally sensitive interventions (CSIs) ("N" = 10) designed to address substance use among minority youths. Study methods consisted of systematic search procedures, quality of study ratings, and meta-analytic techniques to gauge effects and evaluate publication bias. The results, across all measures and…

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  7. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  8. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  9. Integrated Circuits for Rapid Sample Processing and Electrochemical Detection of Biomarkers

    NASA Astrophysics Data System (ADS)

    Besant, Justin

    The trade-off between speed and sensitivity of detection is a fundamental challenge in the design of point-of-care diagnostics. As the relevant molecules in many diseases exist natively at extremely low levels, many gold-standard diagnostic tests are designed with high sensitivity at the expense of long incubations needed to amplify the target analytes. The central aim of this thesis is to design new strategies to detect biologically relevant analytes with both high speed and sensitivity. The response time of a biosensor is limited by the ability of the target analyte to accumulate to detectable levels at the sensor surface. We overcome this limitation by designing a range of integrated devices to optimize the flux of the analyte to the sensor by increasing the effective analyte concentration, shortening the required diffusion distance, and confining the analyte in close proximity to the sensor. We couple these devices with novel ultrasensitive electrochemical transduction strategies to convert rare analytes into a detectable signal. We showcase the clinical utility of these approaches with several applications including cancer diagnosis, bacterial identification, and antibiotic susceptibility profiling. We design and optimize a device to isolate rare cancer cells from the bloodstream with near 100% efficiency and 10 000-fold specificity. We analyse pathogen specific nucleic acids by lysing bacteria in close proximity to an electrochemical sensor and find that this approach has 10-fold higher sensitivity than standard lysis in bulk solution. We design an electronic chip to readout the antibiotic susceptibility profile with an hour-long incubation by concentrating bacteria into nanoliter chambers with integrated electrodes. Finally, we report a strategy for ultrasensitive visual readout of nucleic acids as low as 100 fM within 10 minutes using an amplification cascade. The strategies presented could guide the development of fast, sensitive and low-cost diagnostics for diseases not previously detectable at the point-of-care.

  10. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options.

    PubMed

    Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D

    2013-10-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.

  11. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options☆

    PubMed Central

    Reed, M.S.; Podesta, G.; Fazey, I.; Geeson, N.; Hessel, R.; Hubacek, K.; Letson, D.; Nainggolan, D.; Prell, C.; Rickenbach, M.G.; Ritsema, C.; Schwilch, G.; Stringer, L.C.; Thomas, A.D.

    2013-01-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change. PMID:25844020

  12. Analytical Protein Microarrays: Advancements Towards Clinical Applications

    PubMed Central

    Sauer, Ursula

    2017-01-01

    Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems. PMID:28146048

  13. Shape sensitivity analysis of flutter response of a laminated wing

    NASA Technical Reports Server (NTRS)

    Bergen, Fred D.; Kapania, Rakesh K.

    1988-01-01

    A method is presented for calculating the shape sensitivity of a wing aeroelastic response with respect to changes in geometric shape. Yates' modified strip method is used in conjunction with Giles' equivalent plate analysis to predict the flutter speed, frequency, and reduced frequency of the wing. Three methods are used to calculate the sensitivity of the eigenvalue. The first method is purely a finite difference calculation of the eigenvalue derivative directly from the solution of the flutter problem corresponding to the two different values of the shape parameters. The second method uses an analytic expression for the eigenvalue sensitivities of a general complex matrix, where the derivatives of the aerodynamic, mass, and stiffness matrices are computed using a finite difference approximation. The third method also uses an analytic expression for the eigenvalue sensitivities, but the aerodynamic matrix is computed analytically. All three methods are found to be in good agreement with each other. The sensitivities of the eigenvalues were used to predict the flutter speed, frequency, and reduced frequency. These approximations were found to be in good agreement with those obtained using a complete reanalysis.

  14. Sensors for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Severin, Erik (Inventor)

    1998-01-01

    Chemical sensors for detecting analytes in fluids comprise first and second conductive elements (e.g., electrical leads) electrically coupled to and separated by a chemically sensitive resistor which provides an electrical path between the conductive elements. The resistor comprises a plurality of alternating nonconductive regions (comprising a nonconductive organic polymer) and conductive regions (comprising a conductive material) transverse to the electrical path. The resistor provides a difference in resistance between the conductive elements when contacted with a fluid comprising a chemical analyte at a first concentration, than when contacted with a fluid comprising the chemical analyte at a second different concentration. Arrays of such sensors are constructed with at least two sensors having different chemically sensitive resistors providing dissimilar such differences in resistance. Variability in chemical sensitivity from sensor to sensor is provided by qualitatively or quantitatively varying the composition of the conductive and/or nonconductive regions. An electronic nose for detecting an analyte in a fluid may be constructed by using such arrays in conjunction with an electrical measuring device electrically connected to the conductive elements of each sensor.

  15. Sensors for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Severin, Erik (Inventor); Lewis, Nathan S. (Inventor)

    2001-01-01

    Chemical sensors for detecting analytes in fluids comprise first and second conductive elements (e.g., electrical leads) electrically coupled to and separated by a chemically sensitive resistor which provides an electrical path between the conductive elements. The resistor comprises a plurality of alternating nonconductive regions (comprising a nonconductive organic polymer) and conductive regions (comprising a conductive material) transverse to the electrical path. The resistor provides a difference in resistance between the conductive elements when contacted with a fluid comprising a chemical analyte at a first concentration, than when contacted with a fluid comprising the chemical analyte at a second different concentration. Arrays of such sensors are constructed with at least two sensors having different chemically sensitive resistors providing dissimilar such differences in resistance. Variability in chemical sensitivity from sensor to sensor is provided by qualitatively or quantitatively varying the composition of the conductive and/or nonconductive regions. An electronic nose for detecting an analyte in a fluid may be constructed by using such arrays in conjunction with an electrical measuring device electrically connected to the conductive elements of each sensor.

  16. Sensors for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Severin, Erik (Inventor)

    1999-01-01

    Chemical sensors for detecting analytes in fluids comprise first and second conductive elements (e.g., electrical leads) electrically coupled to and separated by a chemically sensitive resistor which provides an electrical path between the conductive elements. The resistor comprises a plurality of alternating nonconductive regions (comprising a nonconductive organic polymer) and conductive regions (comprising a conductive material) transverse to the electrical path. The resistor provides a difference in resistance between the conductive elements when contacted with a fluid comprising a chemical analyte at a first concentration, than when contacted with a fluid comprising the chemical analyte at a second different concentration. Arrays of such sensors are constructed with at least two sensors having different chemically sensitive resistors providing dissimilar such differences in resistance. Variability in chemical sensitivity from sensor to sensor is provided by qualitatively or quantitatively varying the composition of the conductive and/or nonconductive regions. An electronic nose for detecting an analyte in a fluid may be constructed by using such arrays in conjunction with an electrical measuring device electrically connected to the conductive elements of each sensor.

  17. Sensor arrays for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Freund, Michael S. (Inventor)

    1996-01-01

    Chemical sensors for detecting analytes in fluids comprise first and second conductive elements (e.g. electrical leads) electrically coupled to and separated by a chemically sensitive resistor which provides an electrical path between the conductive elements. The resistor comprises a plurality of alternating nonconductive regions (comprising a nonconductive organic polymer) and conductive regions (comprising a conductive material) transverse to the electrical path. The resistor provides a difference in resistance between the conductive elements when contacted with a fluid comprising a chemical analyte at a first concentration, than when contacted with a fluid comprising the chemical analyte at a second different concentration. Arrays of such sensors are constructed with at least two sensors having different chemically sensitive resistors providing dissimilar such differences in resistance. Variability in chemical sensitivity from sensor to sensor is provided by qualitatively or quantitatively varying the composition of the conductive and/or nonconductive regions. An electronic nose for detecting an analyte in a fluid may be constructed by using such arrays in conjunction with an electrical measuring device electrically connected to the conductive elements of each sensor.

  18. Improving LC-MS sensitivity through increases in chromatographic performance: comparisons of UPLC-ES/MS/MS to HPLC-ES/MS/MS.

    PubMed

    Churchwell, Mona I; Twaddle, Nathan C; Meeker, Larry R; Doerge, Daniel R

    2005-10-25

    Recent technological advances have made available reverse phase chromatographic media with a 1.7 microm particle size along with a liquid handling system that can operate such columns at much higher pressures. This technology, termed ultra performance liquid chromatography (UPLC), offers significant theoretical advantages in resolution, speed, and sensitivity for analytical determinations, particularly when coupled with mass spectrometers capable of high-speed acquisitions. This paper explores the differences in LC-MS performance by conducting a side-by-side comparison of UPLC for several methods previously optimized for HPLC-based separation and quantification of multiple analytes with maximum throughput. In general, UPLC produced significant improvements in method sensitivity, speed, and resolution. Sensitivity increases with UPLC, which were found to be analyte-dependent, were as large as 10-fold and improvements in method speed were as large as 5-fold under conditions of comparable peak separations. Improvements in chromatographic resolution with UPLC were apparent from generally narrower peak widths and from a separation of diastereomers not possible using HPLC. Overall, the improvements in LC-MS method sensitivity, speed, and resolution provided by UPLC show that further advances can be made in analytical methodology to add significant value to hypothesis-driven research.

  19. IBM's Health Analytics and Clinical Decision Support.

    PubMed

    Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W

    2014-08-15

    This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.

  20. Measuring Students' Writing Ability on a Computer-Analytic Developmental Scale: An Exploratory Validity Study

    ERIC Educational Resources Information Center

    Burdick, Hal; Swartz, Carl W.; Stenner, A. Jackson; Fitzgerald, Jill; Burdick, Don; Hanlon, Sean T.

    2013-01-01

    The purpose of the study was to explore the validity of a novel computer-analytic developmental scale, the Writing Ability Developmental Scale. On the whole, collective results supported the validity of the scale. It was sensitive to writing ability differences across grades and sensitive to within-grade variability as compared to human-rated…

  1. BEAMS Lab: Novel approaches to finding a balance between throughput and sensitivity

    NASA Astrophysics Data System (ADS)

    Liberman, Rosa G.; Skipper, Paul L.; Prakash, Chandra; Shaffer, Christopher L.; Flarakos, Jimmy; Tannenbaum, Steven R.

    2007-06-01

    Development of 14C AMS has long pursued the twin goals of maximizing both sensitivity and precision in the interest, among others, of optimizing radiocarbon dating. Application of AMS to biomedical research is less constrained with respect to sensitivity requirements, but more demanding of high throughput. This work presents some technical and conceptual developments in sample processing and analytical instrumentation designed to streamline the process of extracting quantitative data from the various types of samples encountered in analytical biochemistry.

  2. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  3. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  4. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  5. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  6. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    DTIC Science & Technology

    2013-03-01

    imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89  B.  FUTURE WORK................................................................................. 90  APPENDIX A. STK DATA AND BENEFIT

  7. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  8. The Metaphorical Department Head: Using Metaphors as Analytic Tools to Investigate the Role of Department Head

    ERIC Educational Resources Information Center

    Paranosic, Nikola; Riveros, Augusto

    2017-01-01

    This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…

  9. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  10. Uncertainty of relative sensitivity factors in glow discharge mass spectrometry

    NASA Astrophysics Data System (ADS)

    Meija, Juris; Methven, Brad; Sturgeon, Ralph E.

    2017-10-01

    The concept of the relative sensitivity factors required for the correction of the measured ion beam ratios in pin-cell glow discharge mass spectrometry is examined in detail. We propose a data-driven model for predicting the relative response factors, which relies on a non-linear least squares adjustment and analyte/matrix interchangeability phenomena. The model provides a self-consistent set of response factors for any analyte/matrix combination of any element that appears as either an analyte or matrix in at least one known response factor.

  11. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  12. BMDExpress Data Viewer: A Visualization Tool to Analyze ...

    EPA Pesticide Factsheets

    Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at which biological perturbations occur. However, graphing and analytical capabilities within BMDExpress are limited, and the analysis of output files is challenging. We developed a web-based application, BMDExpress Data Viewer, for visualization and graphical analyses of BMDExpress output files. The software application consists of two main components: ‘Summary Visualization Tools’ and ‘Dataset Exploratory Tools’. We demonstrate through two case studies that the ‘Summary Visualization Tools’ can be used to examine and assess the distributions of probe and pathway BMD outputs, as well as derive a potential regulatory BMD through the modes or means of the distributions. The ‘Functional Enrichment Analysis’ tool presents biological processes in a two-dimensional bubble chart view. By applying filters of pathway enrichment p-value and minimum number of significant genes, we showed that the Functional Enrichment Analysis tool can be applied to select pathways that are potentially sensitive to chemical perturbations. The ‘Multiple Dataset Comparison’ tool enables comparison of BMDs across multiple experiments (e.g., across time points, tissues, or organisms, etc.). The ‘BMDL-BM

  13. Developing Analytical Inspection Criteria for Health IT Personnel with Minimum Training in Cognitive Ergonomics: A Practical Solution to EHR Improving EHR Usability

    PubMed Central

    Zhang, Zhen; Franklin, Amy; Walji, Muhammad; Zhang, Jiajie; Gong, Yang

    2014-01-01

    EHR usability has been identified as a major barrier to care quality optimization. One major challenge of improving EHR usability is the lack of systematic training in usability or cognitive ergonomics for EHR designers/developers in the vendor community and EHR analysts making significant configurations in healthcare organizations. A practical solution is to provide usability inspection tools that can be easily operationalized by EHR analysts. This project is aimed at developing a set of usability tools with demonstrated validity and reliability. We present a preliminary study of a metric for cognitive transparency and an exploratory experiment testing its validity in predicting the effectiveness of action-effect mapping. Despite the pilot nature of both, we found high sensitivity and specificity of the metric and higher response accuracy within a shorter time for users to determine action-effect mappings in transparent user interface controls. We plan to expand the sample size in our empirical study. PMID:25954439

  14. A General Tool for Engineering the NAD/NADP Cofactor Preference of Oxidoreductases.

    PubMed

    Cahn, Jackson K B; Werlang, Caroline A; Baumschlager, Armin; Brinkmann-Chen, Sabine; Mayo, Stephen L; Arnold, Frances H

    2017-02-17

    The ability to control enzymatic nicotinamide cofactor utilization is critical for engineering efficient metabolic pathways. However, the complex interactions that determine cofactor-binding preference render this engineering particularly challenging. Physics-based models have been insufficiently accurate and blind directed evolution methods too inefficient to be widely adopted. Building on a comprehensive survey of previous studies and our own prior engineering successes, we present a structure-guided, semirational strategy for reversing enzymatic nicotinamide cofactor specificity. This heuristic-based approach leverages the diversity and sensitivity of catalytically productive cofactor binding geometries to limit the problem to an experimentally tractable scale. We demonstrate the efficacy of this strategy by inverting the cofactor specificity of four structurally diverse NADP-dependent enzymes: glyoxylate reductase, cinnamyl alcohol dehydrogenase, xylose reductase, and iron-containing alcohol dehydrogenase. The analytical components of this approach have been fully automated and are available in the form of an easy-to-use web tool: Cofactor Specificity Reversal-Structural Analysis and Library Design (CSR-SALAD).

  15. Leveraging Social Computing for Personalized Crisis Communication using Social Media.

    PubMed

    Leykin, Dmitry; Aharonson-Daniel, Limor; Lahad, Mooli

    2016-03-24

    The extensive use of social media in modern life redefines social interaction and communication. Communication plays an important role in mitigating, or exacerbating, the psychological and behavioral responses to critical incidents and disasters. As recent disasters demonstrated, people tend to converge to social media during and following emergencies. Authorities can then use this media and other computational methods to gain insights from the public, mainly to enhance situational awareness, but also to improve their communication with the public and public adherence to instructions. The current review presents a conceptual framework for studying psychological aspects of crisis and risk communication using the social media through social computing. Advanced analytical tools can be integrated in the processes and objectives of crisis communication. The availability of the computational techniques can improve communication with the public by a process of Hyper-Targeted Crisis Communication. The review suggests that using advanced computational tools for target-audience profiling and linguistic matching in social media, can facilitate more sensitive and personalized emergency communication.

  16. Dynamic Visualization of Co-expression in Systems Genetics Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    New, Joshua Ryan; Huang, Jian; Chesler, Elissa J

    2008-01-01

    Biologists hope to address grand scientific challenges by exploring the abundance of data made available through modern microarray technology and other high-throughput techniques. The impact of this data, however, is limited unless researchers can effectively assimilate such complex information and integrate it into their daily research; interactive visualization tools are called for to support the effort. Specifically, typical studies of gene co-expression require novel visualization tools that enable the dynamic formulation and fine-tuning of hypotheses to aid the process of evaluating sensitivity of key parameters. These tools should allow biologists to develop an intuitive understanding of the structure of biologicalmore » networks and discover genes which reside in critical positions in networks and pathways. By using a graph as a universal data representation of correlation in gene expression data, our novel visualization tool employs several techniques that when used in an integrated manner provide innovative analytical capabilities. Our tool for interacting with gene co-expression data integrates techniques such as: graph layout, qualitative subgraph extraction through a novel 2D user interface, quantitative subgraph extraction using graph-theoretic algorithms or by querying an optimized b-tree, dynamic level-of-detail graph abstraction, and template-based fuzzy classification using neural networks. We demonstrate our system using a real-world workflow from a large-scale, systems genetics study of mammalian gene co-expression.« less

  17. Analysis of Benefits of an Energy Imbalance Market in the NWPP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samaan, Nader A.; Bayless, Rich; Symonds, Mark

    The Northwest Power Pool (NWPP) Market Assessment Committee (MC) Initiative, which was officially launched on March 19, 2012, set out to explore a range of alternatives that could help the Balancing Authorities and scheduling utilities in the NWPP area address growing operational and commercial challenges affecting the regional power system. The MC formed an Analytical Team with technical representatives from each of the member Balancing Areas in the NWPP and with staff of Pacific Northwest National Laboratory (PNNL). This Analytical Team was instructed to conduct extensive studies of intra-hour operation of the NWPP system in the year 2020 and ofmore » the NWPP region with 14,671 MW of wind penetration. The effort utilized a sub-hourly production cost model (the PLEXOS® computer model) that inputs data from the Western Electricity Coordinating Council (WECC)-wide Production Cost Model (PCM) to evaluate potential production cost savings. The Analytical Team was given two general options to evaluate: •Energy Imbalance Market (EIM): establishment of an automated, organized NWPP area market for economically supplying energy imbalance within the hour. •Enhanced Market-Operational Tools (EMT) that might augment or replace an EIM. The Analytical The Analytical Team built on the WECC-wide PCM data from prior work done in the WECC and carried forward the evolution of the original WECC Transmission Expansion Planning Policy Committee (TEPPC) 2020 PC0 data base. A large number of modifications and improvements were made to this case and the data were subjected to extensive review by the team members to improve the model representation of the Northwest (NW). MC meetings that were open to the public were held for interested parties to review and provide input to the study. Results for the test, base, and sensitivity case studies performed by the MC Initiative Analytical Team indicate that there are a wide range of benefits that could be obtained from the operation of an EIM in the NWPP depending on what assumptions are made. The instructions from the MC were to determine a "minimum high confidence" range of potential benefits. The results for the Base Case indicate that the EIM benefits ranged from approximately $40 million to $70 million in annual savings from the operation of an EIM in the NWPP footprint. A number of additional relevant sensitivity cases were performed, including low and high water conditions, low and high natural gas prices, and various flex reserve requirements, resource operations, and amounts of resource capability held back during the preschedule period. Along with the results for the Base Case, the results for these studies yielded EIM benefits that clustered within the range of $70 to $80 million dollars per year with potential benefits ranging from approximately $125 million to as little as $17 million per year. Because the design and operation of an EIM could enable participating Balancing Authorities (BAs) to collectively lower the quantity of resources they must carry to meet within-hour balancing needs, a sensitivity case was also performed to analyze the impact that such reductions might have on the benefits from an EIM. The results for this sensitivity case indicate that such reductions could increase the benefits from the operation of an EIM in the NWPP into the range of approximately $130 million to $160 million per year. Also, a sensitivity case for a WECC-wide EIM was performed with the results indicating that the potential benefits to the NWPP could increase into the range of $197 million to $233 million per year. While there may be potential reliability benefits from the coordinated dispatch process underlying the operation of an EIM, reliability benefits from an EIM were out of the scope of this study. The EIM benefit analyses that were performed by the Analytical Team are provided in this report.« less

  18. Analysis of anabolic androgenic steroids in urine by full-capillary sample injection combined with a sweeping CE stacking method.

    PubMed

    Wang, Chun-Chi; Cheng, Shu-Fang; Cheng, Hui-Ling; Chen, Yen-Ling

    2013-02-01

    This study describes an on-line stacking CE approach by sweeping with whole capillary sample filling for analyzing five anabolic androgenic steroids in urine samples. The five anabolic steroids for detection were androstenedione, testosterone, epitestosterone, boldenone, and clostebol. Anabolic androgenic steroids are abused in sport doping because they can promote muscle growth. Therefore, a sensitive detection method is imperatively required for monitoring the urine samples of athletes. In this research, an interesting and reliable stacking capillary electrophoresis method was established for analysis of anabolic steroids in urine. After liquid-liquid extraction by n-hexane, the supernatant was dried and reconstituted with 30 mM phosphate buffer (pH 5.00) and loaded into the capillary by hydrodynamic injection (10 psi, 99.9 s). The stacking and separation were simultaneously accomplished at -20 kV in phosphate buffer (30 mM, pH 5.0) containing 100 mM sodium dodecyl sulfate and 40 % methanol. During the method validation, calibration curves were linear (r≥0.990) over a range of 50-1,000 ng/mL for the five analytes. In the evaluation of precision and accuracy for this method, the absolute values of the RSD and the RE in the intra-day (n=3) and inter-day (n=5) analyses were all less than 6.6 %. The limit of detection for the five analytes was 30 ng/mL (S/N=5, sampling 99.9 s at 10 psi). Compared with simple MECK, this stacking method possessed a 108- to 175-fold increase in sensitivity. This simple and sensitive stacking method could be used as a powerful tool for monitoring the illegal use of doping.

  19. Development of LSPR and SPR sensor for the detection of an anti-cancer drug for chemotherapy

    NASA Astrophysics Data System (ADS)

    Zhao, Sandy Shuo; Bolduc, Olivier R.; Colin, Damien Y.; Pelletier, Joelle N.; Masson, Jean-François

    2012-03-01

    The anti-cancer drug, methotrexate (MTX) as a strong inhibitor of human dihydrofolate reductase (hDHFR) has been studied in localized surface plasmon resonance (LSPR) and surface plasmon resonance (SPR) competitive binding assays with folic acid stabilized gold nanoparticles (FA AuNP). The latter with a diameter of 15 nm were prepared in a simple step with sequential characterization using UV-Vis, FTIR, and Raman. A LSPR competitive binding assay between different concentrations of MTX and FA AuNP for hDHFR in solution was designed to quantify MTX by using UV-Vis spectroscopy. Sensitivity of the assay was optimized with respect to both concentrations of the enzyme and FA. The detection and quantification of spiked MTX was demonstrated in phosphate buffer saline and in fetal bovine serum accompanied by solid-phase extraction treatment of the serum. In addition, this assay could also provide as a screening tool for potential inhibitors of hDHFR. In another perspective, MTX was measured in a competitive binding assay with FA AuNP for histidine-tagged hDHFR immobilized on a SPR sensitive surface. In this case, FA AuNP offer a secondary amplification of the analytical response which is indirectly proportional to the concentration of MTX. This alternative approach could contribute to the realization of direct detection of MTX in complex biological fluids. A comparison of characteristics and analytical parameters such as sensitivity, dynamic range and limit of detection between the LSPR and SPR sensing platforms will also be presented. Both assays offer potential in tackling real biological samples for the purpose of monitoring and validating anti-cancer drug levels in human serum during chemotherapy.

  20. Quantification of crystalline cellulose in lignocellulosic biomass using sum frequency generation (SFG) vibration spectroscopy and comparison with other analytical methods.

    PubMed

    Barnette, Anna L; Lee, Christopher; Bradley, Laura C; Schreiner, Edward P; Park, Yong Bum; Shin, Heenae; Cosgrove, Daniel J; Park, Sunkyu; Kim, Seong H

    2012-07-01

    The non-centrosymmetry requirement of sum frequency generation (SFG) vibration spectroscopy allows the detection and quantification of crystalline cellulose in lignocellulose biomass without spectral interferences from hemicelluloses and lignin. This paper shows a correlation between the amount of crystalline cellulose in biomass and the SFG signal intensity. Model biomass samples were prepared by mixing commercially available cellulose, xylan, and lignin to defined concentrations. The SFG signal intensity was found sensitive to a wide range of crystallinity, but varied non-linearly with the mass fraction of cellulose in the samples. This might be due to the matrix effects such as light scattering and absorption by xylan and lignin, as well as the non-linear density dependence of the SFG process itself. Comparison with other techniques such as XRD, FT-Raman, FT-IR and NMR demonstrate that SFG can be a complementary and sensitive tool to assess crystalline cellulose in biomass. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Dynamic Response of a Planetary Gear System Using a Finite Element/Contact Mechanics Model

    NASA Technical Reports Server (NTRS)

    Parker, Robert G.; Agashe, Vinayak; Vijayakar, Sandeep M.

    2000-01-01

    The dynamic response of a helicopter planetary gear system is examined over a wide range of operating speeds and torques. The analysis tool is a unique, semianalytical finite element formulation that admits precise representation of the tooth geometry and contact forces that are crucial in gear dynamics. Importantly, no a priori specification of static transmission error excitation or mesh frequency variation is required; the dynamic contact forces are evaluated internally at each time step. The calculated response shows classical resonances when a harmonic of mesh frequency coincides with a natural frequency. However, peculiar behavior occurs where resonances expected to be excited at a given speed are absent. This absence of particular modes is explained by analytical relationships that depend on the planetary configuration and mesh frequency harmonic. The torque sensitivity of the dynamic response is examined and compared to static analyses. Rotation mode response is shown to be more sensitive to input torque than translational mode response.

  2. Biosensors for Non-Invasive Detection of Celiac Disease Biomarkers in Body Fluids.

    PubMed

    Pasinszki, Tibor; Krebsz, Melinda

    2018-06-16

    Celiac disease is a chronic gluten-initiated autoimmune disorder that predominantly damages the mucosa of the small intestine in genetically-susceptible individuals. It affects a large and increasing number of the world’s population. The diagnosis of this disease and monitoring the response of patients to the therapy, which is currently a life-long gluten-free diet, require the application of reliable, rapid, sensitive, selective, simple, and cost-effective analytical tools. Celiac disease biomarker detection in full blood, serum, or plasma offers a non-invasive way to do this and is well-suited to being the first step of diagnosis. Biosensors provide a novel and alternative way to perform conventional techniques in biomarker sensing, in which electrode material and architecture play important roles in achieving sensitive, selective, and stable detection. There are many opportunities to build and modify biosensor platforms using various materials and detection methods, and the aim of the present review is to summarize developments in this field.

  3. Recovery of permittivity and depth from near-field data as a step toward infrared nanotomography.

    PubMed

    Govyadinov, Alexander A; Mastel, Stefan; Golmar, Federico; Chuvilin, Andrey; Carney, P Scott; Hillenbrand, Rainer

    2014-07-22

    The increasing complexity of composite materials structured on the nanometer scale requires highly sensitive analytical tools for nanoscale chemical identification, ideally in three dimensions. While infrared near-field microscopy provides high chemical sensitivity and nanoscopic spatial resolution in two dimensions, the quantitative extraction of material properties of three-dimensionally structured samples has not been achieved yet. Here we introduce a method to perform rapid recovery of the thickness and permittivity of simple 3D structures (such as thin films and nanostructures) from near-field measurements, and provide its first experimental demonstration. This is accomplished via a novel nonlinear invertible model of the imaging process, taking advantage of the near-field data recorded at multiple harmonics of the oscillation frequency of the near-field probe. Our work enables quantitative nanoscale-resolved optical studies of thin films, coatings, and functionalization layers, as well as the structural analysis of multiphase materials, among others. It represents a major step toward the further goal of near-field nanotomography.

  4. The Stanford-U.S. Geological Survey SHRIMP ion microprobe--a tool for micro-scale chemical and isotopic analysis

    USGS Publications Warehouse

    Bacon, Charles R.; Grove, Marty; Vazquez, Jorge A.; Coble, Matthew A.

    2012-01-01

    Answers to many questions in Earth science require chemical analysis of minute volumes of minerals, volcanic glass, or biological materials. Secondary Ion Mass Spectrometry (SIMS) is an extremely sensitive analytical method in which a 5–30 micrometer diameter "primary" beam of charged particles (ions) is focused on a region of a solid specimen to sputter secondary ions from 1–5 nanograms of the sample under high vacuum. The elemental abundances and isotopic ratios of these secondary ions are determined with a mass spectrometer. These results can be used for geochronology to determine the age of a region within a crystal thousands to billions of years old or to precisely measure trace abundances of chemical elements at concentrations as low as parts per billion. A partnership of the U.S. Geological Survey and the Stanford University School of Earth Sciences operates a large SIMS instrument, the Sensitive High-Resolution Ion Microprobe with Reverse Geometry (SHRIMP–RG) on the Stanford campus.

  5. Sensitivity and proportionality assessment of metabolites from microdose to high dose in rats using LC-MS/MS.

    PubMed

    Ni, Jinsong; Ouyang, Hui; Seto, Carmai; Sakuma, Takeo; Ellis, Robert; Rowe, Josh; Acheampong, Andrew; Welty, Devin; Szekely-Klepser, Gabriella

    2010-03-01

    The objective of this study was to evaluate the sensitivity requirement for LC-MS/MS as an analytical tool to characterize metabolites in plasma and urine at microdoses in rats and to investigate proportionality of metabolite exposure from a microdose of 1.67 µg/kg to a high dose of 5000 µg/kg for atorvastatin, ofloxacin, omeprazole and tamoxifen. Only the glucuronide metabolite of ofloxacin, the hydroxylation metabolite of omeprazole and the hydration metabolite of tamoxifen were characterized in rat plasma at microdose by LC-MS/MS. The exposure of detected metabolites of omeprazole and tamoxifen appeared to increase in a nonproportional manner with increasing doses. Exposure of ortho- and para-hydroxyatorvastatin, but not atorvastatin and lactone, increased proportionally with increasing doses. LC-MS/MS has demonstrated its usefulness for detecting and characterizing the major metabolites in plasma and urine at microdosing levels in rats. The exposure of metabolites at microdose could not simply be used to predict their exposure at higher doses.

  6. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  7. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    ERIC Educational Resources Information Center

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  8. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  9. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  10. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.

    PubMed

    Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria

    2017-06-15

    Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  11. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data

    PubMed Central

    Feikin, Daniel R.; Scott, J. Anthony G.; Zeger, Scott L.; Murdoch, David R.; O’Brien, Katherine L.; Deloria Knoll, Maria

    2017-01-01

    Abstract Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. PMID:28575372

  12. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  13. A Meta-Analytic Review of the Role of Child Anxiety Sensitivity in Child Anxiety

    ERIC Educational Resources Information Center

    Noel, Valerie A.; Francis, Sarah E.

    2011-01-01

    Conflicting findings exist regarding (1) whether anxiety sensitivity (AS) is a construct distinct from anxiety in children and (2) the specific nature of the role of AS in child anxiety. This study uses meta-analytic techniques to (1) determine whether youth (ages 6-18 years) have been reported to experience AS, (2) examine whether AS…

  14. Analytical sensitivity of four commonly used hCG point of care devices.

    PubMed

    Kamer, Sandy M; Foley, Kevin F; Schmidt, Robert L; Greene, Dina N

    2015-04-01

    Point of care (POC) hCG assays are often used to rule-out pregnancy and therefore diagnostic sensitivity, especially at low concentrations of hCG, is important. There are very few studies in the literature that seek to verify the claimed analytical sensitivity of hCG POC devices. The analytical sensitivity of four commonly used hCG POC devices (Alere hCG Combo Cassette, ICON 20 hCG, OSOM hCG Combo Test, and Sure-Vue Serum/Urine hCG-STAT) was challenged using urine samples (n=50) selected based on quantitative hCG concentrations. The majority of these specimens (n=40) had an hCG concentration between 20 and 200 U/L. Each specimen/device combination was reviewed by three individuals. Statistical calculations were performed using Stata 12. The analytical sensitivity of the OSOM was significantly lower inferior than that of the other POC devices. There was no significant difference in the sensitivity of the Alere, ICON 20 and Sure-Vue devices. There was no significant difference in the individual interpretation of the hCG POC results. All hCG POC devices evaluated in this study were susceptible to false negative results at low concentrations of urine hCG. Laboratorians and clinicians should be aware that there are limitations when using urine hCG POC devices to rule out early pregnancy. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  15. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  16. Using geovisual analytics in Google Earth to understand disease distribution: a case study of campylobacteriosis in the Czech Republic (2008-2012).

    PubMed

    Marek, Lukáš; Tuček, Pavel; Pászto, Vít

    2015-01-28

    Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.

  17. Fully 3D-Printed Preconcentrator for Selective Extraction of Trace Elements in Seawater.

    PubMed

    Su, Cheng-Kuan; Peng, Pei-Jin; Sun, Yuh-Chang

    2015-07-07

    In this study, we used a stereolithographic 3D printing technique and polyacrylate polymers to manufacture a solid phase extraction preconcentrator for the selective extraction of trace elements and the removal of unwanted salt matrices, enabling accurate and rapid analyses of trace elements in seawater samples when combined with a quadrupole-based inductively coupled plasma mass spectrometer. To maximize the extraction efficiency, we evaluated the effect of filling the extraction channel with ordered cuboids to improve liquid mixing. Upon automation of the system and optimization of the method, the device allowed highly sensitive and interference-free determination of Mn, Ni, Zn, Cu, Cd, and Pb, with detection limits comparable with those of most conventional methods. The system's analytical reliability was further confirmed through analyses of reference materials and spike analyses of real seawater samples. This study suggests that 3D printing can be a powerful tool for building multilayer fluidic manipulation devices, simplifying the construction of complex experimental components, and facilitating the operation of sophisticated analytical procedures for most sample pretreatment applications.

  18. Simultaneous determination of quinolones for veterinary use by high-performance liquid chromatography with electrochemical detection.

    PubMed

    Rodríguez Cáceres, M I; Guiberteau Cabanillas, A; Galeano Díaz, T; Martínez Cañas, M A

    2010-02-01

    A selective method based on high-performance liquid chromatography with electrochemical detection (HPLC-ECD) has been developed to enable simultaneous determination of three fluoroquinolones (FQs), namely danofloxacin (DANO), difloxacin (DIFLO) and sarafloxacin (SARA). The fluoroquinolones are separated on a Novapack C-18 column and detected in a high sensitivity amperometric cell at a potential of +0.8 V. Solid-phase extraction was used for the extraction of the analytes in real samples. The range of concentration examined varied from 10 to 150 ng g(-1) for danofloxacin, from 25 to 100 ng g(-1) for sarafloxacin and from 50 to 315 ng g(-1) for difloxacin, respectively. The method presents detection limits under 10 ng g(-1) and recoveries around 90% for the three analytes have been obtained in the experiments with fortified samples. This HPLC-ECD approach can be useful in the routine analysis of antibacterial residues being less expensive and less complicated than other more powerful tools as hyphenated techniques. 2009 Elsevier B.V. All rights reserved.

  19. Diagnostic accuracy of HLA-B*57:01 screening for the prediction of abacavir hypersensitivity and clinical utility of the test: a meta-analytic review.

    PubMed

    Cargnin, Sarah; Jommi, Claudio; Canonico, Pier Luigi; Genazzani, Armando A; Terrazzino, Salvatore

    2014-05-01

    To determine diagnostic accuracy of HLA-B*57:01 testing for prediction of abacavir-induced hypersensitivity and to quantify the clinical benefit of pretreatment screening through a meta-analytic review of published studies. A comprehensive search was performed up to June 2013. The methodological quality of relevant studies was assessed by the QUADAS-2 tool. The pooled diagnostic estimates were calculated using a random effect model. Despite the presence of heterogeneity in sensitivity or specificity estimates, the pooled diagnostic odds ratio to detect abacavir-induced hypersensitivity on the basis of clinical criteria was 33.07 (95% CI: 22.33-48.97, I(2): 13.9%), while diagnostic odds ratio for detection of immunologically confirmed abacavir hypersensitivity was 1141 (95% CI: 409-3181, I(2): 0%). Pooled analysis of risk ratio showed that prospective HLA-B*57:01 testing significantly reduced the incidence of abacavir-induced hypersensitivity. This meta-analysis demonstrates an excellent diagnostic accuracy of HLA-B*57:01 testing to detect immunologically confirmed abacavir hypersensitivity and corroborates existing recommendations.

  20. Current trends in nanobiosensor technology

    PubMed Central

    Wu, Diana; Langer, Robert S

    2014-01-01

    The development of tools and processes used to fabricate, measure, and image nanoscale objects has lead to a wide range of work devoted to producing sensors that interact with extremely small numbers (or an extremely small concentration) of analyte molecules. These advances are particularly exciting in the context of biosensing, where the demands for low concentration detection and high specificity are great. Nanoscale biosensors, or nanobiosensors, provide researchers with an unprecedented level of sensitivity, often to the single molecule level. The use of biomolecule-functionalized surfaces can dramatically boost the specificity of the detection system, but can also yield reproducibility problems and increased complexity. Several nanobiosensor architectures based on mechanical devices, optical resonators, functionalized nanoparticles, nanowires, nanotubes, and nanofibers have been demonstrated in the lab. As nanobiosensor technology becomes more refined and reliable, it is likely it will eventually make its way from the lab to the clinic, where future lab-on-a-chip devices incorporating an array of nanobiosensors could be used for rapid screening of a wide variety of analytes at low cost using small samples of patient material. PMID:21391305

  1. Vanishing tattoo multi-sensor for biomedical diagnostics

    NASA Astrophysics Data System (ADS)

    Moczko, E.; Meglinski, I.; Piletsky, S.

    2008-04-01

    Currently, precise non-invasive diagnostics systems for the real-time multi detection and monitoring of physiological parameters and chemical analytes in the human body are urgently required by clinicians, physiologists and bio-medical researchers. We have developed a novel cost effective smart 'vanishing tattoo' (similar to temporary child's tattoos) consisting of environmental-sensitive dyes. Painlessly impregnated into the skin the smart tattoo is capable of generating optical/fluorescence changes (absorbance, transmission, reflectance, emission and/or luminescence within UV, VIS or NIR regions) in response to physical or chemical changes. These changes allow the identification of colour pattern changes similar to bar-code scanning. Such a system allows an easy, cheap and robust comprehensive detection of various parameters and analytes in a small volume of sample (e.g. variations in pH, temperature, ionic strength, solvent polarity, presence of redox species, surfactants, oxygen). These smart tattoos have possible applications in monitoring the progress of disease and transcutaneous drug delivery. The potential of this highly innovative diagnostic tool is wide and diverse and can impact on routine clinical diagnostics, general therapeutic management, skin care and cosmetic products testing as well as fundamental physiological investigations.

  2. Vanishing "tattoo" multisensor for biomedical diagnostics

    NASA Astrophysics Data System (ADS)

    Moczko, E.; Meglinski, I.; Piletsky, S.

    2008-02-01

    Currently, precise non-invasive diagnostics systems for the real-time multi detection and monitoring of physiological parameters and chemical analytes in the human body are urgently required by clinicians, physiologists and bio-medical researchers. We have developed a novel cost effective smart 'vanishing tattoo' (similar to temporary child's tattoos) consisting of environmental-sensitive dyes. Painlessly impregnated into the skin the smart tattoo is capable of generating optical/fluorescence changes (absorbance, transmission, reflectance, emission and/or luminescence within UV, VIS or NIR regions) in response to physical or chemical changes. These changes allow the identification of colour pattern changes similar to bar-code scanning. Such a system allows an easy, cheap and robust comprehensive detection of various parameters and analytes in a small volume of sample (e.g. variations in pH, temperature, ionic strength, solvent polarity, presence of redox species, surfactants, oxygen). These smart tattoos have possible applications in monitoring the progress of disease and transcutaneous drug delivery. The potential of this highly innovative diagnostic tool is wide and diverse and can impact on routine clinical diagnostics, general therapeutic management, skin care and cosmetic products testing as well as fundamental physiological investigations.

  3. Rapid and sensitive insulated isothermal PCR for point-of-need feline leukaemia virus detection.

    PubMed

    Wilkes, Rebecca P; Anis, Eman; Dunbar, Dawn; Lee, Pei-Yu A; Tsai, Yun-Long; Lee, Fu-Chun; Chang, Hsiao-Fen G; Wang, Hwa-Tang T; Graham, Elizabeth M

    2018-04-01

    Objectives Feline leukaemia virus (FeLV), a gamma retrovirus, causes diseases of the feline haematopoietic system that are invariably fatal. Rapid and accurate testing at the point-of-need (PON) supports prevention of virus spread and management of clinical disease. This study evaluated the performance of an insulated isothermal PCR (iiPCR) that detects proviral DNA, and a reverse transcription (RT)-iiPCR that detects both viral RNA and proviral DNA, for FeLV detection at the PON. Methods Mycoplasma haemofelis, feline coronavirus, feline herpesvirus, feline calicivirus and feline immunodeficiency virus were used to test analytical specificity. In vitro transcribed RNA, artificial plasmid, FeLV strain American Type Culture Collection VR-719 and a clinical FeLV isolate were used in the analytical sensitivity assays. A retrospective study including 116 clinical plasma and serum samples that had been tested with virus isolation, real-time PCR and ELISA, and a prospective study including 150 clinical plasma and serum samples were implemented to evaluate the clinical performances of the iiPCR-based methods for FeLV detection. Results Ninety-five percent assay limit of detection was calculated to be 16 RNA and five DNA copies for the RT-iiPCR, and six DNA copies for the iiPCR. Both reactions had analytical sensitivity comparable to a reference real-time PCR (qPCR) and did not detect five non-target feline pathogens. The clinical performance of the RT-iiPCR and iiPCR had 98.82% agreement (kappa[κ] = 0.97) and 100% agreement (κ = 1.0), respectively, with the qPCR (n = 85). The agreement between an automatic nucleic extraction/RT-iiPCR system and virus isolation to detect FeLV in plasma or serum was 95.69% (κ = 0.95) and 98.67% (κ = 0.85) in a retrospective (n = 116) and a prospective (n = 150) study, respectively. Conclusions and relevance These results suggested that both RT-iiPCR and iiPCR assays can serve as reliable tools for PON FeLV detection.

  4. Learner Dashboards a Double-Edged Sword? Students' Sense-Making of a Collaborative Critical Reading and Learning Analytics Environment for Fostering 21st-Century Literacies

    ERIC Educational Resources Information Center

    Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon

    2017-01-01

    The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…

  5. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  6. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  7. An analytical strategy to investigate Semen Strychni nephrotoxicity based on simultaneous HILIC-ESI-MS/MS detection of Semen Strychni alkaloids, tyrosine and tyramine in HEK 293t cell lysates.

    PubMed

    Gu, Liqiang; Hou, Pengyi; Zhang, Ruowen; Liu, Ziying; Bi, Kaishun; Chen, Xiaohui

    2016-10-15

    A Previous metabolomics study has demonstrated that tyrosine metabolism might be disrupted by treating with Semen Strychni on the cell nephrotoxicity model. To investigate the relationship between Semen Strychni alkaloids (SAs) and endogenous tyrosine, tyramine under the nephrotoxicity condition, an HILIC-ESI-MS/MS based analytical strategy was applied in this study. Based on the established Semen Strychni nephrotoxicity cell model, strychnine and brucine were identified and screened as the main SAs by an HPLC-Q Exactive hybrid quadrupole Orbitrap mass system. Then, a sensitive HILIC-ESI-MS/MS method was developed to simultaneously monitor strychnine, brucine, tyrosine and tyramine in cell lysate. The analytes were separated by a Shiseido CAPCELL CORE PC (150mm×2.1mm, 2.7μm) HILIC column in an acetonitrile/0.1% formic acid gradient system. All the calibration curves were linear with regression coefficients above 0.9924. The absolute recoveries were more than 80.5% and the matrix effects were between 91.6%-107.0%. With the developed method, analytes were successfully determined in cell lysates. Decreased levels of tyrosine and tyramine were observed only in combination with increased levels of SAs, indicating that the disturbance of tyrosine metabolism might be induced by the accumulation of SAs in kidney cell after exposure of Semen Strychni. The HILIC-ESI-MS/MS based analytical strategy is a useful tool to reveal the relationships between the toxic herb components and the endogenous metabolite profiling in the toxicity investigation of herb medicines. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Surface-enhanced Raman spectroscopy for the detection of pathogenic DNA and protein in foods

    NASA Astrophysics Data System (ADS)

    Chowdhury, Mustafa H.; Atkinson, Brad; Good, Theresa; Cote, Gerard L.

    2003-07-01

    Traditional Raman spectroscopy while extremely sensitive to structure and conformation, is an ineffective tool for the detection of bioanalytes at the sub milimolar level. Surface Enhanced Raman Spectroscopy (SERS) is a technique developed more recently that has been used with applaudable success to enhance the Raman cross-section of a molecule by factors of 106 to 1014. This technique can be exploited in a nanoscale biosensor for the detection of pathogenic proteins and DNA in foods by using a biorecognition molecule to bring a target analyte in close proximity to the mental surface. This is expected to produce a SERS signal of the target analyte, thus making it possible to easily discriminate between the target analyte and possible confounders. In order for the sensor to be effective, the Raman spectra of the target analyte would have to be distinct from that of the biorecognition molecule, as both would be in close proximity to the metal surface and thus be subjected to the SERS effect. In our preliminary studies we have successfully used citrate reduced silver colloidal particles to obtain unique SERS spectra of α-helical and β-sheet bovine serum albumin (BSA) that served as models of an α helical antiobiody (biorecognition element) and a β-sheet target protein (pathogenic prion). In addition, the unique SERS spectra of double stranded and single stranded DNA were also obtained where the single stranded DNA served as the model for the biorecognition element and the double stranded DNA served as themodel for the DNA probe/target hybrid. This provides a confirmation of the feasibility of the method which opens opportunities for potentially wide spread applications in the detection of food pathogens, biowarefare agents, andother bio-analytes.

  9. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.

  10. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  11. Monitoring of antisolvent crystallization of sodium scutellarein by combined FBRM-PVM-NIR.

    PubMed

    Liu, Xuesong; Sun, Di; Wang, Feng; Wu, Yongjiang; Chen, Yong; Wang, Longhu

    2011-06-01

    Antisolvent crystallization can be used as an alternative to cooling or evaporation for the separation and purification of solid product in the pharmaceutical industry. To improve the process understanding of antisolvent crystallization, the use of in-line tools is vital. In this study, the process analytical technology (PAT) tools including focused beam reflectance measurement (FBRM), particle video microscope (PVM), and near-infrared spectroscopy (NIRS) were utilized to monitor antisolvent crystallization of sodium scutellarein. FBRM was used to monitor chord count and chord length distribution of sodium scutellarein particles in the crystallizer, and PVM, as an in-line video camera, provided pictures imaging particle shape and dimension. In addition, a quantitative model of PLS was established by in-line NIRS to detect the concentration of sodium scutellarein in the solvent and good calibration statistics were obtained (r(2) = 0.976) with the residual predictive deviation value of 11.3. The discussion over sensitivities, strengths, and weaknesses of the PAT tools may be helpful in selection of suitable PAT techniques. These in-line techniques eliminate the need for sample preparation and offer a time-saving approach to understand and monitor antisolvent crystallization process. Copyright © 2011 Wiley-Liss, Inc.

  12. Analytical performance of a bronchial genomic classifier.

    PubMed

    Hu, Zhanzhi; Whitney, Duncan; Anderson, Jessica R; Cao, Manqiu; Ho, Christine; Choi, Yoonha; Huang, Jing; Frink, Robert; Smith, Kate Porta; Monroe, Robert; Kennedy, Giulia C; Walsh, P Sean

    2016-02-26

    The current standard practice of lung lesion diagnosis often leads to inconclusive results, requiring additional diagnostic follow up procedures that are invasive and often unnecessary due to the high benign rate in such lesions (Chest 143:e78S-e92, 2013). The Percepta bronchial genomic classifier was developed and clinically validated to provide more accurate classification of lung nodules and lesions that are inconclusive by bronchoscopy, using bronchial brushing specimens (N Engl J Med 373:243-51, 2015, BMC Med Genomics 8:18, 2015). The analytical performance of the Percepta test is reported here. Analytical performance studies were designed to characterize the stability of RNA in bronchial brushing specimens during collection and shipment; analytical sensitivity defined as input RNA mass; analytical specificity (i.e. potentially interfering substances) as tested on blood and genomic DNA; and assay performance studies including intra-run, inter-run, and inter-laboratory reproducibility. RNA content within bronchial brushing specimens preserved in RNAprotect is stable for up to 20 days at 4 °C with no changes in RNA yield or integrity. Analytical sensitivity studies demonstrated tolerance to variation in RNA input (157 ng to 243 ng). Analytical specificity studies utilizing cancer positive and cancer negative samples mixed with either blood (up to 10 % input mass) or genomic DNA (up to 10 % input mass) demonstrated no assay interference. The test is reproducible from RNA extraction through to Percepta test result, including variation across operators, runs, reagent lots, and laboratories (standard deviation of 0.26 for scores on > 6 unit scale). Analytical sensitivity, analytical specificity and robustness of the Percepta test were successfully verified, supporting its suitability for clinical use.

  13. Spectroelectrochemical Sensors: New Polymer Films for Improved Sensitivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Laura K.; Seliskar, Carl J.; Bryan, Samuel A.

    2014-10-31

    The selectivity of an optical sensor can be improved by combining optical detection with electrochemical oxidation or reduction of the target analyte to change its spectral properties. The changing signal can distinguish the analyte from interferences with similar spectral properties that would otherwise interfere. The analyte is detected by measuring the intensity of the electrochemically modulated signal. In one form this spectroelectrochemical sensor consists of an optically transparent electrode (OTE) coated with a film that preconcentrates the target analyte. The OTE functions as an optical waveguide for attenuated total reflectance (ATR) spectroscopy, which detects the analyte by absorption. Sensitivity reliesmore » in part on a large change in molar absorptivity between the two oxidation states used for electrochemical modulation of the optical signal. A critical part of the sensor is the ion selective film. It should preconcentrate the analyte and exclude some interferences. At the same time the film must not interfere with the electrochemistry or the optical detection. Therefore, since the debut of the sensor’s concept one major focus of our group has been developing appropriate films for different analytes. Here we report the development of a series of quaternized poly(vinylpyridine)-co-styrene (QPVP-co-S) anion exchange films for use in spectroelectrochemical sensors to enable sensitive detection of target anionic analytes in complex samples. The films were either 10% or 20% styrene and were prepared with varying degrees of quaternized pyridine groups, up to 70%. Films were characterized with respect to thickness with spectroscopic ellipsometry, degree of quaternization with FTIR, and electrochemically and spectroelectrochemically using the anions ferrocyanide and pertechnetate.« less

  14. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States.

    PubMed

    Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon

    2018-05-18

    We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  15. Application of Raman Spectroscopy for Nondestructive Evaluation of Composite Materials

    NASA Technical Reports Server (NTRS)

    Washer, Glenn A.; Brooks, Thomas M. B.; Saulsberry, Regor

    2007-01-01

    This paper will present an overview of efforts to investigate the application of Raman spectroscopy for the characterization of Kevlar materials. Raman spectroscopy is a laser technique that is sensitive to molecular interactions in materials such as Kevlar, graphite and carbon used in composite materials. The overall goal of this research reported here is to evaluate Raman spectroscopy as a potential nondestructive evaluation (NDE) tool for the detection of stress rupture in Kevlar composite over-wrapped pressure vessels (COPVs). Characterization of the Raman spectra of Kevlar yarn and strands will be presented and compared with analytical models provided in the literature. Results of testing to investigate the effects of creep and high-temperature aging on the Raman spectra will be presented.

  16. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  17. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    ERIC Educational Resources Information Center

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  18. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  19. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  20. Application of next generation sequencing toward sensitive detection of enteric viruses isolated from celery samples as an example of produce.

    PubMed

    Yang, Zhihui; Mammel, Mark; Papafragkou, Efstathia; Hida, Kaoru; Elkins, Christopher A; Kulka, Michael

    2017-11-16

    Next generation sequencing (NGS) holds promise as a single application for both detection and sequence identification of foodborne viruses; however, technical challenges remain due to anticipated low quantities of virus in contaminated food. In this study, with a focus on data analysis using several bioinformatics tools, we applied NGS toward amplification-independent detection and identification of norovirus at low copy (<10 3 copies) or within multiple strains from produce. Celery samples were inoculated with human norovirus (stool suspension) either as a single norovirus strain, a mixture of strains (GII.4 and GII.6), or a mixture of different species (hepatitis A virus and norovirus). Viral RNA isolation and recovery was confirmed by RT-qPCR, and optimized for library generation and sequencing without amplification using the Illumina MiSeq platform. Extracts containing either a single virus or a two-virus mixture were analyzed using two different analytic approaches to achieve virus detection and identification. First an overall assessment of viral genome coverage for samples varying in copy numbers (1.1×10 3 to 1.7×10 7 ) and genomic content (single or multiple strains in various ratios) was completed by reference-guided mapping. Not unexpectedly, this targeted approach to identification was successful in correctly mapping reads, thus identifying each virus contained in the inoculums even at low copy (estimated at 12 copies). For the second (metagenomic) approach, samples were treated as "unknowns" for data analyses using (i) a sequence-based alignment with a local database, (ii) an "in-house" k-mer tool, (iii) a commercially available metagenomics bioinformatic analysis platform cosmosID, and (iv) an open-source program Kraken. Of the four metagenomics tools applied in this study, only the local database alignment and in-house k-mer tool were successful in detecting norovirus (as well as HAV) at low copy (down to <10 3 copies) and within a mixture of virus strains or species. The results of this investigation provide support for continued investigation into the development and integration of these analytical tools for identification and detection of foodborne viruses. Published by Elsevier B.V.

  1. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  2. Eliciting expert opinion for economic models: an applied example.

    PubMed

    Leal, José; Wordsworth, Sarah; Legood, Rosa; Blair, Edward

    2007-01-01

    Expert opinion is considered as a legitimate source of information for decision-analytic modeling where required data are unavailable. Our objective was to develop a practical computer-based tool for eliciting expert opinion about the shape of the uncertainty distribution around individual model parameters. We first developed a prepilot survey with departmental colleagues to test a number of alternative approaches to eliciting opinions on the shape of the uncertainty distribution around individual parameters. This information was used to develop a survey instrument for an applied clinical example. This involved eliciting opinions from experts to inform a number of parameters involving Bernoulli processes in an economic model evaluating DNA testing for families with a genetic disease, hypertrophic cardiomyopathy. The experts were cardiologists, clinical geneticists, and laboratory scientists working with cardiomyopathy patient populations and DNA testing. Our initial prepilot work suggested that the more complex elicitation techniques advocated in the literature were difficult to use in practice. In contrast, our approach achieved a reasonable response rate (50%), provided logical answers, and was generally rated as easy to use by respondents. The computer software user interface permitted graphical feedback throughout the elicitation process. The distributions obtained were incorporated into the model, enabling the use of probabilistic sensitivity analysis. There is clearly a gap in the literature between theoretical elicitation techniques and tools that can be used in applied decision-analytic models. The results of this methodological study are potentially valuable for other decision analysts deriving expert opinion.

  3. Sensor arrays for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Freund, Michael S. (Inventor); Lewis, Nathan S. (Inventor)

    2000-01-01

    A sensor array for detecting an analyte in a fluid, comprising at least first and second chemically sensitive resistors electrically connected to an electrical measuring apparatus, wherein each of the chemically sensitive resistors comprises a mixture of nonconductive material and a conductive material. Each resistor provides an electrical path through the mixture of nonconductive material and the conductive material. The resistors also provide a difference in resistance between the conductive elements when contacted with a fluid comprising an analyte at a first concentration, than when contacted with an analyte at a second different concentration. A broad range of analytes can be detected using the sensors of the present invention. Examples of such analytes include, but are not limited to, alkanes, alkenes, alkynes, dienes, alicyclic hydrocarbons, arenes, alcohols, ethers, ketones, aldehydes, carbonyls, carbanions, polynuclear aromatics, organic derivatives, biomolecules, sugars, isoprenes, isoprenoids and fatty acids. Moreover, applications for the sensors of the present invention include, but are not limited to, environmental toxicology, remediation, biomedicine, material quality control, food monitoring and agricultural monitoring.

  4. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  5. Polymeric assemblies for sensitive colorimetric assays

    DOEpatents

    Charych, Deborah

    2000-01-01

    The presently claimed invention relates to polymeric assemblies which visibly change color in the presence of analyte. In particular, the presently claimed invention relates to liposomes comprising a plurality of lipid monomers, which comprises a polymerizable group, a hydrophilic head group and a hydrophobic tail group, and one or more ligands. Overall carbon chain length, and polymerizable group positioning on the monomer influence color change sensitivity to analyte concentrations.

  6. Semiquantitative Multiplexed Tandem PCR for Detection and Differentiation of Four Theileria orientalis Genotypes in Cattle

    PubMed Central

    Perera, Piyumali K.; Gasser, Robin B.; Firestone, Simon M.; Smith, Lee; Roeber, Florian

    2014-01-01

    Oriental theileriosis is an emerging, tick-borne disease of bovines in the Asia-Pacific region and is caused by one or more genotypes of the Theileria orientalis complex. This study aimed to establish and validate a multiplexed tandem PCR (MT-PCR) assay using three distinct markers (major piroplasm surface protein, 23-kDa piroplasm membrane protein, and the first internal transcribed spacer of nuclear DNA), for the simultaneous detection and semiquantification of four genotypes (Buffeli, Chitose, Ikeda, and type 5) of the T. orientalis complex. Analytical specificity, analytical sensitivity, and repeatability of the established MT-PCR assay were assessed in a series of experiments. Subsequently, the assay was evaluated using 200 genomic DNA samples collected from cattle from farms on which oriental theileriosis outbreaks had occurred, and 110 samples from a region where no outbreaks had been reported. The results showed the MT-PCR assay specifically and reproducibly detected the expected genotypes (i.e., genotypes Buffeli, Chitose, Ikeda, and type 5) of the T. orientalis complex, reliably differentiated them, and was able to detect as little as 1 fg of genomic DNA from each genotype. The diagnostic specificity and sensitivity of the MT-PCR were estimated at 94.0% and 98.8%, respectively. The MT-PCR assay established here is a practical and effective diagnostic tool for the four main genotypes of T. orientalis complex in Australia and should assist studies of the epidemiology and pathophysiology of oriental theileriosis in the Asia-Pacific region. PMID:25339402

  7. Data Intensive Computing on Amazon Web Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magana-Zook, S. A.

    The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less

  8. Biological Matrix Effects in Quantitative Tandem Mass Spectrometry-Based Analytical Methods: Advancing Biomonitoring

    PubMed Central

    Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd

    2015-01-01

    The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585

  9. Active matrix-based collection of airborne analytes: an analyte recording chip providing exposure history and finger print.

    PubMed

    Fang, Jun; Park, Se-Chul; Schlag, Leslie; Stauden, Thomas; Pezoldt, Jörg; Jacobs, Heiko O

    2014-12-03

    In the field of sensors that target the detection of airborne analytes, Corona/lens-based-collection provides a new path to achieve a high sensitivity. An active-matrix-based analyte collection approach referred to as "airborne analyte memory chip/recorder" is demonstrated, which takes and stores airborne analytes in a matrix to provide an exposure history for off-site analysis. © 2014 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Applications of surface analytical techniques in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Qian, Gujie; Li, Yubiao; Gerson, Andrea R.

    2015-03-01

    This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences to show how each technique is applied and used to obtain specific information and to resolve real problems, which forms the central theme of this review. Although this review focuses on applications of these techniques to study mineralogical and geological samples, we also anticipate that researchers from other research areas such as Material and Environmental Sciences may benefit from this review.

  11. Haze Gray Paint and the U.S. Navy: A Procurement Process Review

    DTIC Science & Technology

    2017-12-01

    support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane

  12. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and

  13. Visualization and Analytics Software Tools for Peregrine System |

    Science.gov Websites

    R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open

  14. Dynamic Vision for Control

    DTIC Science & Technology

    2006-07-27

    unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry

  15. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory

  16. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  17. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  18. A spline-based approach for computing spatial impulse responses.

    PubMed

    Ellis, Michael A; Guenther, Drake; Walker, William F

    2007-05-01

    Computer simulations are an essential tool for the design of phased-array ultrasonic imaging systems. FIELD II, which determines the two-way temporal response of a transducer at a point in space, is the current de facto standard for ultrasound simulation tools. However, the need often arises to obtain two-way spatial responses at a single point in time, a set of dimensions for which FIELD II is not well optimized. This paper describes an analytical approach for computing the two-way, far-field, spatial impulse response from rectangular transducer elements under arbitrary excitation. The described approach determines the response as the sum of polynomial functions, making computational implementation quite straightforward. The proposed algorithm, named DELFI, was implemented as a C routine under Matlab and results were compared to those obtained under similar conditions from the well-established FIELD II program. Under the specific conditions tested here, the proposed algorithm was approximately 142 times faster than FIELD II for computing spatial sensitivity functions with similar amounts of error. For temporal sensitivity functions with similar amounts of error, the proposed algorithm was about 1.7 times slower than FIELD II using rectangular elements and 19.2 times faster than FIELD II using triangular elements. DELFI is shown to be an attractive complement to FIELD II, especially when spatial responses are needed at a specific point in time.

  19. Law enforcement tools available at the Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofstetter, K.J.

    A number of nuclear technologies developed and applied at the Savannah River Site in support of nuclear weapons material production and environmental remediation can be applied to problems in law enforcement. Techniques and equipment for high-sensitivity analyses of samples are available to identify and quantify trace elements and establish origins and histories of forensic evidence removed from crime scenes. While some of theses capabilities are available at local crime laboratories, state-of-the-art equipment and breakthroughs in analytical techniques are continually being developed at DOE laboratories. Extensive experience with the handling of radioactive samples at the DOE labs minimizes the chances ofmore » cross-contamination of evidence received from law enforcement. In addition to high-sensitivity analyses, many of the field techniques developed for use in a nuclear facility can assist law enforcement personnel in detecting illicit materials and operations, in retrieving of pertinent evidence and in surveying crime scenes. Some of these tools include chemical sniffers, hand-held detectors, thermal imaging, etc. In addition, mobile laboratories can be deployed to a crime scene to provide field screening of potential evidence. A variety of portable sensors can be deployed on vehicle, aerial, surface or submersible platforms to assist in the location of pertinent evidence or illicit operations. Several specific nuclear technologies available to law enforcement and their potential uses are discussed.« less

  20. Trace element study in scallop shells by laser ablation ICP-MS: the example of Ba/Ca ratios

    NASA Astrophysics Data System (ADS)

    Lorrain, A.; Pécheyran, C.; Paulet, Y.-M.; Chauvaud, L.; Amouroux, D.; Krupp, E.; Donard, O.

    2003-04-01

    As scallop shells grow incrementally at a rate of one line per day, environmental changes could then be evidenced on a daily basis. As an example for trace element incorporation studies, barium is a geochemical tracer that can be directly related to oceanic primary productivity. Hence, monitoring Ba/Ca variations in a scallop shell should give information about phytoplanktonic events encountered day by day during its life. The very high spatial resolution (typically 40 - 200 µm) and the high elemental sensitivity required can only be achieved by the combination of laser ablation coupled to inductively coupled plasma mass spectrometry. This study demonstrates that Laser ablation coupled to ICP-MS determination is a relevant tool for high resolution distribution measurement of trace elements in calcite matrix. The ablation strategy related to single line rastering and calcium normalisation were found to be the best analytical conditions in terms of reproducibility and sensitivity. The knowledge of P. maximus growth rings periodicity (daily), combined with LA-ICP-MS micro analysis allows the acquisition of time dated profiles with high spatial and thus temporal resolution. This resolution makes P. maximus a potential tool for environmental reconstruction and especially for accurate calibration of proxies. However, the relations among Ba/Ca peaks and phytoplanktonic events differed according to the animals and some inter-annual discrepancies complexify the interpretation.

  1. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  2. Analytical advantages of copolymeric microspheres for fluorimetric sensing - tuneable sensitivity sensors and titration agents.

    PubMed

    Stelmach, Emilia; Maksymiuk, Krzysztof; Michalska, Agata

    2017-01-15

    Analytical benefits related to application of copolymeric microspheres containing different number of carboxylic acid mers have been studied on example of acrylate copolymers. These structures can be used as a reagent in heterogeneous pH titration, benefiting from different number of reactive groups - i.e. different concentration of a titrant - within the series of copolymers. Thus introducing the same amount of different microspheres from a series to the sample, different amount of the titrant is introduced. Copolymeric microspheres also can be used as optical sensors - in this respect the increasing number of reactive groups in the series is useful to improve the analytical performance of microprobes - sensitivity of determination or/and response range. The increase in ion-permeability of the spheres with increasing number of reactive mers is advantageous. It is shown that for pH sensitive microspheres containing higher number of carboxyl groups the higher sensitivity for alkaline pH samples is observed for an indicator present in the beads. The significant increase of optical responses is related to enhanced ion transport within the microspheres. For zinc or potassium ions model sensors tested it was shown that by choice of pH conditions and type of microspheres from the series, the optical responses can be tuned - to enhance sensitivity for analyte concentration change as well as to change the response pattern from sigmoidal (higher sensitivity, narrow range) to linear (broader response range). For classical optode systems (e.g. microspheres containing an optical transducer - pH sensitive dye and optically silent ionophore - receptor) copolymeric microspheres containing carboxylic acid mers in their structure allow application of the sensor in alkaline pH range, which is usually inaccessible for applied optical transducer. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Analytical strategy for the determination of various arsenic species in landfill leachate containing high concentrations of chlorine and organic carbon by HPLC-ICPMS

    NASA Astrophysics Data System (ADS)

    Bae, J.; An, J.; Kim, J.; Jung, H.; Kim, K.; Yoon, C.; Yoon, H.

    2012-12-01

    As a variety of wastes containing arsenic are disposed of in landfills, such facilities can play a prominent role in disseminating arsenic sources to the environment. Since it is widely recognized that arsenic toxicity is highly dependent on its species, accurate determination of various arsenic species should be considered as one of the essential goals to properly account for the potential health risk of arsenic in human and the environment. The inductively coupled plasma mass spectrometry linked to high performance liquid chromatography (HPLC-ICPMS) is acknowledged as one of the most important tools for the trace analysis of metallic speciation because of its superior separation capability and detectability. However, the complexity of matrices can cause severe interferences in the analysis results, which is the problem often encountered with HPLC-ICPMS system. High concentration of organic carbon in a sample solution causes carbon build-up on the skimmer and sampling cone, which reduces analytical sensitivity and requires a high maintenance level for its cleaning. In addition, argon from the plasma and chlorine from the sample matrix may combine to form 40Ar35Cl, which has the same nominal mass to charge (m/z) ratio as arsenic. In this respect, analytical strategy for the determination of various arsenic species (e.g., inorganic arsenite and arsenate, monomethylarsonic acid, dimethylarsinic acid, dimethyldithioarsinic acid, and arsenobetaine) in landfill leachate containing high concentrations of chlorine and organic carbon was developed in the present study. Solid phase extraction disk (i.e., C18 disk), which does not significantly adsorb any target arsenic species, was used to remove organic carbon in sample solutions. In addition, helium (He) gas was injected into the collision reaction cell equipped in ICPMS to collapse 40Ar35Cl into individual 40Ar and 35Cl. Although He gas also decreased arsenic intensity by blocking 75As, its signal to noise ratio significantly increased after injecting He gas. We demonstrated that the analytical strategy was achieved improved sensitivity for the determination of various arsenic species in the landfill leachate as one of the complex matrices.

  4. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.

  5. Distributed Generation Interconnection Collaborative | NREL

    Science.gov Websites

    , reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability

  6. An analytic formula for H-infinity norm sensitivity with applications to control system design

    NASA Technical Reports Server (NTRS)

    Giesy, Daniel P.; Lim, Kyong B.

    1992-01-01

    An analytic formula for the sensitivity of singular value peak variation with respect to parameter variation is derived. As a corollary, the derivative of the H-infinity norm of a stable transfer function with respect to a parameter is presented. It depends on some of the first two derivatives of the transfer function with respect to frequency and the parameter. For cases when the transfer function has a linear system realization whose matrices depend on the parameter, analytic formulas for these first two derivatives are derived, and an efficient algorithm for calculating them is discussed. Examples are given which provide numerical verification of the H-infinity norm sensitivity formula and which demonstrate its utility in designing control systems satisfying H-infinity norm constraints. In the appendix, derivative formulas for singular values are paraphrased.

  7. Analytical validation of a next generation sequencing liquid biopsy assay for high sensitivity broad molecular profiling.

    PubMed

    Plagnol, Vincent; Woodhouse, Samuel; Howarth, Karen; Lensing, Stefanie; Smith, Matt; Epstein, Michael; Madi, Mikidache; Smalley, Sarah; Leroy, Catherine; Hinton, Jonathan; de Kievit, Frank; Musgrave-Brown, Esther; Herd, Colin; Baker-Neblett, Katherine; Brennan, Will; Dimitrov, Peter; Campbell, Nathan; Morris, Clive; Rosenfeld, Nitzan; Clark, James; Gale, Davina; Platt, Jamie; Calaway, John; Jones, Greg; Forshew, Tim

    2018-01-01

    Circulating tumor DNA (ctDNA) analysis is being incorporated into cancer care; notably in profiling patients to guide treatment decisions. Responses to targeted therapies have been observed in patients with actionable mutations detected in plasma DNA at variant allele fractions (VAFs) below 0.5%. Highly sensitive methods are therefore required for optimal clinical use. To enable objective assessment of assay performance, detailed analytical validation is required. We developed the InVisionFirst™ assay, an assay based on enhanced tagged amplicon sequencing (eTAm-Seq™) technology to profile 36 genes commonly mutated in non-small cell lung cancer (NSCLC) and other cancer types for actionable genomic alterations in cell-free DNA. The assay has been developed to detect point mutations, indels, amplifications and gene fusions that commonly occur in NSCLC. For analytical validation, two 10mL blood tubes were collected from NSCLC patients and healthy volunteer donors. In addition, contrived samples were used to represent a wide spectrum of genetic aberrations and VAFs. Samples were analyzed by multiple operators, at different times and using different reagent Lots. Results were compared with digital PCR (dPCR). The InVisionFirst assay demonstrated an excellent limit of detection, with 99.48% sensitivity for SNVs present at VAF range 0.25%-0.33%, 92.46% sensitivity for indels at 0.25% VAF and a high rate of detection at lower frequencies while retaining high specificity (99.9997% per base). The assay also detected ALK and ROS1 gene fusions, and DNA amplifications in ERBB2, FGFR1, MET and EGFR with high sensitivity and specificity. Comparison between the InVisionFirst assay and dPCR in a series of cancer patients showed high concordance. This analytical validation demonstrated that the InVisionFirst assay is highly sensitive, specific and robust, and meets analytical requirements for clinical applications.

  8. Analytical validation of a next generation sequencing liquid biopsy assay for high sensitivity broad molecular profiling

    PubMed Central

    Howarth, Karen; Lensing, Stefanie; Smith, Matt; Epstein, Michael; Madi, Mikidache; Smalley, Sarah; Leroy, Catherine; Hinton, Jonathan; de Kievit, Frank; Musgrave-Brown, Esther; Herd, Colin; Baker-Neblett, Katherine; Brennan, Will; Dimitrov, Peter; Campbell, Nathan; Morris, Clive; Rosenfeld, Nitzan; Clark, James; Gale, Davina; Platt, Jamie; Calaway, John; Jones, Greg

    2018-01-01

    Circulating tumor DNA (ctDNA) analysis is being incorporated into cancer care; notably in profiling patients to guide treatment decisions. Responses to targeted therapies have been observed in patients with actionable mutations detected in plasma DNA at variant allele fractions (VAFs) below 0.5%. Highly sensitive methods are therefore required for optimal clinical use. To enable objective assessment of assay performance, detailed analytical validation is required. We developed the InVisionFirst™ assay, an assay based on enhanced tagged amplicon sequencing (eTAm-Seq™) technology to profile 36 genes commonly mutated in non-small cell lung cancer (NSCLC) and other cancer types for actionable genomic alterations in cell-free DNA. The assay has been developed to detect point mutations, indels, amplifications and gene fusions that commonly occur in NSCLC. For analytical validation, two 10mL blood tubes were collected from NSCLC patients and healthy volunteer donors. In addition, contrived samples were used to represent a wide spectrum of genetic aberrations and VAFs. Samples were analyzed by multiple operators, at different times and using different reagent Lots. Results were compared with digital PCR (dPCR). The InVisionFirst assay demonstrated an excellent limit of detection, with 99.48% sensitivity for SNVs present at VAF range 0.25%-0.33%, 92.46% sensitivity for indels at 0.25% VAF and a high rate of detection at lower frequencies while retaining high specificity (99.9997% per base). The assay also detected ALK and ROS1 gene fusions, and DNA amplifications in ERBB2, FGFR1, MET and EGFR with high sensitivity and specificity. Comparison between the InVisionFirst assay and dPCR in a series of cancer patients showed high concordance. This analytical validation demonstrated that the InVisionFirst assay is highly sensitive, specific and robust, and meets analytical requirements for clinical applications. PMID:29543828

  9. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Surface-assisted laser desorption ionization mass spectrometry techniques for application in forensics.

    PubMed

    Guinan, Taryn; Kirkbride, Paul; Pigou, Paul E; Ronci, Maurizio; Kobus, Hilton; Voelcker, Nicolas H

    2015-01-01

    Matrix-assisted laser desorption ionization (MALDI) mass spectrometry (MS) is an excellent analytical technique for the rapid and sensitive analysis of macromolecules (>700 Da), such as peptides, proteins, nucleic acids, and synthetic polymers. However, the detection of smaller organic molecules with masses below 700 Da using MALDI-MS is challenging due to the appearance of matrix adducts and matrix fragment peaks in the same spectral range. Recently, nanostructured substrates have been developed that facilitate matrix-free laser desorption ionization (LDI), contributing to an emerging analytical paradigm referred to as surface-assisted laser desorption ionization (SALDI) MS. Since SALDI enables the detection of small organic molecules, it is rapidly growing in popularity, including in the field of forensics. At the same time, SALDI also holds significant potential as a high throughput analytical tool in roadside, work place and athlete drug testing. In this review, we discuss recent advances in SALDI techniques such as desorption ionization on porous silicon (DIOS), nano-initiator mass spectrometry (NIMS) and nano assisted laser desorption ionization (NALDI™) and compare their strengths and weaknesses with particular focus on forensic applications. These include the detection of illicit drug molecules and their metabolites in biological matrices and small molecule detection from forensic samples including banknotes and fingerprints. Finally, the review highlights recent advances in mass spectrometry imaging (MSI) using SALDI techniques. © 2014 Wiley Periodicals, Inc.

  11. Imprinting Technology in Electrochemical Biomimetic Sensors

    PubMed Central

    Frasco, Manuela F.; Truta, Liliana A. A. N. A.; Sales, M. Goreti F.; Moreira, Felismina T. C.

    2017-01-01

    Biosensors are a promising tool offering the possibility of low cost and fast analytical screening in point-of-care diagnostics and for on-site detection in the field. Most biosensors in routine use ensure their selectivity/specificity by including natural receptors as biorecognition element. These materials are however too expensive and hard to obtain for every biochemical molecule of interest in environmental and clinical practice. Molecularly imprinted polymers have emerged through time as an alternative to natural antibodies in biosensors. In theory, these materials are stable and robust, presenting much higher capacity to resist to harsher conditions of pH, temperature, pressure or organic solvents. In addition, these synthetic materials are much cheaper than their natural counterparts while offering equivalent affinity and sensitivity in the molecular recognition of the target analyte. Imprinting technology and biosensors have met quite recently, relying mostly on electrochemical detection and enabling a direct reading of different analytes, while promoting significant advances in various fields of use. Thus, this review encompasses such developments and describes a general overview for building promising biomimetic materials as biorecognition elements in electrochemical sensors. It includes different molecular imprinting strategies such as the choice of polymer material, imprinting methodology and assembly on the transduction platform. Their interface with the most recent nanostructured supports acting as standard conductive materials within electrochemical biomimetic sensors is pointed out. PMID:28272314

  12. In vitro bioassays for detecting dioxin-like activity--application potentials and limits of detection, a review.

    PubMed

    Eichbaum, Kathrin; Brinkmann, Markus; Buchinger, Sebastian; Reifferscheid, Georg; Hecker, Markus; Giesy, John P; Engwall, Magnus; van Bavel, Bert; Hollert, Henner

    2014-07-15

    Use of in vitro assays as screening tool to characterize contamination of a variety of environmental matrices has become an increasingly popular and powerful toolbox in the field of environmental toxicology. While bioassays cannot entirely substitute analytical methods such as gas chromatography-mass spectrometry (GC-MS), the increasing improvement of cell lines and standardization of bioassay procedures enhance their utility as bioanalytical pre-screening tests prior to more targeted chemical analytical investigations. Dioxin-receptor-based assays provide a holistic characterization of exposure to dioxin-like compounds (DLCs) by integrating their overall toxic potential, including potentials of unknown DLCs not detectable via e.g. GC-MS. Hence, they provide important additional information with respect to environmental risk assessment of DLCs. This review summarizes different in vitro bioassay applications for detection of DLCs and considers the comparability of bioassay and chemical analytically derived toxicity equivalents (TEQs) of different approaches and various matrices. These range from complex samples such as sediments through single reference to compound mixtures. A summary of bioassay derived detection limits (LODs) showed a number of current bioassays to be equally sensitive as chemical methodologies, but moreover revealed that most of the bioanalytical studies conducted to date did not report their LODs, which represents a limitation with regard to low potency samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Surface modified capillary electrophoresis combined with in solution isoelectric focusing and MALDI-TOF/TOF MS: a gel-free multidimensional electrophoresis approach for proteomic profiling--exemplified on human follicular fluid.

    PubMed

    Hanrieder, Jörg; Zuberovic, Aida; Bergquist, Jonas

    2009-04-24

    Development of miniaturized analytical tools continues to be of great interest to face the challenges in proteomic analysis of complex biological samples such as human body fluids. In the light of these challenges, special emphasis is put on the speed and simplicity of newly designed technological approaches as well as the need for cost efficiency and low sample consumption. In this study, we present an alternative multidimensional bottom-up approach for proteomic profiling for fast, efficient and sensitive protein analysis in complex biological matrices. The presented setup was based on sample pre-fractionation using microscale in solution isoelectric focusing (IEF) followed by tryptic digestion and subsequent capillary electrophoresis (CE) coupled off-line to matrix assisted laser desorption/ionization time of flight tandem mass spectrometry (MALDI TOF MS/MS). For high performance CE-separation, PolyE-323 modified capillaries were applied to minimize analyte-wall interactions. The potential of the analytical setup was demonstrated on human follicular fluid (hFF) representing a typical complex human body fluid with clinical implication. The obtained results show significant identification of 73 unique proteins (identified at 95% significance level), including mostly acute phase proteins but also protein identities that are well known to be extensively involved in follicular development.

  14. Rapid quantification of pharmaceuticals and pesticides in passive samplers using ultra high performance liquid chromatography coupled to high resolution mass spectrometry.

    PubMed

    Wille, Klaas; Claessens, Michiel; Rappé, Karen; Monteyne, Els; Janssen, Colin R; De Brabander, Hubert F; Vanhaecke, Lynn

    2011-12-23

    The presence of both pharmaceuticals and pesticides in the aquatic environment has become a well-known environmental issue during the last decade. An increasing demand however still exists for sensitive and reliable monitoring tools for these rather polar contaminants in the marine environment. In recent years, the great potential of passive samplers or equilibrium based sampling techniques for evaluation of the fate of these contaminants has been shown in literature. Therefore, we developed a new analytical method for the quantification of a high number of pharmaceuticals and pesticides in passive sampling devices. The analytical procedure consisted of extraction using 1:1 methanol/acetonitrile followed by detection with ultra-high performance liquid chromatography coupled to high resolution and high mass accuracy Orbitrap mass spectrometry. Validation of the analytical method resulted in limits of quantification and recoveries ranging between 0.2 and 20 ng per sampler sheet and between 87.9 and 105.2%, respectively. Determination of the sampler-water partition coefficients of all compounds demonstrated that several pharmaceuticals and most pesticides exert a high affinity for the polydimethylsiloxane passive samplers. Finally, the developed analytical methods were used to measure the time-weighted average (TWA) concentrations of the targeted pollutants in passive samplers, deployed at eight stations in the Belgian coastal zone. Propranolol, carbamazepine and seven pesticides were found to be very abundant in the passive samplers. These obtained long-term and large-scale TWA concentrations will contribute in assessing the environmental and human health risk of these emerging pollutants. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Capacitive chemical sensor

    DOEpatents

    Manginell, Ronald P; Moorman, Matthew W; Wheeler, David R

    2014-05-27

    A microfabricated capacitive chemical sensor can be used as an autonomous chemical sensor or as an analyte-sensitive chemical preconcentrator in a larger microanalytical system. The capacitive chemical sensor detects changes in sensing film dielectric properties, such as the dielectric constant, conductivity, or dimensionality. These changes result from the interaction of a target analyte with the sensing film. This capability provides a low-power, self-heating chemical sensor suitable for remote and unattended sensing applications. The capacitive chemical sensor also enables a smart, analyte-sensitive chemical preconcentrator. After sorption of the sample by the sensing film, the film can be rapidly heated to release the sample for further analysis. Therefore, the capacitive chemical sensor can optimize the sample collection time prior to release to enable the rapid and accurate analysis of analytes by a microanalytical system.

  16. Modeling Analyte Transport and Capture in Porous Bead Sensors

    PubMed Central

    Chou, Jie; Lennart, Alexis; Wong, Jorge; Ali, Mehnaaz F.; Floriano, Pierre N.; Christodoulides, Nicolaos; Camp, James; McDevitt, John T.

    2013-01-01

    Porous agarose microbeads, with high surface to volume ratios and high binding densities, are attracting attention as highly sensitive, affordable sensor elements for a variety of high performance bioassays. While such polymer microspheres have been extensively studied and reported on previously and are now moving into real-world clinical practice, very little work has been completed to date to model the convection, diffusion, and binding kinetics of soluble reagents captured within such fibrous networks. Here, we report the development of a three-dimensional computational model and provide the initial evidence for its agreement with experimental outcomes derived from the capture and detection of representative protein and genetic biomolecules in 290μm porous beads. We compare this model to antibody-mediated capture of C-reactive protein and bovine serum albumin, along with hybridization of oligonucleotide sequences to DNA probes. These results suggest that due to the porous interior of the agarose bead, internal analyte transport is both diffusion- and convection-based, and regardless of the nature of analyte, the bead interiors reveal an interesting trickle of convection-driven internal flow. Based on this model, the internal to external flow rate ratio is found to be in the range of 1:3100 to 1:170 for beads with agarose concentration ranging from 0.5% to 8% for the sensor ensembles here studied. Further, both model and experimental evidence suggest that binding kinetics strongly affect analyte distribution of captured reagents within the beads. These findings reveal that high association constants create a steep moving boundary in which unbound analytes are held back at the periphery of the bead sensor. Low association constants create a more shallow moving boundary in which unbound analytes diffuse further into the bead before binding. These models agree with experimental evidence and thus serve as a new tool set for the study of bio-agent transport processes within a new class of medical microdevices. PMID:22250703

  17. Peptidylation for the determination of low-molecular-weight compounds by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Tang, Feng; Cen, Si-Ying; He, Huan; Liu, Yi; Yuan, Bi-Feng; Feng, Yu-Qi

    2016-05-23

    Determination of low-molecular-weight compounds by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) has been a great challenge in the analytical research field. Here we developed a universal peptide-based derivatization (peptidylation) strategy for the sensitive analysis of low-molecular-weight compounds by MALDI-TOF-MS. Upon peptidylation, the molecular weights of target analytes increase, thus avoiding serious matrix ion interference in the low-molecular-weight region in MALDI-TOF-MS. Since peptides typically exhibit good signal response during MALDI-TOF-MS analysis, peptidylation endows high detection sensitivities of low-molecular-weight analytes. As a proof-of-concept, we analyzed low-molecular-weight compounds of aldehydes and thiols by the developed peptidylation strategy. Our results showed that aldehydes and thiols can be readily determined upon peptidylation, thus realizing the sensitive and efficient determination of low-molecular-weight compounds by MALDI-TOF-MS. Moreover, target analytes also can be unambiguously detected in biological samples using the peptidylation strategy. The established peptidylation strategy is a universal strategy and can be extended to the sensitive analysis of various low-molecular-weight compounds by MALDI-TOF-MS, which may be potentially used in areas such as metabolomics.

  18. Recent Advances in Point-of-Care Diagnostics for Cardiac Markers

    PubMed Central

    2014-01-01

    National and international cardiology guidelines have recommended a 1-hour turnaround time for reporting results of cardiac troponin to emergency department personnel, measured from the time of blood collection to reporting. Use of point-of-care testing (POCT) can reduce turnaround times for cardiac markers, but current devices are not as precise or sensitive as central laboratory assays. The gap is growing as manufacturers of mainframe immunoassay instruments have or will release troponin assays that are even higher than those currently available. These assays have analytical sensitivity that enables detection of nearly 100% of all healthy subjects which is not possible for current POCT assays. Use of high sensitivity troponin results in a lower value for the 99th percentile of a healthy population. Clinically, this enables for the detection of more cases of myocardial injury. In order to compete analytically, next generation POCT assays will to make technologic advancements, such as the use of microfluidic to better control sample delivery, nanoparticles or nanotubes to increase the surface-to-volume ratios for analytes and antibodies, and novel detection schemes such as chemiluminescence and electrochemical detectors to enhance analytical sensitivity. Multi-marker analysis using POCT is also on the horizon for tests that complement cardiac troponin. PMID:27683464

  19. Lack of correlation between reaction speed and analytical sensitivity in isothermal amplification reveals the value of digital methods for optimization: validation using digital real-time RT-LAMP

    PubMed Central

    Khorosheva, Eugenia M.; Karymov, Mikhail A.; Selck, David A.; Ismagilov, Rustem F.

    2016-01-01

    In this paper, we asked if it is possible to identify the best primers and reaction conditions based on improvements in reaction speed when optimizing isothermal reactions. We used digital single-molecule, real-time analyses of both speed and efficiency of isothermal amplification reactions, which revealed that improvements in the speed of isothermal amplification reactions did not always correlate with improvements in digital efficiency (the fraction of molecules that amplify) or with analytical sensitivity. However, we observed that the speeds of amplification for single-molecule (in a digital device) and multi-molecule (e.g. in a PCR well plate) formats always correlated for the same conditions. Also, digital efficiency correlated with the analytical sensitivity of the same reaction performed in a multi-molecule format. Our finding was supported experimentally with examples of primer design, the use or exclusion of loop primers in different combinations, and the use of different enzyme mixtures in one-step reverse-transcription loop-mediated amplification (RT-LAMP). Our results show that measuring the digital efficiency of amplification of single-template molecules allows quick, reliable comparisons of the analytical sensitivity of reactions under any two tested conditions, independent of the speeds of the isothermal amplification reactions. PMID:26358811

  20. Trace level detection of compounds related to the chemical weapons convention by 1H-detected 13C NMR spectroscopy executed with a sensitivity-enhanced, cryogenic probehead.

    PubMed

    Cullinan, David B; Hondrogiannis, George; Henderson, Terry J

    2008-04-15

    Two-dimensional 1H-13C HSQC (heteronuclear single quantum correlation) and fast-HMQC (heteronuclear multiple quantum correlation) pulse sequences were implemented using a sensitivity-enhanced, cryogenic probehead for detecting compounds relevant to the Chemical Weapons Convention present in complex mixtures. The resulting methods demonstrated exceptional sensitivity for detecting the analytes at trace level concentrations. 1H-13C correlations of target analytes at < or = 25 microg/mL were easily detected in a sample where the 1H solvent signal was approximately 58,000-fold more intense than the analyte 1H signals. The problem of overlapping signals typically observed in conventional 1H spectroscopy was essentially eliminated, while 1H and 13C chemical shift information could be derived quickly and simultaneously from the resulting spectra. The fast-HMQC pulse sequences generated magnitude mode spectra suitable for detailed analysis in approximately 4.5 h and can be used in experiments to efficiently screen a large number of samples. The HSQC pulse sequences, on the other hand, required roughly twice the data acquisition time to produce suitable spectra. These spectra, however, were phase-sensitive, contained considerably more resolution in both dimensions, and proved to be superior for detecting analyte 1H-13C correlations. Furthermore, a HSQC spectrum collected with a multiplicity-edited pulse sequence provided additional structural information valuable for identifying target analytes. The HSQC pulse sequences are ideal for collecting high-quality data sets with overnight acquisitions and logically follow the use of fast-HMQC pulse sequences to rapidly screen samples for potential target analytes. Use of the pulse sequences considerably improves the performance of NMR spectroscopy as a complimentary technique for the screening, identification, and validation of chemical warfare agents and other small-molecule analytes present in complex mixtures and environmental samples.

  1. A functional glycoprotein competitive recognition and signal amplification strategy for carbohydrate-protein interaction profiling and cell surface carbohydrate expression evaluation

    NASA Astrophysics Data System (ADS)

    Wang, Yangzhong; Chen, Zhuhai; Liu, Yang; Li, Jinghong

    2013-07-01

    A simple and sensitive carbohydrate biosensor has been suggested as a potential tool for accurate analysis of cell surface carbohydrate expression as well as carbohydrate-based therapeutics for a variety of diseases and infections. In this work, a sensitive biosensor for carbohydrate-lectin profiling and in situ cell surface carbohydrate expression was designed by taking advantage of a functional glycoprotein of glucose oxidase acting as both a multivalent recognition unit and a signal amplification probe. Combining the gold nanoparticle catalyzed luminol electrogenerated chemiluminescence and nanocarrier for active biomolecules, the number of cell surface carbohydrate groups could be conveniently read out. The apparent dissociation constant between GOx@Au probes and Con A was detected to be 1.64 nM and was approximately 5 orders of magnitude smaller than that of mannose and Con A, which would arise from the multivalent effect between the probe and Con A. Both glycoproteins and gold nanoparticles contribute to the high affinity between carbohydrates and lectin. The as-proposed biosensor exhibits excellent analytical performance towards the cytosensing of K562 cells with a detection limit of 18 cells, and the mannose moieties on a single K562 cell were determined to be 1.8 × 1010. The biosensor can also act as a useful tool for antibacterial drug screening and mechanism investigation. This strategy integrates the excellent biocompatibility and multivalent recognition of glycoproteins as well as the significant enzymatic catalysis and gold nanoparticle signal amplification, and avoids the cell pretreatment and labelling process. This would contribute to the glycomic analysis and the understanding of complex native glycan-related biological processes.A simple and sensitive carbohydrate biosensor has been suggested as a potential tool for accurate analysis of cell surface carbohydrate expression as well as carbohydrate-based therapeutics for a variety of diseases and infections. In this work, a sensitive biosensor for carbohydrate-lectin profiling and in situ cell surface carbohydrate expression was designed by taking advantage of a functional glycoprotein of glucose oxidase acting as both a multivalent recognition unit and a signal amplification probe. Combining the gold nanoparticle catalyzed luminol electrogenerated chemiluminescence and nanocarrier for active biomolecules, the number of cell surface carbohydrate groups could be conveniently read out. The apparent dissociation constant between GOx@Au probes and Con A was detected to be 1.64 nM and was approximately 5 orders of magnitude smaller than that of mannose and Con A, which would arise from the multivalent effect between the probe and Con A. Both glycoproteins and gold nanoparticles contribute to the high affinity between carbohydrates and lectin. The as-proposed biosensor exhibits excellent analytical performance towards the cytosensing of K562 cells with a detection limit of 18 cells, and the mannose moieties on a single K562 cell were determined to be 1.8 × 1010. The biosensor can also act as a useful tool for antibacterial drug screening and mechanism investigation. This strategy integrates the excellent biocompatibility and multivalent recognition of glycoproteins as well as the significant enzymatic catalysis and gold nanoparticle signal amplification, and avoids the cell pretreatment and labelling process. This would contribute to the glycomic analysis and the understanding of complex native glycan-related biological processes. Electronic supplementary information (ESI) available: Experimental details; characterization of probes; the influence of electrolyte pH; probe concentration and glucose concentration on the electrode ECL effect. See DOI: 10.1039/c3nr01598j

  2. A Comprehensive Tool and Analytical Pathway for Differential Molecular Profiling and Biomarker Discovery

    DTIC Science & Technology

    2014-10-20

    three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY

  3. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  4. Modeling of the Global Water Cycle - Analytical Models

    Treesearch

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  5. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  6. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  7. Sensitivity analysis of dynamic biological systems with time-delays.

    PubMed

    Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang

    2010-10-15

    Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.

  8. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    PubMed

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .

  9. Integrated Arrays of Ion-Sensitive Electrodes

    NASA Technical Reports Server (NTRS)

    Buehler, Martin; Kuhlman, Kimberly

    2003-01-01

    The figure depicts an example of proposed compact water-quality sensors that would contain integrated arrays of ion-sensitive electrodes (ISEs). These sensors would serve as electronic "tongues": they would be placed in contact with water and used to "taste" selected dissolved ions (that is, they would be used to measure the concentrations of the ions). The selected ions could be any or all of a variety of organic and inorganic cations and anions that could be regarded as contaminants or analytes, depending on the specific application. In addition, some of the ISEs could be made sensitive to some neutral analytes

  10. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  11. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  12. Google Analytics – Index of Resources

    EPA Pesticide Factsheets

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  13. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  14. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  15. Morphomic Malnutrition Score: A Standardized Screening Tool for Severe Malnutrition in Adults.

    PubMed

    Lee, Christopher; Raymond, Erica; Derstine, Brian A; Glazer, Joshua M; Goulson, Rebecca; Rajasekaran, Avinash; Cherry-Bukowiec, Jill; Su, Grace L; Wang, Stewart C

    2018-05-22

    Granular diagnostic criteria for adult malnutrition are lacking. This study uses analytic morphomics to define the Morphomic Malnutrition Score (MMS), a robust screening tool for severe malnutrition. The study population (n = 643) consisted of 2 cohorts: 1) 124 emergency department patients diagnosed with severe malnutrition by a registered dietitian (RD) and an available computed tomography (CT) scan within 2 days of RD evaluation, and 2) 519 adult kidney donor candidates to represent a healthy cohort. Body composition markers of muscle area and abdominal adiposity were measured from patient CT scans using analytic morphomic assessment, and then converted to sex- and age-adjusted percentiles using the Reference Analytic Morphomics Population (RAMP). RAMP consists of 6000 patients chosen to be representative of the general population. The combined cohort was then randomly divided into training (n = 453) and validation (n = 190) sets. MMS was derived using logistic regression. The model coefficients were transformed into a score, normalized from 0 to 10 (10 = most severe). Severely malnourished patients had lower amounts of muscle and fat than kidney donors, specifically for dorsal muscle group area at the twelfth thoracic vertebral level (P < 0.001), psoas muscle area at the fourth lumbar vertebral level (P < 0.001), and subcutaneous fat area at the third lumbar vertebral level (P < 0.001)-all parameters in MMS. MMS for severely malnourished patients was higher than kidney donors (7.7 ± 2.2 vs 3.8 ± 2.0, respectively; P-value < 0.001). An MMS > 6.1 was accurate in determining nutrition diagnosis (82.1% sensitivity; 88.3% specificity; 85.2% balanced accuracy). MMS provides an evidence-based, granular assessment to distinguish severely malnourished adults from a healthy population. © 2018 American Society for Parenteral and Enteral Nutrition.

  16. Analyte-Responsive Hydrogels: Intelligent Materials for Biosensing and Drug Delivery.

    PubMed

    Culver, Heidi R; Clegg, John R; Peppas, Nicholas A

    2017-02-21

    Nature has mastered the art of molecular recognition. For example, using synergistic non-covalent interactions, proteins can distinguish between molecules and bind a partner with incredible affinity and specificity. Scientists have developed, and continue to develop, techniques to investigate and better understand molecular recognition. As a consequence, analyte-responsive hydrogels that mimic these recognitive processes have emerged as a class of intelligent materials. These materials are unique not only in the type of analyte to which they respond but also in how molecular recognition is achieved and how the hydrogel responds to the analyte. Traditional intelligent hydrogels can respond to environmental cues such as pH, temperature, and ionic strength. The functional monomers used to make these hydrogels can be varied to achieve responsive behavior. For analyte-responsive hydrogels, molecular recognition can also be achieved by incorporating biomolecules with inherent molecular recognition properties (e.g., nucleic acids, peptides, enzymes, etc.) into the polymer network. Furthermore, in addition to typical swelling/syneresis responses, these materials exhibit unique responsive behaviors, such as gel assembly or disassembly, upon interaction with the target analyte. With the diverse tools available for molecular recognition and the ability to generate unique responsive behaviors, analyte-responsive hydrogels have found great utility in a wide range of applications. In this Account, we discuss strategies for making four different classes of analyte-responsive hydrogels, specifically, non-imprinted, molecularly imprinted, biomolecule-containing, and enzymatically responsive hydrogels. Then we explore how these materials have been incorporated into sensors and drug delivery systems, highlighting examples that demonstrate the versatility of these materials. For example, in addition to the molecular recognition properties of analyte-responsive hydrogels, the physicochemical changes that are induced upon analyte binding can be exploited to generate a detectable signal for sensing applications. As research in this area has grown, a number of creative approaches for improving the selectivity and sensitivity (i.e., detection limit) of these sensors have emerged. For applications in drug delivery systems, therapeutic release can be triggered by competitive molecular interactions or physicochemical changes in the network. Additionally, including degradable units within the network can enable sustained and responsive therapeutic release. Several exciting examples exploiting the analyte-responsive behavior of hydrogels for the treatment of cancer, diabetes, and irritable bowel syndrome are discussed in detail. We expect that creative and combinatorial approaches used in the design of analyte-responsive hydrogels will continue to yield materials with great potential in the fields of sensing and drug delivery.

  17. A hydrogeologic framework for characterizing summer streamflow sensitivity to climate warming in the Pacific Northwest, USA

    NASA Astrophysics Data System (ADS)

    Safeeq, M.; Grant, G. E.; Lewis, S. L.; Kramer, M. G.; Staab, B.

    2014-09-01

    Summer streamflows in the Pacific Northwest are largely derived from melting snow and groundwater discharge. As the climate warms, diminishing snowpack and earlier snowmelt will cause reductions in summer streamflow. Most regional-scale assessments of climate change impacts on streamflow use downscaled temperature and precipitation projections from general circulation models (GCMs) coupled with large-scale hydrologic models. Here we develop and apply an analytical hydrogeologic framework for characterizing summer streamflow sensitivity to a change in the timing and magnitude of recharge in a spatially explicit fashion. In particular, we incorporate the role of deep groundwater, which large-scale hydrologic models generally fail to capture, into streamflow sensitivity assessments. We validate our analytical streamflow sensitivities against two empirical measures of sensitivity derived using historical observations of temperature, precipitation, and streamflow from 217 watersheds. In general, empirically and analytically derived streamflow sensitivity values correspond. Although the selected watersheds cover a range of hydrologic regimes (e.g., rain-dominated, mixture of rain and snow, and snow-dominated), sensitivity validation was primarily driven by the snow-dominated watersheds, which are subjected to a wider range of change in recharge timing and magnitude as a result of increased temperature. Overall, two patterns emerge from this analysis: first, areas with high streamflow sensitivity also have higher summer streamflows as compared to low-sensitivity areas. Second, the level of sensitivity and spatial extent of highly sensitive areas diminishes over time as the summer progresses. Results of this analysis point to a robust, practical, and scalable approach that can help assess risk at the landscape scale, complement the downscaling approach, be applied to any climate scenario of interest, and provide a framework to assist land and water managers in adapting to an uncertain and potentially challenging future.

  18. Towards an Analytical Age-Dependent Model of Contrast Sensitivity Functions for an Ageing Society

    PubMed Central

    Joulan, Karine; Brémond, Roland

    2015-01-01

    The Contrast Sensitivity Function (CSF) describes how the visibility of a grating depends on the stimulus spatial frequency. Many published CSF data have demonstrated that contrast sensitivity declines with age. However, an age-dependent analytical model of the CSF is not available to date. In this paper, we propose such an analytical CSF model based on visual mechanisms, taking into account the age factor. To this end, we have extended an existing model from Barten (1999), taking into account the dependencies of this model's optical and physiological parameters on age. Age-dependent models of the cones and ganglion cells densities, the optical and neural MTF, and optical and neural noise are proposed, based on published data. The proposed age-dependent CSF is finally tested against available experimental data, with fair results. Such an age-dependent model may be beneficial when designing real-time age-dependent image coding and display applications. PMID:26078994

  19. VIBRATIONAL SPECTROSCOPIC SENSORS Fundamentals, Instrumentation and Applications

    NASA Astrophysics Data System (ADS)

    Kraft, Martin

    In textbook descriptions of chemical sensors, almost invariably a chemical sensor is described as a combination of a (dumb) transducer and a (smart) recognition layer. The reason for this is that most transducers, while (reasonably) sensitive, have limited analyte specificity. This is in particular true for non-optical, e.g. mass-sensitive or electrochemical systems, but also many optical transducers are as such incapable of distinguishing between different substances. Consequently, to build sensors operational in multicomponent environments, such transducers must be combined with physicochemical, chemical or biochemical recognition systems providing the required analyte specificity. Although advancements have been made in this field over the last years, selective layers are frequently not (yet) up to the demands set by industrial or environmental applications, in particular when operated over prolonged periods of time. Another significant obstacle are cross-sensitivities that may interfere with the analytical accuracy. Together, these limitations restrict the real-world applicability of many otherwise promising chemical sensors.

  20. Use of zero order diffraction of a grating monochromator towards convenient and sensitive detection of fluorescent analytes in multi fluorophoric systems

    NASA Astrophysics Data System (ADS)

    Panigrahi, Suraj Kumar; Mishra, Ashok Kumar

    2018-02-01

    White light excitation fluorescence (WLEF) is known to possess analytical advantage in terms of enhanced sensitivity and facile capture of the entire fluorescence spectral signature of multi component fluorescence systems. Using the zero order diffraction of the grating monochromator on the excitation side of a commercial spectrofluorimeter, it has been shown that WLEF spectral measurements can be conveniently carried out. Taking analyte multi-fluorophoric systems like (i) drugs and vitamins spiked in urine sample, (ii) adulteration of extra virgin olive oil with olive pomace oil and (iii) mixture of fabric dyes, it was observed that there is a significant enhancement of measurement sensitivity. The total fluorescence spectral response could be conveniently analysed using PLS2 regression. This work brings out the ease of the use of a conventional fluorimeter for WLEF measurements.

  1. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  2. Black phosphorus-assisted laser desorption ionization mass spectrometry for the determination of low-molecular-weight compounds in biofluids.

    PubMed

    He, Xiao-Mei; Ding, Jun; Yu, Lei; Hussain, Dilshad; Feng, Yu-Qi

    2016-09-01

    Quantitative analysis of small molecules by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) has been a challenging task due to matrix-derived interferences in low m/z region and poor reproducibility of MS signal response. In this study, we developed an approach by applying black phosphorus (BP) as a matrix-assisted laser desorption ionization (MALDI) matrix for the quantitative analysis of small molecules for the first time. Black phosphorus-assisted laser desorption/ionization mass spectrometry (BP/ALDI-MS) showed clear background and exhibited superior detection sensitivity toward quaternary ammonium compounds compared to carbon-based materials. By combining stable isotope labeling (SIL) strategy with BP/ALDI-MS (SIL-BP/ALDI-MS), a variety of analytes labeled with quaternary ammonium group were sensitively detected. Moreover, the isotope-labeled forms of analytes also served as internal standards, which broadened the analyte coverage of BP/ALDI-MS and improved the reproducibility of MS signals. Based on these advantages, a reliable method for quantitative analysis of aldehydes from complex biological samples (saliva, urine, and serum) was successfully established. Good linearities were obtained for five aldehydes in the range of 0.1-20.0 μM with correlation coefficients (R (2)) larger than 0.9928. The LODs were found to be 20 to 100 nM. Reproducibility of the method was obtained with intra-day and inter-day relative standard deviations (RSDs) less than 10.4 %, and the recoveries in saliva samples ranged from 91.4 to 117.1 %. Taken together, the proposed SIL-BP/ALDI-MS strategy has proved to be a reliable tool for quantitative analysis of aldehydes from complex samples. Graphical Abstract An approach for the determination of small molecules was developed by using black phosphorus (BP) as a matrix-assisted laser desorption ionization (MALDI) matrix.

  3. Identification of natural indigo in historical textiles by GC-MS.

    PubMed

    Degani, Laura; Riedo, Chiara; Chiantore, Oscar

    2015-02-01

    The possibility of successfully applying a common GC-MS procedure for identification in one step of all types of dyes from plants of unknown origin and from historical objects is particularly attractive due to the high separation efficiency of the capillary columns, the MS detection sensitivity and the reproducibility of results. In this work, GC-MS analysis, previously and successfully used for the characterization of anthraquinones, flavonoids and tannins from plant extracts and historical samples, has been tested on indigoid dyestuffs. An analytical procedure based on the silylating agent N,O-bis-(trimethylsilyl)trifluoroacetamide (BSTFA) with 1% trimethylchlorosilane (TMCS) was applied to pure molecules of indigotin and indirubin and to plant extracts of Indigofera tinctoria L. and Isatis tinctoria L. Preliminary tests have been done to establish the chromatographic conditions and the derivatization amounts most suitable for the simultaneous detection of indigoid molecules and of the other natural compounds, such as fatty acids, carboxylic acids and sugars, contained within the plant extracts. In order to assess the capacity and the sensitivity of the analytical procedure in typical archaeometric applications, wool samples dyed in the laboratory with indigo were analysed by mimicking the sample amounts typically available with historical objects. The electron ionization (EI) spectra of the main silylated derivatives of indigoid molecules obtained in this way constitute the necessary data set for the characterization of natural extracts and historical works of art. Subsequently, the procedure has been applied to historical samples for the detection of indigo and of other dyestuffs eventually contained in samples. Additional information, useful for restoration and preservation of works of art, could be also obtained on the nature of stains and smudges present on the sampled textile material. The GC-MS method turns out to be an efficient and fast analytical tool also for the identification of natural indigo in plants and textile artefacts, providing results complementary to those from high-performance liquid chromatography (HPLC).

  4. Species-specific diagnostic assays for Bonamia ostreae and B. exitiosa in European flat oyster Ostrea edulis: conventional, real-time and multiplex PCR.

    PubMed

    Ramilo, Andrea; Navas, J Ignacio; Villalba, Antonio; Abollo, Elvira

    2013-05-27

    Bonamia ostreae and B. exitiosa have caused mass mortalities of various oyster species around the world and co-occur in some European areas. The World Organisation for Animal Health (OIE) has included infections with both species in the list of notifiable diseases. However, official methods for species-specific diagnosis of either parasite have certain limitations. In this study, new species-specific conventional PCR (cPCR) and real-time PCR techniques were developed to diagnose each parasite species. Moreover, a multiplex PCR method was designed to detect both parasites in a single assay. The analytical sensitivity and specificity of each new method were evaluated. These new procedures were compared with 2 OIE-recommended methods, viz. standard histology and PCR-RFLP. The new procedures showed higher sensitivity than the OIE recommended ones for the diagnosis of both species. The sensitivity of tests with the new primers was higher using oyster gills and gonad tissue, rather than gills alone. The lack of a 'gold standard' prevented accurate estimation of sensitivity and specificity of the new methods. The implementation of statistical tools (maximum likelihood method) for the comparison of the diagnostic tests showed the possibility of false positives with the new procedures, although the absence of a gold standard precluded certainty. Nevertheless, all procedures showed negative results when used for the analysis of oysters from a Bonamia-free area.

  5. Clinical detection of deletion structural variants in whole-genome sequences

    PubMed Central

    Noll, Aaron C; Miller, Neil A; Smith, Laurie D; Yoo, Byunggil; Fiedler, Stephanie; Cooley, Linda D; Willig, Laurel K; Petrikin, Josh E; Cakici, Julie; Lesko, John; Newton, Angela; Detherage, Kali; Thiffault, Isabelle; Saunders, Carol J; Farrow, Emily G; Kingsmore, Stephen F

    2016-01-01

    Optimal management of acutely ill infants with monogenetic diseases requires rapid identification of causative haplotypes. Whole-genome sequencing (WGS) has been shown to identify pathogenic nucleotide variants in such infants. Deletion structural variants (DSVs, >50 nt) are implicated in many genetic diseases, and tools have been designed to identify DSVs using short-read WGS. Optimisation and integration of these tools into a WGS pipeline could improve diagnostic sensitivity and specificity of WGS. In addition, it may improve turnaround time when compared with current CNV assays, enhancing utility in acute settings. Here we describe DSV detection methods for use in WGS for rapid diagnosis in acutely ill infants: SKALD (Screening Konsensus and Annotation of Large Deletions) combines calls from two tools (Breakdancer and GenomeStrip) with calibrated filters and clinical interpretation rules. In four WGS runs, the average analytic precision (positive predictive value) of SKALD was 78%, and recall (sensitivity) was 27%, when compared with validated reference DSV calls. When retrospectively applied to a cohort of 36 families with acutely ill infants SKALD identified causative DSVs in two. The first was heterozygous deletion of exons 1–3 of MMP21 in trans with a heterozygous frame-shift deletion in two siblings with transposition of the great arteries and heterotaxy. In a newborn female with dysmorphic features, ventricular septal defect and persistent pulmonary hypertension, SKALD identified the breakpoints of a heterozygous, de novo 1p36.32p36.13 deletion. In summary, consensus DSV calling, implemented in an 8-h computational pipeline with parameterised filtering, has the potential to increase the diagnostic yield of WGS in acutely ill neonates and discover novel disease genes. PMID:29263817

  6. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  7. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  8. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  9. Sensitive ionization of non-volatile analytes using protein solutions as spray liquid in desorption electrospray ionization mass spectrometry.

    PubMed

    Zhu, Zhiqiang; Han, Jing; Zhang, Yan; Zhou, Yafei; Xu, Ning; Zhang, Bo; Gu, Haiwei; Chen, Huanwen

    2012-12-15

    Desorption electrospray ionization (DESI) is the most popular ambient ionization technique for direct analysis of complex samples without sample pretreatment. However, for many applications, especially for trace analysis, it is of interest to improve the sensitivity of DESI-mass spectrometry (MS). In traditional DESI-MS, a mixture of methanol/water/acetic acid is usually used to generate the primary ions. In this article, dilute protein solutions were electrosprayed in the DESI method to create multiply charged primary ions for the desorption ionization of trace analytes on various surfaces (e.g., filter paper, glass, Al-foil) without any sample pretreatment. The analyte ions were then detected and structurally characterized using a LTQ XL mass spectrometer. Compared with the methanol/water/acetic acid (49:49:2, v/v/v) solution, protein solutions significantly increased the signal levels of non-volatile compounds such as benzoic acid, TNT, o-toluidine, peptide and insulin in either positive or negative ion detection mode. For all the analytes tested, the limits of detection (LODs) were reduced to about half of the original values which were obtained using traditional DESI. The results showed that the signal enhancement is highly correlated with the molecular weight of the proteins and the selected solid surfaces. The proposed DESI method is a universal strategy for rapid and sensitive detection of trace amounts of strongly bound and/or non-volatile analytes, including explosives, peptides, and proteins. The results indicate that the sensitivity of DESI can be further improved by selecting larger proteins and appropriate solid surfaces. Copyright © 2012 John Wiley & Sons, Ltd.

  10. Paired emitter-detector diode detection with dual wavelength monitoring for enhanced sensitivity to transition metals in ion chromatography with post-column reaction.

    PubMed

    O' Toole, Martina; Barron, Leon; Shepherd, Roderick; Paull, Brett; Nesterenko, Pavel; Diamond, Dermot

    2009-01-01

    The combination of post-column derivatisation and visible detection are regularly employed in ion chromatography (IC) to detect poorly absorbing species. Although this mode is often highly sensitive, one disadvantage is the increase in repeating baseline artifacts associated with out-of-sync pumping systems. The work presented here will demonstrate the use of a second generation design paired emitter-detector diode (PEDD-II) detection mode offering enhanced sensitivity to transition metals in IC by markedly reducing this problem and also by improving signal noise. First generation designs demonstrated the use of a single integrated PEDD detector cell as a simple, small (15 x 5 mm), highly sensitive, low cost photometric detector for the detection of metals in IC. The basic principle of this detection mode lies in the employment of two linear light emitting diodes (LEDs), one operating in normal mode as a light source and the other in reverse bias serving as a light detector. The second generation PEDD-II design showed increased sensitivity for Mn(II)- and Co(II)-2-(pyridylazo)resorcinol (PAR) complexes as a result of two simultaneously acquiring detection cells--one analytical PEDD cell and one reference PEDD cell. Therefore, the PEDD-II employs two wavelengths whereby one monitors the analyte reaction product and the second monitors a wavelength close to the isosbestic point. The optimum LED wavelength to be used for the analytical cell was investigated to maximise peak response. The fabrication process for both the analytical and reference PEDD cells was validated by determining the reproducibility of detectors within a batch. The reproducibility and sensitivity of the PEDD-II detector was then investigated using signals obtained from both intra- and inter-day chromatograms.

  11. Verification of a SEU model for advanced 1-micron CMOS structures using heavy ions

    NASA Technical Reports Server (NTRS)

    Cable, J. S.; Carter, J. R.; Witteles, A. A.

    1986-01-01

    Modeling and test results are reported for 1 micron CMOS circuits. Analytical predictions are correlated with experimental data, and sensitivities to process and design variations are discussed. Unique features involved in predicting the SEU performance of these devices are described. The results show that the critical charge for upset exhibits a strong dependence on pulse width for very fast devices, and upset predictions must factor in the pulse shape. Acceptable SEU error rates can be achieved for a 1 micron bulk CMOS process. A thin retrograde well provides complete SEU immunity for N channel hits at normal incidence angle. Source interconnect resistance can be important parameter in determining upset rates, and Cf-252 testing can be a valuable tool for cost-effective SEU testing.

  12. Analytical model for orbital debris environmental management

    NASA Technical Reports Server (NTRS)

    Talent, David L.

    1990-01-01

    A differential equation, also referred to as the PIB (particle-in-a-box) model, expressing the time rate of change of the number of objects in orbit, is developed, and its applicability is illustrated. The model can be used as a tool for the assessment of LEO environment stability, and as a starting point for the development of numerical evolutionary models. Within the context of the model, evolutionary scenarios are examined, and found to be sensitive to the growth rate. It is determined that the present environment is slightly unstable to catastrophic growth, and that the number of particles on orbit will continue to increase until approximately 2250-2350 AD, with a maximum of 2,000,000. The model is expandable to the more realistic (complex) case of multiple species in a multiple-tier system.

  13. Fabrication of chitosan-silver nanoparticle hybrid 3D porous structure as a SERS substrate for biomedical applications

    NASA Astrophysics Data System (ADS)

    Jung, Gyeong-Bok; Kim, Ji-Hye; Burm, Jin Sik; Park, Hun-Kuk

    2013-05-01

    We propose a simple, low-cost, large-area, and functional surface enhanced Raman scattering (SERS) substrate for biomedical applications. The SERS substrate with chitosan-silver nanoparticles (chitosan-Ag NPs) hybrid 3D porous structure was fabricated simply by a one-step method. The chitosan was used as a template for the Ag NPs deposition. SERS enhancement by the chitosan-Ag NPs substrate was experimentally verified using rhodamine B as an analyte. Thiolated single stranded DNA was also measured for atopic dermatitis genetic markers (chemokines CCL17) at a low concentration of 5 pM. We successfully designed a novel SERS substrate with silver nanoparticle hybridized 3D porous chitosan that has the potential to become a highly sensitive and selective tool for biomedical applications.

  14. Separation and simultaneous quantitation of PGF2α and its epimer 8-iso-PGF2α using modifier-assisted differential mobility spectrometry tandem mass spectrometry.

    PubMed

    Liang, Chunsu; Sun, Hui; Meng, Xiangjun; Yin, Lei; Fawcett, J Paul; Yu, Huaidong; Liu, Ting; Gu, Jingkai

    2018-03-01

    Because many therapeutic agents are contaminated by epimeric impurities or form epimers as a result of metabolism, analytical tools capable of determining epimers are increasingly in demand. This article is a proof-of-principle report of a novel DMS-MS/MS method to separate and simultaneously quantify epimers, taking PGF2 α and its 8-epimer, 8- iso -PGF2 α , as an example. Good accuracy and precision were achieved in the range of 10-500 ng/mL with a run time of only 1.5 min. Isopropanol as organic modifier facilitated a good combination of sensitivity and separation. The method is the first example of the quantitation of epimers without chromatographic separation.

  15. Mono-isotope Prediction for Mass Spectra Using Bayes Network.

    PubMed

    Li, Hui; Liu, Chunmei; Rwebangira, Mugizi Robert; Burge, Legand

    2014-12-01

    Mass spectrometry is one of the widely utilized important methods to study protein functions and components. The challenge of mono-isotope pattern recognition from large scale protein mass spectral data needs computational algorithms and tools to speed up the analysis and improve the analytic results. We utilized naïve Bayes network as the classifier with the assumption that the selected features are independent to predict mono-isotope pattern from mass spectrometry. Mono-isotopes detected from validated theoretical spectra were used as prior information in the Bayes method. Three main features extracted from the dataset were employed as independent variables in our model. The application of the proposed algorithm to publicMo dataset demonstrates that our naïve Bayes classifier is advantageous over existing methods in both accuracy and sensitivity.

  16. Protocol for Future Amino Acid Analyses of Samples Returned by the Stardust Mission

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Doty, J. H., III; Matrajt, G.; Dworkin, J. P.

    2006-01-01

    We have demonstrated that LC-ToF-MS coupled with UV fluorescence detection is a powerful tool for the detection of amino acids in meteorite extracts. Using this new analytical technique we were able to identify the extraterrestrial amino acid AIB extracted from fifteen 20 micron sized Murchison meteorite grains. We found that the amino acid contamination levels in Stardust aerogels was much lower than the levels observed in the Murchison meteorite. In addition, the alpha-dialkyl amino acids AIB and isovaline which are the most abundant amino acids in Murchison were not detected in the aerogel above blank levels. We are currently integrating LIF detection capability to our existing nanoflow LC-ToF-MS for enhanced sensitivity required for the analysis of amino acids in Stardust samples.

  17. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  18. Semi-Analytical Models of CO2 Injection into Deep Saline Aquifers: Evaluation of the Area of Review and Leakage through Abandoned Wells

    EPA Science Inventory

    This presentation will provide a conceptual preview of an Area of Review (AoR) tool being developed by EPA’s Office of Research and Development that applies analytic and semi-analytical mathematical solutions to elucidate potential risks associated with geologic sequestration of ...

  19. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  20. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

Top