Sample records for quantitative analytical techniques

  1. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  2. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-07

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  4. Quantitative evaluation of analyte transport on microfluidic paper-based analytical devices (μPADs).

    PubMed

    Ota, Riki; Yamada, Kentaro; Suzuki, Koji; Citterio, Daniel

    2018-02-07

    The transport efficiency during capillary flow-driven sample transport on microfluidic paper-based analytical devices (μPADs) made from filter paper has been investigated for a selection of model analytes (Ni 2+ , Zn 2+ , Cu 2+ , PO 4 3- , bovine serum albumin, sulforhodamine B, amaranth) representing metal cations, complex anions, proteins and anionic molecules. For the first time, the transport of the analytical target compounds rather than the sample liquid, has been quantitatively evaluated by means of colorimetry and absorption spectrometry-based methods. The experiments have revealed that small paperfluidic channel dimensions, additional user operation steps (e.g. control of sample volume, sample dilution, washing step) as well as the introduction of sample liquid wicking areas allow to increase analyte transport efficiency. It is also shown that the interaction of analytes with the negatively charged cellulosic paper substrate surface is strongly influenced by the physico-chemical properties of the model analyte and can in some cases (Cu 2+ ) result in nearly complete analyte depletion during sample transport. The quantitative information gained through these experiments is expected to contribute to the development of more sensitive μPADs.

  5. Applications of surface analytical techniques in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Qian, Gujie; Li, Yubiao; Gerson, Andrea R.

    2015-03-01

    This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences

  6. Analytical robustness of quantitative NIR chemical imaging for Islamic paper characterization

    NASA Astrophysics Data System (ADS)

    Mahgoub, Hend; Gilchrist, John R.; Fearn, Thomas; Strlič, Matija

    2017-07-01

    Recently, spectral imaging techniques such as Multispectral (MSI) and Hyperspectral Imaging (HSI) have gained importance in the field of heritage conservation. This paper explores the analytical robustness of quantitative chemical imaging for Islamic paper characterization by focusing on the effect of different measurement and processing parameters, i.e. acquisition conditions and calibration on the accuracy of the collected spectral data. This will provide a better understanding of the technique that can provide a measure of change in collections through imaging. For the quantitative model, special calibration target was devised using 105 samples from a well-characterized reference Islamic paper collection. Two material properties were of interest: starch sizing and cellulose degree of polymerization (DP). Multivariate data analysis methods were used to develop discrimination and regression models which were used as an evaluation methodology for the metrology of quantitative NIR chemical imaging. Spectral data were collected using a pushbroom HSI scanner (Gilden Photonics Ltd) in the 1000-2500 nm range with a spectral resolution of 6.3 nm using a mirror scanning setup and halogen illumination. Data were acquired at different measurement conditions and acquisition parameters. Preliminary results showed the potential of the evaluation methodology to show that measurement parameters such as the use of different lenses and different scanning backgrounds may not have a great influence on the quantitative results. Moreover, the evaluation methodology allowed for the selection of the best pre-treatment method to be applied to the data.

  7. Quantitative and Qualitative Relations between Motivation and Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Miele, David B.; Wigfield, Allan

    2014-01-01

    The authors examine two kinds of factors that affect students' motivation to engage in critical-analytic thinking. The first, which includes ability beliefs, achievement values, and achievement goal orientations, influences the "quantitative" relation between motivation and critical-analytic thinking; that is, whether students are…

  8. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  9. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  10. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  11. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    PubMed

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental

  12. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  13. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  14. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  15. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  16. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops ( Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach.

  17. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  19. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  20. Influence of Pre-Analytical Factors on Thymus- and Activation-Regulated Chemokine Quantitation in Plasma

    PubMed Central

    Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.

    2015-01-01

    Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246

  1. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  2. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  3. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  4. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  5. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  6. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  7. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  8. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  9. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  10. A Quantitative Technique for Beginning Microscopists.

    ERIC Educational Resources Information Center

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  11. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  12. Correction for isotopic interferences between analyte and internal standard in quantitative mass spectrometry by a nonlinear calibration function.

    PubMed

    Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L

    2013-04-16

    Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.

  13. Analytical insight into "breathing" crack-induced acoustic nonlinearity with an application to quantitative evaluation of contact cracks.

    PubMed

    Wang, Kai; Liu, Menglong; Su, Zhongqing; Yuan, Shenfang; Fan, Zheng

    2018-08-01

    To characterize fatigue cracks, in the undersized stage in particular, preferably in a quantitative and precise manner, a two-dimensional (2D) analytical model is developed for interpreting the modulation mechanism of a "breathing" crack on guided ultrasonic waves (GUWs). In conjunction with a modal decomposition method and a variational principle-based algorithm, the model is capable of analytically depicting the propagating and evanescent waves induced owing to the interaction of probing GUWs with a "breathing" crack, and further extracting linear and nonlinear wave features (e.g., reflection, transmission, mode conversion and contact acoustic nonlinearity (CAN)). With the model, a quantitative correlation between CAN embodied in acquired GUWs and crack parameters (e.g., location and severity) is obtained, whereby a set of damage indices is proposed via which the severity of the crack can be evaluated quantitatively. The evaluation, in principle, does not entail a benchmarking process against baseline signals. As validation, the results obtained from the analytical model are compared with those from finite element simulation, showing good consistency. This has demonstrated accuracy of the developed analytical model in interpreting contact crack-induced CAN, and spotlighted its application to quantitative evaluation of fatigue damage. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Challenges and perspectives in quantitative NMR.

    PubMed

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  16. Harmonization of strategies for the validation of quantitative analytical procedures. A SFSTP proposal--Part I.

    PubMed

    Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L

    2004-11-15

    This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory

  17. Calibrant-Free Analyte Quantitation via a Variable Velocity Flow Cell.

    PubMed

    Beck, Jason G; Skuratovsky, Aleksander; Granger, Michael C; Porter, Marc D

    2017-01-17

    In this paper, we describe a novel method for analyte quantitation that does not rely on calibrants, internal standards, or calibration curves but, rather, leverages the relationship between disparate and predictable surface-directed analyte flux to an array of sensing addresses and a measured resultant signal. To reduce this concept to practice, we fabricated two flow cells such that the mean linear fluid velocity, U, was varied systematically over an array of electrodes positioned along the flow axis. This resulted in a predictable variation of the address-directed flux of a redox analyte, ferrocenedimethanol (FDM). The resultant limiting currents measured at a series of these electrodes, and accurately described by a convective-diffusive transport model, provided a means to calculate an "unknown" concentration without the use of calibrants, internal standards, or a calibration curve. Furthermore, the experiment and concentration calculation only takes minutes to perform. Deviation in calculated FDM concentrations from true values was minimized to less than 0.5% when empirically derived values of U were employed.

  18. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  19. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  20. Analytical Chemistry and the Microchip.

    ERIC Educational Resources Information Center

    Lowry, Robert K.

    1986-01-01

    Analytical techniques used at various points in making microchips are described. They include: Fourier transform infrared spectrometry (silicon purity); optical emission spectroscopy (quantitative thin-film composition); X-ray photoelectron spectroscopy (chemical changes in thin films); wet chemistry, instrumental analysis (process chemicals);…

  1. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  2. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    PubMed

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used

  3. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  4. Nuclear analytical techniques in medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesareo, R.

    1988-01-01

    This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less

  5. Combined use of quantitative ED-EPMA, Raman microspectrometry, and ATR-FTIR imaging techniques for the analysis of individual particles.

    PubMed

    Jung, Hae-Jin; Eom, Hyo-Jin; Kang, Hyun-Woo; Moreau, Myriam; Sobanska, Sophie; Ro, Chul-Un

    2014-08-21

    In this work, quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA) (called low-Z particle EPMA), Raman microspectrometry (RMS), and attenuated total reflectance Fourier transform infrared spectroscopic (ATR-FTIR) imaging were applied in combination for the analysis of the same individual airborne particles for the first time. After examining individual particles of micrometer size by low-Z particle EPMA, consecutive examinations by RMS and ATR-FTIR imaging of the same individual particles were then performed. The relocation of the same particles on Al or Ag foils was successfully carried out among the three standalone instruments for several standard samples and an indoor airborne particle sample, resulting in the successful acquisition of quality spectral data from the three single-particle analytical techniques. The combined application of the three techniques to several different standard particles confirmed that those techniques provided consistent and complementary chemical composition information on the same individual particles. Further, it was clearly demonstrated that the three different types of spectral and imaging data from the same individual particles in an indoor aerosol sample provided richer information on physicochemical characteristics of the particle ensemble than that obtainable by the combined use of two single-particle analytical techniques.

  6. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  7. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  8. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  9. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  10. An integrated approach using orthogonal analytical techniques to characterize heparan sulfate structure.

    PubMed

    Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Gunay, Nur Sibel; Wang, Jing; Sun, Elaine Y; Pradines, Joël R; Farutin, Victor; Shriver, Zachary; Kaundinya, Ganesh V; Capila, Ishan

    2017-02-01

    Heparan sulfate (HS), a glycosaminoglycan present on the surface of cells, has been postulated to have important roles in driving both normal and pathological physiologies. The chemical structure and sulfation pattern (domain structure) of HS is believed to determine its biological function, to vary across tissue types, and to be modified in the context of disease. Characterization of HS requires isolation and purification of cell surface HS as a complex mixture. This process may introduce additional chemical modification of the native residues. In this study, we describe an approach towards thorough characterization of bovine kidney heparan sulfate (BKHS) that utilizes a variety of orthogonal analytical techniques (e.g. NMR, IP-RPHPLC, LC-MS). These techniques are applied to characterize this mixture at various levels including composition, fragment level, and overall chain properties. The combination of these techniques in many instances provides orthogonal views into the fine structure of HS, and in other instances provides overlapping / confirmatory information from different perspectives. Specifically, this approach enables quantitative determination of natural and modified saccharide residues in the HS chains, and identifies unusual structures. Analysis of partially digested HS chains allows for a better understanding of the domain structures within this mixture, and yields specific insights into the non-reducing end and reducing end structures of the chains. This approach outlines a useful framework that can be applied to elucidate HS structure and thereby provides means to advance understanding of its biological role and potential involvement in disease progression. In addition, the techniques described here can be applied to characterization of heparin from different sources.

  11. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  14. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  15. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  16. Quantitative proteomics in the field of microbiology.

    PubMed

    Otto, Andreas; Becher, Dörte; Schmidt, Frank

    2014-03-01

    Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Developing High-Frequency Quantitative Ultrasound Techniques to Characterize Three-Dimensional Engineered Tissues

    NASA Astrophysics Data System (ADS)

    Mercado, Karla Patricia E.

    Tissue engineering holds great promise for the repair or replacement of native tissues and organs. Further advancements in the fabrication of functional engineered tissues are partly dependent on developing new and improved technologies to monitor the properties of engineered tissues volumetrically, quantitatively, noninvasively, and nondestructively over time. Currently, engineered tissues are evaluated during fabrication using histology, biochemical assays, and direct mechanical tests. However, these techniques destroy tissue samples and, therefore, lack the capability for real-time, longitudinal monitoring. The research reported in this thesis developed nondestructive, noninvasive approaches to characterize the structural, biological, and mechanical properties of 3-D engineered tissues using high-frequency quantitative ultrasound and elastography technologies. A quantitative ultrasound technique, using a system-independent parameter known as the integrated backscatter coefficient (IBC), was employed to visualize and quantify structural properties of engineered tissues. Specifically, the IBC was demonstrated to estimate cell concentration and quantitatively detect differences in the microstructure of 3-D collagen hydrogels. Additionally, the feasibility of an ultrasound elastography technique called Single Tracking Location Acoustic Radiation Force Impulse (STL-ARFI) imaging was demonstrated for estimating the shear moduli of 3-D engineered tissues. High-frequency ultrasound techniques can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, these high-frequency quantitative ultrasound techniques can enable noninvasive, volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation.

  18. Determination of Ca content of coral skeleton by analyte additive method using the LIBS technique

    NASA Astrophysics Data System (ADS)

    Haider, A. F. M. Y.; Khan, Z. H.

    2012-09-01

    Laser-induced breakdown spectroscopic (LIBS) technique was used to study the elemental profile of coral skeletons. Apart from calcium and carbon, which are the main elemental constituents of coral skeleton, elements like Sr, Na, Mg, Li, Si, Cu, Ti, K, Mn, Zn, Ba, Mo, Br and Fe were detected in the coral skeletons from the Inani Beach and the Saint Martin's island of Bangladesh and the coral from the Philippines. In addition to the qualitative analysis, the quantitative analysis of the main elemental constituent, calcium (Ca), was done. The result shows the presence of (36.15±1.43)% by weight of Ca in the coral skeleton collected from the Inani Beach, Cox's Bazar, Bangladesh. It was determined by using six calibration curves, drawn for six emission lines of Ca I (428.301 nm, 428.936 nm, 431.865 nm, 443.544 nm, 443.569 nm, and 445.589 nm), by standard analyte additive method. Also from AAS measurement the percentage content of Ca in the same sample of coral skeleton obtained was 39.87% by weight which compares fairly well with the result obtained by the analyte additive method.

  19. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    PubMed

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  20. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. [Clinical Application of Analytical and Medical Instruments Mainly Using MS Techniques].

    PubMed

    Tanaka, Koichi

    2016-02-01

    Analytical instruments for clinical use are commonly required to confirm the compounds and forms related to diseases with the highest possible sensitivity, quantitative performance, and specificity and minimal invasiveness within a short time, easily, and at a low cost. Advancements of technical innovation for Mass Spectrometer (MS) have led to techniques that meet such requirements. Besides confirming known substances, other purposes and advantages of MS that are not fully known to the public are using MS as a tool to discover unknown phenomena and compounds. An example is clarifying the mechanisms of human diseases. The human body has approximately 100 thousand types of protein, and there may be more than several million types of protein and their metabolites. Most of them have yet to be discovered, and their discovery may give birth to new academic fields and lead to the clarification of diseases, development of new medicines, etc. For example, using the MS system developed under "Contribution to drug discovery and diagnosis by next generation of advanced mass spectrometry system," one of the 30 projects of the "Funding Program for World-Leading Innovative R&D on Science and Technology" (FIRST program), and other individual basic technologies, we succeeded in discovering new disease biomarker candidates for Alzheimer's disease, cancer, etc. Further contribution of MS to clinical medicine can be expected through the development and improvement of new techniques, efforts to verify discoveries, and communications with the medical front.

  2. An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Collis, Peter

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…

  3. Analytical techniques of pilot scanning behavior and their application

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.

    1986-01-01

    The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.

  4. Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.

    PubMed

    Mura, Paola

    2015-09-10

    Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are

  6. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    PubMed

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  7. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  8. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  9. Quantitative and qualitative sensing techniques for biogenic volatile organic compounds and their oxidation products.

    PubMed

    Kim, Saewung; Guenther, Alex; Apel, Eric

    2013-07-01

    The physiological production mechanisms of some of the organics in plants, commonly known as biogenic volatile organic compounds (BVOCs), have been known for more than a century. Some BVOCs are emitted to the atmosphere and play a significant role in tropospheric photochemistry especially in ozone and secondary organic aerosol (SOA) productions as a result of interplays between BVOCs and atmospheric radicals such as hydroxyl radical (OH), ozone (O3) and NOX (NO + NO2). These findings have been drawn from comprehensive analysis of numerous field and laboratory studies that have characterized the ambient distribution of BVOCs and their oxidation products, and reaction kinetics between BVOCs and atmospheric oxidants. These investigations are limited by the capacity for identifying and quantifying these compounds. This review highlights the major analytical techniques that have been used to observe BVOCs and their oxidation products such as gas chromatography, mass spectrometry with hard and soft ionization methods, and optical techniques from laser induced fluorescence (LIF) to remote sensing. In addition, we discuss how new analytical techniques can advance our understanding of BVOC photochemical processes. The principles, advantages, and drawbacks of the analytical techniques are discussed along with specific examples of how the techniques were applied in field and laboratory measurements. Since a number of thorough review papers for each specific analytical technique are available, readers are referred to these publications rather than providing thorough descriptions of each technique. Therefore, the aim of this review is for readers to grasp the advantages and disadvantages of various sensing techniques for BVOCs and their oxidation products and to provide guidance for choosing the optimal technique for a specific research task.

  10. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  11. Nuclear and atomic analytical techniques in environmental studies in South America.

    PubMed

    Paschoa, A S

    1990-01-01

    The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.

  12. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  13. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  14. Ratio of sequential chromatograms for quantitative analysis and peak deconvolution: Application to standard addition method and process monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.

    1990-08-01

    This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less

  15. A novel analytical technique suitable for the identification of plastics.

    PubMed

    Nečemer, Marijan; Kump, Peter; Sket, Primož; Plavec, Janez; Grdadolnik, Jože; Zvanut, Maja

    2013-01-01

    The enormous development and production of plastic materials in the last century resulted in increasing numbers of such kinds of objects. Development of a simple and fast technique to classify different types of plastics could be used in many activities dealing with plastic materials such as packaging of food, sorting of used plastic materials, and also, if technique would be non-destructive, for conservation of plastic artifacts in museum collections, a relatively new field of interest since 1990. In our previous paper we introduced a non-destructive technique for fast identification of unknown plastics based on EDXRF spectrometry,1 using as a case study some plastic artifacts archived in the Museum in order to show the advantages of the nondestructive identification of plastic material. In order to validate our technique it was necessary to apply for this purpose the comparison of analyses with some of the analytical techniques, which are more suitable and so far rather widely applied in identifying some most common sorts of plastic materials.

  16. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  17. Speciation of individual mineral particles of micrometer size by the combined use of attenuated total reflectance-Fourier transform-infrared imaging and quantitative energy-dispersive electron probe X-ray microanalysis techniques.

    PubMed

    Jung, Hae-Jin; Malek, Md Abdul; Ryu, JiYeon; Kim, BoWha; Song, Young-Chul; Kim, HyeKyeong; Ro, Chul-Un

    2010-07-15

    Our previous work demonstrated for the first time the potential of the combined use of two techniques, attenuated total reflectance FT-IR (ATR-FT-IR) imaging and a quantitative energy-dispersive electron probe X-ray microanalysis, low-Z particle EPMA, for the characterization of individual aerosol particles. In this work, the speciation of mineral particles was performed on a single particle level for 24 mineral samples, including kaolinite, montmorillonite, vermiculite, talc, quartz, feldspar, calcite, gypsum, and apatite, by the combined use of ATR-FT-IR imaging and low-Z particle EPMA techniques. These two single particle analytical techniques provide complementary information, the ATR-FT-IR imaging on mineral types and low-Z particle EPMA on the morphology and elemental concentrations, on the same individual particles. This work demonstrates that the combined use of the two single particle analytical techniques can powerfully characterize externally heterogeneous mineral particle samples in detail and has great potential for the characterization of airborne mineral dust particles.

  18. The analytical representation of viscoelastic material properties using optimization techniques

    NASA Technical Reports Server (NTRS)

    Hill, S. A.

    1993-01-01

    This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.

  19. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  20. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  1. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    NASA Technical Reports Server (NTRS)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  2. Quantitative phase imaging method based on an analytical nonparaxial partially coherent phase optical transfer function.

    PubMed

    Bao, Yijun; Gaylord, Thomas K

    2016-11-01

    Multifilter phase imaging with partially coherent light (MFPI-PC) is a promising new quantitative phase imaging method. However, the existing MFPI-PC method is based on the paraxial approximation. In the present work, an analytical nonparaxial partially coherent phase optical transfer function is derived. This enables the MFPI-PC to be extended to the realistic nonparaxial case. Simulations over a wide range of test phase objects as well as experimental measurements on a microlens array verify higher levels of imaging accuracy compared to the paraxial method. Unlike the paraxial version, the nonparaxial MFPI-PC with obliquity factor correction exhibits no systematic error. In addition, due to its analytical expression, the increase in computation time compared to the paraxial version is negligible.

  3. Technique for quantitative RT-PCR analysis directly from single muscle fibers.

    PubMed

    Wacker, Michael J; Tehel, Michelle M; Gallagher, Philip M

    2008-07-01

    The use of single-cell quantitative RT-PCR has greatly aided the study of gene expression in fields such as muscle physiology. For this study, we hypothesized that single muscle fibers from a biopsy can be placed directly into the reverse transcription buffer and that gene expression data can be obtained without having to first extract the RNA. To test this hypothesis, biopsies were taken from the vastus lateralis of five male subjects. Single muscle fibers were isolated and underwent RNA isolation (technique 1) or placed directly into reverse transcription buffer (technique 2). After cDNA conversion, individual fiber cDNA was pooled and quantitative PCR was performed using primer-probes for beta(2)-microglobulin, glyceraldehyde-3-phosphate dehydrogenase, insulin-like growth factor I receptor, and glucose transporter subtype 4. The no RNA extraction method provided similar quantitative PCR data as that of the RNA extraction method. A third technique was also tested in which we used one-quarter of an individual fiber's cDNA for PCR (not pooled) and the average coefficient of variation between fibers was <8% (cycle threshold value) for all genes studied. The no RNA extraction technique was tested on isolated muscle fibers using a gene known to increase after exercise (pyruvate dehydrogenase kinase 4). We observed a 13.9-fold change in expression after resistance exercise, which is consistent with what has been previously observed. These results demonstrate a successful method for gene expression analysis directly from single muscle fibers.

  4. Counterfeit drugs: analytical techniques for their identification.

    PubMed

    Martino, R; Malet-Martino, M; Gilard, V; Balayssac, S

    2010-09-01

    In recent years, the number of counterfeit drugs has increased dramatically, including not only "lifestyle" products but also vital medicines. Besides the threat to public health, the financial and reputational damage to pharmaceutical companies is substantial. The lack of robust information on the prevalence of fake drugs is an obstacle in the fight against drug counterfeiting. It is generally accepted that approximately 10% of drugs worldwide could be counterfeit, but it is also well known that this number covers very different situations depending on the country, the places where the drugs are purchased, and the definition of what constitutes a counterfeit drug. The chemical analysis of drugs suspected to be fake is a crucial step as counterfeiters are becoming increasingly sophisticated, rendering visual inspection insufficient to distinguish the genuine products from the counterfeit ones. This article critically reviews the recent analytical methods employed to control the quality of drug formulations, using as an example artemisinin derivatives, medicines particularly targeted by counterfeiters. Indeed, a broad panel of techniques have been reported for their analysis, ranging from simple and cheap in-field ones (colorimetry and thin-layer chromatography) to more advanced laboratory methods (mass spectrometry, nuclear magnetic resonance, and vibrational spectroscopies) through chromatographic methods, which remain the most widely used. The conclusion section of the article highlights the questions to be posed before selecting the most appropriate analytical approach.

  5. Development of Impurity Profiling Methods Using Modern Analytical Techniques.

    PubMed

    Ramachandra, Bondigalla

    2017-01-02

    This review gives a brief introduction about the process- and product-related impurities and emphasizes on the development of novel analytical methods for their determination. It describes the application of modern analytical techniques, particularly the ultra-performance liquid chromatography (UPLC), liquid chromatography-mass spectrometry (LC-MS), high-resolution mass spectrometry (HRMS), gas chromatography-mass spectrometry (GC-MS) and high-performance thin layer chromatography (HPTLC). In addition to that, the application of nuclear magnetic resonance (NMR) spectroscopy was also discussed for the characterization of impurities and degradation products. The significance of the quality, efficacy and safety of drug substances/products, including the source of impurities, kinds of impurities, adverse effects by the presence of impurities, quality control of impurities, necessity for the development of impurity profiling methods, identification of impurities and regulatory aspects has been discussed. Other important aspects that have been discussed are forced degradation studies and the development of stability indicating assay methods.

  6. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Feng; Liu, Yijin; Yu, Xiqian

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution

  7. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE PAGES

    Lin, Feng; Liu, Yijin; Yu, Xiqian; ...

    2017-08-30

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution

  8. Emission Computed Tomography: A New Technique for the Quantitative Physiologic Study of Brain and Heart in Vivo

    DOE R&D Accomplishments Database

    Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.

    1978-01-01

    Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.

  9. Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community

    DTIC Science & Technology

    2016-01-01

    Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement

  10. Analytical challenges for conducting rapid metabolism characterization for QIVIVE.

    PubMed

    Tolonen, Ari; Pelkonen, Olavi

    2015-06-05

    For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Chemical speciation of individual airborne particles by the combined use of quantitative energy-dispersive electron probe X-ray microanalysis and attenuated total reflection Fourier transform-infrared imaging techniques.

    PubMed

    Song, Young-Chul; Ryu, JiYeon; Malek, Md Abdul; Jung, Hae-Jin; Ro, Chul-Un

    2010-10-01

    In our previous work, it was demonstrated that the combined use of attenuated total reflectance (ATR) FT-IR imaging and quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA), named low-Z particle EPMA, had the potential for characterization of individual aerosol particles. Additionally, the speciation of individual mineral particles was performed on a single particle level by the combined use of the two techniques, demonstrating that simultaneous use of the two single particle analytical techniques is powerful for the detailed characterization of externally heterogeneous mineral particle samples and has great potential for characterization of atmospheric mineral dust aerosols. These single particle analytical techniques provide complementary information on the physicochemical characteristics of the same individual particles, such as low-Z particle EPMA on morphology and elemental concentrations and the ATR-FT-IR imaging on molecular species, crystal structures, functional groups, and physical states. In this work, this analytical methodology was applied to characterize an atmospheric aerosol sample collected in Incheon, Korea. Overall, 118 individual particles were observed to be primarily NaNO(3)-containing, Ca- and/or Mg-containing, silicate, and carbonaceous particles, although internal mixing states of the individual particles proved complicated. This work demonstrates that more detailed physiochemical properties of individual airborne particles can be obtained using this approach than when either the low-Z particle EPMA or ATR-FT-IR imaging technique is used alone.

  12. MRI technique for the snapshot imaging of quantitative velocity maps using RARE.

    PubMed

    Shiko, G; Sederman, A J; Gladden, L F

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T(2) weighted, not T(2)(∗) weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98×49 μm(2), within 20 min, and monitored over ∼13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390×390 μm(2). The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. MRI technique for the snapshot imaging of quantitative velocity maps using RARE

    NASA Astrophysics Data System (ADS)

    Shiko, G.; Sederman, A. J.; Gladden, L. F.

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T2 weighted, not T2∗ weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98 × 49 μm2, within 20 min, and monitored over ˜13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390 × 390 μm2. The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques.

  14. A Meta-Analytic Investigation of Fiedler's Contingency Model of Leadership Effectiveness.

    ERIC Educational Resources Information Center

    Strube, Michael J.; Garcia, Joseph E.

    According to Fiedler's Contingency Model of Leadership Effectiveness, group performance is a function of the leader-situation interaction. A review of past validations has found several problems associated with the model. Meta-analytic techniques were applied to the Contingency Model in order to assess the validation evidence quantitatively. The…

  15. Quantitative techniques for musculoskeletal MRI at 7 Tesla.

    PubMed

    Bangerter, Neal K; Taylor, Meredith D; Tarbox, Grayson J; Palmer, Antony J; Park, Daniel J

    2016-12-01

    Whole-body 7 Tesla MRI scanners have been approved solely for research since they appeared on the market over 10 years ago, but may soon be approved for selected clinical neurological and musculoskeletal applications in both the EU and the United States. There has been considerable research work on musculoskeletal applications at 7 Tesla over the past decade, including techniques for ultra-high resolution morphological imaging, 3D T2 and T2* mapping, ultra-short TE applications, diffusion tensor imaging of cartilage, and several techniques for assessing proteoglycan content in cartilage. Most of this work has been done in the knee or other extremities, due to technical difficulties associated with scanning areas such as the hip and torso at 7 Tesla. In this manuscript, we first provide some technical context for 7 Tesla imaging, including challenges and potential advantages. We then review the major quantitative MRI techniques being applied to musculoskeletal applications on 7 Tesla whole-body systems.

  16. Quantitative techniques for musculoskeletal MRI at 7 Tesla

    PubMed Central

    Taylor, Meredith D.; Tarbox, Grayson J.; Palmer, Antony J.; Park, Daniel J.

    2016-01-01

    Whole-body 7 Tesla MRI scanners have been approved solely for research since they appeared on the market over 10 years ago, but may soon be approved for selected clinical neurological and musculoskeletal applications in both the EU and the United States. There has been considerable research work on musculoskeletal applications at 7 Tesla over the past decade, including techniques for ultra-high resolution morphological imaging, 3D T2 and T2* mapping, ultra-short TE applications, diffusion tensor imaging of cartilage, and several techniques for assessing proteoglycan content in cartilage. Most of this work has been done in the knee or other extremities, due to technical difficulties associated with scanning areas such as the hip and torso at 7 Tesla. In this manuscript, we first provide some technical context for 7 Tesla imaging, including challenges and potential advantages. We then review the major quantitative MRI techniques being applied to musculoskeletal applications on 7 Tesla whole-body systems. PMID:28090448

  17. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  18. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  19. Effect of different analyte diffusion/adsorption protocols on SERS signals

    NASA Astrophysics Data System (ADS)

    Li, Ruoping; Petschek, Rolfe G.; Han, Junhe; Huang, Mingju

    2018-07-01

    The effect of different analyte diffusion/adsorption protocols was studied which is often overlooked in surface-enhanced Raman scattering (SERS) technique. Three protocols: highly concentrated dilution (HCD) protocol, half-half dilution (HHD) protocol and layered adsorption (LA) protocol were studied and the SERS substrates were monolayer films of 80 nm Ag nanoparticles (NPs) which were modified by polyvinylpyrrolidone. The diffusion/adsorption mechanisms were modelled using the diffusion equation and the electromagnetic field distribution of two adjacent Ag NPs was simulated by the finite-different time-domain method. All experimental data and theoretical analysis suggest that different diffusion/adsorption behaviour of analytes will cause different SERS signal enhancements. HHD protocol could produce the most uniform and reproducible samples, and the corresponding signal intensity of the analyte is the strongest. This study will help to understand and promote the use of SERS technique in quantitative analysis.

  20. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  1. Hydrolysis Studies and Quantitative Determination of Aluminum Ions Using [superscript 27]Al NMR: An Undergraduate Analytical Chemistry Experiment

    ERIC Educational Resources Information Center

    Curtin, Maria A.; Ingalls, Laura R.; Campbell, Andrew; James-Pederson, Magdalena

    2008-01-01

    This article describes a novel experiment focused on metal ion hydrolysis and the equilibria related to metal ions in aqueous systems. Using [superscript 27]Al NMR, the students become familiar with NMR spectroscopy as a quantitative analytical tool for the determination of aluminum by preparing a standard calibration curve using standard aluminum…

  2. Applications of reversible covalent chemistry in analytical sample preparation.

    PubMed

    Siegel, David

    2012-12-07

    Reversible covalent chemistry (RCC) adds another dimension to commonly used sample preparation techniques like solid-phase extraction (SPE), solid-phase microextraction (SPME), molecular imprinted polymers (MIPs) or immuno-affinity cleanup (IAC): chemical selectivity. By selecting analytes according to their covalent reactivity, sample complexity can be reduced significantly, resulting in enhanced analytical performance for low-abundance target analytes. This review gives a comprehensive overview of the applications of RCC in analytical sample preparation. The major reactions covered include reversible boronic ester formation, thiol-disulfide exchange and reversible hydrazone formation, targeting analyte groups like diols (sugars, glycoproteins and glycopeptides, catechols), thiols (cysteinyl-proteins and cysteinyl-peptides) and carbonyls (carbonylated proteins, mycotoxins). Their applications range from low abundance proteomics to reversible protein/peptide labelling to antibody chromatography to quantitative and qualitative food analysis. In discussing the potential of RCC, a special focus is on the conditions and restrictions of the utilized reaction chemistry.

  3. Real-time quantitative fluorescence imaging using a single snapshot optical properties technique for neurosurgical guidance

    NASA Astrophysics Data System (ADS)

    Valdes, Pablo A.; Angelo, Joseph; Gioux, Sylvain

    2015-03-01

    Fluorescence imaging has shown promise as an adjunct to improve the extent of resection in neurosurgery and oncologic surgery. Nevertheless, current fluorescence imaging techniques do not account for the heterogeneous attenuation effects of tissue optical properties. In this work, we present a novel imaging system that performs real time quantitative fluorescence imaging using Single Snapshot Optical Properties (SSOP) imaging. We developed the technique and performed initial phantom studies to validate the quantitative capabilities of the system for intraoperative feasibility. Overall, this work introduces a novel real-time quantitative fluorescence imaging method capable of being used intraoperatively for neurosurgical guidance.

  4. Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakkila, E.A.

    1978-10-01

    Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.

  5. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  6. The NIST Quantitative Infrared Database

    PubMed Central

    Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.

    1999-01-01

    With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.

  7. Reactive Tracer Techniques to Quantitatively Monitor Carbon Dioxide Storage in Geologic Formations

    NASA Astrophysics Data System (ADS)

    Matter, J. M.; Carson, C.; Stute, M.; Broecker, W. S.

    2012-12-01

    Injection of CO2 into geologic storage reservoirs induces fluid-rock reactions that may lead to the mineralization of the injected CO2. The long-term safety of geologic CO2 storage is, therefore, determined by in situ CO2-fluid-rock reactions. Currently existing monitoring and verification techniques for CO2 storage are insufficient to characterize the solubility and reactivity of the injected CO2, and to establish a mass balance of the stored CO2. Dissolved and chemically transformed CO2 thus avoid detection. We developed and are testing a new reactive tracer technique for quantitative monitoring and detection of dissolved and chemically transformed CO2 in geologic storage reservoirs. The technique involves tagging the injected carbon with radiocarbon (14C). Carbon-14 is a naturally occurring radioisotope produced by cosmic radiation and made artificially by 14N neutron capture. The ambient concentration is very low with a 14C/12C ratio of 10-12. The concentration of 14C in deep geologic formations and fossil fuels is at least two orders of magnitude lower. This makes 14C an ideal quantitative tracer for tagging underground injections of anthropogenic CO2. We are testing the feasibility of this tracer technique at the CarbFix pilot injection site in Iceland, where approximately 2,000 tons of CO2 dissolved in water are currently injected into a deep basalt aquifer. The injected CO2 is tagged with 14C by dynamically adding calibrated amounts of H14CO3 solution to the injection stream. The target concentration is 12 Bq/kg of injected water, which results in a 14C activity that is 5 times enriched compared to the 1850 background. In addition to 14C as a reactive tracer, trifluormethylsulphur pentafluoride (SF5CF3) and sulfurhexafluoride (SF6) are used as conservative tracers to monitor the transport of the injected CO2 in the subsurface. Fluid samples are collected for tracer analysis from the injection and monitoring wells on a regular basis. Results show a fast

  8. Quantitative Thermochronology

    NASA Astrophysics Data System (ADS)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  9. ARPEFS as an analytic technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schach von Wittenau, A.E.

    1991-04-01

    Two modifications to the ARPEFS technique are introduced. These are studied using p(2 {times} 2)S/Cu(001) as a model system. The first modification is the obtaining of ARPEFS {chi}(k) curves at temperatures as low as our equipment will permit. While adding to the difficulty of the experiment, this modification is shown to almost double the signal-to-noise ratio of normal emission p(2 {times} 2)S/Cu(001) {chi}(k) curves. This is shown by visual comparison of the raw data and by the improved precision of the extracted structural parameters. The second change is the replacement of manual fitting of the Fourier filtered {chi}(k) curves bymore » the use of the simplex algorithm for parameter determination. Again using p(2 {times} 2)S/Cu(001) data, this is shown to result in better agreement between experimental {chi}(k) curves and curves calculated based on model structures. The improved ARPEFS is then applied to p(2 {times} 2)S/Ni(111) and ({radical}3 {times} {radical}3) R30{degree}S/Ni(111). For p(2 {times} 2)S/Cu(001) we find a S-Cu bond length of 2.26 {Angstrom}, with the S adatom 1.31 {Angstrom} above the fourfold hollow site. The second Cu layer appears to be corrugated. Analysis of the p(2 {times} 2)S/Ni(111) data indicates that the S adatom adatom adsorbs onto the FCC threefold hollow site 1.53 {Angstrom} above the Ni surface. The S-Ni bond length is determined to be 2.13 {Angstrom}, indicating an outwards shift of the first layer Ni atoms. We are unable to assign a unique structure to ({radical}3 {times} {radical}3)R30{degree}S/Ni(111). An analysis of the strengths and weaknesses of ARPEFS as an experimental and analytic technique is presented, along with a summary of problems still to be addressed.« less

  10. Quantitation by Portable Gas Chromatography: Mass Spectrometry of VOCs Associated with Vapor Intrusion

    PubMed Central

    Fair, Justin D.; Bailey, William F.; Felty, Robert A.; Gifford, Amy E.; Shultes, Benjamin; Volles, Leslie H.

    2010-01-01

    Development of a robust reliable technique that permits for the rapid quantitation of volatile organic chemicals is an important first step to remediation associated with vapor intrusion. This paper describes the development of an analytical method that allows for the rapid and precise identification and quantitation of halogenated and nonhalogenated contaminants commonly found within the ppbv level at sites where vapor intrusion is a concern. PMID:20885969

  11. Analytical electron microscopy in mineralogy; exsolved phases in pyroxenes

    USGS Publications Warehouse

    Nord, G.L.

    1982-01-01

    Analytical scanning transmission electron microscopy has been successfully used to characterize the structure and composition of lamellar exsolution products in pyroxenes. At operating voltages of 100 and 200 keV, microanalytical techniques of x-ray energy analysis, convergent-beam electron diffraction, and lattice imaging have been used to chemically and structurally characterize exsolution lamellae only a few unit cells wide. Quantitative X-ray energy analysis using ratios of peak intensities has been adopted for the U.S. Geological Survey AEM in order to study the compositions of exsolved phases and changes in compositional profiles as a function of time and temperature. The quantitative analysis procedure involves 1) removal of instrument-induced background, 2) reduction of contamination, and 3) measurement of correction factors obtained from a wide range of standard compositions. The peak-ratio technique requires that the specimen thickness at the point of analysis be thin enough to make absorption corrections unnecessary (i.e., to satisfy the "thin-foil criteria"). In pyroxenes, the calculated "maximum thicknesses" range from 130 to 1400 nm for the ratios Mg/Si, Fe/Si, and Ca/Si; these "maximum thicknesses" have been contoured in pyroxene composition space as a guide during analysis. Analytical spatial resolutions of 50-100 nm have been achieved in AEM at 200 keV from the composition-profile studies, and analytical reproducibility in AEM from homogeneous pyroxene standards is ?? 1.5 mol% endmember. ?? 1982.

  12. Cell bioprocessing in space - Applications of analytical cytology

    NASA Technical Reports Server (NTRS)

    Todd, P.; Hymer, W. C.; Goolsby, C. L.; Hatfield, J. M.; Morrison, D. R.

    1988-01-01

    Cell bioprocessing experiments in space are reviewed and the development of on-board cell analytical cytology techniques that can serve such experiments is discussed. Methods and results of experiments involving the cultivation and separation of eukaryotic cells in space are presented. It is suggested that an advanced cytometer should be developed for the quantitative analysis of large numbers of specimens of suspended eukaryotic cells and bioparticles in experiments on the Space Station.

  13. Degradation of glass artifacts: application of modern surface analytical techniques.

    PubMed

    Melcher, Michael; Wiesinger, Rita; Schreiner, Manfred

    2010-06-15

    A detailed understanding of the stability of glasses toward liquid or atmospheric attack is of considerable importance for preserving numerous objects of our cultural heritage. Glasses produced in the ancient periods (Egyptian, Greek, or Roman glasses), as well as modern glass, can be classified as soda-lime-silica glasses. In contrast, potash was used as a flux in medieval Northern Europe for the production of window panes for churches and cathedrals. The particular chemical composition of these potash-lime-silica glasses (low in silica and rich in alkali and alkaline earth components), in combination with increased levels of acidifying gases (such as SO(2), CO(2), NO(x), or O(3)) and airborne particulate matter in today's urban or industrial atmospheres, has resulted in severe degradation of important cultural relics, particularly over the last century. Rapid developments in the fields of microelectronics and computer sciences, however, have contributed to the development of a variety of nondestructive, surface analytical techniques for the scientific investigation and material characterization of these unique and valuable objects. These methods include scanning electron microscopy in combination with energy- or wavelength-dispersive spectrometry (SEM/EDX or SEM/WDX), secondary ion mass spectrometry (SIMS), and atomic force microscopy (AFM). In this Account, we address glass analysis and weathering mechanisms, exploring the possibilities (and limitations) of modern analytical techniques. Corrosion by liquid substances is well investigated in the glass literature. In a tremendous number of case studies, the basic reaction between aqueous solutions and the glass surfaces was identified as an ion-exchange reaction between hydrogen-bearing species of the attacking liquid and the alkali and alkaline earth ions in the glass, causing a depletion of the latter in the outermost surface layers. Although mechanistic analogies to liquid corrosion are obvious, atmospheric

  14. Quantitative analysis of sitagliptin using the (19)F-NMR method: a universal technique for fluorinated compound detection.

    PubMed

    Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya

    2015-01-07

    To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.

  15. Analytical Glycobiology at High Sensitivity: Current Approaches and Directions

    PubMed Central

    Novotny, Milos V.; Alley, William R.; Mann, Benjamin F.

    2013-01-01

    This review summarizes the analytical advances made during the last several years in the structural and quantitative determinations of glycoproteins in complex biological mixtures. The main analytical techniques used in the fields of glycomics and glycoproteomics involve different modes of mass spectrometry and their combinations with capillary separation methods such as microcolumn liquid chromatography and capillary electrophoresis. The needs for high-sensitivity measurements have been emphasized in the oligosaccharide profiling used in the field of biomarker discovery through MALDI mass spectrometry. High-sensitivity profiling of both glycans and glycopeptides from biological fluids and tissue extracts has been aided significantly through lectin preconcentration and the uses of affinity chromatography. PMID:22945852

  16. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  17. Large-Scale Interlaboratory Study to Develop, Analytically Validate and Apply Highly Multiplexed, Quantitative Peptide Assays to Measure Cancer-Relevant Proteins in Plasma*

    PubMed Central

    Abbatiello, Susan E.; Schilling, Birgit; Mani, D. R.; Zimmerman, Lisa J.; Hall, Steven C.; MacLean, Brendan; Albertolle, Matthew; Allen, Simon; Burgess, Michael; Cusack, Michael P.; Gosh, Mousumi; Hedrick, Victoria; Held, Jason M.; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kinsinger, Christopher R.; Lyssand, John; Makowski, Lee; Mesri, Mehdi; Rodriguez, Henry; Rudnick, Paul; Sadowski, Pawel; Sedransk, Nell; Shaddox, Kent; Skates, Stephen J.; Kuhn, Eric; Smith, Derek; Whiteaker, Jeffery R.; Whitwell, Corbin; Zhang, Shucha; Borchers, Christoph H.; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel C.; MacCoss, Michael J.; Neubert, Thomas A.; Paulovich, Amanda G.; Regnier, Fred E.; Tempst, Paul; Carr, Steven A.

    2015-01-01

    There is an increasing need in biology and clinical medicine to robustly and reliably measure tens to hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility, and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here, we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and seven control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data, we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to subnanogram/ml sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and interlaboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy-isotope-labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an interlaboratory clinical study of patient samples. Our study further establishes that LC

  18. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  19. Applying Mixed Methods Techniques in Strategic Planning

    ERIC Educational Resources Information Center

    Voorhees, Richard A.

    2008-01-01

    In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…

  20. Modeling of phonon scattering in n-type nanowire transistors using one-shot analytic continuation technique

    NASA Astrophysics Data System (ADS)

    Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel

    2013-10-01

    We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.

  1. A Quantitative Needs Assessment Technique for Cross-Cultural Work Adjustment Training.

    ERIC Educational Resources Information Center

    Selmer, Lyn

    2000-01-01

    A study of 67 Swedish expatriate bosses and 104 local Hong Kong middle managers tested a quantitative needs assessment technique measuring work values. Two-thirds of middle managers' work values were not correctly estimated by their bosses, especially instrumental values (pay, benefits, security, working hours and conditions), indicating a need…

  2. MO-E-12A-01: Quantitative Imaging: Techniques, Applications, and Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, E; Jeraj, R; McNitt-Gray, M

    The first symposium in the Quantitative Imaging Track focused on the introduction of quantitative imaging (QI) by illustrating the potential of QI in diagnostic and therapeutic applications in research and patient care, highlighting key challenges in implementation of such QI applications, and reviewing QI efforts of selected national and international agencies and organizations, including the FDA, NCI, NIST, and RSNA. This second QI symposium will focus more specifically on the techniques, applications, and challenges of QI. The first talk of the session will focus on modalityagnostic challenges of QI, beginning with challenges of the development and implementation of QI applicationsmore » in single-center, single-vendor settings and progressing to the challenges encountered in the most general setting of multi-center, multi-vendor settings. The subsequent three talks will focus on specific QI challenges and opportunities in the modalityspecific settings of CT, PET/CT, and MR. Each talk will provide information on modality-specific QI techniques, applications, and challenges, including current efforts focused on solutions to such challenges. Learning Objectives: Understand key general challenges of QI application development and implementation, regardless of modality. Understand selected QI techniques and applications in CT, PET/CT, and MR. Understand challenges, and potential solutions for such challenges, for the applications presented for each modality.« less

  3. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    PubMed Central

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370

  4. Quantitative study of Xanthosoma violaceum leaf surfaces using RIMAPS and variogram techniques.

    PubMed

    Favret, Eduardo A; Fuentes, Néstor O; Molina, Ana M

    2006-08-01

    Two new imaging techniques (rotated image with maximum averaged power spectrum (RIMAPS) and variogram) are presented for the study and description of leaf surfaces. Xanthosoma violaceum was analyzed to illustrate the characteristics of both techniques. Both techniques produce a quantitative description of leaf surface topography. RIMAPS combines digitized images rotation with Fourier transform, and it is used to detect patterns orientation and characteristics of surface topography. Variogram relates the mathematical variance of a surface with the area of the sample window observed. It gives the typical scale lengths of the surface patterns. RIMAPS detects the morphological variations of the surface topography pattern between fresh and dried (herbarium) samples of the leaf. The variogram method finds the characteristic dimensions of the leaf microstructure, i.e., cell length, papillae diameter, etc., showing that there are not significant differences between dry and fresh samples. The results obtained show the robustness of RIMAPS and variogram analyses to detect, distinguish, and characterize leaf surfaces, as well as give scale lengths. Both techniques are tools for the biologist to study variations of the leaf surface when different patterns are present. The use of RIMAPS and variogram opens a wide spectrum of possibilities by providing a systematic, quantitative description of the leaf surface topography.

  5. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  6. On the Applications of IBA Techniques to Biological Samples Analysis: PIXE and RBS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falcon-Gonzalez, J. M.; Bernal-Alvarado, J.; Sosa, M.

    2008-08-11

    The analytical techniques based on ion beams or IBA techniques give quantitative information on elemental concentration in samples of a wide variety of nature. In this work, we focus on PIXE technique, analyzing thick target biological specimens (TTPIXE), using 3 MeV protons produced by an electrostatic accelerator. A nuclear microprobe was used performing PIXE and RBS simultaneously, in order to solve the uncertainties produced in the absolute PIXE quantifying. The advantages of using both techniques and a nuclear microprobe are discussed. Quantitative results are shown to illustrate the multielemental resolution of the PIXE technique; for this, a blood standard wasmore » used.« less

  7. Analytical validation of quantitative immunohistochemical assays of tumor infiltrating lymphocyte biomarkers.

    PubMed

    Singh, U; Cui, Y; Dimaano, N; Mehta, S; Pruitt, S K; Yearley, J; Laterza, O F; Juco, J W; Dogdas, B

    2018-06-04

    Tumor infiltrating lymphocytes (TIL), especially T-cells, have both prognostic and therapeutic applications. The presence of CD8+ effector T-cells and the ratio of CD8+ cells to FOXP3+ regulatory T-cells have been used as biomarkers of disease prognosis to predict response to various immunotherapies. Blocking the interaction between inhibitory receptors on T-cells and their ligands with therapeutic antibodies including atezolizumab, nivolumab, pembrolizumab and tremelimumab increases the immune response against cancer cells and has shown significant improvement in clinical benefits and survival in several different tumor types. The improved clinical outcome is presumed to be associated with a higher tumor infiltration; therefore, it is thought that more accurate methods for measuring the amount of TIL could assist prognosis and predict treatment response. We have developed and validated quantitative immunohistochemistry (IHC) assays for CD3, CD8 and FOXP3 for immunophenotyping T-lymphocytes in tumor tissue. Various types of formalin fixed, paraffin embedded (FFPE) tumor tissues were immunolabeled with anti-CD3, anti-CD8 and anti-FOXP3 antibodies using an IHC autostainer. The tumor area of stained tissues, including the invasive margin of the tumor, was scored by a pathologist (visual scoring) and by computer-based quantitative image analysis. Two image analysis scores were obtained for the staining of each biomarker: the percent positive cells in the tumor area and positive cells/mm 2 tumor area. Comparison of visual vs. image analysis scoring methods using regression analysis showed high correlation and indicated that quantitative image analysis can be used to score the number of positive cells in IHC stained slides. To demonstrate that the IHC assays produce consistent results in normal daily testing, we evaluated the specificity, sensitivity and reproducibility of the IHC assays using both visual and image analysis scoring methods. We found that CD3, CD8 and

  8. Characterization, thermal stability studies, and analytical method development of Paromomycin for formulation development.

    PubMed

    Khan, Wahid; Kumar, Neeraj

    2011-06-01

    Paromomycin (PM) is an aminoglycoside antibiotic, first isolated in the 1950s, and approved in 2006 for treatment of visceral leishmaniasis. Although isolated six decades back, sufficient information essential for development of pharmaceutical formulation is not available for PM. The purpose of this paper was to determine thermal stability and development of new analytical method for formulation development of PM. PM was characterized by thermoanalytical (DSC, TGA, and HSM) and by spectroscopic (FTIR) techniques and these techniques were used to establish thermal stability of PM after heating PM at 100, 110, 120, and 130 °C for 24 h. Biological activity of these heated samples was also determined by microbiological assay. Subsequently, a simple, rapid and sensitive RP-HPLC method for quantitative determination of PM was developed using pre-column derivatization with 9-fluorenylmethyl chloroformate. The developed method was applied to estimate PM quantitatively in two parenteral dosage forms. PM was successfully characterized by various stated techniques. These techniques indicated stability of PM for heating up to 120 °C for 24 h, but when heated at 130 °C, PM is liable to degradation. This degradation is also observed in microbiological assay where PM lost ∼30% of its biological activity when heated at 130 °C for 24 h. New analytical method was developed for PM in the concentration range of 25-200 ng/ml with intra-day and inter-day variability of < 2%RSD. Characterization techniques were established and stability of PM was determined successfully. Developed analytical method was found sensitive, accurate, and precise for quantification of PM. Copyright © 2010 John Wiley & Sons, Ltd. Copyright © 2010 John Wiley & Sons, Ltd.

  9. Analytical and numerical techniques for predicting the interfacial stresses of wavy carbon nanotube/polymer composites

    NASA Astrophysics Data System (ADS)

    Yazdchi, K.; Salehi, M.; Shokrieh, M. M.

    2009-03-01

    By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.

  10. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  11. [The quantitative testing of V617F mutation in gen JAK2 using pyrosequencing technique].

    PubMed

    Dunaeva, E A; Mironov, K O; Dribnokhodova, T E; Subbotina, E E; Bashmakova; Ol'hovskiĭ, I A; Shipulin, G A

    2014-11-01

    The somatic mutation V617F in gen JAK2 is a frequent cause of chronic myeloprolific diseases not conditioned by BCR/ABL mutation. The quantitative testing of relative percentage of mutant allele can be used in establishing severity of disease and its prognosis and in prescription of remedy inhibiting activity of JAK2. To quantitatively test mutation the pyrosequencing technique was applied. The developed technique permits detecting and quantitatively, testing percentage of mutation fraction since 7%. The "gray zone" is presented by samples with percentage of mutant allele from 4% to 7%. The dependence of expected percentage of mutant fraction in analyzed sample from observed value of signal is described by equation of line with regression coefficients y = - 0.97, x = -1.32 and at that measurement uncertainty consists ± 0.7. The developed technique is approved officially on clinical material from 192 patients with main forms of myeloprolific diseases not conditioned by BCR/ABL mutation. It was detected 64 samples with mautant fraction percentage from 13% to 91%. The developed technique permits implementing monitoring of therapy of myeloprolific diseases and facilitates to optimize tactics of treatment.

  12. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  13. Cost and Schedule Analytical Techniques Development

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) under contract NAS 8-40431 "Cost and Schedule Analytical Techniques Development Contract" (CSATD) during Option Year 3 (December 1, 1997 through November 30, 1998). This Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides technical products and deliverables in the form of parametric models, databases, methodologies, studies, and analyses to the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) and other user organizations. Detailed Monthly Reports were submitted to MSFC in accordance with the contract's Statement of Work, Section IV "Reporting and Documentation". These reports spelled out each month's specific work performed, deliverables submitted, major meetings conducted, and other pertinent information. Therefore, this Final Report will summarize these activities at a higher level. During this contract Option Year, SAIC expended 25,745 hours in the performance of tasks called out in the Statement of Work. This represents approximately 14 full-time EPs. Included are the Huntsville-based team, plus SAIC specialists in San Diego, Ames Research Center, Tampa, and Colorado Springs performing specific tasks for which they are uniquely qualified.

  14. Nondestructive atomic compositional analysis of BeMgZnO quaternary alloys using ion beam analytical techniques

    NASA Astrophysics Data System (ADS)

    Zolnai, Z.; Toporkov, M.; Volk, J.; Demchenko, D. O.; Okur, S.; Szabó, Z.; Özgür, Ü.; Morkoç, H.; Avrutin, V.; Kótai, E.

    2015-02-01

    The atomic composition with less than 1-2 atom% uncertainty was measured in ternary BeZnO and quaternary BeMgZnO alloys using a combination of nondestructive Rutherford backscattering spectrometry with 1 MeV He+ analyzing ion beam and non-Rutherford elastic backscattering experiments with 2.53 MeV energy protons. An enhancement factor of 60 in the cross-section of Be for protons has been achieved to monitor Be atomic concentrations. Usually the quantitative analysis of BeZnO and BeMgZnO systems is challenging due to difficulties with appropriate experimental tools for the detection of the light Be element with satisfactory accuracy. As it is shown, our applied ion beam technique, supported with the detailed simulation of ion stopping, backscattering, and detection processes allows of quantitative depth profiling and compositional analysis of wurtzite BeZnO/ZnO/sapphire and BeMgZnO/ZnO/sapphire layer structures with low uncertainty for both Be and Mg. In addition, the excitonic bandgaps of the layers were deduced from optical transmittance measurements. To augment the measured compositions and bandgaps of BeO and MgO co-alloyed ZnO layers, hybrid density functional bandgap calculations were performed with varying the Be and Mg contents. The theoretical vs. experimental bandgaps show linear correlation in the entire bandgap range studied from 3.26 eV to 4.62 eV. The analytical method employed should help facilitate bandgap engineering for potential applications, such as solar blind UV photodetectors and heterostructures for UV emitters and intersubband devices.

  15. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  16. Progressing towards more quantitative analytical pyrolysis of soil organic matter using molecular beam mass spectroscopy of whole soils and added standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haddix, Michelle L.; Magrini-Bair, Kim; Evans, Robert J.

    Soil organic matter (SOM) is extremely complex. It is composed of hundreds of different organic substances and it has been difficult to quantify these diverse substances in a dynamic-ecosystem functioning standpoint. Analytical pyrolysis has been used to compare chemical differences between soils, but its ability to measure the absolute amount of a specific compound in the soil is still in question. Our objective was to assess whether utilizing pyrolysis-molecular beam mass spectroscopy (py-MBMS) to define the signature of known reference compounds (adenine, indole, palmitic acid, etc.) and biological samples (chitin, fungi, cellulose, etc.) separately and when added to whole soilsmore » it was possible to make py-MBMS more quantitative. Reference compounds, spanning a wide variety of compound categories, and biological samples, expected to be present in SOM, were added to three soils from Colorado, Ohio, and Massachusetts that have varying total C, % clay, and clay type. Py-MBMS, a rapid analysis technique originally developed to analyze complex biomolecules, flash pyrolyzes soil organic matter to form products that are often considered characteristic of the original molecular structure. Samples were pyrolyzed at 550 degrees C by py-MBMS. All samples were weighed and %C and %N determined both before and after pyrolysis to evaluate mass loss, C loss, and N loss for the samples.An average relationship of r2 = 0.76 (P = 0.005) was found for the amount of cellulose added to soil at 25, 50, and 100% of soil C relative to the ion intensity of select mass/charge of the compound.There was a relationship of r2 = 0.93 (P < 0.001) for the amount of indole added to soil at 25, 50, and 100% of soil C and the ion intensity of the associated mass variables (mass/charge). Comparing spectra of pure compounds with the spectra of the compounds added to soil and isolated clay showed that interference could occur based on soil type and compound with the Massachusetts soil with high

  17. An analytical technique for approximating unsteady aerodynamics in the time domain

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1980-01-01

    An analytical technique is presented for approximating unsteady aerodynamic forces in the time domain. The order of elements of a matrix Pade approximation was postulated, and the resulting polynomial coefficients were determined through a combination of least squares estimates for the numerator coefficients and a constrained gradient search for the denominator coefficients which insures stable approximating functions. The number of differential equations required to represent the aerodynamic forces to a given accuracy tends to be smaller than that employed in certain existing techniques where the denominator coefficients are chosen a priori. Results are shown for an aeroelastic, cantilevered, semispan wing which indicate a good fit to the aerodynamic forces for oscillatory motion can be achieved with a matrix Pade approximation having fourth order numerator and second order denominator polynomials.

  18. Visual analytics techniques for large multi-attribute time series data

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.

    2008-01-01

    Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.

  19. On the accuracy of analytical models of impurity segregation during directional melt crystallization and their applicability for quantitative calculations

    NASA Astrophysics Data System (ADS)

    Voloshin, A. E.; Prostomolotov, A. I.; Verezub, N. A.

    2016-11-01

    The paper deals with the analysis of the accuracy of some one-dimensional (1D) analytical models of the axial distribution of impurities in the crystal grown from a melt. The models proposed by Burton-Prim-Slichter, Ostrogorsky-Muller and Garandet with co-authors are considered, these models are compared to the results of a two-dimensional (2D) numerical simulation. Stationary solutions as well as solutions for the initial transient regime obtained using these models are considered. The sources of errors are analyzed, a conclusion is made about the applicability of 1D analytical models for quantitative estimates of impurity incorporation into the crystal sample as well as for the solution of the inverse problems.

  20. Analytical techniques for characterization of cyclodextrin complexes in aqueous solution: a review.

    PubMed

    Mura, Paola

    2014-12-01

    Cyclodextrins are cyclic oligosaccharides endowed with a hydrophilic outer surface and a hydrophobic inner cavity, able to form inclusion complexes with a wide variety of guest molecules, positively affecting their physicochemical properties. In particular, in the pharmaceutical field, cyclodextrin complexation is mainly used to increase the aqueous solubility and dissolution rate of poorly soluble drugs, and to enhance their bioavailability and stability. Analytical characterization of host-guest interactions is of fundamental importance for fully exploiting the potential benefits of complexation, helping in selection of the most appropriate cyclodextrin. The assessment of the actual formation of a drug-cyclodextrin inclusion complex and its full characterization is not a simple task and often requires the use of different analytical methods, whose results have to be combined and examined together. The purpose of the present review is to give, as much as possible, a general overview of the main analytical tools which can be employed for the characterization of drug-cyclodextrin inclusion complexes in solution, with emphasis on their respective potential merits, disadvantages and limits. Further, the applicability of each examined technique is illustrated and discussed by specific examples from literature. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Application of ion chemistry and the SIFT technique to the quantitative analysis of trace gases in air and on breath

    NASA Astrophysics Data System (ADS)

    Smith, David; Španěl, Patrik

    Our major objective in this paper is to describe a new method we have developed for the analysis of trace gases at partial pressures down to the ppb level in atmospheric air, with special emphasis on the detection and quantification of trace gases on human breath. It involves the use of our selected ion flow tube (Sift) technique which we previously developed and used extensively for the study of gas phase ionic reactions occurring in ionized media such as the terrestrial atmosphere and interstellar gas clouds. Before discussing this analytical technique we describe the results of our very recent Sift and flowing afterglow (FA) studies of the reactions of the H3O+ and OH- ions, of their hydrates H3O+(H2O)1,2,3 and OH- (H2O)1,2, and of NO+ and O2+, with several hydrocarbons and oxygen-bearing organic molecules, studies that are very relevant to our trace gas analytical studies. Then follows a detailed discussion of the application of our Sift technique to trace gas analysis, after which we present some results obtained for the analyses of laboratory air, the breath of a healthy non-smoking person, the breath of a person who regularly smokes cigarettes, the complex vapours emitted by banana and onion, and the molecules present in a butane/air flame. We show how the quantitative analysis of breath can be achieved from only a single exhalation and in real time (the time response of the instrument is only about 20 ms). We also show how the time variation of breath gases over long time periods can be followed, using the decay of ethanol on the breath after the ingestion of distilled liquor as an example, yet simultaneously following several other trace gases including acetone and isoprene which are very easily detected on the breath of all individuals because of their relatively high partial pressures (typically 100 to 1000 ppb). The breath of a smoker is richer in complex molecules, some nitrogen containing organics apparently being very evident at the 5 to 50 ppb level

  2. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  3. Size analysis of polyglutamine protein aggregates using fluorescence detection in an analytical ultracentrifuge.

    PubMed

    Polling, Saskia; Hatters, Danny M; Mok, Yee-Foong

    2013-01-01

    Defining the aggregation process of proteins formed by poly-amino acid repeats in cells remains a challenging task due to a lack of robust techniques for their isolation and quantitation. Sedimentation velocity methodology using fluorescence detected analytical ultracentrifugation is one approach that can offer significant insight into aggregation formation and kinetics. While this technique has traditionally been used with purified proteins, it is now possible for substantial information to be collected with studies using cell lysates expressing a GFP-tagged protein of interest. In this chapter, we describe protocols for sample preparation and setting up the fluorescence detection system in an analytical ultracentrifuge to perform sedimentation velocity experiments on cell lysates containing aggregates formed by poly-amino acid repeat proteins.

  4. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    PubMed

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  5. Quantitative impact of direct, personal feedback on hand hygiene technique.

    PubMed

    Lehotsky, Á; Szilágyi, L; Ferenci, T; Kovács, L; Pethes, R; Wéber, G; Haidegger, T

    2015-09-01

    This study investigated the effectiveness of targeting hand hygiene technique using a new training device that provides objective, personal and quantitative feedback. One hundred and thirty-six healthcare workers in three Hungarian hospitals participated in a repetitive hand hygiene technique assessment study. Ultraviolet (UV)-labelled hand rub was used at each event, and digital images of the hands were subsequently taken under UV light. Immediate objective visual feedback was given to participants, showing missed areas on their hands. The rate of inadequate hand rubbing reduced from 50% to 15% (P < 0.001). However, maintenance of this reduced rate is likely to require continuous use of the electronic equipment. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  6. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment.

    PubMed

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.

  7. A novel approach for quantitation of nonderivatized sialic acid in protein therapeutics using hydrophilic interaction chromatographic separation and nano quantity analyte detection.

    PubMed

    Chemmalil, Letha; Suravajjala, Sreekanth; See, Kate; Jordan, Eric; Furtado, Marsha; Sun, Chong; Hosselet, Stephen

    2015-01-01

    This paper describes a novel approach for the quantitation of nonderivatized sialic acid in glycoproteins, separated by hydrophilic interaction chromatography, and detection by Nano Quantity Analyte Detector (NQAD). The detection technique of NQAD is based on measuring change in the size of dry aerosol and converting the particle count rate into chromatographic output signal. NQAD detector is suitable for the detection of sialic acid, which lacks sufficiently active chromophore or fluorophore. The water condensation particle counting technology allows the analyte to be enlarged using water vapor to provide highest sensitivity. Derivatization-free analysis of glycoproteins using HPLC/NQAD method with PolyGLYCOPLEX™ amide column is well correlated with HPLC method with precolumn derivatization using 1, 2-diamino-4, 5-methylenedioxybenzene (DMB) as well as the Dionex-based high-pH anion-exchange chromatography (or ion chromatography) with pulsed amperometric detection (HPAEC-PAD). With the elimination of derivatization step, HPLC/NQAD method is more efficient than HPLC/DMB method. HPLC/NQAD method is more reproducible than HPAEC-PAD method as HPAEC-PAD method suffers high variability because of electrode fouling during analysis. Overall, HPLC/NQAD method offers broad linear dynamic range as well as excellent precision, accuracy, repeatability, reliability, and ease of use, with acceptable comparability to the commonly used HPAEC-PAD and HPLC/DMB methods. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  8. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  9. Matrix effects break the LC behavior rule for analytes in LC-MS/MS analysis of biological samples

    USDA-ARS?s Scientific Manuscript database

    High-performance liquid chromatography (HPLC) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) are generally accepted as the preferred techniques for detecting and quantitating analytes of interest in biological matrices on the basis of the rule that one chemical compound yields one LC-...

  10. Applications of nuclear analytical techniques to environmental studies

    NASA Astrophysics Data System (ADS)

    Freitas, M. C.; Pacheco, A. M. G.; Marques, A. P.; Barros, L. I. C.; Reis, M. A.

    2001-07-01

    A few examples of application of nuclear-analytical techniques to biological monitors—natives and transplants—are given herein. Parmelia sulcata Taylor transplants were set up in a heavily industrialized area of Portugal—the Setúbal peninsula, about 50 km south of Lisbon—where indigenous lichens are rare. The whole area was 10×15 km around an oil-fired power station, and a 2.5×2.5 km grid was used. In north-western Portugal, native thalli of the same epiphytes (Parmelia spp., mostly Parmelia sulcata Taylor) and bark from olive trees (Olea europaea) were sampled across an area of 50×50 km, using a 10×10 km grid. This area is densely populated and features a blend of rural, urban-industrial and coastal environments, together with the country's second-largest metro area (Porto). All biomonitors have been analyzed by INAA and PIXE. Results were put through nonparametric tests and factor analysis for trend significance and emission sources, respectively.

  11. Effects of fecal sampling on preanalytical and analytical phases in quantitative fecal immunochemical tests for hemoglobin.

    PubMed

    Rapi, Stefano; Berardi, Margherita; Cellai, Filippo; Ciattini, Samuele; Chelazzi, Laura; Ognibene, Agostino; Rubeca, Tiziana

    2017-07-24

    Information on preanalytical variability is mandatory to bring laboratories up to ISO 15189 requirements. Fecal sampling is greatly affected by lack of harmonization in laboratory medicine. The aims of this study were to obtain information on the devices used for fecal sampling and to explore the effect of different amounts of feces on the results from the fecal immunochemical test for hemoglobin (FIT-Hb). Four commercial sample collection devices for quantitative FIT-Hb measurements were investigated. The volume of interest (VOI) of the probes was measured from diameter and geometry. Quantitative measurements of the mass of feces were carried out by gravimetry. The effects of an increased amount of feces on the analytical environment were investigated measuring the Hb values with a single analytical method. VOI was 8.22, 7.1 and 9.44 mm3 for probes that collected a target of 10 mg of feces, and 3.08 mm3 for one probe that targeted 2 mg of feces. The ratio between recovered and target amounts of devices ranged from 56% to 121%. Different changes in the measured Hb values were observed, in adding increasing amounts of feces in commercial buffers. The amounts of collected materials are related to the design of probes. Three out 4 manufacturers declare the same target amount using different sampling volumes and obtaining different amounts of collected materials. The introduction of a standard probes to reduce preanalytical variability could be an useful step for fecal test harmonization and to fulfill the ISO 15189 requirements.

  12. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques

    PubMed Central

    Rosebrock, Adrian; Caban, Jesus J.; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2014-01-01

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute’s Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant. PMID:25722829

  13. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  14. Comparative study of quantitative phase imaging techniques for refractometry of optical fibers

    NASA Astrophysics Data System (ADS)

    de Dorlodot, Bertrand; Bélanger, Erik; Bérubé, Jean-Philippe; Vallée, Réal; Marquet, Pierre

    2018-02-01

    The refractive index difference profile of optical fibers is the key design parameter because it determines, among other properties, the insertion losses and propagating modes. Therefore, an accurate refractive index profiling method is of paramount importance to their development and optimization. Quantitative phase imaging (QPI) is one of the available tools to retrieve structural characteristics of optical fibers, including the refractive index difference profile. Having the advantage of being non-destructive, several different QPI methods have been developed over the last decades. Here, we present a comparative study of three different available QPI techniques, namely the transport-of-intensity equation, quadriwave lateral shearing interferometry and digital holographic microscopy. To assess the accuracy and precision of those QPI techniques, quantitative phase images of the core of a well-characterized optical fiber have been retrieved for each of them and a robust image processing procedure has been applied in order to retrieve their refractive index difference profiles. As a result, even if the raw images for all the three QPI methods were suffering from different shortcomings, our robust automated image-processing pipeline successfully corrected these. After this treatment, all three QPI techniques yielded accurate, reliable and mutually consistent refractive index difference profiles in agreement with the accuracy and precision of the refracted near-field benchmark measurement.

  15. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  16. Self-Normalized Photoacoustic Technique for the Quantitative Analysis of Paper Pigments

    NASA Astrophysics Data System (ADS)

    Balderas-López, J. A.; Gómez y Gómez, Y. M.; Bautista-Ramírez, M. E.; Pescador-Rojas, J. A.; Martínez-Pérez, L.; Lomelí-Mejía, P. A.

    2018-03-01

    A self-normalized photoacoustic technique was applied for quantitative analysis of pigments embedded in solids. Paper samples (filter paper, Whatman No. 1), attached with the pigment: Direct Fast Turquoise Blue GL, were used for this study. This pigment is a blue dye commonly used in industry to dye paper and other fabrics. The optical absorption coefficient, at a wavelength of 660 nm, was measured for this pigment at various concentrations in the paper substrate. It was shown that Beer-Lambert model for light absorption applies well for pigments in solid substrates and optical absorption coefficients as large as 220 cm^{-1} can be measured with this photoacoustic technique.

  17. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    PubMed

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher

  18. Sample and data processing considerations for the NIST quantitative infrared database

    NASA Astrophysics Data System (ADS)

    Chu, Pamela M.; Guenther, Franklin R.; Rhoderick, George C.; Lafferty, Walter J.; Phillips, William

    1999-02-01

    Fourier-transform infrared (FT-IR) spectrometry has become a useful real-time in situ analytical technique for quantitative gas phase measurements. In fact, the U.S. Environmental Protection Agency (EPA) has recently approved open-path FT-IR monitoring for the determination of hazardous air pollutants (HAP) identified in EPA's Clean Air Act of 1990. To support infrared based sensing technologies, the National Institute of Standards and Technology (NIST) is currently developing a standard quantitative spectral database of the HAPs based on gravimetrically prepared standard samples. The procedures developed to ensure the quantitative accuracy of the reference data are discussed, including sample preparation, residual sample contaminants, data processing considerations, and estimates of error.

  19. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    PubMed

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  20. Macro elemental analysis of food samples by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  1. Speciation and Characterization of E-Waste, Using Analytical Techniques

    NASA Astrophysics Data System (ADS)

    López, C. Cortés; Cruz, V. E. Reyes; Rodríguez, M. A. Veloz; Ávila, J. Hernández; Badillo, J. Flores; Murcia, J. A. Cobos

    Electronic waste (e-waste), have a high potential as a source of precious metals, since they can contain metals like silver, gold, platinum, copper, zinc, nickel, tin and others. In this paper some e-waste were characterized using several analytical techniques as Scanning Electron Microscopy (SEM), X-ray diffraction (XRD) and inductively coupled plasma (ICP) in addition to the thermodynamic study by Pourbaix diagrams of silver (Ag), gold (Au), platinum (Pt), copper (Cu), nickel (Ni), tin (Sn) and zinc (Zn); considering an average low concentration of HNO3 (10% v/v). With results of the characterization was determined that the e-waste is an ideal source for the recovery of valuable metals. Similarly, the thermodynamic studies showed that it is possible to obtain all metallic species except Pt, in a potential window of 1.45V to 2.0V vs SCE.

  2. Comparison of a two-dimensional adaptive-wall technique with analytical wall interference correction techniques

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.

    1992-01-01

    A two dimensional airfoil model was tested in the adaptive wall test section of the NASA Langley 0.3 meter Transonic Cryogenic Tunnel (TCT) and in the ventilated test section of the National Aeronautical Establishment Two Dimensional High Reynold Number Facility (HRNF). The primary goal of the tests was to compare different techniques (adaptive test section walls and classical, analytical corrections) to account for wall interference. Tests were conducted over a Mach number range from 0.3 to 0.8 at chord Reynolds numbers of 10 x 10(exp 6), 15 x 10(exp 6), and 20 x 10(exp 6). The angle of attack was varied from about 12 degrees up to stall. Movement of the top and bottom test section walls was used to account for the wall interference in the HRNF tests. The test results are in good agreement.

  3. Magnetoresistive biosensors for quantitative proteomics

    NASA Astrophysics Data System (ADS)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  4. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment

    PubMed Central

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364

  5. Microstructural study of the nickel-base alloy WAZ-20 using qualitative and quantitative electron optical techniques

    NASA Technical Reports Server (NTRS)

    Young, S. G.

    1973-01-01

    The NASA nickel-base alloy WAZ-20 was analyzed by advanced metallographic techniques to qualitatively and quantitatively characterize its phases and stability. The as-cast alloy contained primary gamma-prime, a coarse gamma-gamma prime eutectic, a gamma-fine gamma prime matrix, and MC carbides. A specimen aged at 870 C for 1000 hours contained these same constituents and a few widely scattered high W particles. No detrimental phases (such as sigma or mu) were observed. Scanning electron microscope, light metallography, and replica electron microscope methods are compared. The value of quantitative electron microprobe techniques such as spot and area analysis is demonstrated.

  6. Cells and Stripes: A novel quantitative photo-manipulation technique

    PubMed Central

    Mistrik, Martin; Vesela, Eva; Furst, Tomas; Hanzlikova, Hana; Frydrych, Ivo; Gursky, Jan; Majera, Dusana; Bartek, Jiri

    2016-01-01

    Laser micro-irradiation is a technology widely used in the DNA damage response, checkpoint signaling, chromatin remodeling and related research fields, to assess chromatin modifications and recruitment of diverse DNA damage sensors, mediators and repair proteins to sites of DNA lesions. While this approach has aided numerous discoveries related to cell biology, maintenance of genome integrity, aging and cancer, it has so far been limited by a tedious manual definition of laser-irradiated subcellular regions, with the ensuing restriction to only a small number of cells treated and analyzed in a single experiment. Here, we present an improved and versatile alternative to the micro-irradiation approach: Quantitative analysis of photo-manipulated samples using innovative settings of standard laser-scanning microscopes. Up to 200 cells are simultaneously exposed to a laser beam in a defined pattern of collinear rays. The induced striation pattern is then automatically evaluated by a simple algorithm, which provides a quantitative assessment of various laser-induced phenotypes in live or fixed cells. Overall, this new approach represents a more robust alternative to existing techniques, and provides a versatile tool for a wide range of applications in biomedicine. PMID:26777522

  7. Pulmonary nodule characterization, including computer analysis and quantitative features.

    PubMed

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  8. Recent Advances in Analytical Pyrolysis to Investigate Organic Materials in Heritage Science.

    PubMed

    Degano, Ilaria; Modugno, Francesca; Bonaduce, Ilaria; Ribechini, Erika; Colombini, Maria Perla

    2018-06-18

    The molecular characterization of organic materials in samples from artworks and historical objects traditionally entailed qualitative and quantitative analyses by HPLC and GC. Today innovative approaches based on analytical pyrolysis enable samples to be analysed without any chemical pre-treatment. Pyrolysis, which is often considered as a screening technique, shows previously unexplored potential thanks to recent instrumental developments. Organic materials that are macromolecular in nature, or undergo polymerization upon curing and ageing can now be better investigated. Most constituents of paint layers and archaeological organic substances contain major insoluble and chemically non-hydrolysable fractions that are inaccessible to GC or HPLC. To date, molecular scientific investigations of the organic constituents of artworks and historical objects have mostly focused on the minor constituents of the sample. This review presents recent advances in the qualitative and semi-quantitative analyses of organic materials in heritage objects based on analytical pyrolysis coupled with mass spectrometry. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  10. Quantitative elemental analysis of an industrial mineral talc, using accelerator-based analytical technique

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, A. O.; Mazzoli, C.; Ceccato, D.; Ajayi, E. O. B.; De Poli, M.; Moschini, G.

    2005-10-01

    Accelerator-based technique of PIXE was employed for the determination of the elemental concentration of an industrial mineral, talc. Talc is a very versatile mineral in industries with several applications. Due to this, there is a need to know its constituents to ensure that the workers are not exposed to health risks. Besides, microscopic tests on some talc samples in Nigeria confirm that they fall within the BP British Pharmacopoeia standard for tablet formation. However, for these samples to become a local source of raw material for pharmaceutical grade talc, the precise elemental compositions should be established which is the focus of this work. Proton beam produced by the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy was used for the PIXE measurements. The results which show the concentration of different elements in the talc samples, their health implications and metabolic roles are presented and discussed.

  11. Confocal Raman Microscopy for pH-Gradient Preconcentration and Quantitative Analyte Detection in Optically Trapped Phospholipid Vesicles.

    PubMed

    Hardcastle, Chris D; Harris, Joel M

    2015-08-04

    The ability of a vesicle membrane to preserve a pH gradient, while allowing for diffusion of neutral molecules across the phospholipid bilayer, can provide the isolation and preconcentration of ionizable compounds within the vesicle interior. In this work, confocal Raman microscopy is used to observe (in situ) the pH-gradient preconcentration of compounds into individual optically trapped vesicles that provide sub-femtoliter collectors for small-volume samples. The concentration of analyte accumulated in the vesicle interior is determined relative to a perchlorate-ion internal standard, preloaded into the vesicle along with a high-concentration buffer. As a guide to the experiments, a model for the transfer of analyte into the vesicle based on acid-base equilibria is developed to predict the concentration enrichment as a function of source-phase pH and analyte concentration. To test the concept, the accumulation of benzyldimethylamine (BDMA) was measured within individual 1 μm phospholipid vesicles having a stable initial pH that is 7 units lower than the source phase. For low analyte concentrations in the source phase (100 nM), a concentration enrichment into the vesicle interior of (5.2 ± 0.4) × 10(5) was observed, in agreement with the model predictions. Detection of BDMA from a 25 nM source-phase sample was demonstrated, a noteworthy result for an unenhanced Raman scattering measurement. The developed model accurately predicts the falloff of enrichment (and measurement sensitivity) at higher analyte concentrations, where the transfer of greater amounts of BDMA into the vesicle titrates the internal buffer and decreases the pH gradient. The predictable calibration response over 4 orders of magnitude in source-phase concentration makes it suitable for quantitative analysis of ionizable compounds from small-volume samples. The kinetics of analyte accumulation are relatively fast (∼15 min) and are consistent with the rate of transfer of a polar aromatic

  12. An analytical approach for the calculation of stress-intensity factors in transformation-toughened ceramics

    NASA Astrophysics Data System (ADS)

    Müller, W. H.

    1990-12-01

    Stress-induced transformation toughening in Zirconia-containing ceramics is described analytically by means of a quantitative model: A Griffith crack which interacts with a transformed, circular Zirconia inclusion. Due to its volume expansion, a ZrO2-particle compresses its flanks, whereas a particle in front of the crack opens the flanks such that the crack will be attracted and finally absorbed. Erdogan's integral equation technique is applied to calculate the dislocation functions and the stress-intensity-factors which correspond to these situations. In order to derive analytical expressions, the elastic constants of the inclusion and the matrix are assumed to be equal.

  13. Absolute activity quantitation from projections using an analytical approach: comparison with iterative methods in Tc-99m and I-123 brain SPECT

    NASA Astrophysics Data System (ADS)

    Fakhri, G. El; Kijewski, M. F.; Moore, S. C.

    2001-06-01

    Estimates of SPECT activity within certain deep brain structures could be useful for clinical tasks such as early prediction of Alzheimer's disease with Tc-99m or Parkinson's disease with I-123; however, such estimates are biased by poor spatial resolution and inaccurate scatter and attenuation corrections. We compared an analytical approach (AA) of more accurate quantitation to a slower iterative approach (IA). Monte Carlo simulated projections of 12 normal and 12 pathologic Tc-99m perfusion studies, as well as 12, normal and 12 pathologic I-123 neurotransmission studies, were generated using a digital brain phantom and corrected for scatter by a multispectral fitting procedure. The AA included attenuation correction by a modified Metz-Fan algorithm and activity estimation by a technique that incorporated Metz filtering to compensate for variable collimator response (VCR), IA-modeled attenuation, and VCR in the projector/backprojector of an ordered subsets-expectation maximization (OSEM) algorithm. Bias and standard deviation over the 12 normal and 12 pathologic patients were calculated with respect to the reference values in the corpus callosum, caudate nucleus, and putamen. The IA and AA yielded similar quantitation results in both Tc-99m and I-123 studies in all brain structures considered in both normal and pathologic patients. The bias with respect to the reference activity distributions was less than 7% for Tc-99m studies, but greater than 30% for I-123 studies, due to partial volume effect in the striata. Our results were validated using I-123 physical acquisitions of an anthropomorphic brain phantom. The IA yielded quantitation accuracy comparable to that obtained with IA, while requiring much less processing time. However, in most conditions, IA yielded lower noise for the same bias than did AA.

  14. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  15. Evaluation and performance of desorption electrospray ionization using a triple quadrupole mass spectrometer for quantitation of pharmaceuticals in plasma.

    PubMed

    Kennedy, Joseph H; Wiseman, Justin M

    2010-02-01

    The present work describes the methodology and investigates the performance of desorption electrospray ionization (DESI) combined with a triple quadrupole mass spectrometer for the quantitation of small drug molecules in human plasma. Amoxepine, atenolol, carbamazepine, clozapine, prazosin, propranolol and verapamil were selected as target analytes while terfenadine was selected as the internal standard common to each of the analytes. Protein precipitation of human plasma using acetonitrile was utilized for all samples. Limits of detection were determined for all analytes in plasma and shown to be in the range 0.2-40 ng/mL. Quantitative analysis of amoxepine, prazosin and verapamil was performed over the range 20-7400 ng/mL and shown to be linear in all cases with R(2) >0.99. In most cases, the precision (relative standard deviation) and accuracy (relative error) of each method were less than or equal to 20%, respectively. The performance of the combined techniques made it possible to analyze each sample in 15 s illustrating DESI tandem mass spectrometry (MS/MS) as powerful tool for the quantitation of analytes in deproteinized human plasma. Copyright 2010 John Wiley & Sons, Ltd.

  16. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  17. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Treesearch

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  18. Quantitative assessment of prevalence of pre-analytical variables and their effect on coagulation assay. Can intervention improve patient safety?

    PubMed

    Bhushan, Ravi; Sen, Arijit

    2017-04-01

    Very few Indian studies exist on evaluation of pre-analytical variables affecting "Prothrombin Time" the commonest coagulation assay performed. The study was performed in an Indian tertiary care setting with an aim to assess quantitatively the prevalence of pre-analytical variables and their effects on the results (patient safety), for Prothrombin time test. The study also evaluated their effects on the result and whether intervention, did correct the results. The firstly evaluated the prevalence for various pre-analytical variables detected in samples sent for Prothrombin Time testing. These samples with the detected variables wherever possible were tested and result noted. The samples from the same patients were repeated and retested ensuring that no pre-analytical variable is present. The results were again noted to check for difference the intervention produced. The study evaluated 9989 samples received for PT/INR over a period of 18 months. The prevalence of different pre-analytical variables was found to be 862 (8.63%). The proportion of various pre-analytical variables detected were haemolysed samples 515 (5.16%), over filled vacutainers 62 (0.62%), under filled vacutainers 39 (0.39%), low values 205 (2.05%), clotted samples 11 (0.11%), wrong labeling 4 (0.04%), wrong vacutainer use 2 (0.02%), chylous samples 7 (0.07%) and samples with more than one variable 17 (0.17%). The comparison of percentage of samples showing errors were noted for the first variables since they could be tested with and without the variable in place. The reduction in error percentage was 91.5%, 69.2%, 81.5% and 95.4% post intervention for haemolysed, overfilled, under filled and samples collected with excess pressure at phlebotomy respectively. Correcting the variables did reduce the error percentage to a great extent in these four variables and hence the variables are found to affect "Prothrombin Time" testing and can hamper patient safety.

  19. The Role of Cations on the Performance of Lithium Ion Batteries: A Quantitative Analytical Approach.

    PubMed

    Nowak, Sascha; Winter, Martin

    2018-02-20

    Lithium ion batteries are nowadays the state-of-the-art power sources for portable electronic devices and the most promising candidate for energy storage in large-size batteries, e.g., pure and hybrid vehicles. However, the degradation of the cell components minimizes both storage and operation lifetime (calendar and cycle life), which is called aging. Due to the numerous different aging effects, in either the single constituents or their interactions with each other, many reports about methodologies and techniques, both electrochemical and analytical, can be found in the literature. However, quantitative data about the degradation effects were seldom stated. One important effect is the cation distribution and migration during operation. Metal dissolution and metal migration of the cathode and the corresponding deposition of these metals on the graphitic anode are known harmful degradation effects, especially for the formed solid electrolyte interphase on the surface of the anode. Depending on the applied cell chemistries and therefore the cathode material, different mechanisms were reported so far. For lithium manganese oxide based cells, the acidification of the electrolyte due to composition of the conduction salt is attributed as the main source of metal migration. Due to subsequent loss of manganese from the cathode, the overall performance of the cell is seriously impaired. Based on the obtained observations, this degradation mechanism was adapted to lithium nickel cobalt manganese based cells as main cause of the capacity fading. However, with the help a developed total X-ray fluorescence method and additional surface and electrolyte investigations, the proposed HF based mechanism was disproven. Instead, the migration was directly associated with material defects or mechanical spalling of the particles. Furthermore, with the obtained quantitative data of the migrated transition metals on the anode and separator, the contribution on the capacity fade was

  20. Microfluidic paper-based analytical devices for potential use in quantitative and direct detection of disease biomarkers in clinical analysis.

    PubMed

    Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei

    2017-08-15

    Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A Label-Free Porous Silicon Immunosensor for Broad Detection of Opiates in a Blind Clinical Study and Result Comparison to Commercial Analytical Chemistry Techniques

    PubMed Central

    Bonanno, Lisa M.; Kwong, Tai C.; DeLouise, Lisa A.

    2010-01-01

    In this work we evaluate for the first time the performance of a label-free porous silicon (PSi) immunosensor assay in a blind clinical study designed to screen authentic patient urine specimens for a broad range of opiates. The PSi opiate immunosensor achieved 96% concordance with liquid chromatography-mass spectrometry/tandem mass spectrometry (LC-MS/MS) results on samples that underwent standard opiate testing (n=50). In addition, successful detection of a commonly abused opiate, oxycodone, resulted in 100% qualitative agreement between the PSi opiate sensor and LC-MS/MS. In contrast, a commercial broad opiate immunoassay technique (CEDIA®) achieved 65% qualitative concordance with LC-MS/MS. Evaluation of important performance attributes including precision, accuracy, and recovery was completed on blank urine specimens spiked with test analytes. Variability of morphine detection as a model opiate target was < 9% both within-run and between-day at and above the cutoff limit of 300 ng ml−1. This study validates the analytical screening capability of label-free PSi opiate immunosensors in authentic patient samples and is the first semi-quantitative demonstration of the technology’s successful clinical use. These results motivate future development of PSi technology to reduce complexity and cost of diagnostic testing particularly in a point-of-care setting. PMID:21062030

  2. Quantitative analysis of virgin coconut oil in cream cosmetics preparations using fourier transform infrared (FTIR) spectroscopy.

    PubMed

    Rohman, A; Man, Yb Che; Sismindari

    2009-10-01

    Today, virgin coconut oil (VCO) is becoming valuable oil and is receiving an attractive topic for researchers because of its several biological activities. In cosmetics industry, VCO is excellent material which functions as a skin moisturizer and softener. Therefore, it is important to develop a quantitative analytical method offering a fast and reliable technique. Fourier transform infrared (FTIR) spectroscopy with sample handling technique of attenuated total reflectance (ATR) can be successfully used to analyze VCO quantitatively in cream cosmetic preparations. A multivariate analysis using calibration of partial least square (PLS) model revealed the good relationship between actual value and FTIR-predicted value of VCO with coefficient of determination (R2) of 0.998.

  3. Chemiluminescence microarrays in analytical chemistry: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2014-09-01

    Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.

  4. Review of recent advances in analytical techniques for the determination of neurotransmitters

    PubMed Central

    Perry, Maura; Li, Qiang; Kennedy, Robert T.

    2009-01-01

    Methods and advances for monitoring neurotransmitters in vivo or for tissue analysis of neurotransmitters over the last five years are reviewed. The review is organized primarily by neurotransmitter type. Transmitter and related compounds may be monitored by either in vivo sampling coupled to analytical methods or implanted sensors. Sampling is primarily performed using microdialysis, but low-flow push-pull perfusion may offer advantages of spatial resolution while minimizing the tissue disruption associated with higher flow rates. Analytical techniques coupled to these sampling methods include liquid chromatography, capillary electrophoresis, enzyme assays, sensors, and mass spectrometry. Methods for the detection of amino acid, monoamine, neuropeptide, acetylcholine, nucleoside, and soluable gas neurotransmitters have been developed and improved upon. Advances in the speed and sensitivity of these methods have enabled improvements in temporal resolution and increased the number of compounds detectable. Similar advances have enabled improved detection at tissue samples, with a substantial emphasis on single cell and other small samples. Sensors provide excellent temporal and spatial resolution for in vivo monitoring. Advances in application to catecholamines, indoleamines, and amino acids have been prominent. Improvements in stability, sensitivity, and selectivity of the sensors have been of paramount interest. PMID:19800472

  5. Intracellular subsurface imaging using a hybrid shear-force feedback/scanning quantitative phase microscopy technique

    NASA Astrophysics Data System (ADS)

    Edward, Kert

    Quantitative phase microscopy (QPM) allows for the imaging of translucent or transparent biological specimens without the need for exogenous contrast agents. This technique is usually applied towards the investigation of simple cells such as red blood cells which are typically enucleated and can be considered to be homogenous. However, most biological cells are nucleated and contain other interesting intracellular organelles. It has been established that the physical characteristics of certain subsurface structures such as the shape and roughness of the nucleus is well correlated with onset and progress of pathological conditions such as cancer. Although the acquired quantitative phase information of biological cells contains surface information as well as coupled subsurface information, the latter has been ignored up until now. A novel scanning quantitative phase imaging system unencumbered by 2pi ambiguities is hereby presented. This system is incorporated into a shear-force feedback scheme which allows for simultaneous phase and topography determination. It will be shown how subsequent image processing of these two data sets allows for the extraction of the subsurface component in the phase data and in vivo cell refractometry studies. Both fabricated samples and biological cells ranging from rat fibroblast cells to malaria infected human erythrocytes were investigated as part of this research. The results correlate quite well with that obtained via other microscopy techniques.

  6. Assessment of analytical techniques for predicting solid propellant exhaust plumes and plume impingement environments

    NASA Technical Reports Server (NTRS)

    Tevepaugh, J. A.; Smith, S. D.; Penny, M. M.

    1977-01-01

    An analysis of experimental nozzle, exhaust plume, and exhaust plume impingement data is presented. The data were obtained for subscale solid propellant motors with propellant Al loadings of 2, 10 and 15% exhausting to simulated altitudes of 50,000, 100,000 and 112,000 ft. Analytical predictions were made using a fully coupled two-phase method of characteristics numerical solution and a technique for defining thermal and pressure environments experienced by bodies immersed in two-phase exhaust plumes.

  7. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Quantitative Electron Probe Microanalysis: State of the Art

    NASA Technical Reports Server (NTRS)

    Carpernter, P. K.

    2005-01-01

    Quantitative electron-probe microanalysis (EPMA) has improved due to better instrument design and X-ray correction methods. Design improvement of the electron column and X-ray spectrometer has resulted in measurement precision that exceeds analytical accuracy. Wavelength-dispersive spectrometer (WDS) have layered-dispersive diffraction crystals with improved light-element sensitivity. Newer energy-dispersive spectrometers (EDS) have Si-drift detector elements, thin window designs, and digital processing electronics with X-ray throughput approaching that of WDS Systems. Using these systems, digital X-ray mapping coupled with spectrum imaging is a powerful compositional mapping tool. Improvements in analytical accuracy are due to better X-ray correction algorithms, mass absorption coefficient data sets,and analysis method for complex geometries. ZAF algorithms have ban superceded by Phi(pz) algorithms that better model the depth distribution of primary X-ray production. Complex thin film and particle geometries are treated using Phi(pz) algorithms, end results agree well with Monte Carlo simulations. For geological materials, X-ray absorption dominates the corretions end depends on the accuracy of mass absorption coefficient (MAC) data sets. However, few MACs have been experimentally measured, and the use of fitted coefficients continues due to general success of the analytical technique. A polynomial formulation of the Bence-Albec alpha-factor technique, calibrated using Phi(pz) algorithms, is used to critically evaluate accuracy issues and can be also be used for high 2% relative and is limited by measurement precision for ideal cases, but for many elements the analytical accuracy is unproven. The EPMA technique has improved to the point where it is frequently used instead of the petrogaphic microscope for reconnaissance work. Examples of stagnant research areas are: WDS detector design characterization of calibration standards, and the need for more complete

  9. Exploring phlebotomy technique as a pre-analytical factor in proteomic analyses by mass spectrometry.

    PubMed

    Penn, Andrew M; Lu, Linghong; Chambers, Andrew G; Balshaw, Robert F; Morrison, Jaclyn L; Votova, Kristine; Wood, Eileen; Smith, Derek S; Lesperance, Maria; del Zoppo, Gregory J; Borchers, Christoph H

    2015-12-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is an emerging technology for blood biomarker verification and validation; however, the results may be influenced by pre-analytical factors. This exploratory study was designed to determine if differences in phlebotomy techniques would significantly affect the abundance of plasma proteins in an upcoming biomarker development study. Blood was drawn from 10 healthy participants using four techniques: (1) a 20-gauge IV with vacutainer, (2) a 21-gauge direct vacutainer, (3) an 18-gauge butterfly with vacutainer, and (4) an 18-gauge butterfly with syringe draw. The abundances of a panel of 122 proteins (117 proteins, plus 5 matrix metalloproteinase (MMP) proteins) were targeted by LC/MRM-MS. In addition, complete blood count (CBC) data were also compared across the four techniques. Phlebotomy technique significantly affected 2 of the 11 CBC parameters (red blood cell count, p = 0.010; hemoglobin concentration, p = 0.035) and only 12 of the targeted 117 proteins (p < 0.05). Of the five MMP proteins, only MMP7 was detectable and its concentration was not significantly affected by different techniques. Overall, most proteins in this exploratory study were not significantly influenced by phlebotomy technique; however, a larger study with additional patients will be required for confirmation.

  10. Quantitative Phase Fraction Detection in Organic Photovoltaic Materials through EELS Imaging

    DOE PAGES

    Dyck, Ondrej; Hu, Sheng; Das, Sanjib; ...

    2015-11-24

    Organic photovoltaic materials have recently seen intense interest from the research community. Improvements in device performance are occurring at an impressive rate; however, visualization of the active layer phase separation still remains a challenge. Our paper outlines the application of two electron energy-loss spectroscopic (EELS) imaging techniques that can complement and enhance current phase detection techniques. Specifically, the bulk plasmon peak position, often used to produce contrast between phases in energy filtered transmission electron microscopy (EFTEM), is quantitatively mapped across a sample cross section. One complementary spectrum image capturing the carbon and sulfur core loss edges is compared with themore » plasmon peak map and found to agree quite well, indicating that carbon and sulfur density differences between the two phases also allows phase discrimination. Additionally, an analytical technique for determining absolute atomic areal density is used to produce an absolute carbon and sulfur areal density map. We also show how these maps may be re-interpreted as a phase ratio map, giving quantitative information about the purity of the phases within the junction.« less

  11. Evaluation of available analytical techniques for monitoring the quality of space station potable water

    NASA Technical Reports Server (NTRS)

    Geer, Richard D.

    1989-01-01

    To assure the quality of potable water (PW) on the Space Station (SS) a number of chemical and physical tests must be conducted routinely. After reviewing the requirements for potable water, both direct and indirect analytical methods are evaluated that could make the required tests and improvements compatible with the Space Station operation. A variety of suggestions are made to improve the analytical techniques for SS operation. The most important recommendations are: (1) the silver/silver chloride electrode (SB) method of removing I sub 2/I (-) biocide from the water, since it may interfere with analytical procedures for PW and also its end uses; (2) the orbital reactor (OR) method of carrying out chemistry and electrochemistry in microgravity by using a disk shaped reactor on an orbital table to impart artificial G force to the contents, allowing solution mixing and separation of gases and liquids; and (3) a simple ultra low volume highly sensitive electrochemical/conductivity detector for use with a capillary zone electrophoresis apparatus. It is also recommended, since several different conductivity and resistance measurements are made during the analysis of PW, that the bipolar pulse measuring circuit be used in all these applications for maximum compatibility and redundancy of equipment.

  12. Reverse transcription-polymerase chain reaction molecular testing of cytology specimens: Pre-analytic and analytic factors.

    PubMed

    Bridge, Julia A

    2017-01-01

    The introduction of molecular testing into cytopathology laboratory practice has expanded the types of samples considered feasible for identifying genetic alterations that play an essential role in cancer diagnosis and treatment. Reverse transcription-polymerase chain reaction (RT-PCR), a sensitive and specific technical approach for amplifying a defined segment of RNA after it has been reverse-transcribed into its DNA complement, is commonly used in clinical practice for the identification of recurrent or tumor-specific fusion gene events. Real-time RT-PCR (quantitative RT-PCR), a technical variation, also permits the quantitation of products generated during each cycle of the polymerase chain reaction process. This review addresses qualitative and quantitative pre-analytic and analytic considerations of RT-PCR as they relate to various cytologic specimens. An understanding of these aspects of genetic testing is central to attaining optimal results in the face of the challenges that cytology specimens may present. Cancer Cytopathol 2017;125:11-19. © 2016 American Cancer Society. © 2016 American Cancer Society.

  13. SnapShot: Visualization to Propel Ice Hockey Analytics.

    PubMed

    Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T

    2012-12-01

    Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.

  14. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  15. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  16. Combined use of optical and electron microscopic techniques for the measurement of hygroscopic property, chemical composition, and morphology of individual aerosol particles.

    PubMed

    Ahn, Kang-Ho; Kim, Sun-Man; Jung, Hae-Jin; Lee, Mi-Jung; Eom, Hyo-Jin; Maskey, Shila; Ro, Chul-Un

    2010-10-01

    In this work, an analytical method for the characterization of the hygroscopic property, chemical composition, and morphology of individual aerosol particles is introduced. The method, which is based on the combined use of optical and electron microscopic techniques, is simple and easy to apply. An optical microscopic technique was used to perform the visual observation of the phase transformation and hygroscopic growth of aerosol particles on a single particle level. A quantitative energy-dispersive electron probe X-ray microanalysis, named low-Z particle EPMA, was used to perform a quantitative chemical speciation of the same individual particles after the measurement of the hygroscopic property. To validate the analytical methodology, the hygroscopic properties of artificially generated NaCl, KCl, (NH(4))(2)SO(4), and Na(2)SO(4) aerosol particles of micrometer size were investigated. The practical applicability of the analytical method for studying the hygroscopic property, chemical composition, and morphology of ambient aerosol particles is demonstrated.

  17. A new multi-step technique with differential transform method for analytical solution of some nonlinear variable delay differential equations.

    PubMed

    Benhammouda, Brahim; Vazquez-Leal, Hector

    2016-01-01

    This work presents an analytical solution of some nonlinear delay differential equations (DDEs) with variable delays. Such DDEs are difficult to treat numerically and cannot be solved by existing general purpose codes. A new method of steps combined with the differential transform method (DTM) is proposed as a powerful tool to solve these DDEs. This method reduces the DDEs to ordinary differential equations that are then solved by the DTM. Furthermore, we show that the solutions can be improved by Laplace-Padé resummation method. Two examples are presented to show the efficiency of the proposed technique. The main advantage of this technique is that it possesses a simple procedure based on a few straight forward steps and can be combined with any analytical method, other than the DTM, like the homotopy perturbation method.

  18. An analytical approach based on ESI-MS, LC-MS and PCA for the quali-quantitative analysis of cycloartane derivatives in Astragalus spp.

    PubMed

    Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia

    2013-11-01

    Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. A Direct, Competitive Enzyme-Linked Immunosorbent Assay (ELISA) as a Quantitative Technique for Small Molecules

    ERIC Educational Resources Information Center

    Powers, Jennifer L.; Rippe, Karen Duda; Imarhia, Kelly; Swift, Aileen; Scholten, Melanie; Islam, Naina

    2012-01-01

    ELISA (enzyme-linked immunosorbent assay) is a widely used technique with applications in disease diagnosis, detection of contaminated foods, and screening for drugs of abuse or environmental contaminants. However, published protocols with a focus on quantitative detection of small molecules designed for teaching laboratories are limited. A…

  20. The application of absolute quantitative (1)H NMR spectroscopy in drug discovery and development.

    PubMed

    Singh, Suruchi; Roy, Raja

    2016-07-01

    The identification of a drug candidate and its structural determination is the most important step in the process of the drug discovery and for this, nuclear magnetic resonance (NMR) is one of the most selective analytical techniques. The present review illustrates the various perspectives of absolute quantitative (1)H NMR spectroscopy in drug discovery and development. It deals with the fundamentals of quantitative NMR (qNMR), the physiochemical properties affecting qNMR, and the latest referencing techniques used for quantification. The precise application of qNMR during various stages of drug discovery and development, namely natural product research, drug quantitation in dosage forms, drug metabolism studies, impurity profiling and solubility measurements is elaborated. To achieve this, the authors explore the literature of NMR in drug discovery and development between 1963 and 2015. It also takes into account several other reviews on the subject. qNMR experiments are used for drug discovery and development processes as it is a non-destructive, versatile and robust technique with high intra and interpersonal variability. However, there are several limitations also. qNMR of complex biological samples is incorporated with peak overlap and a low limit of quantification and this can be overcome by using hyphenated chromatographic techniques in addition to NMR.

  1. Uncovering category specificity of genital sexual arousal in women: The critical role of analytic technique.

    PubMed

    Pulverman, Carey S; Hixon, J Gregory; Meston, Cindy M

    2015-10-01

    Based on analytic techniques that collapse data into a single average value, it has been reported that women lack category specificity and show genital sexual arousal to a large range of sexual stimuli including those that both match and do not match their self-reported sexual interests. These findings may be a methodological artifact of the way in which data are analyzed. This study examined whether using an analytic technique that models data over time would yield different results. Across two studies, heterosexual (N = 19) and lesbian (N = 14) women viewed erotic films featuring heterosexual, lesbian, and gay male couples, respectively, as their physiological sexual arousal was assessed with vaginal photoplethysmography. Data analysis with traditional methods comparing average genital arousal between films failed to detect specificity of genital arousal for either group. When data were analyzed with smoothing regression splines and a within-subjects approach, both heterosexual and lesbian women demonstrated different patterns of genital sexual arousal to the different types of erotic films, suggesting that sophisticated statistical techniques may be necessary to more fully understand women's genital sexual arousal response. Heterosexual women showed category-specific genital sexual arousal. Lesbian women showed higher arousal to the heterosexual film than the other films. However, within subjects, lesbian women showed significantly different arousal responses suggesting that lesbian women's genital arousal discriminates between different categories of stimuli at the individual level. Implications for the future use of vaginal photoplethysmography as a diagnostic tool of sexual preferences in clinical and forensic settings are discussed. © 2015 Society for Psychophysiological Research.

  2. Quantitative X-ray dark-field and phase tomography using single directional speckle scanning technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hongchang, E-mail: hongchang.wang@diamond.ac.uk; Kashyap, Yogesh; Sawhney, Kawal

    2016-03-21

    X-ray dark-field contrast tomography can provide important supplementary information inside a sample to the conventional absorption tomography. Recently, the X-ray speckle based technique has been proposed to provide qualitative two-dimensional dark-field imaging with a simple experimental arrangement. In this letter, we deduce a relationship between the second moment of scattering angle distribution and cross-correlation degradation of speckle and establish a quantitative basis of X-ray dark-field tomography using single directional speckle scanning technique. In addition, the phase contrast images can be simultaneously retrieved permitting tomographic reconstruction, which yields enhanced contrast in weakly absorbing materials. Such complementary tomography technique can allow systematicmore » investigation of complex samples containing both soft and hard materials.« less

  3. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  4. Multi analytical technique study of human bones from an archaeological discovery.

    PubMed

    Lachowicz, J I; Palomba, S; Meloni, P; Carboni, M; Sanna, G; Floris, R; Pusceddu, V; Sarigu, M

    2017-03-01

    In 1953, during the building restoration of San Michele church (Bono, Sardinia, 16th-19th Century), a high number of disarticulated skeletons were recovered. From a group of 412 hip bones, two of these, affected by several pathological lesions, were analysed. The two coxal bones can be referred to the same individual, an adult man. A multi-analytical study, started with the purpose of investigating the bone pathology, was extended to characterize the mineral components of a large representative set of bones from the same ossuary, all attributed to adult men who lived in the region four-two centuries ago. A quantitative ICP-AES analysis for Ca, Fe, Mg, Mn, Na, Pb and Zn was executed, and a chemometric investigation on the results was performed. This approach gave evidence of the effects of diagenesis, allowed some hypothesis of the incidence of the known dietary habits on bone composition, and completely differentiated the pathological bones from those of a normal population on the basis of the mineral composition. Moreover, porosity, crystallinity and FT-IR analysis were conducted on both non- and pathological sample. Copyright © 2016 Elsevier GmbH. All rights reserved.

  5. Aberration measurement technique based on an analytical linear model of a through-focus aerial image.

    PubMed

    Yan, Guanyong; Wang, Xiangzhao; Li, Sikun; Yang, Jishuo; Xu, Dongbo; Erdmann, Andreas

    2014-03-10

    We propose an in situ aberration measurement technique based on an analytical linear model of through-focus aerial images. The aberrations are retrieved from aerial images of six isolated space patterns, which have the same width but different orientations. The imaging formulas of the space patterns are investigated and simplified, and then an analytical linear relationship between the aerial image intensity distributions and the Zernike coefficients is established. The linear relationship is composed of linear fitting matrices and rotation matrices, which can be calculated numerically in advance and utilized to retrieve Zernike coefficients. Numerical simulations using the lithography simulators PROLITH and Dr.LiTHO demonstrate that the proposed method can measure wavefront aberrations up to Z(37). Experiments on a real lithography tool confirm that our method can monitor lens aberration offset with an accuracy of 0.7 nm.

  6. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  7. Enhanced analytical sensitivity of a quantitative PCR for CMV using a modified nucleic-acid extraction procedure.

    PubMed

    Ferreira-Gonzalez, A; Yanovich, S; Langley, M R; Weymouth, L A; Wilkinson, D S; Garrett, C T

    2000-01-01

    Accurate and rapid diagnosis of CMV disease in immunocompromised individuals remains a challenge. Quantitative polymerase chain reaction (QPCR) methods for detection of CMV in peripheral blood mononuclear cells (PBMC) have improved the positive and negative predictive value of PCR for diagnosis of CMV disease. However, detection of CMV in plasma has demonstrated a lower negative predictive value for plasma as compared with PBMC. To enhance the sensitivity of the QPCR assay for plasma specimens, plasma samples were centrifuged before nucleic-acid extraction and the extracted DNA resolubilized in reduced volume. Optimization of the nucleic-acid extraction focused on decreasing or eliminating the presence of inhibitors in the pelleted plasma. Quantitation was achieved by co-amplifying an internal quantitative standard (IS) with the same primer sequences as CMV. PCR products were detected by hybridization in a 96-well microtiter plate coated with a CMV or IS specific probe. The precision of the QPCR assay for samples prepared from untreated and from pelleted plasma was then assessed. The coefficient of variation for both types of samples was almost identical and the magnitude of the coefficient of variations was reduced by a factor of ten if the data were log transformed. Linearity of the QPCR assay extended over a 3.3-log range for both types of samples but the range of linearity for pelleted plasma was 20 to 40,000 viral copies/ml (vc/ml) in contrast to 300 to 400,000 vc/ml for plasma. Thus, centrifugation of plasma before nucleic-acid extraction and resuspension of extracted CMV DNA in reduced volume enhanced the analytical sensitivity approximately tenfold over the dynamic range of the assay. Copyright 2000 Wiley-Liss, Inc.

  8. Pre-analytical effects of blood sampling and handling in quantitative immunoassays for rheumatoid arthritis.

    PubMed

    Zhao, Xiaoyan; Qureshi, Ferhan; Eastman, P Scott; Manning, William C; Alexander, Claire; Robinson, William H; Hesterberg, Lyndal K

    2012-04-30

    Variability in pre-analytical blood sampling and handling can significantly impact results obtained in quantitative immunoassays. Understanding the impact of these variables is critical for accurate quantification and validation of biomarker measurements. Particularly, in the design and execution of large clinical trials, even small differences in sample processing and handling can have dramatic effects in analytical reliability, results interpretation, trial management and outcome. The effects of two common blood sampling methods (serum vs. plasma) and two widely-used serum handling methods (on the clot with ambient temperature shipping, "traditional", vs. centrifuged with cold chain shipping, "protocol") on protein and autoantibody concentrations were examined. Matched serum and plasma samples were collected from 32 rheumatoid arthritis (RA) patients representing a wide range of disease activity status. Additionally, a set of matched serum samples with two sample handling methods was collected. One tube was processed per manufacturer's instructions and shipped overnight on cold packs (protocol). The matched tube, without prior centrifugation, was simultaneously shipped overnight at ambient temperatures (traditional). Upon delivery, the traditional tube was centrifuged. All samples were subsequently aliquoted and frozen prior to analysis of protein and autoantibody biomarkers. Median correlation between paired serum and plasma across all autoantibody assays was 0.99 (0.98-1.00) with a median % difference of -3.3 (-7.5 to 6.0). In contrast, observed protein biomarker concentrations were significantly affected by sample types, with median correlation of 0.99 (0.33-1.00) and a median % difference of -10 (-55 to 23). When the two serum collection/handling methods were compared, the median correlation between paired samples for autoantibodies was 0.99 (0.91-1.00) with a median difference of 4%. In contrast, significant increases were observed in protein biomarker

  9. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  10. A general, cryogenically-based analytical technique for the determination of trace quantities of volatile organic compounds in the atmosphere

    NASA Technical Reports Server (NTRS)

    Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.

    1985-01-01

    An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.

  11. Detection of genetically modified organisms in foods by DNA amplification techniques.

    PubMed

    García-Cañas, Virginia; Cifuentes, Alejandro; González, Ramón

    2004-01-01

    In this article, the different DNA amplification techniques that are being used for detecting genetically modified organisms (GMOs) in foods are examined. This study intends to provide an updated overview (including works published till June 2002) on the principal applications of such techniques together with their main advantages and drawbacks in GMO detection in foods. Some relevant facts on sampling, DNA isolation, and DNA amplification methods are discussed. Moreover; these analytical protocols are discuissed from a quantitative point of view, including the newest investigations on multiplex detection of GMOs in foods and validation of methods.

  12. Product identification techniques used as training aids for analytical chemists

    NASA Technical Reports Server (NTRS)

    Grillo, J. P.

    1968-01-01

    Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.

  13. Analytic tests and their relation to jet fuel thermal stability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heneghan, S.P.; Kauffman, R.E.

    1995-05-01

    The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions showmore » that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.« less

  14. A quantitative, comprehensive analytical model for ``fast'' magnetic reconnection in Hall MHD

    NASA Astrophysics Data System (ADS)

    Simakov, Andrei N.

    2008-11-01

    Magnetic reconnection in nature usually happens on fast (e.g. dissipation independent) time scales. While such scales have been observed computationally [1], a fundamental analytical model capable of explaining them has been lacking. Here, we propose such a quantitative model for 2D Hall MHD reconnection without a guide field. The model recovers the Sweet-Parker and the electron MHD [2] results in the appropriate limits of the ion inertial length, di, and is valid everywhere in between [3]. The model predicts the dissipation region aspect ratio and the reconnection rate Ez in terms of dissipation and inertial parameters, and has been found to be in excellent agreement with non-linear simulations. It confirms a number of long-standing empirical results and resolves several controversies. In particular, we find that both open X-point and elongated dissipation regions allow ``fast'' reconnection and that Ez depends on di. Moreover, when applied to electron-positron plasmas, the model demonstrates that fast dispersive waves are not instrumental for ``fast'' reconnection [4]. [1] J. Birn et al., J. Geophys. Res. 106, 3715 (2001). [2] L. Chac'on, A. N. Simakov, and A. Zocco, Phys. Rev. Lett. 99, 235001 (2007). [3] A. N. Simakov and L. Chac'on, submitted to Phys. Rev. Lett. [4] L. Chac'on, A. N. Simakov, V. Lukin, and A. Zocco, Phys. Rev. Lett. 101, 025003 (2008).

  15. Application of the correlation constrained multivariate curve resolution alternating least-squares method for analyte quantitation in the presence of unexpected interferences using first-order instrumental data.

    PubMed

    Goicoechea, Héctor C; Olivieri, Alejandro C; Tauler, Romà

    2010-03-01

    Correlation constrained multivariate curve resolution-alternating least-squares is shown to be a feasible method for processing first-order instrumental data and achieve analyte quantitation in the presence of unexpected interferences. Both for simulated and experimental data sets, the proposed method could correctly retrieve the analyte and interference spectral profiles and perform accurate estimations of analyte concentrations in test samples. Since no information concerning the interferences was present in calibration samples, the proposed multivariate calibration approach including the correlation constraint facilitates the achievement of the so-called second-order advantage for the analyte of interest, which is known to be present for more complex higher-order richer instrumental data. The proposed method is tested using a simulated data set and two experimental data systems, one for the determination of ascorbic acid in powder juices using UV-visible absorption spectral data, and another for the determination of tetracycline in serum samples using fluorescence emission spectroscopy.

  16. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    PubMed Central

    Sosa-Ferrera, Zoraida; Mahugo-Santana, Cristina; Santana-Rodríguez, José Juan

    2013-01-01

    Endocrine-disruptor compounds (EDCs) can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented. PMID:23738329

  17. Various extraction and analytical techniques for isolation and identification of secondary metabolites from Nigella sativa seeds.

    PubMed

    Liu, X; Abd El-Aty, A M; Shim, J-H

    2011-10-01

    Nigella sativa L. (black cumin), commonly known as black seed, is a member of the Ranunculaceae family. This seed is used as a natural remedy in many Middle Eastern and Far Eastern countries. Extracts prepared from N. sativa have, for centuries, been used for medical purposes. Thus far, the organic compounds in N. sativa, including alkaloids, steroids, carbohydrates, flavonoids, fatty acids, etc. have been fairly well characterized. Herein, we summarize some new extraction techniques, including microwave assisted extraction (MAE) and supercritical extraction techniques (SFE), in addition to the classical method of hydrodistillation (HD), which have been employed for isolation and various analytical techniques used for the identification of secondary metabolites in black seed. We believe that some compounds contained in N. sativa remain to be identified, and that high-throughput screening could help to identify new compounds. A study addressing environmentally-friendly techniques that have minimal or no environmental effects is currently underway in our laboratory.

  18. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  19. Development of a BK virus real-time quantitative assay using the bioMérieux analyte-specific reagents in plasma specimens.

    PubMed

    Rennert, Hanna; Fernandes, Helen; Gilani, Zahid; Sipley, John

    2015-12-01

    Viral load testing for BK virus (BKV) has become the standard of care for diagnosing BKV infection and monitoring therapy in kidney transplant patients. However, there are currently no US Food and Drug Administration-approved assays and no standardization among available tests. This study evaluated the performance of the analyte-specific reagent (ASR) BKV primers r-gene and probe r-gene reagents (bioMérieux, Marcy l'Étoile, France) soon to become available on the US market for accuracy, linearity, precision, analytical sensitivity, specificity, and correlation with the Qiagen (Germantown, MD) BKV ASR test using commercial material and patient plasma samples. The assay was linear from 204 to 3.92 million (2.31-6.6 log10) DNA copies/mL (coefficient of determination: R(2) =0.999). A dilution series demonstrated limits of detection and quantitation of 2.14 log10 and 2.30 log10 copies/mL (95% hit rate detection), respectively. Interrun precision was highly reproducible, with coefficients of variance ranging from 2.2% to 6.0%. A comparison of 34 matched samples showed a good agreement (R(2) = 0.87) between the bioMérieux BKV laboratory test and the Qiagen BKV ASR assay results, with an average negative bias (-0.28 log10 copies/mL). The laboratory-developed test with bioMérieux BKV reagents is a reliable and sensitive assay for BKV DNA quantitation compared with the Qiagen ASR test. Copyright© by the American Society for Clinical Pathology.

  20. Quantitative Determination of Caffeine in Beverages Using a Combined SPME-GC/MS Method

    NASA Astrophysics Data System (ADS)

    Pawliszyn, Janusz; Yang, Min J.; Orton, Maureen L.

    1997-09-01

    Solid-phase microextraction (SPME) combined with gas chromatography/mass spectrometry (GC/MS) has been applied to the analysis of various caffeinated beverages. Unlike the current methods, this technique is solvent free and requires no pH adjustments. The simplicity of the SPME-GC/MS method lends itself to a good undergraduate laboratory practice. This publication describes the analytical conditions and presents the data for determination of caffeine in coffee, tea, and coke. Quantitation by isotopic dilution is also illustrated.

  1. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  2. Analytical impact time and angle guidance via time-varying sliding mode technique.

    PubMed

    Zhao, Yao; Sheng, Yongzhi; Liu, Xiangdong

    2016-05-01

    To concretely provide a feasible solution for homing missiles with the precise impact time and angle, this paper develops a novel guidance law, based on the nonlinear engagement dynamics. The guidance law is firstly designed with the prior assumption of a stationary target, followed by the practical extension to a moving target scenario. The time-varying sliding mode (TVSM) technique is applied to fulfill the terminal constraints, in which a specific TVSM surface is constructed with two unknown coefficients. One is tuned to meet the impact time requirement and the other one is targeted with a global sliding mode, so that the impact angle constraint as well as the zero miss distance can be satisfied. Because the proposed law possesses three guidance gain as design parameters, the intercept trajectory can be shaped according to the operational conditions and missile׳s capability. To improve the tolerance of initial heading errors and broaden the application, a new frame of reference is also introduced. Furthermore, the analytical solutions of the flight trajectory, heading angle and acceleration command can be totally expressed for the prediction and offline parameter selection by solving a first-order linear differential equation. Numerical simulation results for various scenarios validate the effectiveness of the proposed guidance law and demonstrate the accuracy of the analytic solutions. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Proteomics Is Analytical Chemistry: Fitness-for-Purpose in the Application of Top-Down and Bottom-Up Analyses.

    PubMed

    Coorssen, Jens R; Yergey, Alfred L

    2015-12-03

    Molecular mechanisms underlying health and disease function at least in part based on the flexibility and fine-tuning afforded by protein isoforms and post-translational modifications. The ability to effectively and consistently resolve these protein species or proteoforms, as well as assess quantitative changes is therefore central to proteomic analyses. Here we discuss the pros and cons of currently available and developing analytical techniques from the perspective of the full spectrum of available tools and their current applications, emphasizing the concept of fitness-for-purpose in experimental design based on consideration of sample size and complexity; this necessarily also addresses analytical reproducibility and its variance. Data quality is considered the primary criterion, and we thus emphasize that the standards of Analytical Chemistry must apply throughout any proteomic analysis.

  4. Development of a novel nanoscratch technique for quantitative measurement of ice adhesion strength

    NASA Astrophysics Data System (ADS)

    Loho, T.; Dickinson, M.

    2018-04-01

    The mechanism for the way that ice adheres to surfaces is still not well understood. Currently there is no standard method to quantitatively measure how ice adheres to surfaces which makes ice surface studies difficult to compare. A novel quantitative lateral force adhesion measurement at the micro-nano scale for ice was created which shears micro-nano sized ice droplets (less than 3 μm in diameter and 100nm in height) using a nanoindenter. By using small ice droplets, the variables associated with bulk ice measurements were minimised which increased data repeatability compared to bulk testing. The technique provided post- testing surface scans to confirm that the ice had been removed and that measurements were of ice adhesion strength. Results show that the ice adhesion strength of a material is greatly affected by the nano-scale surface roughness of the material with rougher surfaces having higher ice adhesion strength.

  5. Electrical field-induced extraction and separation techniques: promising trends in analytical chemistry--a review.

    PubMed

    Yamini, Yadollah; Seidi, Shahram; Rezazadeh, Maryam

    2014-03-03

    Sample preparation is an important issue in analytical chemistry, and is often a bottleneck in chemical analysis. So, the major incentive for the recent research has been to attain faster, simpler, less expensive, and more environmentally friendly sample preparation methods. The use of auxiliary energies, such as heat, ultrasound, and microwave, is one of the strategies that have been employed in sample preparation to reach the above purposes. Application of electrical driving force is the current state-of-the-art, which presents new possibilities for simplifying and shortening the sample preparation process as well as enhancing its selectivity. The electrical driving force has scarcely been utilized in comparison with other auxiliary energies. In this review, the different roles of electrical driving force (as a powerful auxiliary energy) in various extraction techniques, including liquid-, solid-, and membrane-based methods, have been taken into consideration. Also, the references have been made available, relevant to the developments in separation techniques and Lab-on-a-Chip (LOC) systems. All aspects of electrical driving force in extraction and separation methods are too specific to be treated in this contribution. However, the main aim of this review is to provide a brief knowledge about the different fields of analytical chemistry, with an emphasis on the latest efforts put into the electrically assisted membrane-based sample preparation systems. The advantages and disadvantages of these approaches as well as the new achievements in these areas have been discussed, which might be helpful for further progress in the future. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Analytical techniques for measuring hydrocarbon emissions from the manufacture of fiberglass-reinforced plastics. Report for June 1995--March 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, R.S.; Kong, E.J.; Bahner, M.A.

    The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equipment, and operator technique. Analytical techniques were developed to reduce the cost of these emission measurements. Emissions from a small test mold in a temporary total enclosure (TTE) correlated with emissions from full-size production molds in a separate TTE. Gravimetric mass balance measurements inside the TTE generally agreed to within +/-30% with total hydrocarbon (THC) measurements in the TTE exhaust duct.

  7. A comparison of sorptive extraction techniques coupled to a new quantitative, sensitive, high throughput GC-MS/MS method for methoxypyrazine analysis in wine.

    PubMed

    Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E

    2016-02-01

    Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015

  8. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    PubMed

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  9. Development of inspection techniques for quantitatively measuring surface contamination on SRM hardware

    NASA Technical Reports Server (NTRS)

    Law, R. D.

    1989-01-01

    A contaminant is any material or substance which is potentially undesirable or which may adversely affect any part, component, or assembly. Contamination control of SRM hardware surfaces is a serious concern, for both Thiokol and NASA, with particular concern for contaminants which may adversely affect bonding surfaces. The purpose of this study is to develop laboratory analytical techniques which will make it possible to certify the cleanliness of any designated surface, with special focus on particulates (dust, dirt, lint, etc.), oils (hydrocarbons, silicones, plasticizers, etc.), and greases (HD-2, fluorocarbon grease, etc.). The hardware surfaces of concern will include D6AC steel, aluminum alloys, anodized aluminum alloys, glass/phenolic, carbon/phenolic, NBR/asbestos-silica, and EPDM rubber.

  10. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  11. Analytical Electrochemistry: Theory and Instrumentation of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Johnson, Dennis C.

    1980-01-01

    Emphasizes trends in the development of six topics concerning analytical electrochemistry, including books and reviews (34 references cited), mass transfer (59), charge transfer (25), surface effects (33), homogeneous reactions (21), and instrumentation (31). (CS)

  12. A quantitative evaluation of the high elbow technique in front crawl.

    PubMed

    Suito, Hiroshi; Nunome, Hiroyuki; Ikegami, Yasuo

    2017-07-01

    Many coaches often instruct swimmers to keep the elbow in a high position (high elbow position) during early phase of the underwater stroke motion (pull phase) in front crawl, however, the high elbow position has never been quantitatively evaluated. The aims of this study were (1) to quantitatively evaluate the "high elbow" position, (2) to clarify the relationship between the high elbow position and required upper limb configuration and (3) to examine the efficacy of high elbow position on the resultant swimming velocity. Sixteen highly skilled and 6 novice male swimmers performed 25 m front crawl with maximal effort and their 3-dimensional arm stroke motion was captured at 60 Hz. An attempt was made to develop a new index to evaluate the high elbow position (I he : high elbow index) using 3-dimensional coordinates of the shoulder, elbow and wrist joints. I he of skilled swimmers moderately correlated with the average shoulder internal rotation angle (r = -0.652, P < 0.01) and swimming velocity (r = -0.683, P < 0.01) during the pull phase. These results indicate that I he is a useful index for evaluating high elbow arm stroke technique during the pull phase in front crawl.

  13. Quantitative Detection of Pharmaceuticals Using a Combination of Paper Microfluidics and Wavelength Modulated Raman Spectroscopy

    PubMed Central

    Craig, Derek; Mazilu, Michael; Dholakia, Kishan

    2015-01-01

    Raman spectroscopy has proven to be an indispensable technique for the identification of various types of analytes due to the fingerprint vibration spectrum obtained. Paper microfluidics has also emerged as a low cost, easy to fabricate and portable approach for point of care testing. However, due to inherent background fluorescence, combining Raman spectroscopy with paper microfluidics is to date an unmet challenge in the absence of using surface enhanced mechanisms. We describe the first use of wavelength modulated Raman spectroscopy (WMRS) for analysis on a paper microfluidics platform. This study demonstrates the ability to suppress the background fluorescence of the paper using WMRS and the subsequent implementation of this technique for pharmaceutical analysis. The results of this study demonstrate that it is possible to discriminate between both paracetamol and ibuprofen, whilst, also being able to detect the presence of each analyte quantitatively at nanomolar concentrations. PMID:25938464

  14. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  15. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  16. A three-dimensional finite-element thermal/mechanical analytical technique for high-performance traveling wave tubes

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard

    1991-01-01

    Current research in high-efficiency, high-performance traveling wave tubes (TWT's) has led to the development of novel thermal/ mechanical computer models for use with helical slow-wave structures. A three-dimensional, finite element computer model and analytical technique used to study the structural integrity and thermal operation of a high-efficiency, diamond-rod, K-band TWT designed for use in advanced space communications systems. This analysis focused on the slow-wave circuit in the radiofrequency section of the TWT, where an inherent localized heating problem existed and where failures were observed during earlier cold compression, or 'coining' fabrication technique that shows great potential for future TWT development efforts. For this analysis, a three-dimensional, finite element model was used along with MARC, a commercially available finite element code, to simulate the fabrication of a diamond-rod TWT. This analysis was conducted by using component and material specifications consistent with actual TWT fabrication and was verified against empirical data. The analysis is nonlinear owing to material plasticity introduced by the forming process and also to geometric nonlinearities presented by the component assembly configuration. The computer model was developed by using the high efficiency, K-band TWT design but is general enough to permit similar analyses to be performed on a wide variety of TWT designs and styles. The results of the TWT operating condition and structural failure mode analysis, as well as a comparison of analytical results to test data are presented.

  17. A three-dimensional finite-element thermal/mechanical analytical technique for high-performance traveling wave tubes

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Kurt A.; Bartos, Karen F.; Fite, E. B.; Sharp, G. R.

    1992-01-01

    Current research in high-efficiency, high-performance traveling wave tubes (TWT's) has led to the development of novel thermal/mechanical computer models for use with helical slow-wave structures. A three-dimensional, finite element computer model and analytical technique used to study the structural integrity and thermal operation of a high-efficiency, diamond-rod, K-band TWT designed for use in advanced space communications systems. This analysis focused on the slow-wave circuit in the radiofrequency section of the TWT, where an inherent localized heating problem existed and where failures were observed during earlier cold compression, or 'coining' fabrication technique that shows great potential for future TWT development efforts. For this analysis, a three-dimensional, finite element model was used along with MARC, a commercially available finite element code, to simulate the fabrication of a diamond-rod TWT. This analysis was conducted by using component and material specifications consistent with actual TWT fabrication and was verified against empirical data. The analysis is nonlinear owing to material plasticity introduced by the forming process and also to geometric nonlinearities presented by the component assembly configuration. The computer model was developed by using the high efficiency, K-band TWT design but is general enough to permit similar analyses to be performed on a wide variety of TWT designs and styles. The results of the TWT operating condition and structural failure mode analysis, as well as a comparison of analytical results to test data are presented.

  18. Recent trends in analytical methods and separation techniques for drugs of abuse in hair.

    PubMed

    Baciu, T; Borrull, F; Aguilar, C; Calull, M

    2015-01-26

    Hair analysis of drugs of abuse has been a subject of growing interest from a clinical, social and forensic perspective for years because of the broad time detection window after intake in comparison to urine and blood analysis. Over the last few years, hair analysis has gained increasing attention and recognition for the retrospective investigation of drug abuse in a wide variety of contexts, shown by the large number of applications developed. This review aims to provide an overview of the state of the art and the latest trends used in the literature from 2005 to the present in the analysis of drugs of abuse in hair, with a special focus on separation analytical techniques and their hyphenation with mass spectrometry detection. The most recently introduced sample preparation techniques are also addressed in this paper. The main strengths and weaknesses of all of these approaches are critically discussed by means of relevant applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  20. DGT Passive Sampling for Quantitative in Situ Measurements of Compounds from Household and Personal Care Products in Waters.

    PubMed

    Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C

    2017-11-21

    Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.

  1. Mapping of thermal injury in biologic tissues using quantitative pathologic techniques

    NASA Astrophysics Data System (ADS)

    Thomsen, Sharon L.

    1999-05-01

    Qualitative and quantitative pathologic techniques can be used for (1) mapping of thermal injury, (2) comparisons lesion sizes and configurations for different instruments or heating sources and (3) comparisons of treatment effects. Concentric zones of thermal damage form around a single volume heat source. The boundaries between some of these zones are distinct and measurable. Depending on the energy deposition, heating times and tissue type, the zones can include the following beginning at the hotter center and progressing to the cooler periphery: (1) tissue ablation, (2) carbonization, (3) tissue water vaporization, (4) structural protein denaturation (thermal coagulation), (5) vital enzyme protein denaturation, (6) cell membrane disruption, (7) hemorrhage, hemostasis and hyperhemia, (8) tissue necrosis and (9) wound organization and healing.

  2. Extended internal standard method for quantitative 1H NMR assisted by chromatography (EIC) for analyte overlapping impurity on 1H NMR spectra.

    PubMed

    Saito, Naoki; Kitamaki, Yuko; Otsuka, Satoko; Yamanaka, Noriko; Nishizaki, Yuzo; Sugimoto, Naoki; Imura, Hisanori; Ihara, Toshihide

    2018-07-01

    We devised a novel extended internal standard method of quantitative 1 H NMR (qNMR) assisted by chromatography (EIC) that accurately quantifies 1 H signal areas of analytes, even when the chemical shifts of the impurity and analyte signals overlap completely. When impurity and analyte signals overlap in the 1 H NMR spectrum but can be separated in a chromatogram, the response ratio of the impurity and an internal standard (IS) can be obtained from the chromatogram. If the response ratio can be converted into the 1 H signal area ratio of the impurity and the IS, the 1 H signal area of the analyte can be evaluated accurately by mathematically correcting the contributions of the 1 H signal area of the impurity overlapping the analyte in the 1 H NMR spectrum. In this study, gas chromatography and liquid chromatography were used. We used 2-chlorophenol and 4-chlorophenol containing phenol as an impurity as examples in which impurity and analyte signals overlap to validate and demonstrate the EIC, respectively. Because the 1 H signals of 2-chlorophenol and phenol can be separated in specific alkaline solutions, 2-chlorophenol is suitable to validate the EIC by comparing analytical value obtained by the EIC with that by only qNMR under the alkaline condition. By the EIC, the purity of 2-chlorophenol was obtained with a relative expanded uncertainty (k = 2) of 0.24%. The purity matched that obtained under the alkaline condition. Furthermore, the EIC was also validated by evaluating the phenol content with the absolute calibration curve method by gas chromatography. Finally, we demonstrated that the EIC was possible to evaluate the purity of 4-chlorophenol, with a relative expanded uncertainty (k = 2) of 0.22%, which was not able to be separated from the 1 H signal of phenol under any condition. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Analytical investigation of thermal barrier coatings for advanced power generation combustion turbines

    NASA Technical Reports Server (NTRS)

    Amos, D. J.

    1977-01-01

    An analytical evaluation was conducted to determine quantitatively the improvement potential in cycle efficiency and cost of electricity made possible by the introduction of thermal barrier coatings to power generation combustion turbine systems. The thermal barrier system, a metallic bond coat and yttria stabilized zirconia outer layer applied by plasma spray techniques, acts as a heat insulator to provide substantial metal temperature reductions below that of the exposed thermal barrier surface. The study results show the thermal barrier to be a potentially attractive means for improving performance and reducing cost of electricity for the simple, recuperated, and combined cycles evaluated.

  5. The Xeno-glycomics database (XDB): a relational database of qualitative and quantitative pig glycome repertoire.

    PubMed

    Park, Hae-Min; Park, Ju-Hyeong; Kim, Yoon-Woo; Kim, Kyoung-Jin; Jeong, Hee-Jin; Jang, Kyoung-Soon; Kim, Byung-Gee; Kim, Yun-Gon

    2013-11-15

    In recent years, the improvement of mass spectrometry-based glycomics techniques (i.e. highly sensitive, quantitative and high-throughput analytical tools) has enabled us to obtain a large dataset of glycans. Here we present a database named Xeno-glycomics database (XDB) that contains cell- or tissue-specific pig glycomes analyzed with mass spectrometry-based techniques, including a comprehensive pig glycan information on chemical structures, mass values, types and relative quantities. It was designed as a user-friendly web-based interface that allows users to query the database according to pig tissue/cell types or glycan masses. This database will contribute in providing qualitative and quantitative information on glycomes characterized from various pig cells/organs in xenotransplantation and might eventually provide new targets in the α1,3-galactosyltransferase gene-knock out pigs era. The database can be accessed on the web at http://bioinformatics.snu.ac.kr/xdb.

  6. [The Raman Spectroscopy (RS): A new tool for the analytical quality control of injectable in health settings. Comparison of RS technique versus HPLC and UV/Vis-FTIR, applied to anthracyclines as anticancer drugs].

    PubMed

    Bourget, P; Amin, A; Moriceau, A; Cassard, B; Vidal, F; Clement, R

    2012-12-01

    The study compares the performances of three analytical methods devoted to Analytical Quality Control (AQC) of therapeutic solutions formed into care environment, we are talking about Therapeutics Objects(TN) (TOs(TN)). We explored the pharmacological model of two widely used anthracyclines i.e. adriamycin and epirubicin. We compared the performance of the HPLC versus two vibrational spectroscopic techniques: a tandem UV/Vis-FTIR on one hand and Raman Spectroscopy (RS) on the other. The three methods give good results for the key criteria of repeatability, of reproducibility and, of accuracy. A Spearman and a Kendall correlation test confirms the noninferiority of the vibrational techniques as an alternative to the reference method (HPLC). The selection of bands for characterization and quantification by RS is the results of a gradual process adjustment, at the intercept of matrix effects. From the perspective of a AQC associated to release of TOs, RS displays various advantages: (a) to decide quickly (~2min), simultaneously and without intrusion or withdrawal on both the nature of a packaging than on a solvant and this, regardless of the compound of interest; it is the founder asset of the method, (b) to explore qualitatively and quantitatively any kinds of TOs, (c) operator safety is guaranteed during production and in the laboratory, (d) the suppression of analytical releases or waste contribute to protects the environment, (e) the suppression.of consumables, (f) a negligible costs of maintenance, (g) a small budget of technicians training. These results already show that the SR technology is potentially a strong contributor to the safety of the medication cycle and fight against the iatrogenic effects of drugs. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  7. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  8. Quantitative measurement of solvation shells using frequency modulated atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Uchihashi, T.; Higgins, M.; Nakayama, Y.; Sader, J. E.; Jarvis, S. P.

    2005-03-01

    The nanoscale specificity of interaction measurements and additional imaging capability of the atomic force microscope make it an ideal technique for measuring solvation shells in a variety of liquids next to a range of materials. Unfortunately, the widespread use of atomic force microscopy for the measurement of solvation shells has been limited by uncertainties over the dimensions, composition and durability of the tip during the measurements, and problems associated with quantitative force calibration of the most sensitive dynamic measurement techniques. We address both these issues by the combined use of carbon nanotube high aspect ratio probes and quantifying the highly sensitive frequency modulation (FM) detection technique using a recently developed analytical method. Due to the excellent reproducibility of the measurement technique, additional information regarding solvation shell size as a function of proximity to the surface has been obtained for two very different liquids. Further, it has been possible to identify differences between chemical and geometrical effects in the chosen systems.

  9. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  10. Missing Value Monitoring Enhances the Robustness in Proteomics Quantitation.

    PubMed

    Matafora, Vittoria; Corno, Andrea; Ciliberto, Andrea; Bachi, Angela

    2017-04-07

    In global proteomic analysis, it is estimated that proteins span from millions to less than 100 copies per cell. The challenge of protein quantitation by classic shotgun proteomic techniques relies on the presence of missing values in peptides belonging to low-abundance proteins that lowers intraruns reproducibility affecting postdata statistical analysis. Here, we present a new analytical workflow MvM (missing value monitoring) able to recover quantitation of missing values generated by shotgun analysis. In particular, we used confident data-dependent acquisition (DDA) quantitation only for proteins measured in all the runs, while we filled the missing values with data-independent acquisition analysis using the library previously generated in DDA. We analyzed cell cycle regulated proteins, as they are low abundance proteins with highly dynamic expression levels. Indeed, we found that cell cycle related proteins are the major components of the missing values-rich proteome. Using the MvM workflow, we doubled the number of robustly quantified cell cycle related proteins, and we reduced the number of missing values achieving robust quantitation for proteins over ∼50 molecules per cell. MvM allows lower quantification variance among replicates for low abundance proteins with respect to DDA analysis, which demonstrates the potential of this novel workflow to measure low abundance, dynamically regulated proteins.

  11. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  12. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  13. Multiplexed MRM-based quantitation of candidate cancer biomarker proteins in undepleted and non-enriched human plasma.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Borchers, Christoph H

    2013-07-01

    An emerging approach for multiplexed targeted proteomics involves bottom-up LC-MRM-MS, with stable isotope-labeled internal standard peptides, to accurately quantitate panels of putative disease biomarkers in biofluids. In this paper, we used this approach to quantitate 27 candidate cancer-biomarker proteins in human plasma that had not been treated by immunoaffinity depletion or enrichment techniques. These proteins have been reported as biomarkers for a variety of human cancers, from laryngeal to ovarian, with breast cancer having the highest correlation. We implemented measures to minimize the analytical variability, improve the quantitative accuracy, and increase the feasibility and applicability of this MRM-based method. We have demonstrated excellent retention time reproducibility (median interday CV: 0.08%) and signal stability (median interday CV: 4.5% for the analytical platform and 6.1% for the bottom-up workflow) for the 27 biomarker proteins (represented by 57 interference-free peptides). The linear dynamic range for the MRM assays spanned four orders-of-magnitude, with 25 assays covering a 10(3) -10(4) range in protein concentration. The lowest abundance quantifiable protein in our biomarker panel was insulin-like growth factor 1 (calculated concentration: 127 ng/mL). Overall, the analytical performance of this assay demonstrates high robustness and sensitivity, and provides the necessary throughput and multiplexing capabilities required to verify and validate cancer-associated protein biomarker panels in human plasma, prior to clinical use. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Advances in functional brain imaging technology and developmental neuro-psychology: their applications in the Jungian analytic domain.

    PubMed

    Petchkovsky, Leon

    2017-06-01

    Analytical psychology shares with many other psychotherapies the important task of repairing the consequences of developmental trauma. The majority of analytic patients come from compromised early developmental backgrounds: they may have experienced neglect, abuse, or failures of empathic resonance from their carers. Functional brain imagery techniques including Quantitative Electroencephalogram (QEEG), and functional Magnetic Resonance Imagery (fMRI), allow us to track mental processes in ways beyond verbal reportage and introspection. This independent perspective is useful for developing new psychodynamic hypotheses, testing current ones, providing diagnostic markers, and monitoring treatment progress. Jung, with the Word Association Test, grasped these principles 100 years ago. Brain imaging techniques have contributed to powerful recent advances in our understanding of neurodevelopmental processes in the first three years of life. If adequate nurturance is compromised, a range of difficulties may emerge. This has important implications for how we understand and treat our psychotherapy clients. The paper provides an overview of functional brain imaging and advances in developmental neuropsychology, and looks at applications of some of these findings (including neurofeedback) in the Jungian psychotherapy domain. © 2017, The Society of Analytical Psychology.

  15. Screening of synthetic PDE-5 inhibitors and their analogues as adulterants: analytical techniques and challenges.

    PubMed

    Patel, Dhavalkumar Narendrabhai; Li, Lin; Kee, Chee-Leong; Ge, Xiaowei; Low, Min-Yong; Koh, Hwee-Ling

    2014-01-01

    The popularity of phosphodiesterase type 5 (PDE-5) enzyme inhibitors for the treatment of erectile dysfunction has led to the increase in prevalence of illicit sexual performance enhancement products. PDE-5 inhibitors, namely sildenafil, tadalafil and vardenafil, and their unapproved designer analogues are being increasingly used as adulterants in the herbal products and health supplements marketed for sexual performance enhancement. To date, more than 50 unapproved analogues of prescription PDE-5 inhibitors were found as adulterants in the literature. To avoid detection of such adulteration by standard screening protocols, the perpetrators of such illegal products are investing time and resources to synthesize exotic analogues and devise novel means for adulteration. A comprehensive review of conventional and advance analytical techniques to detect and characterize the adulterants is presented. The rapid identification and structural elucidation of unknown analogues as adulterants is greatly enhanced by the wide myriad of analytical techniques employed, including high performance liquid chromatography (HPLC), gas chromatography-mass spectrometry (GC-MS), liquid chromatography mass-spectrometry (LC-MS), nuclear magnetic resonance (NMR) spectroscopy, vibrational spectroscopy, liquid chromatography-Fourier transform ion cyclotron resonance-mass spectrometry (LC-FT-ICR-MS), liquid chromatograph-hybrid triple quadrupole linear ion trap mass spectrometer with information dependent acquisition, ultra high performance liquid chromatography-time of flight-mass spectrometry (UHPLC-TOF-MS), ion mobility spectroscopy (IMS) and immunoassay methods. The many challenges in detecting and characterizing such adulterants, and the need for concerted effort to curb adulteration in order to safe guard public safety and interest are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. An Analytical Technique to Elucidate Field Impurities From Manufacturing Uncertainties of an Double Pancake Type HTS Insert for High Field LTS/HTS NMR Magnets

    PubMed Central

    Hahn, Seung-yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu

    2010-01-01

    This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet. PMID:20407595

  17. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  18. Surrogate analyte approach for quantitation of endogenous NAD(+) in human acidified blood samples using liquid chromatography coupled with electrospray ionization tandem mass spectrometry.

    PubMed

    Liu, Liling; Cui, Zhiyi; Deng, Yuzhong; Dean, Brian; Hop, Cornelis E C A; Liang, Xiaorong

    2016-02-01

    A high-performance liquid chromatography tandem mass spectrometry (LC-MS/MS) assay for the quantitative determination of NAD(+) in human whole blood using a surrogate analyte approach was developed and validated. Human whole blood was acidified using 0.5N perchloric acid at a ratio of 1:3 (v:v, blood:perchloric acid) during sample collection. 25μL of acidified blood was extracted using a protein precipitation method and the resulting extracts were analyzed using reverse-phase chromatography and positive electrospray ionization mass spectrometry. (13)C5-NAD(+) was used as the surrogate analyte for authentic analyte, NAD(+). The standard curve ranging from 0.250 to 25.0μg/mL in acidified human blood for (13)C5-NAD(+) was fitted to a 1/x(2) weighted linear regression model. The LC-MS/MS response between surrogate analyte and authentic analyte at the same concentration was obtained before and after the batch run. This response factor was not applied when determining the NAD(+) concentration from the (13)C5-NAD(+) standard curve since the percent difference was less than 5%. The precision and accuracy of the LC-MS/MS assay based on the five analytical QC levels were well within the acceptance criteria from both FDA and EMA guidance for bioanalytical method validation. Average extraction recovery of (13)C5-NAD(+) was 94.6% across the curve range. Matrix factor was 0.99 for both high and low QC indicating minimal ion suppression or enhancement. The validated assay was used to measure the baseline level of NAD(+) in 29 male and 21 female human subjects. This assay was also used to study the circadian effect of endogenous level of NAD(+) in 10 human subjects. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Skill Assessment of An Hybrid Technique To Estimate Quantitative Precipitation Forecast For Galicia (nw Spain)

    NASA Astrophysics Data System (ADS)

    Lage, A.; Taboada, J. J.

    Precipitation is the most obvious of the weather elements in its effects on normal life. Numerical weather prediction (NWP) is generally used to produce quantitative precip- itation forecast (QPF) beyond the 1-3 h time frame. These models often fail to predict small-scale variations of rain because of spin-up problems and their coarse spatial and temporal resolution (Antolik, 2000). Moreover, there are some uncertainties about the behaviour of the NWP models in extreme situations (de Bruijn and Brandsma, 2000). Hybrid techniques, combining the benefits of NWP and statistical approaches in a flexible way, are very useful to achieve a good QPF. In this work, a new technique of QPF for Galicia (NW of Spain) is presented. This region has a percentage of rainy days per year greater than 50% with quantities that may cause floods, with human and economical damages. The technique is composed of a NWP model (ARPS) and a statistical downscaling process based on an automated classification scheme of at- mospheric circulation patterns for the Iberian Peninsula (J. Ribalaygua and R. Boren, 1995). Results show that QPF for Galicia is improved using this hybrid technique. [1] Antolik, M.S. 2000 "An Overview of the National Weather Service's centralized statistical quantitative precipitation forecasts". Journal of Hydrology, 239, pp:306- 337. [2] de Bruijn, E.I.F and T. Brandsma "Rainfall prediction for a flooding event in Ireland caused by the remnants of Hurricane Charley". Journal of Hydrology, 239, pp:148-161. [3] Ribalaygua, J. and Boren R. "Clasificación de patrones espaciales de precipitación diaria sobre la España Peninsular". Informes N 3 y 4 del Servicio de Análisis e Investigación del Clima. Instituto Nacional de Meteorología. Madrid. 53 pp.

  20. The 2D analytic signal for envelope detection and feature extraction on ultrasound images.

    PubMed

    Wachinger, Christian; Klein, Tassilo; Navab, Nassir

    2012-08-01

    The fundamental property of the analytic signal is the split of identity, meaning the separation of qualitative and quantitative information in form of the local phase and the local amplitude, respectively. Especially the structural representation, independent of brightness and contrast, of the local phase is interesting for numerous image processing tasks. Recently, the extension of the analytic signal from 1D to 2D, covering also intrinsic 2D structures, was proposed. We show the advantages of this improved concept on ultrasound RF and B-mode images. Precisely, we use the 2D analytic signal for the envelope detection of RF data. This leads to advantages for the extraction of the information-bearing signal from the modulated carrier wave. We illustrate this, first, by visual assessment of the images, and second, by performing goodness-of-fit tests to a Nakagami distribution, indicating a clear improvement of statistical properties. The evaluation is performed for multiple window sizes and parameter estimation techniques. Finally, we show that the 2D analytic signal allows for an improved estimation of local features on B-mode images. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  2. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  3. Quantitative analysis of time-resolved microwave conductivity data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reid, Obadiah G.; Moore, David T.; Li, Zhen

    Flash-photolysis time-resolved microwave conductivity (fp-TRMC) is a versatile, highly sensitive technique for studying the complex photoconductivity of solution, solid, and gas-phase samples. The purpose of this paper is to provide a standard reference work for experimentalists interested in using microwave conductivity methods to study functional electronic materials, describing how to conduct and calibrate these experiments in order to obtain quantitative results. The main focus of the paper is on calculating the calibration factor, K, which is used to connect the measured change in microwave power absorption to the conductance of the sample. We describe the standard analytical formulae that havemore » been used in the past, and compare them to numerical simulations. This comparison shows that the most widely used analytical analysis of fp-TRMC data systematically under-estimates the transient conductivity by ~60%. We suggest a more accurate semi-empirical way of calibrating these experiments. However, we emphasize that the full numerical calculation is necessary to quantify both transient and steady-state conductance for arbitrary sample properties and geometry.« less

  4. Quantitative analysis of time-resolved microwave conductivity data

    DOE PAGES

    Reid, Obadiah G.; Moore, David T.; Li, Zhen; ...

    2017-11-10

    Flash-photolysis time-resolved microwave conductivity (fp-TRMC) is a versatile, highly sensitive technique for studying the complex photoconductivity of solution, solid, and gas-phase samples. The purpose of this paper is to provide a standard reference work for experimentalists interested in using microwave conductivity methods to study functional electronic materials, describing how to conduct and calibrate these experiments in order to obtain quantitative results. The main focus of the paper is on calculating the calibration factor, K, which is used to connect the measured change in microwave power absorption to the conductance of the sample. We describe the standard analytical formulae that havemore » been used in the past, and compare them to numerical simulations. This comparison shows that the most widely used analytical analysis of fp-TRMC data systematically under-estimates the transient conductivity by ~60%. We suggest a more accurate semi-empirical way of calibrating these experiments. However, we emphasize that the full numerical calculation is necessary to quantify both transient and steady-state conductance for arbitrary sample properties and geometry.« less

  5. Social Data Analytics Using Tensors and Sparse Techniques

    ERIC Educational Resources Information Center

    Zhang, Miao

    2014-01-01

    The development of internet and mobile technologies is driving an earthshaking social media revolution. They bring the internet world a huge amount of social media content, such as images, videos, comments, etc. Those massive media content and complicate social structures require the analytic expertise to transform those flood of information into…

  6. Heat and mass transfer in combustion - Fundamental concepts and analytical techniques

    NASA Technical Reports Server (NTRS)

    Law, C. K.

    1984-01-01

    Fundamental combustion phenomena and the associated flame structures in laminar gaseous flows are discussed on physical bases within the framework of the three nondimensional parameters of interest to heat and mass transfer in chemically-reacting flows, namely the Damkoehler number, the Lewis number, and the Arrhenius number which is the ratio of the reaction activation energy to the characteristic thermal energy. The model problems selected for illustration are droplet combustion, boundary layer combustion, and the propagation, flammability, and stability of premixed flames. Fundamental concepts discussed include the flame structures for large activation energy reactions, S-curve interpretation of the ignition and extinctin states, reaction-induced local-similarity and non-similarity in boundary layer flows, the origin and removal of the cold boundary difficulty in modeling flame propagation, and effects of flame stretch and preferential diffusion on flame extinction and stability. Analytical techniques introduced include the Shvab-Zeldovich formulation, the local Shvab-Zeldovich formulation, flame-sheet approximation and the associated jump formulation, and large activation energy matched asymptotic analysis. Potentially promising research areas are suggested.

  7. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  8. Understanding changes over time in workers' compensation claim rates using time series analytical techniques.

    PubMed

    Moore, Ian C; Tompa, Emile

    2011-11-01

    The objective of this study is to better understand the inter-temporal variation in workers' compensation claim rates using time series analytical techniques not commonly used in the occupational health and safety literature. We focus specifically on the role of unemployment rates in explaining claim rate variations. The major components of workers' compensation claim rates are decomposed using data from a Canadian workers' compensation authority for the period 1991-2007. Several techniques are used to undertake the decomposition and assess key factors driving rates: (i) the multitaper spectral estimator, (ii) the harmonic F test, (iii) the Kalman smoother and (iv) ordinary least squares. The largest component of the periodic behaviour in workers' compensation claim rates is seasonal variation. Business cycle fluctuations in workers' compensation claim rates move inversely to unemployment rates. The analysis suggests that workers' compensation claim rates between 1991 and 2008 were driven by (in order of magnitude) a strong negative long term growth trend, periodic seasonal trends and business cycle fluctuations proxied by the Ontario unemployment rate.

  9. Biological Matrix Effects in Quantitative Tandem Mass Spectrometry-Based Analytical Methods: Advancing Biomonitoring

    PubMed Central

    Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd

    2015-01-01

    The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585

  10. Gold Nanoparticles as a Direct and Rapid Sensor for Sensitive Analytical Detection of Biogenic Amines

    NASA Astrophysics Data System (ADS)

    El-Nour, K. M. A.; Salam, E. T. A.; Soliman, H. M.; Orabi, A. S.

    2017-03-01

    A new optical sensor was developed for rapid screening with high sensitivity for the existence of biogenic amines (BAs) in poultry meat samples. Gold nanoparticles (GNPs) with particle size 11-19 nm function as a fast and sensitive biosensor for detection of histamine resulting from bacterial decarboxylation of histidine as a spoilage marker for stored poultry meat. Upon reaction with histamine, the red color of the GNPs converted into deep blue. The appearance of blue color favorably coincides with the concentration of BAs that can induce symptoms of poisoning. This biosensor enables a semi-quantitative detection of analyte in real samples by eye-vision. Quality evaluation is carried out by measuring histamine and histidine using different analytical techniques such as UV-vis, FTIR, and fluorescence spectroscopy as well as TEM. A rapid quantitative readout of samples by UV-vis and fluorescence methods with standard instrumentation were proposed in a short time unlike chromatographic and electrophoretic methods. Sensitivity and limit of detection (LOD) of 6.59 × 10-4 and 0.6 μM, respectively, are determined for histamine as a spoilage marker with a correlation coefficient ( R 2) of 0.993.

  11. Optical trapping for analytical biotechnology.

    PubMed

    Ashok, Praveen C; Dholakia, Kishan

    2012-02-01

    We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  13. Quantitative secondary electron imaging for work function extraction at atomic level and layer identification of graphene

    PubMed Central

    Zhou, Yangbo; Fox, Daniel S; Maguire, Pierce; O’Connell, Robert; Masters, Robert; Rodenburg, Cornelia; Wu, Hanchun; Dapor, Maurizio; Chen, Ying; Zhang, Hongzhou

    2016-01-01

    Two-dimensional (2D) materials usually have a layer-dependent work function, which require fast and accurate detection for the evaluation of their device performance. A detection technique with high throughput and high spatial resolution has not yet been explored. Using a scanning electron microscope, we have developed and implemented a quantitative analytical technique which allows effective extraction of the work function of graphene. This technique uses the secondary electron contrast and has nanometre-resolved layer information. The measurement of few-layer graphene flakes shows the variation of work function between graphene layers with a precision of less than 10 meV. It is expected that this technique will prove extremely useful for researchers in a broad range of fields due to its revolutionary throughput and accuracy. PMID:26878907

  14. Quantitative trace analysis of complex mixtures using SABRE hyperpolarization.

    PubMed

    Eshuis, Nan; van Weerdenburg, Bram J A; Feiters, Martin C; Rutjes, Floris P J T; Wijmenga, Sybren S; Tessari, Marco

    2015-01-26

    Signal amplification by reversible exchange (SABRE) is an emerging nuclear spin hyperpolarization technique that strongly enhances NMR signals of small molecules in solution. However, such signal enhancements have never been exploited for concentration determination, as the efficiency of SABRE can strongly vary between different substrates or even between nuclear spins in the same molecule. The first application of SABRE for the quantitative analysis of a complex mixture is now reported. Despite the inherent complexity of the system under investigation, which involves thousands of competing binding equilibria, analytes at concentrations in the low micromolar range could be quantified from single-scan SABRE spectra using a standard-addition approach. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Quantitative mass spectrometry imaging of emtricitabine in cervical tissue model using infrared matrix-assisted laser desorption electrospray ionization

    PubMed Central

    Bokhart, Mark T.; Rosen, Elias; Thompson, Corbin; Sykes, Craig; Kashuba, Angela D. M.; Muddiman, David C.

    2015-01-01

    A quantitative mass spectrometry imaging (QMSI) technique using infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) is demonstrated for the antiretroviral (ARV) drug emtricitabine in incubated human cervical tissue. Method development of the QMSI technique leads to a gain in sensitivity and removal of interferences for several ARV drugs. Analyte response was significantly improved by a detailed evaluation of several cationization agents. Increased sensitivity and removal of an isobaric interference was demonstrated with sodium chloride in the electrospray solvent. Voxel-to-voxel variability was improved for the MSI experiments by normalizing analyte abundance to a uniformly applied compound with similar characteristics to the drug of interest. Finally, emtricitabine was quantified in tissue with a calibration curve generated from the stable isotope-labeled analog of emtricitabine followed by cross-validation using liquid chromatography tandem mass spectrometry (LC-MS/MS). The quantitative IR-MALDESI analysis proved to be reproducible with an emtricitabine concentration of 17.2±1.8 μg/gtissue. This amount corresponds to the detection of 7 fmol/voxel in the IR-MALDESI QMSI experiment. Adjacent tissue slices were analyzed using LC-MS/MS which resulted in an emtricitabine concentration of 28.4±2.8 μg/gtissue. PMID:25318460

  16. Concurrence of big data analytics and healthcare: A systematic review.

    PubMed

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  17. Multi-technique quantitative analysis and socioeconomic considerations of lead, cadmium, and arsenic in children's toys and toy jewelry.

    PubMed

    Hillyer, Margot M; Finch, Lauren E; Cerel, Alisha S; Dattelbaum, Jonathan D; Leopold, Michael C

    2014-08-01

    A wide spectrum and large number of children's toys and toy jewelry items were purchased from both bargain and retail vendors and analyzed for arsenic, cadmium, and lead metal content using multiple analytical techniques, including flame and furnace atomic absorption spectroscopy as well as X-ray fluorescence spectroscopy. Particularly dangerous for young children, metal concentrations in toys/toy jewelry were assessed for compliance with current Consumer Safety Product Commission (CPSC) regulations (F963-11). A conservative metric involving multiple analytical techniques was used to categorize compliance: one technique confirmation of metal in excess of CPSC limits indicated a "suspect" item while confirmation on two different techniques warranted a non-compliant designation. Sample matrix-based standard addition provided additional confirmation of non-compliant and suspect products. Results suggest that origin of purchase, rather than cost, is a significant factor in the risk assessment of these materials with 57% of toys/toy jewelry items from bargain stores non-compliant or suspect compared to only 15% from retail outlets and 13% if only low cost items from the retail stores are compared. While jewelry was found to be the most problematic product (73% of non-compliant/suspect samples), lead (45%) and arsenic (76%) were the most dominant toxins found in non-compliant/suspect samples. Using the greater Richmond area as a model, the discrepancy between bargain and retail children's products, along with growing numbers of bargain stores in low-income and urban areas, exemplifies an emerging socioeconomic public health issue. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  19. Quantitative Functional Imaging Using Dynamic Positron Computed Tomography and Rapid Parameter Estimation Techniques

    NASA Astrophysics Data System (ADS)

    Koeppe, Robert Allen

    Positron computed tomography (PCT) is a diagnostic imaging technique that provides both three dimensional imaging capability and quantitative measurements of local tissue radioactivity concentrations in vivo. This allows the development of non-invasive methods that employ the principles of tracer kinetics for determining physiological properties such as mass specific blood flow, tissue pH, and rates of substrate transport or utilization. A physiologically based, two-compartment tracer kinetic model was derived to mathematically describe the exchange of a radioindicator between blood and tissue. The model was adapted for use with dynamic sequences of data acquired with a positron tomograph. Rapid estimation techniques were implemented to produce functional images of the model parameters by analyzing each individual pixel sequence of the image data. A detailed analysis of the performance characteristics of three different parameter estimation schemes was performed. The analysis included examination of errors caused by statistical uncertainties in the measured data, errors in the timing of the data, and errors caused by violation of various assumptions of the tracer kinetic model. Two specific radioindicators were investigated. ('18)F -fluoromethane, an inert freely diffusible gas, was used for local quantitative determinations of both cerebral blood flow and tissue:blood partition coefficient. A method was developed that did not require direct sampling of arterial blood for the absolute scaling of flow values. The arterial input concentration time course was obtained by assuming that the alveolar or end-tidal expired breath radioactivity concentration is proportional to the arterial blood concentration. The scale of the input function was obtained from a series of venous blood concentration measurements. The method of absolute scaling using venous samples was validated in four studies, performed on normal volunteers, in which directly measured arterial concentrations

  20. An Overview of Learning Analytics

    ERIC Educational Resources Information Center

    Clow, Doug

    2013-01-01

    Learning analytics, the analysis and representation of data about learners in order to improve learning, is a new lens through which teachers can understand education. It is rooted in the dramatic increase in the quantity of data about learners and linked to management approaches that focus on quantitative metrics, which are sometimes antithetical…

  1. Optimisation of techniques for quantification of Botrytis cinerea in grape berries and receptacles by quantitative polymerase chain reaction

    USDA-ARS?s Scientific Manuscript database

    Quantitative PCR (qPCR) can be used to detect and monitor pathogen colonization, but early attempts to apply the technology to Botrytis cinerea infection of grape berries have identified limitations to current techniques. In this study, four DNA extraction methods, two grinding methods, two grape or...

  2. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. SFC-MS/MS as an orthogonal technique for improved screening of polar analytes in anti-doping control.

    PubMed

    Parr, Maria Kristina; Wuest, Bernhard; Naegele, Edgar; Joseph, Jan F; Wenzel, Maxi; Schmidt, Alexander H; Stanic, Mijo; de la Torre, Xavier; Botrè, Francesco

    2016-09-01

    HPLC is considered the method of choice for the separation of various classes of drugs. However, some analytes are still challenging as HPLC shows limited resolution capabilities for highly polar analytes as they interact insufficiently on conventional reversed-phase (RP) columns. Especially in combination with mass spectrometric detection, limitations apply for alterations of stationary phases. Some highly polar sympathomimetic drugs and their metabolites showed almost no retention on different RP columns. Their retention remains poor even on phenylhexyl phases that show different selectivity due to π-π interactions. Supercritical fluid chromatography (SFC) as an orthogonal separation technique to HPLC may help to overcome these issues. Selected polar drugs and metabolites were analyzed utilizing SFC separation. All compounds showed sharp peaks and good retention even for the very polar analytes, such as sulfoconjugates. Retention times and elution orders in SFC are different to both RP and HILIC separations as a result of the orthogonality. Short cycle times could be realized. As temperature and pressure strongly influence the polarity of supercritical fluids, precise regulation of temperature and backpressure is required for the stability of the retention times. As CO2 is the main constituent of the mobile phase in SFC, solvent consumption and solvent waste are considerably reduced. Graphical Abstract SFC-MS/MS vs. LC-MS/MS.

  4. Detection and quantitation of benzo(a)pyrene-DNA adducts in brain and liver tissues of Beluga whales (Delphinapterus leucas) from the St. Lawrence and Mackenzie Estuaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shugart, L.R.

    1988-01-01

    It should be noted that there are few analytical techniques available for the detection and quantitation of chemical adducts in the DNA of living organisms. The reasons for this are: the analytical technique often has to accommodate the unique chemical and/or physical properties of the individual chemical or its metabolite; the percentage of total chemical that becomes most of the parent compound is usually detoxified and excreted; not all adducts that form between the genotoxic agent and DNA are stable or are involved in the development of subsequent deleterious events in the organism; and the amount of DNA available formore » analysis is often quite limited. 16 refs., 1 tab.« less

  5. Nanomaterials in consumer products: a challenging analytical problem.

    PubMed

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  6. Nanomaterials in consumer products: a challenging analytical problem

    NASA Astrophysics Data System (ADS)

    Contado, Catia

    2015-08-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  7. Nanomaterials in consumer products: a challenging analytical problem

    PubMed Central

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216

  8. Analytical Applications of NMR: Summer Symposium on Analytical Chemistry.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1982-01-01

    Highlights a symposium on analytical applications of nuclear magnetic resonance spectroscopy (NMR), discussing pulse Fourier transformation technique, two-dimensional NMR, solid state NMR, and multinuclear NMR. Includes description of ORACLE, an NMR data processing system at Syracuse University using real-time color graphics, and algorithms for…

  9. Advances in multiplexed MRM-based protein biomarker quantitation toward clinical utility.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Hardie, Darryl B; Borchers, Christoph H

    2014-05-01

    Accurate and rapid protein quantitation is essential for screening biomarkers for disease stratification and monitoring, and to validate the hundreds of putative markers in human biofluids, including blood plasma. An analytical method that utilizes stable isotope-labeled standard (SIS) peptides and selected/multiple reaction monitoring-mass spectrometry (SRM/MRM-MS) has emerged as a promising technique for determining protein concentrations. This targeted approach has analytical merit, but its true potential (in terms of sensitivity and multiplexing) has yet to be realized. Described herein is a method that extends the multiplexing ability of the MRM method to enable the quantitation 142 high-to-moderate abundance proteins (from 31mg/mL to 44ng/mL) in undepleted and non-enriched human plasma in a single run. The proteins have been reported to be associated to a wide variety of non-communicable diseases (NCDs), from cardiovascular disease (CVD) to diabetes. The concentrations of these proteins in human plasma are inferred from interference-free peptides functioning as molecular surrogates (2 peptides per protein, on average). A revised data analysis strategy, involving the linear regression equation of normal control plasma, has been instituted to enable the facile application to patient samples, as demonstrated in separate nutrigenomics and CVD studies. The exceptional robustness of the LC/MS platform and the quantitative method, as well as its high throughput, makes the assay suitable for application to patient samples for the verification of a condensed or complete protein panel. This article is part of a Special Issue entitled: Biomarkers: A Proteomic Challenge. © 2013.

  10. Liquid chromatography-mass spectrometry-based quantitative proteomics.

    PubMed

    Linscheid, Michael W; Ahrends, Robert; Pieper, Stefan; Kühn, Andreas

    2009-01-01

    During the last decades, molecular sciences revolutionized biomedical research and gave rise to the biotechnology industry. During the next decades, the application of the quantitative sciences--informatics, physics, chemistry, and engineering--to biomedical research brings about the next revolution that will improve human healthcare and certainly create new technologies, since there is no doubt that small changes can have great effects. It is not a question of "yes" or "no," but of "how much," to make best use of the medical options we will have. In this context, the development of accurate analytical methods must be considered a cornerstone, since the understanding of biological processes will be impossible without information about the minute changes induced in cells by interactions of cell constituents with all sorts of endogenous and exogenous influences and disturbances. The first quantitative techniques, which were developed, allowed monitoring relative changes only, but they clearly showed the significance of the information obtained. The recent advent of techniques claiming to quantify proteins and peptides not only relative to each other, but also in an absolute fashion, promised another quantum leap, since knowing the absolute amount will allow comparing even unrelated species and the definition of parameters will permit to model biological systems much more accurate than before. To bring these promises to life, several approaches are under development at this point in time and this review is focused on those developments.

  11. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  12. Development of quantitative laser ionization mass spectrometry (LIMS). Final report, 1 Aug 87-1 Jan 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odom, R.W.

    1991-06-04

    The objective of the research was to develop quantitative microanalysis methods for dielectric thin films using the laser ionization mass spectrometry (LIMS) technique. The research involved preparation of thin (5,000 A) films of SiO2, Al2O3, MgF2, TiO2, Cr2O3, Ta2O5, Si3N4, and ZrO2, and doping these films with ion implant impurities of 11B, 40Ca, 56Fe, 68Zn, 81Br, and 121Sb. Laser ionization mass spectrometry (LIMS), secondary ion mass spectrometry (SIMS) and Rutherford backscattering spectrometry (RBS) were performed on these films. The research demonstrated quantitative LIMS analysis down to detection levels of 10-100 ppm, and led to the development of (1) a compoundmore » thin film standards product line for the performing organization, (2) routine LIMS analytical methods, and (3) the manufacture of high speed preamplifiers for time-of-flight mass spectrometry (TOF-MS) techniques.« less

  13. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    NASA Astrophysics Data System (ADS)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  14. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  15. Constraint-Referenced Analytics of Algebra Learning

    ERIC Educational Resources Information Center

    Sutherland, Scot M.; White, Tobin F.

    2016-01-01

    The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…

  16. Functional-analytical capabilities of GIS technology in the study of water use risks

    NASA Astrophysics Data System (ADS)

    Nevidimova, O. G.; Yankovich, E. P.; Yankovich, K. S.

    2015-02-01

    Regional security aspects of economic activities are of great importance for legal regulation in environmental management. This has become a critical issue due to climate change, especially in regions where severe climate conditions have a great impact on almost all types of natural resource uses. A detailed analysis of climate and hydrological situation in Tomsk Oblast considering water use risks was carried out. Based on developed author's techniques an informational and analytical database was created using ArcGIS software platform, which combines statistical (quantitative) and spatial characteristics of natural hazards and socio-economic factors. This system was employed to perform areal zoning according to the degree of water use risks involved.

  17. Uses of Multivariate Analytical Techniques in Online and Blended Business Education: An Assessment of Current Practice and Recommendations for Future Research

    ERIC Educational Resources Information Center

    Arbaugh, J. B.; Hwang, Alvin

    2013-01-01

    Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…

  18. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and

  19. [Quantitative determination of glass content in monazite glass-ceramics by IR technique].

    PubMed

    He, Yong; Zhang, Bao-min

    2003-04-01

    Monazite glass-ceramics consist of both monazite and metaphoshate glass phases. The absorption bands of both phases do not overlap each other, and the absorption intensities of bands 1,275 and 616 cm-1 vary with the glass contents. The correlation coefficient between logarithmic absorbance ratio of the two bands and glass contents was r = 0.9975 and its regression equation was y = 48.356 + 25.93x. The absorbance ratio of bands 952 and 616 cm-1 also varied with different ratios of Ce2O3/La2O3 in synthetic monazites, with r = 0.9917 and a regression equation y = 0.2211 exp (0.0221x). High correlation coefficients show that the IR technique could find new application in the quantitative analysis of glass content in phosphate glass-ceramics.

  20. Quantitative coronary angiography using image recovery techniques for background estimation in unsubtracted images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Jerry T.; Kamyar, Farzad; Molloi, Sabee

    2007-10-15

    Densitometry measurements have been performed previously using subtracted images. However, digital subtraction angiography (DSA) in coronary angiography is highly susceptible to misregistration artifacts due to the temporal separation of background and target images. Misregistration artifacts due to respiration and patient motion occur frequently, and organ motion is unavoidable. Quantitative densitometric techniques would be more clinically feasible if they could be implemented using unsubtracted images. The goal of this study is to evaluate image recovery techniques for densitometry measurements using unsubtracted images. A humanoid phantom and eight swine (25-35 kg) were used to evaluate the accuracy and precision of the followingmore » image recovery techniques: Local averaging (LA), morphological filtering (MF), linear interpolation (LI), and curvature-driven diffusion image inpainting (CDD). Images of iodinated vessel phantoms placed over the heart of the humanoid phantom or swine were acquired. In addition, coronary angiograms were obtained after power injections of a nonionic iodinated contrast solution in an in vivo swine study. Background signals were estimated and removed with LA, MF, LI, and CDD. Iodine masses in the vessel phantoms were quantified and compared to known amounts. Moreover, the total iodine in left anterior descending arteries was measured and compared with DSA measurements. In the humanoid phantom study, the average root mean square errors associated with quantifying iodine mass using LA and MF were approximately 6% and 9%, respectively. The corresponding average root mean square errors associated with quantifying iodine mass using LI and CDD were both approximately 3%. In the in vivo swine study, the root mean square errors associated with quantifying iodine in the vessel phantoms with LA and MF were approximately 5% and 12%, respectively. The corresponding average root mean square errors using LI and CDD were both 3%. The standard

  1. Assessment of analytical techniques for predicting solid propellant exhaust plumes

    NASA Technical Reports Server (NTRS)

    Tevepaugh, J. A.; Smith, S. D.; Penny, M. M.

    1977-01-01

    The calculation of solid propellant exhaust plume flow fields is addressed. Two major areas covered are: (1) the applicability of empirical data currently available to define particle drag coefficients, heat transfer coefficients, mean particle size and particle size distributions, and (2) thermochemical modeling of the gaseous phase of the flow field. Comparisons of experimentally measured and analytically predicted data are made. The experimental data were obtained for subscale solid propellant motors with aluminum loadings of 2, 10 and 15%. Analytical predictions were made using a fully coupled two-phase numerical solution. Data comparisons will be presented for radial distributions at plume axial stations of 5, 12, 16 and 20 diameters.

  2. Development of quantitative screen for 1550 chemicals with GC-MS.

    PubMed

    Bergmann, Alan J; Points, Gary L; Scott, Richard P; Wilson, Glenn; Anderson, Kim A

    2018-05-01

    With hundreds of thousands of chemicals in the environment, effective monitoring requires high-throughput analytical techniques. This paper presents a quantitative screening method for 1550 chemicals based on statistical modeling of responses with identification and integration performed using deconvolution reporting software. The method was evaluated with representative environmental samples. We tested biological extracts, low-density polyethylene, and silicone passive sampling devices spiked with known concentrations of 196 representative chemicals. A multiple linear regression (R 2  = 0.80) was developed with molecular weight, logP, polar surface area, and fractional ion abundance to predict chemical responses within a factor of 2.5. Linearity beyond the calibration had R 2  > 0.97 for three orders of magnitude. Median limits of quantitation were estimated to be 201 pg/μL (1.9× standard deviation). The number of detected chemicals and the accuracy of quantitation were similar for environmental samples and standard solutions. To our knowledge, this is the most precise method for the largest number of semi-volatile organic chemicals lacking authentic standards. Accessible instrumentation and software make this method cost effective in quantifying a large, customizable list of chemicals. When paired with silicone wristband passive samplers, this quantitative screen will be very useful for epidemiology where binning of concentrations is common. Graphical abstract A multiple linear regression of chemical responses measured with GC-MS allowed quantitation of 1550 chemicals in samples such as silicone wristbands.

  3. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  4. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  5. Quantitative analyses of bifunctional molecules.

    PubMed

    Braun, Patrick D; Wandless, Thomas J

    2004-05-11

    Small molecules can be discovered or engineered to bind tightly to biologically relevant proteins, and these molecules have proven to be powerful tools for both basic research and therapeutic applications. In many cases, detailed biophysical analyses of the intermolecular binding events are essential for improving the activity of the small molecules. These interactions can often be characterized as straightforward bimolecular binding events, and a variety of experimental and analytical techniques have been developed and refined to facilitate these analyses. Several investigators have recently synthesized heterodimeric molecules that are designed to bind simultaneously with two different proteins to form ternary complexes. These heterodimeric molecules often display compelling biological activity; however, they are difficult to characterize. The bimolecular interaction between one protein and the heterodimeric ligand (primary dissociation constant) can be determined by a number of methods. However, the interaction between that protein-ligand complex and the second protein (secondary dissociation constant) is more difficult to measure due to the noncovalent nature of the original protein-ligand complex. Consequently, these heterodimeric compounds are often characterized in terms of their activity, which is an experimentally dependent metric. We have developed a general quantitative mathematical model that can be used to measure both the primary (protein + ligand) and secondary (protein-ligand + protein) dissociation constants for heterodimeric small molecules. These values are largely independent of the experimental technique used and furthermore provide a direct measure of the thermodynamic stability of the ternary complexes that are formed. Fluorescence polarization and this model were used to characterize the heterodimeric molecule, SLFpYEEI, which binds to both FKBP12 and the Fyn SH2 domain, demonstrating that the model is useful for both predictive as well as ex

  6. Measuring bio-oil upgrade intermediates and corrosive species with polarity-matched analytical approaches

    DOE PAGES

    Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...

    2014-10-03

    Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less

  7. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  8. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  9. Combined quantitative and qualitative two-channel optical biopsy technique for discrimination of tumor borders

    NASA Astrophysics Data System (ADS)

    Bocher, Thomas; Beuthan, Juergen; Scheller, M.; Hopf, Juergen U. G.; Linnarz, Marietta; Naber, Rolf-Dieter; Minet, Olaf; Becker, Wolfgang; Mueller, Gerhard J.

    1995-12-01

    Conventional laser-induced fluorescence spectroscopy (LIFS) of endogenous chromophores like NADH (Nicotineamide Adenine Dinucleotide, reduced form) and PP IX (Protoporphyrin IX) provides information about the relative amounts of these metabolites in the observed cells. But for diagnostic applications the concentrations of these chromophores have to be determined quantitatively to establish tissue-independent differentiation criterions. It is well- known that the individually and locally varying optical tissue parameters are major obstacles for the determination of the true chromophore concentrations by simple fluorescence spectroscopy. To overcome these problems a fiber-based, 2-channel technique including a rescaled NADH-channel (delivering quantitative values) and a relative PP IX-channel was developed. Using the accumulated information of both channels can provide good tissue state separation. Ex-vivo studies with resected and frozen samples (with LN2) of squamous cells in the histologically confirmed states: normal, tumor border, inflammation and hyperplasia were performed. Each state was represented in this series with at least 7 samples. At the identical tissue spot both, the rescaled NADH-fluorescence and the relative PP IX- fluorescence, were determined. In the first case a nitrogen laser (337 nm, 500 ps, 200 microjoule, 10 Hz) in the latter case a diode laser (633 nm, 15 mW, cw) were used as excitation sources. In this ex-vivo study a good separation between the different tissue states was achieved. With a device constructed for clinical usage one quantitative, in-vivo NADH- measurement was done recently showing similar separation capabilities.

  10. Comparative study of inorganic elements determined in whole blood from Dmd(mdx)/J mice strain by EDXRF and NAA analytical techniques.

    PubMed

    Redígolo, M M; Sato, I M; Metairon, S; Zamboni, C B

    2016-04-01

    Several diseases can be diagnosed observing the variation of specific elements concentration in body fluids. In this study the concentration of inorganic elements in blood samples of dystrophic (Dmd(mdx)/J) and C57BL/6J (control group) mice strain were determined. The results obtained from Energy Dispersive X-ray Fluorescence (EDXRF) were compared with Neutron Activation Analysis (NAA) technique. Both analytical techniques showed to be appropriate and complementary offering a new contribution for veterinary medicine as well as detailed knowledge of this pathology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Reliable screening of various foodstuffs with respect to their irradiation status: A comparative study of different analytical techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho

    2013-10-01

    Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.

  12. Development of a Fourier transform infrared spectroscopy coupled to UV-Visible analysis technique for aminosides and glycopeptides quantitation in antibiotic locks.

    PubMed

    Sayet, G; Sinegre, M; Ben Reguiga, M

    2014-01-01

    Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  13. Quantitative determination of BAF312, a S1P-R modulator, in human urine by LC-MS/MS: prevention and recovery of lost analyte due to container surface adsorption.

    PubMed

    Li, Wenkui; Luo, Suyi; Smith, Harold T; Tse, Francis L S

    2010-02-15

    Analyte loss due to non-specific binding, especially container surface adsorption, is not uncommon in the quantitative analysis of urine samples. In developing a sensitive LC-MS/MS method for the determination of a drug candidate, BAF312, in human urine, a simple procedure was outlined for identification, confirmation and prevention of analyte non-specific binding to a container surface and to recover the 'non-specific loss' of an analyte, if no transfer has occurred to the original urine samples. Non-specific binding or container surface adsorption can be quickly identified by using freshly spiked urine calibration standards and pre-pooled QC samples during a LC-MS/MS feasibility run. The resulting low recovery of an analyte in urine samples can be prevented through the use of additives, such as the non-ionic surfactant Tween-80, CHAPS and others, to the container prior to urine sample collection. If the urine samples have not been transferred from the bulk container, the 'non-specific binding' of an analyte to the container surface can be reversed by the addition of a specified amount of CHAPS, Tween-80 or bovine serum albumin, followed by appropriate mixing. Among the above agents, Tween-80 is the most cost-effective. beta-cyclodextrin may be suitable in stabilizing the analyte of interest in urine via pre-treating the matrix with the agent. However, post-addition of beta-cyclodextrin to untreated urine samples does not recover the 'lost' analyte due to non-specific binding or container surface adsorption. In the case of BAF312, a dynamic range of 0.0200-20.0 ng/ml in human urine was validated with an overall accuracy and precision for QC sample results ranging from -3.2 to 5.1% (bias) and 3.9 to 10.2% (CV), respectively. Pre- and post-addition of 0.5% (v/v) Tween-80 to the container provided excellent overall analyte recovery and minimal MS signal suppression when a liquid-liquid extraction in combination with an isocratic LC separation was employed. The

  14. Integrated analytical techniques with high sensitivity for studying brain translocation and potential impairment induced by intranasally instilled copper nanoparticles.

    PubMed

    Bai, Ru; Zhang, Lili; Liu, Ying; Li, Bai; Wang, Liming; Wang, Peng; Autrup, Herman; Beer, Christiane; Chen, Chunying

    2014-04-07

    Health impacts of inhalation exposure to engineered nanomaterials have attracted increasing attention. In this paper, integrated analytical techniques with high sensitivity were used to study the brain translocation and potential impairment induced by intranasally instilled copper nanoparticles (CuNPs). Mice were exposed to CuNPs in three doses (1, 10, 40 mg/kg bw). The body weight of mice decreased significantly in the 10 and 40 mg/kg group (p<0.05) but recovered slightly within exposure duration. Inductively coupled plasma mass spectrometry (ICP-MS) analysis showed that CuNPs could enter the brain. Altered distribution of some important metal elements was observed by synchrotron radiation X-ray fluorescence (SRXRF). H&E staining and immunohistochemical analysis showed that CuNPs produced damages to nerve cells and astrocyte might be the one of the potential targets of CuNPs. The changes of neurotransmitter levels in different brain regions demonstrate that the dysfunction occurred in exposed groups. These data indicated that CuNPs could enter the brain after nasal inhalation and induced damages to the central nervous system (CNS). Integration of effective analytical techniques for systematic investigations is a promising direction to better understand the biological activities of nanomaterials. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Quantitative Confocal Microscopy Analysis as a Basis for Search and Study of Potassium Kv1.x Channel Blockers

    NASA Astrophysics Data System (ADS)

    Feofanov, Alexey V.; Kudryashova, Kseniya S.; Nekrasova, Oksana V.; Vassilevski, Alexander A.; Kuzmenkov, Alexey I.; Korolkova, Yuliya V.; Grishin, Eugene V.; Kirpichnikov, Mikhail P.

    Artificial KcsA-Kv1.x (x = 1, 3) receptors were recently designed by transferring the ligand-binding site from human Kv1.x voltage-gated potassium channels into corresponding domain of the bacterial KscA channel. We found that KcsA-Kv1.x receptors expressed in E. coli cells are embedded into cell membrane and bind ligands when the cells are transformed to spheroplasts. We supposed that E. coli spheroplasts with membrane-embedded KcsA-Kv1.x and fluorescently labeled ligand agitoxin-2 (R-AgTx2) can be used as elements of an advanced analytical system for search and study of Kv1-channel blockers. To realize this idea, special procedures were developed for measurement and quantitative treatment of fluorescence signals obtained from spheroplast membrane using confocal laser scanning microscopy (CLSM). The worked out analytical "mix and read" systems supported by quantitative CLSM analysis were demonstrated to be reliable alternative to radioligand and electrophysiology techniques in the search and study of selective Kv1.x channel blockers of high scientific and medical importance.

  16. Speckle noise reduction in quantitative optical metrology techniques by application of the discrete wavelet transformation

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Effective suppression of speckle noise content in interferometric data images can help in improving accuracy and resolution of the results obtained with interferometric optical metrology techniques. In this paper, novel speckle noise reduction algorithms based on the discrete wavelet transformation are presented. The algorithms proceed by: (a) estimating the noise level contained in the interferograms of interest, (b) selecting wavelet families, (c) applying the wavelet transformation using the selected families, (d) wavelet thresholding, and (e) applying the inverse wavelet transformation, producing denoised interferograms. The algorithms are applied to the different stages of the processing procedures utilized for generation of quantitative speckle correlation interferometry data of fiber-optic based opto-electronic holography (FOBOEH) techniques, allowing identification of optimal processing conditions. It is shown that wavelet algorithms are effective for speckle noise reduction while preserving image features otherwise faded with other algorithms.

  17. Quantitative imaging technique using the layer-stripping algorithm

    NASA Astrophysics Data System (ADS)

    Beilina, L.

    2017-07-01

    We present the layer-stripping algorithm for the solution of the hyperbolic coefficient inverse problem (CIP). Our numerical examples show quantitative reconstruction of small tumor-like inclusions in two-dimensions.

  18. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. [application of the analytical transmission electron microscopy techniques for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in mammalian cells].

    PubMed

    Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V

    2014-01-01

    This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.

  20. A Review of Analytical Methods for p-Coumaric Acid in Plant-Based Products, Beverages, and Biological Matrices.

    PubMed

    Ferreira, Paula Scanavez; Victorelli, Francesca Damiani; Fonseca-Santos, Bruno; Chorilli, Marlus

    2018-05-14

    p-Coumaric acid (p-CA), also known as 4-hydroxycinnamic acid, is a phenolic acid, which has been widely studied due to its beneficial effects against several diseases and its wide distribution in the plant kingdom. This phenolic compound can be found in the free form or conjugated with other molecules; therefore, its bioavailability and the pathways via which it is metabolized change according to its chemical structure. p-CA has potential pharmacological effects because it has high free radical scavenging, anti-inflammatory, antineoplastic, and antimicrobial activities, among other biological properties. It is therefore essential to choose the most appropriate and effective analytical method for qualitative and quantitative determination of p-CA in different matrices, such as plasma, urine, plant extracts, and drug delivery systems. The most-reported analytical method for this purpose is high-performance liquid chromatography, which is mostly coupled with some type of detectors, such as UV/Vis detector. However, other analytical techniques are also used to evaluate this compound. This review presents a summary of p-CA in terms of its chemical and pharmacokinetic properties, pharmacological effects, drug delivery systems, and the analytical methods described in the literature that are suitable for its quantification.

  1. Extraction of fullerenes from environmental matrices as affected by solvent characteristics and analyte concentration.

    PubMed

    Place, Benjamin J; Kleber, Markus; Field, Jennifer A

    2013-03-01

    Fullerenes possess unique chemical properties that make the isolation of these compounds from heterogeneous environmental matrices difficult. For example, previous reports indicate that toluene-based extraction techniques vary in their ability to extract C60, especially from highly carbonaceous solid matrices. Here, we examined the effects of (i) solvent type (toluene alone versus an 80:20 v/v mixture of toluene and 1-methylnaphthalene) and (ii) analyte concentration on the extraction efficiency of an isotopically labeled surrogate compound, (13)C60. The toluene/1-methylnaphthalene mixture increased fullerene extraction efficiency from carbon lampblack by a factor of five, but was not significantly different from 100% toluene when applied to wood stove soot or montmorillonite. Recovery of the (13)C60 surrogate declined with decreasing analyte concentration. The usefulness of isotopically labeled surrogate is demonstrated and the study provides a quantitative assessment regarding the dependence of fullerene extraction efficiencies on the geochemical characteristics of solid matrices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  3. Quantitative proteomics in Giardia duodenalis-Achievements and challenges.

    PubMed

    Emery, Samantha J; Lacey, Ernest; Haynes, Paul A

    2016-08-01

    Giardia duodenalis (syn. G. lamblia and G. intestinalis) is a protozoan parasite of vertebrates and a major contributor to the global burden of diarrheal diseases and gastroenteritis. The publication of multiple genome sequences in the G. duodenalis species complex has provided important insights into parasite biology, and made post-genomic technologies, including proteomics, significantly more accessible. The aims of proteomics are to identify and quantify proteins present in a cell, and assign functions to them within the context of dynamic biological systems. In Giardia, proteomics in the post-genomic era has transitioned from reliance on gel-based systems to utilisation of a diverse array of techniques based on bottom-up LC-MS/MS technologies. Together, these have generated crucial foundations for subcellular proteomes, elucidated intra- and inter-assemblage isolate variation, and identified pathways and markers in differentiation, host-parasite interactions and drug resistance. However, in Giardia, proteomics remains an emerging field, with considerable shortcomings evident from the published research. These include a bias towards assemblage A, a lack of emphasis on quantitative analytical techniques, and limited information on post-translational protein modifications. Additionally, there are multiple areas of research for which proteomic data is not available to add value to published transcriptomic data. The challenge of amalgamating data in the systems biology paradigm necessitates the further generation of large, high-quality quantitative datasets to accurately model parasite biology. This review surveys the current proteomic research available for Giardia and evaluates their technical and quantitative approaches, while contextualising their biological insights into parasite pathology, isolate variation and eukaryotic evolution. Finally, we propose areas of priority for the generation of future proteomic data to explore fundamental questions in Giardia

  4. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  5. Measurements of the quantitative lateral analytical resolution at evaporated aluminium and silver layers with the JEOL JXA-8530F FEG-EPMA

    NASA Astrophysics Data System (ADS)

    Berger, D.; Nissen, J.

    2018-01-01

    The studies in this paper are part of systematic investigations of the lateral analytical resolution of the field emission electron microprobe JEOL JXA-8530F. Hereby, the quantitative lateral resolution, which is achieved in practise, is in the focus of interest. The approach is to determine the minimum thickness of a metallic layer for which an accurate quantitative element analysis in cross-section is still possible. Previous measurements were accomplished at sputtered gold (Z = 79) layers, where a lateral resolution in the range of 140 to 170 nm was achieved at suitable parameters of the microprobe. To study the Z-dependence of the lateral resolution, now aluminium (Z = 13) resp. silver (Z = 47) layers with different thicknesses were generated by evaporation and prepared in cross-section subsequently by use of a focussed Ga-ion beam (FIB). Each layer was analysed quantitatively with different electron energies. The thinnest layer which can be resolved specifies the best lateral resolution. These measured values were compared on the one hand with Monte Carlo simulations and on the other hand with predictions from formulas from the literature. The measurements fit well to the simulated and calculated values, except the ones at the lowest primary electron energies with an overvoltage below ˜ 2. The reason for this discrepancy is not clear yet and has to be clarified by further investigations. The results apply for any microanalyser - even with energy-dispersive X-ray spectrometry (EDS) detection - if the probe diameters, which might deviate from those of the JEOL JXA-8530F, at suitable analysing parameters are considered.

  6. Development of a comprehensive analytical platform for the detection and quantitation of food fraud using a biomarker approach. The oregano adulteration case study.

    PubMed

    Wielogorska, Ewa; Chevallier, Olivier; Black, Connor; Galvin-King, Pamela; Delêtre, Marc; Kelleher, Colin T; Haughey, Simon A; Elliott, Christopher T

    2018-01-15

    Due to increasing number of food fraud incidents, there is an inherent need for the development and implementation of analytical platforms enabling detection and quantitation of adulteration. In this study a set of unique biomarkers of commonly found oregano adulterants became the targets in the development of a LC-MS/MS method which underwent a rigorous in-house validation. The method presented very high selectivity and specificity, excellent linearity (R 2 >0.988) low decision limits and detection capabilities (<2%), acceptable accuracy (intra-assay 92-113%, inter-assay 69-138%) and precision (CV<20%). The method was compared with an established FTIR screening assay and revealed a good correlation of quali- and quantitative results (R 2 >0.81). An assessment of 54 suspected adulterated oregano samples revealed that almost 90% of them contained at least one bulking agent, with a median level of adulteration of 50%. Such innovative methodologies need to be established as routine testing procedures to detect and ultimately deter food fraud. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Application of ASTAR(TM)/Precession Electron Diffraction Technique to Quantitatively Study Defects in Nanocrystalline Metallic Materials

    NASA Astrophysics Data System (ADS)

    Ghamarian, Iman

    Nanocrystalline metallic materials have the potential to exhibit outstanding performance which leads to their usage in challenging applications such as coatings and biomedical implant devices. To optimize the performance of nanocrystalline metallic materials according to the desired applications, it is important to have a decent understanding of the structure, processing and properties of these materials. Various efforts have been made to correlate microstructure and properties of nanocrystalline metallic materials. Based on these research activities, it is noticed that microstructure and defects (e.g., dislocations and grain boundaries) play a key role in the behavior of these materials. Therefore, it is of great importance to establish methods to quantitatively study microstructures, defects and their interactions in nanocrystalline metallic materials. Since the mechanisms controlling the properties of nanocrystalline metallic materials occur at a very small length scale, it is fairly difficult to study them. Unfortunately, most of the characterization techniques used to explore these materials do not have the high enough spatial resolution required for the characterization of these materials. For instance, by applying complex profile-fitting algorithms to X-ray diffraction patterns, it is possible to get an estimation of the average grain size and the average dislocation density within a relatively large area. However, these average values are not enough for developing meticulous phenomenological models which are able to correlate microstructure and properties of nanocrystalline metallic materials. As another example, electron backscatter diffraction technique also cannot be used widely in the characterization of these materials due to problems such as relative poor spatial resolution (which is 90 nm) and the degradation of Kikuchi diffraction patterns in severely deformed nano-size grain metallic materials. In this study, ASTAR(TM)/precession electron

  8. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    PubMed

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  9. Big Data Analytics with Datalog Queries on Spark.

    PubMed

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  10. Big Data Analytics with Datalog Queries on Spark

    PubMed Central

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2017-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296

  11. Applications of Microfluidics in Quantitative Biology.

    PubMed

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  12. Quantitation of acrylamide in foods by high-resolution mass spectrometry.

    PubMed

    Troise, Antonio Dario; Fiore, Alberto; Fogliano, Vincenzo

    2014-01-08

    Acrylamide detection still represents one of the hottest topics in food chemistry. Solid phase cleanup coupled to liquid chromatography separation and tandem mass spectrometry detection along with GC-MS detection are nowadays the gold standard procedure for acrylamide quantitation thanks to high reproducibility, good recovery, and low relative standard deviation. High-resolution mass spectrometry (HRMS) is particularly suitable for the detection of low molecular weight amides, and it can provide some analytical advantages over other MS techniques. In this paper a liquid chromatography (LC) method for acrylamide determination using HRMS detection was developed and compared to LC coupled to tandem mass spectrometry. The procedure applied a simplified extraction, no cleanup steps, and a 4 min chromatography. It proved to be solid and robust with an acrylamide mass accuracy of 0.7 ppm, a limit of detection of 2.65 ppb, and a limit of quantitation of 5 ppb. The method was tested on four acrylamide-containing foods: cookies, French fries, ground coffee, and brewed coffee. Results were perfectly in line with those obtained by LC-MS/MS.

  13. Hierarchical Analytical Approaches for Unraveling the Composition of Proprietary Mixtures

    EPA Pesticide Factsheets

    The composition of commercial mixtures including pesticide inert ingredients, aircraft deicers, and aqueous film-forming foam (AFFF) formulations, and by analogy, fracking fluids, are proprietary. Quantitative analytical methodologies can only be developed for mixture components once their identities are known. Because proprietary mixtures may contain volatile and non-volatile components, a hierarchy of analytical methods is often required for the full identification of all proprietary mixture components.

  14. Analyte species and concentration identification using differentially functionalized microcantilever arrays and artificial neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senesac, Larry R; Datskos, Panos G; Sepaniak, Michael J

    2006-01-01

    In the present work, we have performed analyte species and concentration identification using an array of ten differentially functionalized microcantilevers coupled with a back-propagation artificial neural network pattern recognition algorithm. The array consists of ten nanostructured silicon microcantilevers functionalized by polymeric and gas chromatography phases and macrocyclic receptors as spatially dense, differentially responding sensing layers for identification and quantitation of individual analyte(s) and their binary mixtures. The array response (i.e. cantilever bending) to analyte vapor was measured by an optical readout scheme and the responses were recorded for a selection of individual analytes as well as several binary mixtures. Anmore » artificial neural network (ANN) was designed and trained to recognize not only the individual analytes and binary mixtures, but also to determine the concentration of individual components in a mixture. To the best of our knowledge, ANNs have not been applied to microcantilever array responses previously to determine concentrations of individual analytes. The trained ANN correctly identified the eleven test analyte(s) as individual components, most with probabilities greater than 97%, whereas it did not misidentify an unknown (untrained) analyte. Demonstrated unique aspects of this work include an ability to measure binary mixtures and provide both qualitative (identification) and quantitative (concentration) information with array-ANN-based sensor methodologies.« less

  15. The analyst's participation in the analytic process.

    PubMed

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  16. Quantitative evaluation method of the threshold adjustment and the flat field correction performances of hybrid photon counting pixel detectors

    NASA Astrophysics Data System (ADS)

    Medjoubi, K.; Dawiec, A.

    2017-12-01

    A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.

  17. Pre-analytic and analytic sources of variations in thiopurine methyltransferase activity measurement in patients prescribed thiopurine-based drugs: A systematic review.

    PubMed

    Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A

    2011-07-01

    Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.

  18. Mass spectrometry as a quantitative tool in plant metabolomics

    PubMed Central

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  19. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    DOE PAGES

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J.; ...

    2015-06-18

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is amore » technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. Here, this review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. Electron tomography produces quantitative 3D reconstructions for biological and physical sciences from sets of 2D projections acquired at different tilting angles in a transmission electron microscope. Finally, state-of-the-art techniques capable of producing 3D representations such as Pt-Pd core-shell nanoparticles and IgG1 antibody molecules are reviewed.« less

  20. Identification and confirmation of chemical residues by chromatography-mass spectrometry and other techniques

    USDA-ARS?s Scientific Manuscript database

    A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...

  1. Sensors for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Severin, Erik (Inventor)

    1998-01-01

    Chemical sensors for detecting analytes in fluids comprise first and second conductive elements (e.g., electrical leads) electrically coupled to and separated by a chemically sensitive resistor which provides an electrical path between the conductive elements. The resistor comprises a plurality of alternating nonconductive regions (comprising a nonconductive organic polymer) and conductive regions (comprising a conductive material) transverse to the electrical path. The resistor provides a difference in resistance between the conductive elements when contacted with a fluid comprising a chemical analyte at a first concentration, than when contacted with a fluid comprising the chemical analyte at a second different concentration. Arrays of such sensors are constructed with at least two sensors having different chemically sensitive resistors providing dissimilar such differences in resistance. Variability in chemical sensitivity from sensor to sensor is provided by qualitatively or quantitatively varying the composition of the conductive and/or nonconductive regions. An electronic nose for detecting an analyte in a fluid may be constructed by using such arrays in conjunction with an electrical measuring device electrically connected to the conductive elements of each sensor.

  2. Sensors for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Severin, Erik (Inventor); Lewis, Nathan S. (Inventor)

    2001-01-01

    Chemical sensors for detecting analytes in fluids comprise first and second conductive elements (e.g., electrical leads) electrically coupled to and separated by a chemically sensitive resistor which provides an electrical path between the conductive elements. The resistor comprises a plurality of alternating nonconductive regions (comprising a nonconductive organic polymer) and conductive regions (comprising a conductive material) transverse to the electrical path. The resistor provides a difference in resistance between the conductive elements when contacted with a fluid comprising a chemical analyte at a first concentration, than when contacted with a fluid comprising the chemical analyte at a second different concentration. Arrays of such sensors are constructed with at least two sensors having different chemically sensitive resistors providing dissimilar such differences in resistance. Variability in chemical sensitivity from sensor to sensor is provided by qualitatively or quantitatively varying the composition of the conductive and/or nonconductive regions. An electronic nose for detecting an analyte in a fluid may be constructed by using such arrays in conjunction with an electrical measuring device electrically connected to the conductive elements of each sensor.

  3. Sensors for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Severin, Erik (Inventor)

    1999-01-01

    Chemical sensors for detecting analytes in fluids comprise first and second conductive elements (e.g., electrical leads) electrically coupled to and separated by a chemically sensitive resistor which provides an electrical path between the conductive elements. The resistor comprises a plurality of alternating nonconductive regions (comprising a nonconductive organic polymer) and conductive regions (comprising a conductive material) transverse to the electrical path. The resistor provides a difference in resistance between the conductive elements when contacted with a fluid comprising a chemical analyte at a first concentration, than when contacted with a fluid comprising the chemical analyte at a second different concentration. Arrays of such sensors are constructed with at least two sensors having different chemically sensitive resistors providing dissimilar such differences in resistance. Variability in chemical sensitivity from sensor to sensor is provided by qualitatively or quantitatively varying the composition of the conductive and/or nonconductive regions. An electronic nose for detecting an analyte in a fluid may be constructed by using such arrays in conjunction with an electrical measuring device electrically connected to the conductive elements of each sensor.

  4. Quantitative impedimetric monitoring of cell migration under the stimulation of cytokine or anti-cancer drug in a microfluidic chip

    PubMed Central

    Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao

    2015-01-01

    Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566

  5. Extended Analytic Device Optimization Employing Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred

    2013-01-01

    Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.

  6. Recent trends in analytical procedures in forensic toxicology.

    PubMed

    Van Bocxlaer, Jan F

    2005-12-01

    Forensic toxicology is a very demanding discipline,heavily dependent on good analytical techniques. That is why new trends appear continuously. In the past years. LC-MS has revolutionized target compound analysis and has become the trend, also in toxicology. In LC-MS screening analysis, things are less straightforward and several approaches exist. One promising approach based on accurate LC-MSTOF mass measurements and elemental formula based library searches is discussed. This way of screening has already proven its applicability but at the same time it became obvious that a single accurate mass measurement lacks some specificity when using large compound libraries. CE too is a reemerging approach. The increasingly polar and ionic molecules encountered make it a worthwhile addition to e.g. LC, as illustrated for the analysis of GHB. A third recent trend is the use of MALDI mass spectrometry for small molecules. It is promising for its ease-of-use and high throughput. Unfortunately, re-ports of disappointment but also accomplishment, e.g. the quantitative analysis of LSD as discussed here, alternate, and it remains to be seen whether MALDI really will establish itself. Indeed, not all new trends will prove themselves but the mere fact that many appear in the world of analytical toxicology nowadays is, in itself, encouraging for the future of (forensic) toxicology.

  7. Analytical toxicology.

    PubMed

    Flanagan, R J; Widdop, B; Ramsey, J D; Loveland, M

    1988-09-01

    1. Major advances in analytical toxicology followed the introduction of spectroscopic and chromatographic techniques in the 1940s and early 1950s and thin layer chromatography remains important together with some spectrophotometric and other tests. However, gas- and high performance-liquid chromatography together with a variety of immunoassay techniques are now widely used. 2. The scope and complexity of forensic and clinical toxicology continues to increase, although the compounds for which emergency analyses are needed to guide therapy are few. Exclusion of the presence of hypnotic drugs can be important in suspected 'brain death' cases. 3. Screening for drugs of abuse has assumed greater importance not only for the management of the habituated patient, but also in 'pre-employment' and 'employment' screening. The detection of illicit drug administration in sport is also an area of increasing importance. 4. In industrial toxicology, the range of compounds for which blood or urine measurements (so called 'biological monitoring') can indicate the degree of exposure is increasing. The monitoring of environmental contaminants (lead, chlorinated pesticides) in biological samples has also proved valuable. 5. In the near future a consensus as to the units of measurement to be used is urgently required and more emphasis will be placed on interpretation, especially as regards possible behavioural effects of drugs or other poisons. Despite many advances in analytical techniques there remains a need for reliable, simple tests to detect poisons for use in smaller hospital and other laboratories.

  8. Rapid perfusion quantification using Welch-Satterthwaite approximation and analytical spectral filtering

    NASA Astrophysics Data System (ADS)

    Krishnan, Karthik; Reddy, Kasireddy V.; Ajani, Bhavya; Yalavarthy, Phaneendra K.

    2017-02-01

    CT and MR perfusion weighted imaging (PWI) enable quantification of perfusion parameters in stroke studies. These parameters are calculated from the residual impulse response function (IRF) based on a physiological model for tissue perfusion. The standard approach for estimating the IRF is deconvolution using oscillatory-limited singular value decomposition (oSVD) or Frequency Domain Deconvolution (FDD). FDD is widely recognized as the fastest approach currently available for deconvolution of CT Perfusion/MR PWI. In this work, three faster methods are proposed. The first is a direct (model based) crude approximation to the final perfusion quantities (Blood flow, Blood volume, Mean Transit Time and Delay) using the Welch-Satterthwaite approximation for gamma fitted concentration time curves (CTC). The second method is a fast accurate deconvolution method, we call Analytical Fourier Filtering (AFF). The third is another fast accurate deconvolution technique using Showalter's method, we call Analytical Showalter's Spectral Filtering (ASSF). Through systematic evaluation on phantom and clinical data, the proposed methods are shown to be computationally more than twice as fast as FDD. The two deconvolution based methods, AFF and ASSF, are also shown to be quantitatively accurate compared to FDD and oSVD.

  9. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  10. Noninvasive radioisotopic technique for detection of platelet deposition in mitral valve prostheses and quantitation of visceral microembolism in dogs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dewanjee, M.K.; Fuster, V.; Rao, S.A.

    1983-05-01

    A noninvasive technique has been developed in the dog model for imaging, with a gamma camera, the platelet deposition on Bjoerk-Shiley mitral valve prostheses early postoperatively. At 25 hours after implantation of the prosthesis and 24 hours after intravenous administration of 400 to 500 microCi of platelets labeled with indium-111, the platelet deposition in the sewing ring and perivalvular cardiac tissue can be clearly delineated in a scintiphotograph. An in vitro technique was also developed for quantitation of visceral microemboli in brain, lungs, kidneys, and other tissues. Biodistribution of the labeled platelets was quantitated, and the tissue/blood radioactivity ratio wasmore » determined in 22 dogs in four groups: unoperated normal dogs, sham-operated dogs, prosthesis-implanted dogs, and prosthesis-implanted dogs treated with dipyridamole before and aspirin and dipyridamole immediately after operation. Fifteen to 20% of total platelets were consumed as a consequence of the surgical procedure. On quantitation, we found that platelet deposition on the components of the prostheses was significantly reduced in prosthesis-implanted animals treated with dipyridamole and aspirin when compared with prosthesis-implanted, untreated dogs. All prosthesis-implanted animals considered together had a twofold to fourfold increase in tissue/blood radioactivity ratio in comparison with unoperated and sham-operated animals, an indication that the viscera work as filters and trap platelet microemboli that are presumably produced in the region of the mitral valve prostheses. In the dog model, indium-111-labeled platelets thus provide a sensitive marker for noninvasive imaging of platelet deposition on mechanical mitral valve prostheses, in vitro evaluation of platelet microembolism in viscera, in vitro quantitation of surgical consumption of platelets, and evaluation of platelet-inhibitor drugs.« less

  11. Comparison of three-way and four-way calibration for the real-time quantitative analysis of drug hydrolysis in complex dynamic samples by excitation-emission matrix fluorescence.

    PubMed

    Yin, Xiao-Li; Gu, Hui-Wen; Liu, Xiao-Lu; Zhang, Shan-Hui; Wu, Hai-Long

    2018-03-05

    Multiway calibration in combination with spectroscopic technique is an attractive tool for online or real-time monitoring of target analyte(s) in complex samples. However, how to choose a suitable multiway calibration method for the resolution of spectroscopic-kinetic data is a troubling problem in practical application. In this work, for the first time, three-way and four-way fluorescence-kinetic data arrays were generated during the real-time monitoring of the hydrolysis of irinotecan (CPT-11) in human plasma by excitation-emission matrix fluorescence. Alternating normalization-weighted error (ANWE) and alternating penalty trilinear decomposition (APTLD) were used as three-way calibration for the decomposition of the three-way kinetic data array, whereas alternating weighted residual constraint quadrilinear decomposition (AWRCQLD) and alternating penalty quadrilinear decomposition (APQLD) were applied as four-way calibration to the four-way kinetic data array. The quantitative results of the two kinds of calibration models were fully compared from the perspective of predicted real-time concentrations, spiked recoveries of initial concentration, and analytical figures of merit. The comparison study demonstrated that both three-way and four-way calibration models could achieve real-time quantitative analysis of the hydrolysis of CPT-11 in human plasma under certain conditions. However, it was also found that both of them possess some critical advantages and shortcomings during the process of dynamic analysis. The conclusions obtained in this paper can provide some helpful guidance for the reasonable selection of multiway calibration models to achieve the real-time quantitative analysis of target analyte(s) in complex dynamic systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Simplex and duplex event-specific analytical methods for functional biotech maize.

    PubMed

    Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young

    2009-08-26

    Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.

  13. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  14. Portable paper-based device for quantitative colorimetric assays relying on light reflectance principle.

    PubMed

    Li, Bowei; Fu, Longwen; Zhang, Wei; Feng, Weiwei; Chen, Lingxin

    2014-04-01

    This paper presents a novel paper-based analytical device based on the colorimetric paper assays through its light reflectance. The device is portable, low cost (<20 dollars), and lightweight (only 176 g) that is available to assess the cost-effectiveness and appropriateness of the original health care or on-site detection information. Based on the light reflectance principle, the signal can be obtained directly, stably and user-friendly in our device. We demonstrated the utility and broad applicability of this technique with measurements of different biological and pollution target samples (BSA, glucose, Fe, and nitrite). Moreover, the real samples of Fe (II) and nitrite in the local tap water were successfully analyzed, and compared with the standard UV absorption method, the quantitative results showed good performance, reproducibility, and reliability. This device could provide quantitative information very conveniently and show great potential to broad fields of resource-limited analysis, medical diagnostics, and on-site environmental detection. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Frontally eluted components procedure with thin layer chromatography as a mode of sample preparation for high performance liquid chromatography quantitation of acetaminophen in biological matrix.

    PubMed

    Klimek-Turek, A; Sikora, M; Rybicki, M; Dzido, T H

    2016-03-04

    A new concept of using thin-layer chromatography to sample preparation for the quantitative determination of solute/s followed by instrumental techniques is presented Thin-layer chromatography (TLC) is used to completely separate acetaminophen and its internal standard from other components (matrix) and to form a single spot/zone containing them at the solvent front position (after the final stage of the thin-layer chromatogram development). The location of the analytes and internal standard in the solvent front zone allows their easy extraction followed by quantitation by HPLC. The exctraction procedure of the solute/s and internal standard can proceed from whole solute frontal zone or its part without lowering in accuracy of quantitative analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Analytical advances in pharmaceutical impurity profiling.

    PubMed

    Holm, René; Elder, David P

    2016-05-25

    Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. FAIR exempting separate T (1) measurement (FAIREST): a novel technique for online quantitative perfusion imaging and multi-contrast fMRI.

    PubMed

    Lai, S; Wang, J; Jahng, G H

    2001-01-01

    A new pulse sequence, dubbed FAIR exempting separate T(1) measurement (FAIREST) in which a slice-selective saturation recovery acquisition is added in addition to the standard FAIR (flow-sensitive alternating inversion recovery) scheme, was developed for quantitative perfusion imaging and multi-contrast fMRI. The technique allows for clean separation between and thus simultaneous assessment of BOLD and perfusion effects, whereas quantitative cerebral blood flow (CBF) and tissue T(1) values are monitored online. Online CBF maps were obtained using the FAIREST technique and the measured CBF values were consistent with the off-line CBF maps obtained from using the FAIR technique in combination with a separate sequence for T(1) measurement. Finger tapping activation studies were carried out to demonstrate the applicability of the FAIREST technique in a typical fMRI setting for multi-contrast fMRI. The relative CBF and BOLD changes induced by finger-tapping were 75.1 +/- 18.3 and 1.8 +/- 0.4%, respectively, and the relative oxygen consumption rate change was 2.5 +/- 7.7%. The results from correlation of the T(1) maps with the activation images on a pixel-by-pixel basis show that the mean T(1) value of the CBF activation pixels is close to the T(1) of gray matter while the mean T(1) value of the BOLD activation pixels is close to the T(1) range of blood and cerebrospinal fluid. Copyright 2001 John Wiley & Sons, Ltd.

  18. Elements of analytic style: Bion's clinical seminars.

    PubMed

    Ogden, Thomas H

    2007-10-01

    The author finds that the idea of analytic style better describes significant aspects of the way he practices psychoanalysis than does the notion of analytic technique. The latter is comprised to a large extent of principles of practice developed by previous generations of analysts. By contrast, the concept of analytic style, though it presupposes the analyst's thorough knowledge of analytic theory and technique, emphasizes (1) the analyst's use of his unique personality as reflected in his individual ways of thinking, listening, and speaking, his own particular use of metaphor, humor, irony, and so on; (2) the analyst's drawing on his personal experience, for example, as an analyst, an analysand, a parent, a child, a spouse, a teacher, and a student; (3) the analyst's capacity to think in a way that draws on, but is independent of, the ideas of his colleagues, his teachers, his analyst, and his analytic ancestors; and (4) the responsibility of the analyst to invent psychoanalysis freshly for each patient. Close readings of three of Bion's 'Clinical seminars' are presented in order to articulate some of the elements of Bion's analytic style. Bion's style is not presented as a model for others to emulate or, worse yet, imitate; rather, it is described in an effort to help the reader consider from a different vantage point (provided by the concept of analytic style) the way in which he, the reader, practices psychoanalysis.

  19. Discourse-Centric Learning Analytics: Mapping the Terrain

    ERIC Educational Resources Information Center

    Knight, Simon; Littleton, Karen

    2015-01-01

    There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA…

  20. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  1. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  2. X-ray micro-beam techniques and phase contrast tomography applied to biomaterials

    NASA Astrophysics Data System (ADS)

    Fratini, Michela; Campi, Gaetano; Bukreeva, Inna; Pelliccia, Daniele; Burghammer, Manfred; Tromba, Giuliana; Cancedda, Ranieri; Mastrogiacomo, Maddalena; Cedola, Alessia

    2015-12-01

    A deeper comprehension of the biomineralization (BM) process is at the basis of tissue engineering and regenerative medicine developments. Several in-vivo and in-vitro studies were dedicated to this purpose via the application of 2D and 3D diagnostic techniques. Here, we develop a new methodology, based on different complementary experimental techniques (X-ray phase contrast tomography, micro-X-ray diffraction and micro-X-ray fluorescence scanning technique) coupled to new analytical tools. A qualitative and quantitative structural investigation, from the atomic to the micrometric length scale, is obtained for engineered bone tissues. The high spatial resolution achieved by X-ray scanning techniques allows us to monitor the bone formation at the first-formed mineral deposit at the organic-mineral interface within a porous scaffold. This work aims at providing a full comprehension of the morphology and functionality of the biomineralization process, which is of key importance for developing new drugs for preventing and healing bone diseases and for the development of bio-inspired materials.

  3. Generalized Subset Designs in Analytical Chemistry.

    PubMed

    Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan

    2017-06-20

    Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.

  4. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  5. Investigation of Stainless Steel Corrosion in Ultrahigh-Purity Water and Steam Systems by Surface Analytical Techniques

    NASA Astrophysics Data System (ADS)

    Dong, Xia; Iacocca, Ronald G.; Bustard, Bethany L.; Kemp, Craig A. J.

    2010-02-01

    Stainless steel pipes with different degrees of rouging and a Teflon®-coated rupture disc with severe corrosion were thoroughly investigated by combining multiple surface analytical techniques. The surface roughness and iron oxide layer thickness increase with increasing rouge severity, and the chromium oxide layer coexists with the iron oxide layer in samples with various degrees of rouging. Unlike the rouging observed for stainless steel pipes, the fast degradation of the rupture disc was caused by a crevice corrosion environment created by perforations in the protective Teflon coating. This failure analysis clearly shows the highly corrosive nature of ultrapure water used in the manufacture of pharmaceutical products, and demonstrates some of the unexpected corrosion mechanisms that can be encountered in these environments.

  6. Standardization approaches in absolute quantitative proteomics with mass spectrometry.

    PubMed

    Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo

    2017-07-31

    Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and

  7. Quantitative Analysis of the Educational Infrastructure in Colombia Through the Use of a Georeferencing Software and Analytic Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Cala Estupiñan, Jose Luis; María González Bernal, Lina; Ponz Tienda, Jose Luis; Gutierrez Bucheli, Laura Andrea; Alejandro Arboleda, Carlos

    2017-10-01

    The distribution policies of the national budget have been showing an increasing trend of the investment in education infrastructure. This is the reason that makes it necessary to identify the territories with the greatest number of facilities (such as schools, colleges, universities and libraries) and those lacking this type of infrastructure, in order to know where a possible government intervention is required. This work is not intended to give a judgment on the qualitative state of the national infrastructure. It focuses, in terms of infrastructure, on Colombia’s quantitative status of the educational sector, by identifying the territories with more facilities, such as schools, colleges, universities and public libraries. To do this a quantitative index will be created to identify if the coverage of educational infrastructure at departmental level is enough, by taking into account not only the number of facilities, but also the population and the area of influence each one has. The above study is framed within a project of the University of the Andes called “visible Infrastructure”. The index is obtained through a hierarchical analytical process (AHP) and subsequently a linear equation that reflects the variables investigated. The validation of this index is performed through correlations and regressions of social, economic and cultural indicators determined by official entities. All the information on which the analysis is based is official and public. With the end of the armed conflict, it is necessary to focus the planning of public policies to heal the social gaps that the most vulnerable population needs.

  8. Quantitative Analysis of Nail Polish Remover Using Nuclear Magnetic Resonance Spectroscopy Revisited

    ERIC Educational Resources Information Center

    Hoffmann, Markus M.; Caccamis, Joshua T.; Heitz, Mark P.; Schlecht, Kenneth D.

    2008-01-01

    Substantial modifications are presented for a previously described experiment using nuclear magnetic resonance (NMR) spectroscopy to quantitatively determine analytes in commercial nail polish remover. The revised experiment is intended for a second- or third-year laboratory course in analytical chemistry and can be conducted for larger laboratory…

  9. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    PubMed

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. New analytical technique for carbon dioxide absorption solvents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pouryousefi, F.; Idem, R.O.

    2008-02-15

    The densities and refractive indices of two binary systems (water + MEA and water + MDEA) and three ternary systems (water + MEA + CO{sub 2}, water + MDEA + CO{sub 2}, and water + MEA + MDEA) used for carbon dioxide (CO{sub 2}) capture were measured over the range of compositions of the aqueous alkanolamine(s) used for CO{sub 2} absorption at temperatures from 295 to 338 K. Experimental densities were modeled empirically, while the experimental refractive indices were modeled using well-established models from the known values of their pure-component densities and refractive indices. The density and Gladstone-Dale refractive indexmore » models were then used to obtain the compositions of unknown samples of the binary and ternary systems by simultaneous solution of the density and refractive index equations. The results from this technique have been compared with HPLC (high-performance liquid chromatography) results, while a third independent technique (acid-base titration) was used to verify the results. The results show that the systems' compositions obtained from the simple and easy-to-use refractive index/density technique were very comparable to the expensive and laborious HPLC/titration techniques, suggesting that the refractive index/density technique can be used to replace existing methods for analysis of fresh or nondegraded, CO{sub 2}-loaded, single and mixed alkanolamine solutions.« less

  11. Assessing the impact of natural policy experiments on socioeconomic inequalities in health: how to apply commonly used quantitative analytical methods?

    PubMed

    Hu, Yannan; van Lenthe, Frank J; Hoffmann, Rasmus; van Hedel, Karen; Mackenbach, Johan P

    2017-04-20

    The scientific evidence-base for policies to tackle health inequalities is limited. Natural policy experiments (NPE) have drawn increasing attention as a means to evaluating the effects of policies on health. Several analytical methods can be used to evaluate the outcomes of NPEs in terms of average population health, but it is unclear whether they can also be used to assess the outcomes of NPEs in terms of health inequalities. The aim of this study therefore was to assess whether, and to demonstrate how, a number of commonly used analytical methods for the evaluation of NPEs can be applied to quantify the effect of policies on health inequalities. We identified seven quantitative analytical methods for the evaluation of NPEs: regression adjustment, propensity score matching, difference-in-differences analysis, fixed effects analysis, instrumental variable analysis, regression discontinuity and interrupted time-series. We assessed whether these methods can be used to quantify the effect of policies on the magnitude of health inequalities either by conducting a stratified analysis or by including an interaction term, and illustrated both approaches in a fictitious numerical example. All seven methods can be used to quantify the equity impact of policies on absolute and relative inequalities in health by conducting an analysis stratified by socioeconomic position, and all but one (propensity score matching) can be used to quantify equity impacts by inclusion of an interaction term between socioeconomic position and policy exposure. Methods commonly used in economics and econometrics for the evaluation of NPEs can also be applied to assess the equity impact of policies, and our illustrations provide guidance on how to do this appropriately. The low external validity of results from instrumental variable analysis and regression discontinuity makes these methods less desirable for assessing policy effects on population-level health inequalities. Increased use of the

  12. Comparison of Analytic Hierarchy Process, Catastrophe and Entropy techniques for evaluating groundwater prospect of hard-rock aquifer systems

    NASA Astrophysics Data System (ADS)

    Jenifer, M. Annie; Jha, Madan K.

    2017-05-01

    Groundwater is a treasured underground resource, which plays a central role in sustainable water management. However, it being hidden and dynamic in nature, its sustainable development and management calls for precise quantification of this precious resource at an appropriate scale. This study demonstrates the efficacy of three GIS-based multi-criteria decision analysis (MCDA) techniques, viz., Analytic Hierarchy Process (AHP), Catastrophe and Entropy in evaluating groundwater potential through a case study in hard-rock aquifer systems. Using satellite imagery and relevant field data, eight thematic layers (rainfall, land slope, drainage density, soil, lineament density, geology, proximity to surface water bodies and elevation) of the factors having significant influence on groundwater occurrence were prepared. These thematic layers and their features were assigned suitable weights based on the conceptual frameworks of AHP, Catastrophe and Entropy techniques and then they were integrated in the GIS environment to generate an integrated raster layer depicting groundwater potential index of the study area. The three groundwater prospect maps thus yielded by these MCDA techniques were verified using a novel approach (concept of 'Dynamic Groundwater Potential'). The validation results revealed that the groundwater potential predicted by the AHP technique has a pronounced accuracy of 87% compared to the Catastrophe (46% accuracy) and Entropy techniques (51% accuracy). It is concluded that the AHP technique is the most reliable for the assessment of groundwater resources followed by the Entropy method. The developed groundwater potential maps can serve as a scientific guideline for the cost-effective siting of wells and the effective planning of groundwater development at a catchment or basin scale.

  13. Piezoelectric sensors based on molecular imprinted polymers for detection of low molecular mass analytes.

    PubMed

    Uludağ, Yildiz; Piletsky, Sergey A; Turner, Anthony P F; Cooper, Matthew A

    2007-11-01

    Biomimetic recognition elements employed for the detection of analytes are commonly based on proteinaceous affibodies, immunoglobulins, single-chain and single-domain antibody fragments or aptamers. The alternative supra-molecular approach using a molecularly imprinted polymer now has proven utility in numerous applications ranging from liquid chromatography to bioassays. Despite inherent advantages compared with biochemical/biological recognition (which include robustness, storage endurance and lower costs) there are few contributions that describe quantitative analytical applications of molecularly imprinted polymers for relevant small molecular mass compounds in real-world samples. There is, however, significant literature describing the use of low-power, portable piezoelectric transducers to detect analytes in environmental monitoring and other application areas. Here we review the combination of molecularly imprinted polymers as recognition elements with piezoelectric biosensors for quantitative detection of small molecules. Analytes are classified by type and sample matrix presentation and various molecularly imprinted polymer synthetic fabrication strategies are also reviewed.

  14. Analytical learning and term-rewriting systems

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  15. Review of quantitative phase-digital holographic microscopy: promising novel imaging technique to resolve neuronal network activity and identify cellular biomarkers of psychiatric disorders.

    PubMed

    Marquet, Pierre; Depeursinge, Christian; Magistretti, Pierre J

    2014-10-01

    Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed.

  16. Review of quantitative phase-digital holographic microscopy: promising novel imaging technique to resolve neuronal network activity and identify cellular biomarkers of psychiatric disorders

    PubMed Central

    Marquet, Pierre; Depeursinge, Christian; Magistretti, Pierre J.

    2014-01-01

    Abstract. Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed. PMID:26157976

  17. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  18. An overview on forensic analysis devoted to analytical chemists.

    PubMed

    Castillo-Peinado, L S; Luque de Castro, M D

    2017-05-15

    The present article has as main aim to show analytical chemists interested in forensic analysis the world they will face if decision in favor of being a forensic analytical chemist is adopted. With this purpose, the most outstanding aspects of forensic analysis in dealing with sampling (involving both bodily and no bodily samples), sample preparation, and analytical equipment used in detection, identification and quantitation of key sample components are critically discussed. The role of the great omics in forensic analysis, and the growing role of the youngest of the great omics -metabolomics- are also discussed. The foreseeable role of integrative omics is also outlined. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  20. Quantitation of cocaine and cocaethylene in small volumes of rat whole blood using gas chromatography-mass spectrometry.

    PubMed

    Burdick, J D; Boni, R L; Fochtman, F W

    1997-05-01

    A simple solid phase extraction (SPE) technique combined with gas chromatography-mass spectrometry (GC/MS) operated in selected ion monitoring (SIM) mode is described for quantitation of cocaine and cocaethylene in small samples (250 microliters) of rat whole blood. Use of (N-[2H3C])-cocaine and (N-[2H3C])-cocaethylene internal standards resulted in high sensitivity and selectivity for this analytical method. Analysis was performed using a Hewlett-Packard 5890 GC equipped with a 7673A Automatic Liquid Sampler linked to a Hewlett-Packard 5972 Mass Selective Detector. Separation of analytes was accomplished on a cross-linked methyl silicone gum capillary column (Ultra 1: 12m x 0.2mm (i.d.) x 0.33 microns). Linearity was established over a wide range of concentrations (5.0-2000.0 ng ml-1) with good precision. Limits of detection (LOD) were 1.0 and 2.0 ng ml-1 for cocaine and cocaethylene, respectively. This analytical method was designed for use in pharmacokinetic experiments studying the formation of cocaethylene following ethanol pretreatment in rats administered cocaine.

  1. Discrimination between biologically relevant calcium phosphate phases by surface-analytical techniques

    NASA Astrophysics Data System (ADS)

    Kleine-Boymann, Matthias; Rohnke, Marcus; Henss, Anja; Peppler, Klaus; Sann, Joachim; Janek, Juergen

    2014-08-01

    The spatially resolved phase identification of biologically relevant calcium phosphate phases (CPPs) in bone tissue is essential for the elucidation of bone remodeling mechanisms and for the diagnosis of bone diseases. Analytical methods with high spatial resolution for the discrimination between chemically quite close phases are rare. Therefore the applicability of state-of-the-art ToF-SIMS, XPS and EDX as chemically specific techniques was investigated. The eight CPPs hydroxyapatite (HAP), β-tricalcium phosphate (β-TCP), α-tricalcium phosphate (α-TCP), octacalcium phosphate (OCP), dicalcium phosphate dihydrate (DCPD), dicalcium phosphate (DCP), monocalcium phosphate (MCP) and amorphous calcium phosphate (ACP) were either commercial materials in high purity or synthesized by ourselves. The phase purity was proven by XRD analysis. All of the eight CPPs show different mass spectra and the phases can be discriminated by applying the principal component analysis method to the mass spectrometric data. The Ca/P ratios of all phosphates were determined by XPS and EDX. With both methods some CPPs can be distinguished, but the obtained Ca/P ratios deviate systematically from their theoretical values. It is necessary in any case to determine a calibration curve, respectively the ZAF values, from appropriate standards. In XPS also the O(1s)-satellite signals are correlated to the CPPs composition. Angle resolved and long-term XPS measurements of HAP clearly prove that there is no phosphate excess at the surface. Decomposition due to X-ray irradiation has not been observed.

  2. Quantitative Determination of Fluorine Content in Blends of Polylactide (PLA)–Talc Using Near Infrared Spectroscopy

    PubMed Central

    Tamburini, Elena; Tagliati, Chiara; Bonato, Tiziano; Costa, Stefania; Scapoli, Chiara; Pedrini, Paola

    2016-01-01

    Near-infrared spectroscopy (NIRS) has been widely used for quantitative and/or qualitative determination of a wide range of matrices. The objective of this study was to develop a NIRS method for the quantitative determination of fluorine content in polylactide (PLA)-talc blends. A blending profile was obtained by mixing different amounts of PLA granules and talc powder. The calibration model was built correlating wet chemical data (alkali digestion method) and NIR spectra. Using FT (Fourier Transform)-NIR technique, a Partial Least Squares (PLS) regression model was set-up, in a concentration interval of 0 ppm of pure PLA to 800 ppm of pure talc. Fluorine content prediction (R2cal = 0.9498; standard error of calibration, SEC = 34.77; standard error of cross-validation, SECV = 46.94) was then externally validated by means of a further 15 independent samples (R2EX.V = 0.8955; root mean standard error of prediction, RMSEP = 61.08). A positive relationship between an inorganic component as fluorine and NIR signal has been evidenced, and used to obtain quantitative analytical information from the spectra. PMID:27490548

  3. ICP-MS: Analytical Method for Identification and Detection of Elemental Impurities.

    PubMed

    Mittal, Mohini; Kumar, Kapil; Anghore, Durgadas; Rawal, Ravindra K

    2017-01-01

    Aim of this article is to review and discuss the currently used quantitative analytical method ICP-MS, which is used for quality control of pharmaceutical products. ICP-MS technique has several applications such as determination of single elements, multi element analysis in synthetic drugs, heavy metals in environmental water, trace element content of selected fertilizers and dairy manures. ICP-MS is also used for determination of toxic and essential elements in different varieties of food samples and metal pollutant present in the environment. The pharmaceuticals may generate impurities at various stages of development, transportation and storage which make them risky to be administered. Thus, it is essential that these impurities must be detected and quantified. ICP-MS plays an important function in the recognition and revealing of elemental impurities. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Determination of a quantitative parameter to evaluate swimming technique based on the maximal tethered swimming test.

    PubMed

    Soncin, Rafael; Mezêncio, Bruno; Ferreira, Jacielle Carolina; Rodrigues, Sara Andrade; Huebner, Rudolf; Serrão, Julio Cerca; Szmuchrowski, Leszek

    2017-06-01

    The aim of this study was to propose a new force parameter, associated with swimmers' technique and performance. Twelve swimmers performed five repetitions of 25 m sprint crawl and a tethered swimming test with maximal effort. The parameters calculated were: the mean swimming velocity for crawl sprint, the mean propulsive force of the tethered swimming test as well as an oscillation parameter calculated from force fluctuation. The oscillation parameter evaluates the force variation around the mean force during the tethered test as a measure of swimming technique. Two parameters showed significant correlations with swimming velocity: the mean force during the tethered swimming (r = 0.85) and the product of the mean force square root and the oscillation (r = 0.86). However, the intercept coefficient was significantly different from zero only for the mean force, suggesting that although the correlation coefficient of the parameters was similar, part of the mean velocity magnitude that was not associated with the mean force was associated with the product of the mean force square root and the oscillation. Thus, force fluctuation during tethered swimming can be used as a quantitative index of swimmers' technique.

  5. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    EPA Pesticide Factsheets

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  6. Obstetric Neuraxial Drug Administration Errors: A Quantitative and Qualitative Analytical Review.

    PubMed

    Patel, Santosh; Loveridge, Robert

    2015-12-01

    Drug administration errors in obstetric neuraxial anesthesia can have devastating consequences. Although fully recognizing that they represent "only the tip of the iceberg," published case reports/series of these errors were reviewed in detail with the aim of estimating the frequency and the nature of these errors. We identified case reports and case series from MEDLINE and performed a quantitative analysis of the involved drugs, error setting, source of error, the observed complications, and any therapeutic interventions. We subsequently performed a qualitative analysis of the human factors involved and proposed modifications to practice. Twenty-nine cases were identified. Various drugs were given in error, but no direct effects on the course of labor, mode of delivery, or neonatal outcome were reported. Four maternal deaths from the accidental intrathecal administration of tranexamic acid were reported, all occurring after delivery of the fetus. A range of hemodynamic and neurologic signs and symptoms were noted, but the most commonly reported complication was the failure of the intended neuraxial anesthetic technique. Several human factors were present; most common factors were drug storage issues and similar drug appearance. Four practice recommendations were identified as being likely to have prevented the errors. The reported errors exposed latent conditions within health care systems. We suggest that the implementation of the following processes may decrease the risk of these types of drug errors: (1) Careful reading of the label on any drug ampule or syringe before the drug is drawn up or injected; (2) labeling all syringes; (3) checking labels with a second person or a device (such as a barcode reader linked to a computer) before the drug is drawn up or administered; and (4) use of non-Luer lock connectors on all epidural/spinal/combined spinal-epidural devices. Further study is required to determine whether routine use of these processes will reduce drug

  7. Assessment of cleaning and disinfection in Salmonella-contaminated poultry layer houses using qualitative and semi-quantitative culture techniques.

    PubMed

    Wales, Andrew; Breslin, Mark; Davies, Robert

    2006-09-10

    Salmonella infection of laying flocks in the UK is predominantly a problem of the persistent contamination of layer houses and associated wildlife vectors by Salmonella Enteritidis. Methods for its control and elimination include effective cleaning and disinfection of layer houses between flocks, and it is important to be able to measure the success of such decontamination. A method for the environmental detection and semi-quantitative enumeration of salmonellae was used and compared with a standard qualitative method, in 12 Salmonella-contaminated caged layer houses before and after cleaning and disinfection. The quantitative technique proved to have comparable sensitivity to the standard method, and additionally provided insights into the numerical Salmonella challenge that replacement flocks would encounter. Elimination of S. Enteritidis was not achieved in any of the premises examined although substantial reductions in the prevalence and numbers of salmonellae were demonstrated, whilst in others an increase in contamination was observed after cleaning and disinfection. Particular problems with feeders and wildlife vectors were highlighted. The use of a quantitative method assisted the identification of problem areas, such as those with a high initial bacterial load or those experiencing only a modest reduction in bacterial count following decontamination.

  8. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  9. Trace metal speciation in natural waters: Computational vs. analytical

    USGS Publications Warehouse

    Nordstrom, D. Kirk

    1996-01-01

    Improvements in the field sampling, preservation, and determination of trace metals in natural waters have made many analyses more reliable and less affected by contamination. The speciation of trace metals, however, remains controversial. Chemical model speciation calculations do not necessarily agree with voltammetric, ion exchange, potentiometric, or other analytical speciation techniques. When metal-organic complexes are important, model calculations are not usually helpful and on-site analytical separations are essential. Many analytical speciation techniques have serious interferences and only work well for a limited subset of water types and compositions. A combined approach to the evaluation of speciation could greatly reduce these uncertainties. The approach proposed would be to (1) compare and contrast different analytical techniques with each other and with computed speciation, (2) compare computed trace metal speciation with reliable measurements of solubility, potentiometry, and mean activity coefficients, and (3) compare different model calculations with each other for the same set of water analyses, especially where supplementary data on speciation already exist. A comparison and critique of analytical with chemical model speciation for a range of water samples would delineate the useful range and limitations of these different approaches to speciation. Both model calculations and analytical determinations have useful and different constraints on the range of possible speciation such that they can provide much better insight into speciation when used together. Major discrepancies in the thermodynamic databases of speciation models can be evaluated with the aid of analytical speciation, and when the thermodynamic models are highly consistent and reliable, the sources of error in the analytical speciation can be evaluated. Major thermodynamic discrepancies also can be evaluated by simulating solubility and activity coefficient data and testing various

  10. Sensor arrays for detecting analytes in fluids

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Freund, Michael S. (Inventor)

    1996-01-01

    Chemical sensors for detecting analytes in fluids comprise first and second conductive elements (e.g. electrical leads) electrically coupled to and separated by a chemically sensitive resistor which provides an electrical path between the conductive elements. The resistor comprises a plurality of alternating nonconductive regions (comprising a nonconductive organic polymer) and conductive regions (comprising a conductive material) transverse to the electrical path. The resistor provides a difference in resistance between the conductive elements when contacted with a fluid comprising a chemical analyte at a first concentration, than when contacted with a fluid comprising the chemical analyte at a second different concentration. Arrays of such sensors are constructed with at least two sensors having different chemically sensitive resistors providing dissimilar such differences in resistance. Variability in chemical sensitivity from sensor to sensor is provided by qualitatively or quantitatively varying the composition of the conductive and/or nonconductive regions. An electronic nose for detecting an analyte in a fluid may be constructed by using such arrays in conjunction with an electrical measuring device electrically connected to the conductive elements of each sensor.

  11. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    PubMed

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the

  12. Value of Earth Observations: Key principles and techniques of socioeconomic benefits analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Macauley, M.; Bernknopf, R.

    2013-12-01

    Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.

  13. Molecular Rotors for Universal Quantitation of Nanoscale Hydrophobic Interfaces in Microplate Format.

    PubMed

    Bisso, Paul W; Tai, Michelle; Katepalli, Hari; Bertrand, Nicolas; Blankschtein, Daniel; Langer, Robert

    2018-01-10

    Hydrophobic self-assembly pairs diverse chemical precursors and simple formulation processes to access a vast array of functional colloids. Exploration of this design space, however, is stymied by lack of broadly general, high-throughput colloid characterization tools. Here, we show that a narrow structural subset of fluorescent, zwitterionic molecular rotors, dialkylaminostilbazolium sulfonates [DASS] with intermediate-length alkyl tails, fills this major analytical void by quantitatively sensing hydrophobic interfaces in microplate format. DASS dyes supersede existing interfacial probes by avoiding off-target fluorogenic interactions and dye aggregation while preserving hydrophobic partitioning strength. To illustrate the generality of this approach, we demonstrate (i) a microplate-based technique for measuring mass concentration of small (20-200 nm), dilute (submicrogram sensitivity) drug delivery nanoparticles; (ii) elimination of particle size, surfactant chemistry, and throughput constraints on quantifying the complex surfactant/metal oxide adsorption isotherms critical for environmental remediation and enhanced oil recovery; and (iii) more reliable self-assembly onset quantitation for chemically and structurally distinct amphiphiles. These methods could streamline the development of nanotechnologies for a broad range of applications.

  14. Approximate analytical relationships for linear optimal aeroelastic flight control laws

    NASA Astrophysics Data System (ADS)

    Kassem, Ayman Hamdy

    1998-09-01

    This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.

  15. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    ERIC Educational Resources Information Center

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  16. Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients

    PubMed Central

    Ramírez, Juan Carlos; Cura, Carolina Inés; Moreira, Otacilio da Cruz; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Guedes, Paulo Marcos da Matta; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Galvão, Lúcia Maria da Cunha; da Câmara, Antonia Cláudia Jácome; Espinoza, Bertha; de Noya, Belkisyole Alarcón; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G.

    2015-01-01

    An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. PMID:26320872

  17. Multidimensional NMR approaches towards highly resolved, sensitive and high-throughput quantitative metabolomics.

    PubMed

    Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick

    2017-02-01

    Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Quantitative detection of caffeine in human skin by confocal Raman spectroscopy--A systematic in vitro validation study.

    PubMed

    Franzen, Lutz; Anderski, Juliane; Windbergs, Maike

    2015-09-01

    For rational development and evaluation of dermal drug delivery, the knowledge of rate and extent of substance penetration into the human skin is essential. However, current analytical procedures are destructive, labor intense and lack a defined spatial resolution. In this context, confocal Raman microscopy bares the potential to overcome current limitations in drug depth profiling. Confocal Raman microscopy already proved its suitability for the acquisition of qualitative penetration profiles, but a comprehensive investigation regarding its suitability for quantitative measurements inside the human skin is still missing. In this work, we present a systematic validation study to deploy confocal Raman microscopy for quantitative drug depth profiling in human skin. After we validated our Raman microscopic setup, we successfully established an experimental procedure that allows correlating the Raman signal of a model drug with its controlled concentration in human skin. To overcome current drawbacks in drug depth profiling, we evaluated different modes of peak correlation for quantitative Raman measurements and offer a suitable operating procedure for quantitative drug depth profiling in human skin. In conclusion, we successfully demonstrate the potential of confocal Raman microscopy for quantitative drug depth profiling in human skin as valuable alternative to destructive state-of-the-art techniques. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Ultrasonic Nondestructive Evaluation Techniques Applied to the Quantitative Characterization of Textile Composite Materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1998-01-01

    An overall goal of this research has been to enhance our understanding of the scientific principles necessary to develop advanced ultrasonic nondestructive techniques for the quantitative characterization of advanced composite structures. To this end, we have investigated a thin woven composite (5-harness biaxial weave). We have studied the effects that variations of the physical parameters of the experimental setup can have on the ultrasonic determination of the material properties for this thin composite. In particular, we have considered the variation of the nominal center frequency and the f-number of the transmitting transducer which in turn address issues such as focusing and beam spread of ultrasonic fields. This study has employed a planar, two-dimensional, receiving pseudo-array that has permitted investigation of the diffraction patterns of ultrasonic fields. Distortion of the ultrasonic field due to the spatial anisotropy of the thin composite has prompted investigation of the phenomenon of phase cancellation at the face of a finite-aperture, piezoelectric receiver. We have performed phase-sensitive and phase-insensitive analyses to provide a measure of the amount of phase cancellation at the face of a finite-aperture, piezoelectric receiver. The pursuit of robust measurements of received energy (i.e., those not susceptible to phase cancellation at the face of a finite-aperture, piezoelectric receiver) supports the development of robust techniques to determine material properties from measure ultrasonic parameters.

  20. Engaging Business Students in Quantitative Skills Development

    ERIC Educational Resources Information Center

    Cronin, Anthony; Carroll, Paula

    2015-01-01

    In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…

  1. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  2. Analytical Microscopy and Imaging Science | Materials Science | NREL

    Science.gov Websites

    Microanalysis (EPMA) for quantitative compositional analysis. It relies on wavelength-dispersive spectroscopy to Science group in NREL's Materials Science Center. Mowafak Al-Jassim Group Manager Dr. Al-Jassim manages the Analytical Microscopy and Imaging Science group with the Materials Science Center. Email | 303-384

  3. New developments of X-ray fluorescence imaging techniques in laboratory

    NASA Astrophysics Data System (ADS)

    Tsuji, Kouichi; Matsuno, Tsuyoshi; Takimoto, Yuki; Yamanashi, Masaki; Kometani, Noritsugu; Sasaki, Yuji C.; Hasegawa, Takeshi; Kato, Shuichi; Yamada, Takashi; Shoji, Takashi; Kawahara, Naoki

    2015-11-01

    X-ray fluorescence (XRF) analysis is a well-established analytical technique with a long research history. Many applications have been reported in various fields, such as in the environmental, archeological, biological, and forensic sciences as well as in industry. This is because XRF has a unique advantage of being a nondestructive analytical tool with good precision for quantitative analysis. Recent advances in XRF analysis have been realized by the development of new x-ray optics and x-ray detectors. Advanced x-ray focusing optics enables the making of a micro x-ray beam, leading to micro-XRF analysis and XRF imaging. A confocal micro-XRF technique has been applied for the visualization of elemental distributions inside the samples. This technique was applied for liquid samples and for monitoring chemical reactions such as the metal corrosion of steel samples in the NaCl solutions. In addition, a principal component analysis was applied for reducing the background intensity in XRF spectra obtained during XRF mapping, leading to improved spatial resolution of confocal micro-XRF images. In parallel, the authors have proposed a wavelength dispersive XRF (WD-XRF) imaging spectrometer for a fast elemental imaging. A new two dimensional x-ray detector, the Pilatus detector was applied for WD-XRF imaging. Fast XRF imaging in 1 s or even less was demonstrated for Euro coins and industrial samples. In this review paper, these recent advances in laboratory-based XRF imaging, especially in a laboratory setting, will be introduced.

  4. Analytical model and error analysis of arbitrary phasing technique for bunch length measurement

    NASA Astrophysics Data System (ADS)

    Chen, Qushan; Qin, Bin; Chen, Wei; Fan, Kuanjun; Pei, Yuanji

    2018-05-01

    An analytical model of an RF phasing method using arbitrary phase scanning for bunch length measurement is reported. We set up a statistical model instead of a linear chirp approximation to analyze the energy modulation process. It is found that, assuming a short bunch (σφ / 2 π → 0) and small relative energy spread (σγ /γr → 0), the energy spread (Y =σγ 2) at the exit of the traveling wave linac has a parabolic relationship with the cosine value of the injection phase (X = cosφr|z=0), i.e., Y = AX2 + BX + C. Analogous to quadrupole strength scanning for emittance measurement, this phase scanning method can be used to obtain the bunch length by measuring the energy spread at different injection phases. The injection phases can be randomly chosen, which is significantly different from the commonly used zero-phasing method. Further, the systematic error of the reported method, such as the influence of the space charge effect, is analyzed. This technique will be especially useful at low energies when the beam quality is dramatically degraded and is hard to measure using the zero-phasing method.

  5. Integrating Water Quality and River Rehabilitation Management - A Decision-Analytical Perspective

    NASA Astrophysics Data System (ADS)

    Reichert, P.; Langhans, S.; Lienert, J.; Schuwirth, N.

    2009-04-01

    Integrative river management involves difficult decisions about alternative measures to improve their ecological state. For this reason, it seems useful to apply knowledge from the decision sciences to support river management. We discuss how decision-analytical elements can be employed for designing an integrated river management procedure. An important aspect of this procedure is to clearly separate scientific predictions of the consequences of alternatives from objectives to be achieved by river management. The key elements of the suggested procedure are (i) the quantitative elicitation of the objectives from different stakeholder groups, (ii) the compilation of the current scientific knowledge about the consequences of the effects resulting from suggested measures in the form of a probabilistic mathematical model, and (iii) the use of these predictions and valuations to prioritize alternatives, to uncover conflicting objectives, to support the design of better alternatives, and to improve the transparency of communication about the chosen management strategy. The development of this procedure led to insights regarding necessary steps to be taken for rational decision-making in river management, to guidelines about the use of decision-analytical techniques for performing these steps, but also to new insights about the application of decision-analytical techniques in general. In particular, the consideration of the spatial distribution of the effects of measures and the potential added value of connected rehabilitated river reaches leads to favoring measures that have a positive effect beyond a single river reach. As these effects only propagate within the river network, this results in a river basin oriented management concept as a consequence of a rational decision support procedure, rather than as an a priori management paradigm. There are also limitations to the support that can be expected from the decision-analytical perspective. It will not provide the

  6. Nuclear magnetic resonance and high-performance liquid chromatography techniques for the characterization of bioactive compounds from Humulus lupulus L. (hop).

    PubMed

    Bertelli, Davide; Brighenti, Virginia; Marchetti, Lucia; Reik, Anna; Pellati, Federica

    2018-06-01

    Humulus lupulus L. (hop) represents one of the most cultivated crops, it being a key ingredient in the brewing process. Many health-related properties have been described for hop extracts, making this plant gain more interest in the field of pharmaceutical and nutraceutical research. Among the analytical tools available for the phytochemical characterization of plant extracts, quantitative nuclear magnetic resonance (qNMR) represents a new and powerful technique. In this ambit, the present study was aimed at the development of a new, simple, and efficient qNMR method for the metabolite fingerprinting of bioactive compounds in hop cones, taking advantage of the novel ERETIC 2 tool. To the best of our knowledge, this is the first attempt to apply this method to complex matrices of natural origin, such as hop extracts. The qNMR method set up in this study was applied to the quantification of both prenylflavonoids and bitter acids in eight hop cultivars. The performance of this analytical method was compared with that of HPLC-UV/DAD, which represents the most frequently used technique in the field of natural product analysis. The quantitative data obtained for hop samples by means of the two aforementioned techniques highlighted that the amount of bioactive compounds was slightly higher when qNMR was applied, although the order of magnitude of the values was the same. The accuracy of qNMR was comparable to that of the chromatographic method, thus proving to be a reliable tool for the analysis of these secondary metabolites in hop extracts. Graphical abstract Graphical abstract related to the extraction and analytical methods applied in this work for the analysis of bioactive compounds in Humulus lupulus L. (hop) cones.

  7. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials

    PubMed Central

    Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.

    2015-01-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347

  8. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  9. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods

  10. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and

  11. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  12. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  13. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (technique based on dielectrophoresis. This technique is capable of quantifying target analytes down to a few thousands of molecules (˜zmoles ).

  14. Application of surface plasmon resonance for the detection of carbohydrates, glycoconjugates, and measurement of the carbohydrate-specific interactions: a comparison with conventional analytical techniques. A critical review.

    PubMed

    Safina, Gulnara

    2012-01-27

    Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. An analytical technique for predicting the characteristics of a flexible wing equipped with an active flutter-suppression system and comparison with wind-tunnel data

    NASA Technical Reports Server (NTRS)

    Abel, I.

    1979-01-01

    An analytical technique for predicting the performance of an active flutter-suppression system is presented. This technique is based on the use of an interpolating function to approximate the unsteady aerodynamics. The resulting equations are formulated in terms of linear, ordinary differential equations with constant coefficients. This technique is then applied to an aeroelastic model wing equipped with an active flutter-suppression system. Comparisons between wind-tunnel data and analysis are presented for the wing both with and without active flutter suppression. Results indicate that the wing flutter characteristics without flutter suppression can be predicted very well but that a more adequate model of wind-tunnel turbulence is required when the active flutter-suppression system is used.

  16. Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.

    ERIC Educational Resources Information Center

    Hercules, David M.; Hercules, Shirley H.

    1984-01-01

    Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)

  17. Microextraction techniques at the analytical laboratory: an efficient way for determining low amounts of residual insecticides in soils

    NASA Astrophysics Data System (ADS)

    Viñas, Pilar; Navarro, Tania; Campillo, Natalia; Fenoll, Jose; Garrido, Isabel; Cava, Juana; Hernandez-Cordoba, Manuel

    2017-04-01

    Microextraction techniques allow sensitive measurements of pollutants to be carried out by means of instrumentation commonly available at the analytical laboratory. This communication reports our studies focused to the determination of pyrethroid insecticides in polluted soils. These chemicals are synthetic analogues of pyrethrum widely used for pest control in agricultural and household applications. Because of their properties, pyrethroids tend to strongly absorb to soil particles and organic matter. Although they are considered as pesticides with a low toxicity for humans, long times exposure to them may cause damage in immune system and in the neurological system. The procedure here studied is based on dispersive liquid-liquid microextraction (DLLME), and permits the determination of fifteen pyrethroid compounds (allethrin, resmethrin, tetramethrin, bifenthrin, fenpropathrin, cyhalothrin, acrinathrin, permethrin, λ-cyfluthrin, cypermethrin, flucythrinate, fenvalerate, esfenvalerate, τ-fluvalinate, and deltamethrin) in soil samples using gas chromatography with mass spectrometry (GC-MS). The analytes were first extracted from the soil samples (4 g) by treatment with 2 mL of acetonitrile, 2 mL of water and 0.5 g of NaCl. The enriched organic phase (approximately 0.8 mL) was separated by centrifugation, and this solution used as the dispersant in a DLLME process. The analytes did not need to be derivatized before their injection into the chromatographic system, due to their volatility and thermal stability. The identification of the different pyrethroids was carried out based on their retention times and mass spectra, considering the m/z values of the different fragments and their relative abundances. The detection limits were in the 0.2-23 ng g-1 range, depending on the analyte and the sample under analysis. The authors are grateful to the Comunidad Autonóma de la Región de Murcia, Spain (Fundación Séneca, 19888/GERM/15) and to the Spanish MINECO (Project

  18. Rediscovery and Revival of Analytical Refractometry for Protein Determination: Recombining Simplicity With Accuracy in the Digital Era.

    PubMed

    Anderle, Heinz; Weber, Alfred

    2016-03-01

    Among "vintage" methods of protein determination, quantitative analytical refractometry has received far less attention than well-established pharmacopoeial techniques based on protein nitrogen content, such as Dumas combustion (1831) and Kjeldahl digestion (1883). Protein determination by quantitative refractometry dates back to 1903 and has been extensively investigated and characterized in the following 30 years, but has since vanished into a few niche applications that may not require the degree of accuracy and precision essential for pharmaceutical analysis. However, because high-resolution and precision digital refractometers have replaced manual instruments, reducing time and resource consumption, the method appears particularly attractive from an economic, ergonomic, and environmental viewpoint. The sample solution can be measured without dilution or other preparation procedures than the separation of the protein-free matrix by ultrafiltration, which might even be omitted for a constant matrix and excipient composition. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  19. Quantitative Measurement of Local Infrared Absorption and Dielectric Function with Tip-Enhanced Near-Field Microscopy.

    PubMed

    Govyadinov, Alexander A; Amenabar, Iban; Huth, Florian; Carney, P Scott; Hillenbrand, Rainer

    2013-05-02

    Scattering-type scanning near-field optical microscopy (s-SNOM) and Fourier transform infrared nanospectroscopy (nano-FTIR) are emerging tools for nanoscale chemical material identification. Here, we push s-SNOM and nano-FTIR one important step further by enabling them to quantitatively measure local dielectric constants and infrared absorption. Our technique is based on an analytical model, which allows for a simple inversion of the near-field scattering problem. It yields the dielectric permittivity and absorption of samples with 2 orders of magnitude improved spatial resolution compared to far-field measurements and is applicable to a large class of samples including polymers and biological matter. We verify the capabilities by determining the local dielectric permittivity of a PMMA film from nano-FTIR measurements, which is in excellent agreement with far-field ellipsometric data. We further obtain local infrared absorption spectra with unprecedented accuracy in peak position and shape, which is the key to quantitative chemometrics on the nanometer scale.

  20. Aerodynamic measurement techniques. [laser based diagnostic techniques

    NASA Technical Reports Server (NTRS)

    Hunter, W. W., Jr.

    1976-01-01

    Laser characteristics of intensity, monochromatic, spatial coherence, and temporal coherence were developed to advance laser based diagnostic techniques for aerodynamic related research. Two broad categories of visualization and optical measurements were considered, and three techniques received significant attention. These are holography, laser velocimetry, and Raman scattering. Examples of the quantitative laser velocimeter and Raman scattering measurements of velocity, temperature, and density indicated the potential of these nonintrusive techniques.

  1. Post-translational quantitation by SRM/MRM: applications in cardiology.

    PubMed

    Gianazza, Erica; Banfi, Cristina

    2018-06-04

    Post-translational modifications (PTMs) have an important role in the regulation of protein function, localization and interaction with other molecules. PTMs apply a dynamic control of proteins both in physiological and pathological conditions. The study of disease-specific PTMs allows identifying potential biomarkers and developing effective drugs. Enrichment techniques combined with high-resolution MS/MS analysis provide attractive results on PTMs characterization. Selected reaction monitoring/multiple reaction monitoring (SRM/MRM) is a powerful targeted assay for the quantitation and validation of PTMs in complex biological samples. Areas covered: The most frequent PTMs are described in terms of biological role and analytical methods commonly used to detect them. The applications of SRM/MRM for the absolute quantitation of PTMs are reported and a specific section is focused on PTMs detection in proteins that are involved in cardiovascular system and heart diseases. Expert commentary: PTMs characterization in relation to disease pathology is still in progress, but targeted proteomics by LC-MS/MS has significantly upgraded the knowledge in the last years. Advances in enrichment strategies and software tools will facilitate the interpretation of high PTMs complexity. Promising studies confirm the great potentiality of SRM/MRM to study PTMs in cardiovascular field and PTMomics could be very useful in a clinical perspective.

  2. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    NASA Astrophysics Data System (ADS)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  3. Sampling of illicit drugs for quantitative analysis--part II. Study of particle size and its influence on mass reduction.

    PubMed

    Bovens, M; Csesztregi, T; Franc, A; Nagy, J; Dujourdy, L

    2014-01-01

    The basic goal in sampling for the quantitative analysis of illicit drugs is to maintain the average concentration of the drug in the material from its original seized state (the primary sample) all the way through to the analytical sample, where the effect of particle size is most critical. The size of the largest particles of different authentic illicit drug materials, in their original state and after homogenisation, using manual or mechanical procedures, was measured using a microscope with a camera attachment. The comminution methods employed included pestle and mortar (manual) and various ball and knife mills (mechanical). The drugs investigated were amphetamine, heroin, cocaine and herbal cannabis. It was shown that comminution of illicit drug materials using these techniques reduces the nominal particle size from approximately 600 μm down to between 200 and 300 μm. It was demonstrated that the choice of 1 g increments for the primary samples of powdered drugs and cannabis resin, which were used in the heterogeneity part of our study (Part I) was correct for the routine quantitative analysis of illicit seized drugs. For herbal cannabis we found that the appropriate increment size was larger. Based on the results of this study we can generally state that: An analytical sample weight of between 20 and 35 mg of an illicit powdered drug, with an assumed purity of 5% or higher, would be considered appropriate and would generate an RSDsampling in the same region as the RSDanalysis for a typical quantitative method of analysis for the most common, powdered, illicit drugs. For herbal cannabis, with an assumed purity of 1% THC (tetrahydrocannabinol) or higher, an analytical sample weight of approximately 200 mg would be appropriate. In Part III we will pull together our homogeneity studies and particle size investigations and use them to devise sampling plans and sample preparations suitable for the quantitative instrumental analysis of the most common illicit

  4. Measurement of sodium concentration in sweat samples: comparison of 5 analytical techniques.

    PubMed

    Goulet, Eric D B; Asselin, Audrey; Gosselin, Jonathan; Baker, Lindsay B

    2017-08-01

    Sweat sodium concentration (SSC) can be determined using different analytical techniques (ATs), which may have implications for athletes and scientists. This study compared the SSC measured with 5 ATs: ion chromatography (IChr), flame photometry (FP), direct (DISE) and indirect (IISE) ion-selective electrode, and ion conductivity (IC). Seventy sweat samples collected from 14 athletes were analyzed with 5 instruments: the 883 Basic IC Plus (IChr, reference instrument), AAnalyst 200 (FP), Cobas 6000 (IISE), Sweat-Chek (IC), and B-722 Laqua Twin (DISE). Instruments showed excellent relative (intraclass correlation coefficient (ICC) ≥ 0.999) and absolute (coefficient of variation (CV) ≤ 2.6%) reliability. Relative validity was also excellent between ATs (ICC ≥ 0.961). In regards to the inter-AT absolute validity, compared with IChr, standard error of the estimates were similar among ATs (2.8-3.8 mmol/L), but CV was lowest with DISE (3.9%), intermediate with IISE (7.6%), and FP (6.9%) and highest with IC (12.3%). In conclusion, SSC varies depending on the AT used to analyze samples. Therefore, results obtained from different ATs are scarcely comparable and should not be used interchangeably. Nevertheless, taking into account the normal variability in SSC (∼±12%), the imprecision of the recommendations deriving from FP, IISE, IC, and DISE should have trivial health and physiological consequences under most exercise circumstances.

  5. Comparison of extraction techniques and modeling of accelerated solvent extraction for the authentication of natural vanilla flavors.

    PubMed

    Cicchetti, Esmeralda; Chaintreau, Alain

    2009-06-01

    Accelerated solvent extraction (ASE) of vanilla beans has been optimized using ethanol as a solvent. A theoretical model is proposed to account for this multistep extraction. This allows the determination, for the first time, of the total amount of analytes initially present in the beans and thus the calculation of recoveries using ASE or any other extraction technique. As a result, ASE and Soxhlet extractions have been determined to be efficient methods, whereas recoveries are modest for maceration techniques and depend on the solvent used. Because industrial extracts are obtained by many different procedures, including maceration in various solvents, authenticating vanilla extracts using quantitative ratios between the amounts of vanilla flavor constituents appears to be unreliable. When authentication techniques based on isotopic ratios are used, ASE is a valid sample preparation technique because it does not induce isotopic fractionation.

  6. The Evolution of 3D Microimaging Techniques in Geosciences

    NASA Astrophysics Data System (ADS)

    Sahagian, D.; Proussevitch, A.

    2009-05-01

    In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can

  7. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    PubMed

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  8. Quantitative dispersion microscopy

    PubMed Central

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234

  9. Identification of Microorganisms by Modern Analytical Techniques.

    PubMed

    Buszewski, Bogusław; Rogowska, Agnieszka; Pomastowski, Paweł; Złoch, Michał; Railean-Plugaru, Viorica

    2017-11-01

    Rapid detection and identification of microorganisms is a challenging and important aspect in a wide range of fields, from medical to industrial, affecting human lives. Unfortunately, classical methods of microorganism identification are based on time-consuming and labor-intensive approaches. Screening techniques require the rapid and cheap grouping of bacterial isolates; however, modern bioanalytics demand comprehensive bacterial studies at a molecular level. Modern approaches for the rapid identification of bacteria use molecular techniques, such as 16S ribosomal RNA gene sequencing based on polymerase chain reaction or electromigration, especially capillary zone electrophoresis and capillary isoelectric focusing. However, there are still several challenges with the analysis of microbial complexes using electromigration technology, such as uncontrolled aggregation and/or adhesion to the capillary surface. Thus, an approach using capillary electrophoresis of microbial aggregates with UV and matrix-assisted laser desorption ionization time-of-flight MS detection is presented.

  10. Quantitative PCR for Genetic Markers of Human Fecal Pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantificationapproach. We report the development of quantitative PCR assays for quantification of two recently described human-...

  11. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  12. Quantitative analysis of major dibenzocyclooctane lignans in Schisandrae fructus by online TLC-DART-MS.

    PubMed

    Kim, Hye Jin; Oh, Myung Sook; Hong, Jongki; Jang, Young Pyo

    2011-01-01

    Direct analysis in real time (DART) ion source is a powerful ionising technique for the quick and easy detection of various organic molecules without any sample preparation steps, but the lack of quantitation capacity limits its extensive use in the field of phytochemical analysis. To improvise a new system which utilize DART-MS as a hyphenated detector for quantitation. A total extract of Schisandra chinensis fruit was analyzed on a TLC plate and three major lignan compounds were quantitated by three different methods of UV densitometry, TLC-DART-MS and HPLC-UV to compare the efficiency of each method. To introduce the TLC plate into the DART ion source at a constant velocity, a syringe pump was employed. The DART-MS total ion current chromatogram was recorded for the entire TLC plate. The concentration of each lignan compound was calculated from the calibration curve established with standard compound. Gomisin A, gomisin N and schisandrin were well separated on a silica-coated TLC plate and the specific ion current chromatograms were successfully acquired from the TLC-DART-MS system. The TLC-DART-MS system for the quantitation of natural products showed better linearity and specificity than TLC densitometry, and consumed less time and solvent than conventional HPLC method. A hyphenated system for the quantitation of phytochemicals from crude herbal drugs was successfully established. This system was shown to have a powerful analytical capacity for the prompt and efficient quantitation of natural products from crude drugs. Copyright © 2010 John Wiley & Sons, Ltd.

  13. Identification of Tengfu Jiangya Tablet Target Biomarkers with Quantitative Proteomic Technique

    PubMed Central

    Xu, Jingwen; Zhang, Shijun; Jiang, Haiqiang; Wang, Nan; Lin, Haiqing

    2017-01-01

    Tengfu Jiangya Tablet (TJT) is a well accepted antihypertension drug in China and its major active components were Uncaria total alkaloids and Semen Raphani soluble alkaloid. To further explore treatment effects mechanism of TJT on essential hypertension, a serum proteomic study was performed. Potential biomarkers were quantified in serum of hypertension individuals before and after taking TJT with isobaric tags for relative and absolute quantitation (iTRAQ) coupled two-dimensional liquid chromatography followed electrospray ionization-tandem mass spectrometry (2D LC-MS/MS) proteomics technique. Among 391 identified proteins with high confidence, 70 proteins were differentially expressed (fold variation criteria, >1.2 or <0.83) between two groups (39 upregulated and 31 downregulated). Combining with Gene Ontology annotation, KEGG pathway analysis, and literature retrieval, 5 proteins were chosen as key target biomarkers during TJT therapeutic process. And the alteration profiles of these 5 proteins were verified by ELISA and Western Blot. Proteins Kininogen 1 and Keratin 1 are members of Kallikrein system, while Myeloperoxidase, Serum Amyloid protein A, and Retinol binding protein 4 had been reported closely related to vascular endothelial injury. Our study discovered 5 target biomarkers of the compound Chinese medicine TJT. Secondly, this research initially revealed the antihypertension therapeutic mechanism of this drug from a brand-new aspect. PMID:28408942

  14. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  15. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  16. An analytical and experimental evaluation of a Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. A.; Cosby, R. M.

    1976-01-01

    An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.

  17. Quantitative PCR for genetic markers of human fecal pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantification approach. We report the development of quantitative PCR assays for enumeration of two recently described hum...

  18. Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.

    2018-03-01

    An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.

  19. Using Analytical Techniques to Interpret Financial Statements.

    ERIC Educational Resources Information Center

    Walters, Donald L.

    1986-01-01

    Summarizes techniques for interpreting the balance sheet and the statement of revenues, expenditures, and changes-in-fund-balance sections of the comprehensive annual financial report required of all school districts. Uses three tables to show intricacies involved and focuses on analyzing favorable and unfavorable budget variances. (MLH)

  20. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  1. Reliable LC-MS quantitative glycomics using iGlycoMab stable isotope labeled glycans as internal standards.

    PubMed

    Zhou, Shiyue; Tello, Nadia; Harvey, Alex; Boyes, Barry; Orlando, Ron; Mechref, Yehia

    2016-06-01

    Glycans have numerous functions in various biological processes and participate in the progress of diseases. Reliable quantitative glycomic profiling techniques could contribute to the understanding of the biological functions of glycans, and lead to the discovery of potential glycan biomarkers for diseases. Although LC-MS is a powerful analytical tool for quantitative glycomics, the variation of ionization efficiency and MS intensity bias are influencing quantitation reliability. Internal standards can be utilized for glycomic quantitation by MS-based methods to reduce variability. In this study, we used stable isotope labeled IgG2b monoclonal antibody, iGlycoMab, as an internal standard to reduce potential for errors and to reduce variabililty due to sample digestion, derivatization, and fluctuation of nanoESI efficiency in the LC-MS analysis of permethylated N-glycans released from model glycoproteins, human blood serum, and breast cancer cell line. We observed an unanticipated degradation of isotope labeled glycans, tracked a source of such degradation, and optimized a sample preparation protocol to minimize degradation of the internal standard glycans. All results indicated the effectiveness of using iGlycoMab to minimize errors originating from sample handling and instruments. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Pavement Performance : Approaches Using Predictive Analytics

    DOT National Transportation Integrated Search

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  3. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  4. Directivity analysis of meander-line-coil EMATs with a wholly analytical method.

    PubMed

    Xie, Yuedong; Liu, Zenghua; Yin, Liyuan; Wu, Jiande; Deng, Peng; Yin, Wuliang

    2017-01-01

    This paper presents the simulation and experimental study of the radiation pattern of a meander-line-coil EMAT. A wholly analytical method, which involves the coupling of two models: an analytical EM model and an analytical UT model, has been developed to build EMAT models and analyse the Rayleigh waves' beam directivity. For a specific sensor configuration, Lorentz forces are calculated using the EM analytical method, which is adapted from the classic Deeds and Dodd solution. The calculated Lorentz force density are imported to an analytical ultrasonic model as driven point sources, which produce the Rayleigh waves within a layered medium. The effect of the length of the meander-line-coil on the Rayleigh waves' beam directivity is analysed quantitatively and verified experimentally. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  6. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  7. Tannin structural elucidation and quantitative ³¹P NMR analysis. 2. Hydrolyzable tannins and proanthocyanidins.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products.

  8. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  9. Detailed Chemical Composition of Condensed Tannins via Quantitative (31)P NMR and HSQC Analyses: Acacia catechu, Schinopsis balansae, and Acacia mearnsii.

    PubMed

    Crestini, Claudia; Lange, Heiko; Bianchetti, Giulia

    2016-09-23

    The chemical composition of Acacia catechu, Schinopsis balansae, and Acacia mearnsii proanthocyanidins has been determined using a novel analytical approach that rests on the concerted use of quantitative (31)P NMR and two-dimensional heteronuclear NMR spectroscopy. This approach has offered significant detailed information regarding the structure and purity of these complex and often elusive proanthocyanidins. More specifically, rings A, B, and C of their flavan-3-ol units show well-defined and resolved absorbance regions in both the quantitative (31)P NMR and HSQC spectra. By integrating each of these regions in the (31)P NMR spectra, it is possible to identify the oxygenation patterns of the flavan-3-ol units. At the same time it is possible to acquire a fingerprint of the proanthocyanidin sample and evaluate its purity via the HSQC information. This analytical approach is suitable for both the purified natural product proanthocyanidins and their commercial analogues. Overall, this effort demonstrates the power of the concerted use of these two NMR techniques for the structural elucidation of natural products containing labile hydroxy protons and a carbon framework that can be traced out via HSQC.

  10. Quantitative prediction of solute strengthening in aluminium alloys.

    PubMed

    Leyson, Gerard Paul M; Curtin, William A; Hector, Louis G; Woodward, Christopher F

    2010-09-01

    Despite significant advances in computational materials science, a quantitative, parameter-free prediction of the mechanical properties of alloys has been difficult to achieve from first principles. Here, we present a new analytic theory that, with input from first-principles calculations, is able to predict the strengthening of aluminium by substitutional solute atoms. Solute-dislocation interaction energies in and around the dislocation core are first calculated using density functional theory and a flexible-boundary-condition method. An analytic model for the strength, or stress to move a dislocation, owing to the random field of solutes, is then presented. The theory, which has no adjustable parameters and is extendable to other metallic alloys, predicts both the energy barriers to dislocation motion and the zero-temperature flow stress, allowing for predictions of finite-temperature flow stresses. Quantitative comparisons with experimental flow stresses at temperature T=78 K are made for Al-X alloys (X=Mg, Si, Cu, Cr) and good agreement is obtained.

  11. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  12. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE PAGES

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    2016-09-16

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  13. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  14. Second harmonic generation quantitative measurements on collagen fibrils through correlation to electron microscopy

    NASA Astrophysics Data System (ADS)

    Bancelin, S.; Aimé, C.; Gusachenko, I.; Kowalczuk, L.; Latour, G.; Coradin, T.; Schanne-Klein, M.-C.

    2015-03-01

    Type I collagen is a major structural protein in mammals that shows highly structured macromolecular organizations specific to each tissue. This biopolymer is synthesized as triple helices, which self-assemble into fibrils (Ø =10-300 nm) and further form various 3D organization. In recent years, Second Harmonic Generation (SHG) microscopy has emerged as a powerful technique to probe in situ the fibrillar collagenous network within tissues. However, this optical technique cannot resolve most of the fibrils and is a coherent process, which has impeded quantitative measurements of the fibril diameter so far. In this study, we correlated SHG microscopy with Transmission Electron Microscopy to determine the sensitivity of SHG microscopy and to calibrate SHG signals as a function of the fibril diameter in reconstructed collagen gels. To that end, we synthetized isolated fibrils with various diameters and successfully imaged the very same fibrils with both techniques, down to 30 nm diameter. We observed that SHG signals scaled as the fourth power of the fibril diameter, as expected from analytical and numerical calculations. This calibration was then applied to diabetic rat cornea in which we successfully recovered the diameter of hyperglycemia-induced fibrils in the Descemet's membrane without having to resolve them. Finally we derived the first hyperpolarizability from a single collagen triple helix which validates the bottom-up approach used to calculate the non-linear response at the fibrillar scale and denotes a parallel alignment of triple helices within the fibrils. These results represent a major step towards quantitative SHG imaging of nm-sized collagen fibrils.

  15. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  16. Accelerator-based analytical technique in the evaluation of some Nigeria’s natural minerals: Fluorite, tourmaline and topaz

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, O. A.; Mazzoli, C.; Ceccato, D.; Akintunde, J. A.; De Poli, M.; Moschini, G.

    2005-10-01

    For the first time, the complementary accelerator-based analytical technique of PIXE and electron microprobe analysis (EMPA) were employed for the characterization of some Nigeria's natural minerals namely fluorite, tourmaline and topaz. These minerals occur in different areas in Nigeria. The minerals are mainly used as gemstones and for other scientific and technological applications and therefore are very important. There is need to characterize them to know the quality of these gemstones and update the geochemical data on them geared towards useful applications. PIXE analysis was carried out using the 1.8 MeV collimated proton beam from the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy. The novel results which show many elements at different concentrations in these minerals are presented and discussed.

  17. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  18. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Principles, performance, and applications of spectral reconstitution (SR) in quantitative analysis of oils by Fourier transform infrared spectroscopy (FT-IR).

    PubMed

    García-González, Diego L; Sedman, Jacqueline; van de Voort, Frederik R

    2013-04-01

    Spectral reconstitution (SR) is a dilution technique developed to facilitate the rapid, automated, and quantitative analysis of viscous oil samples by Fourier transform infrared spectroscopy (FT-IR). This technique involves determining the dilution factor through measurement of an absorption band of a suitable spectral marker added to the diluent, and then spectrally removing the diluent from the sample and multiplying the resulting spectrum to compensate for the effect of dilution on the band intensities. The facsimile spectrum of the neat oil thus obtained can then be qualitatively or quantitatively analyzed for the parameter(s) of interest. The quantitative performance of the SR technique was examined with two transition-metal carbonyl complexes as spectral markers, chromium hexacarbonyl and methylcyclopentadienyl manganese tricarbonyl. The estimation of the volume fraction (VF) of the diluent in a model system, consisting of canola oil diluted to various extents with odorless mineral spirits, served as the basis for assessment of these markers. The relationship between the VF estimates and the true volume fraction (VF(t)) was found to be strongly dependent on the dilution ratio and also depended, to a lesser extent, on the spectral resolution. These dependences are attributable to the effect of changes in matrix polarity on the bandwidth of the ν(CO) marker bands. Excellent VF(t) estimates were obtained by making a polarity correction devised with a variance-spectrum-delineated correction equation. In the absence of such a correction, SR was shown to introduce only a minor and constant bias, provided that polarity differences among all the diluted samples analyzed were minimal. This bias can be built into the calibration of a quantitative FT-IR analytical method by subjecting appropriate calibration standards to the same SR procedure as the samples to be analyzed. The primary purpose of the SR technique is to simplify preparation of diluted samples such that

  20. [Determination of acetanilide herbicide residues in tea by gas chromatography-mass spectrometry with two different ionization techniques].

    PubMed

    Shen, Weijian; Xu, Jinzhong; Yang, Wenquan; Shen, Chongyu; Zhao, Zengyun; Ding, Tao; Wu, Bin

    2007-09-01

    An analytical method of solid phase extraction-gas chromatography-mass spectrometry with two different ionization techniques was established for simultaneous determination of 12 acetanilide herbicide residues in tea-leaves. Herbicides were extracted from tea-leaf samples with ethyl acetate. The extract was cleaned-up on an active carbon SPE column connected to a Florisil SPE column. Analytical screening was determined by the technique of gas chromatography (GC)-mass spectrometry (MS) in the selected ion monitoring (SIM) mode with either electron impact ionization (EI) or negative chemical ionization (NCI). It is reliable and stable that the recoveries of all herbicides were in the range from 50% to 110% at three spiked levels, 10 microg/kg, 20 microg/kg and 40 microg/kg, and the relative standard deviations (RSDs) were no more than 10.9%. The two different ionization techniques are complementary as more ion fragmentation information can be obtained from the EI mode while more molecular ion information from the NCI mode. By comparison of the two techniques, the selectivity of NCI-SIM was much better than that of EI-SIM method. The sensitivities of the both techniques were high, the limit of quantitative (LOQ) for each herbicide was no more than 2.0 microg/kg, and the limit of detection (LOD) with NCI-SIM technique was much lower than that of EI-SIM when analyzing herbicides with several halogen atoms in the molecule.

  1. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  2. Quantitative Ultrasound-Assisted Extraction for Trace-Metal Determination: An Experiment for Analytical Chemistry

    ERIC Educational Resources Information Center

    Lavilla, Isela; Costas, Marta; Pena-Pereira, Francisco; Gil, Sandra; Bendicho, Carlos

    2011-01-01

    Ultrasound-assisted extraction (UAE) is introduced to upper-level analytical chemistry students as a simple strategy focused on sample preparation for trace-metal determination in biological tissues. Nickel extraction in seafood samples and quantification by electrothermal atomic absorption spectrometry (ETAAS) are carried out by a team of four…

  3. Simultaneous quantitative analysis of main components in linderae reflexae radix with one single marker.

    PubMed

    Wang, Li-Li; Zhang, Yun-Bin; Sun, Xiao-Ya; Chen, Sui-Qing

    2016-05-08

    Establish a quantitative analysis of multi-components by the single marker (QAMS) method for quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four main components in Linderae Reflexae Radix. Four main components of pinostrobin, pinosylvin, pinocembrin, and 3,5-dihydroxy-2-(1- p -mentheneyl)- trans -stilbene were selected as analytes to evaluate the quality by RP-HPLC coupled with a UV-detector. The method was evaluated by a comparison of the quantitative results between the external standard method and QAMS with a different HPLC system. The results showed that no significant differences were found in the quantitative results of the four contents of Linderae Reflexae Radix determined by the external standard method and QAMS (RSD <3%). The contents of four analytes (pinosylvin, pinocembrin, pinostrobin, and Reflexanbene I) in Linderae Reflexae Radix were determined by the single marker of pinosylvin. This fingerprint was the spectra determined by Shimadzu LC-20AT and Waters e2695 HPLC that were equipped with three different columns.

  4. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  5. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  6. SERS-based application in food analytics (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Cialla-May, Dana; Radu, Andreea; Jahn, Martin; Weber, Karina; Popp, Jürgen

    2017-02-01

    To establish detection schemes in life science applications, specific and sensitive methods allowing for fast detection times are required. Due to the interaction of molecules with strong electromagnetic fields excited at metallic nanostructures, the molecular fingerprint specific Raman spectrum is increased by several orders of magnitude. This effect is described as surface-enhanced Raman spectroscopy (SERS) and became a very powerful analytical tool in many fields of application. Within this presentation, we will introduce innovative bottom-up strategies to prepare SERS-active nanostructures coated with a lipophilic sensor layer. To do so, the food colorant Sudan III, an indirect carcinogen substance found in chili powder, palm oil or spice mixtures, is detected quantitatively in the background of the competitor riboflavin as well as paprika powder extracts. The SERS-based detection of azorubine (E122) in commercial available beverages with different complexity (e.g. sugar content, alcohol concentration) illustrates the strong potential of SERS as a qualitative as well as semiquantitative prescan method in food analytics. Here, a good agreement between the estimated concentration employing SERS as well as the gold standard technique HPLC, a highly laborious method, is found. Finally, SERS is applied to detect vitamin B2 and B12 in cereals as well as the estimate the ratio of lycopene and β-carotene in tomatoes. Acknowledgement: Funding the projects "QuantiSERS" and "Jenaer Biochip Initiative 2.0" within the framework "InnoProfile Transfer - Unternehmen Region" the Federal Ministry of Education and Research, Germany (BMBF) is gratefully acknowledged.

  7. The role of light microscopy in aerospace analytical laboratories

    NASA Technical Reports Server (NTRS)

    Crutcher, E. R.

    1977-01-01

    Light microscopy has greatly reduced analytical flow time and added new dimensions to laboratory capability. Aerospace analytical laboratories are often confronted with problems involving contamination, wear, or material inhomogeneity. The detection of potential problems and the solution of those that develop necessitate the most sensitive and selective applications of sophisticated analytical techniques and instrumentation. This inevitably involves light microscopy. The microscope can characterize and often identify the cause of a problem in 5-15 minutes with confirmatory tests generally less than one hour. Light microscopy has and will make a very significant contribution to the analytical capabilities of aerospace laboratories.

  8. FRET-based genetically-encoded sensors for quantitative monitoring of metabolites.

    PubMed

    Mohsin, Mohd; Ahmad, Altaf; Iqbal, Muhammad

    2015-10-01

    Neighboring cells in the same tissue can exist in different states of dynamic activities. After genomics, proteomics and metabolomics, fluxomics is now equally important for generating accurate quantitative information on the cellular and sub-cellular dynamics of ions and metabolite, which is critical for functional understanding of organisms. Various spectrometry techniques are used for monitoring ions and metabolites, although their temporal and spatial resolutions are limited. Discovery of the fluorescent proteins and their variants has revolutionized cell biology. Therefore, novel tools and methods targeting sub-cellular compartments need to be deployed in specific cells and targeted to sub-cellular compartments in order to quantify the target-molecule dynamics directly. We require tools that can measure cellular activities and protein dynamics with sub-cellular resolution. Biosensors based on fluorescence resonance energy transfer (FRET) are genetically encoded and hence can specifically target sub-cellular organelles by fusion to proteins or targetted sequences. Since last decade, FRET-based genetically encoded sensors for molecules involved in energy production, reactive oxygen species and secondary messengers have helped to unravel key aspects of cellular physiology. This review, describing the design and principles of sensors, presents a database of sensors for different analytes/processes, and illustrate examples of application in quantitative live cell imaging.

  9. Fast analytical scatter estimation using graphics processing units.

    PubMed

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  10. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    PubMed Central

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-01-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of TOF scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (Direct Image Reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias vs. variance performance to iterative TOF reconstruction with a matched resolution model. PMID:27032968

  11. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    NASA Astrophysics Data System (ADS)

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-05-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of time-of-flight (TOF) scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (DIRECT: direct image reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias versus variance performance to iterative TOF reconstruction with a matched resolution model.

  12. Can neutral analytes be concentrated by transient isotachophoresis in micellar electrokinetic chromatography and how much?

    PubMed

    Matczuk, Magdalena; Foteeva, Lidia S; Jarosz, Maciej; Galanski, Markus; Keppler, Bernhard K; Hirokawa, Takeshi; Timerbaev, Andrei R

    2014-06-06

    Transient isotachophoresis (tITP) is a versatile sample preconcentration technique that uses ITP to focus electrically charged analytes at the initial stage of CE analysis. However, according to the ruling principle of tITP, uncharged analytes are beyond its capacity while being separated and detected by micellar electrokinetic chromatography (MEKC). On the other hand, when these are charged micelles that undergo the tITP focusing, one can anticipate the concentration effect, resulting from the formation of transient micellar stack at moving sample/background electrolyte (BGE) boundary, which increasingly accumulates the analytes. This work expands the enrichment potential of tITP for MEKC by demonstrating the quantitative analysis of uncharged metal-based drugs from highly saline samples and introducing to the BGE solution anionic surfactants and buffer (terminating) co-ions of different mobility and concentration to optimize performance. Metallodrugs of assorted lipophilicity were chosen so as to explore whether their varying affinity toward micelles plays the role. In addition to altering the sample and BGE composition, optimization of the detection capability was achieved due to fine-tuning operational variables such as sample volume, separation voltage and pressure, etc. The results of optimization trials shed light on the mechanism of micellar tITP and render effective determination of selected drugs in human urine, with practical limits of detection using conventional UV detector. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Analytical capabilities and services of Lawrence Livermore Laboratory's General Chemistry Division. [Methods available at Lawrence Livermore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutmacher, R.; Crawford, R.

    This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.

  14. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  15. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  16. Analytical Chemical Sensing in the Submillimeter/terahertz Spectral Range

    NASA Astrophysics Data System (ADS)

    Moran, Benjamin L.; Fosnight, Alyssa M.; Medvedev, Ivan R.; Neese, Christopher F.

    2012-06-01

    Highly sensitive and selective Terahertz sensor utilized to quantitatively analyze a complex mixture of Volatile Organic Compounds is reported. To best demonstrate analytical capabilities of THz chemical sensors we chose to perform analytical quantitative analysis of a certified gas mixture using a novel prototype chemical sensor that couples a commercial preconcentration system (Entech 7100A) to a high resolution THz spectrometer. We selected Method TO-14A certified mixture of 39 volatile organic compounds (VOCs) diluted to 1 part per million (ppm) in nitrogen. 26 of the 39 chemicals were identified by us as suitable for THz spectroscopic detection. Entech 7100A system is designed and marketed as an inlet system for Gas Chromatography-Mass Spectrometry (GC-MS) instruments with a specific focus on TO-14 and TO-15 EPA sampling methods. Its preconcentration efficiency is high for the 39 chemicals in the mixture used for this study and our preliminary results confirm this. Here we present the results of this study which serves as basis for our ongoing research in environmental sensing and analysis of exhaled human breath.

  17. A calibration method for fringe reflection technique based on the analytical phase-slope description

    NASA Astrophysics Data System (ADS)

    Wu, Yuxiang; Yue, Huimin; Pan, Zhipeng; Liu, Yong

    2018-05-01

    The fringe reflection technique (FRT) has been one of the most popular methods to measure the shape of specular surface these years. The existing system calibration methods of FRT usually contain two parts, which are camera calibration and geometric calibration. In geometric calibration, the liquid crystal display (LCD) screen position calibration is one of the most difficult steps among all the calibration procedures, and its accuracy is affected by the factors such as the imaging aberration, the plane mirror flatness, and LCD screen pixel size accuracy. In this paper, based on the deduction of FRT analytical phase-slope description, we present a novel calibration method with no requirement to calibrate the position of LCD screen. On the other hand, the system can be arbitrarily arranged, and the imaging system can either be telecentric or non-telecentric. In our experiment of measuring the 5000mm radius sphere mirror, the proposed calibration method achieves 2.5 times smaller measurement error than the geometric calibration method. In the wafer surface measuring experiment, the measurement result with the proposed calibration method is closer to the interferometer result than the geometric calibration method.

  18. Learning Analytics as Assemblage: Criticality and Contingency in Online Education

    ERIC Educational Resources Information Center

    Scott, John; Nichols, T. Philip

    2017-01-01

    Recently, the possibilities for leveraging "big data" in research and pedagogy have given rise to the growing field of "learning analytics" in online education. While much of this work has focused on quantitative metrics, some have called for critical perspectives that interrogate such data as an interplay between technical…

  19. How Dispositional Learning Analytics Helps Understanding the Worked-Example Principle

    ERIC Educational Resources Information Center

    Tempelaar, Dirk

    2017-01-01

    This empirical study aims to demonstrate how Dispositional Learning Analytics can contribute in the investigation of the effectiveness of didactical scenarios in authentic settings, where previous research has mostly been laboratory based. Using a showcase based on learning processes of 1080 students in a blended introductory quantitative course,…

  20. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  1. Identification and Quantitation of Flavanols and Proanthocyanidins in Foods: How Good are the Datas?

    PubMed Central

    Kelm, Mark A.; Hammerstone, John F.; Schmitz, Harold H.

    2005-01-01

    Evidence suggesting that dietary polyphenols, flavanols, and proanthocyanidins in particular offer significant cardiovascular health benefits is rapidly increasing. Accordingly, reliable and accurate methods are needed to provide qualitative and quantitative food composition data necessary for high quality epidemiological and clinical research. Measurements for flavonoids and proanthocyanidins have employed a range of analytical techniques, with various colorimetric assays still being popular for estimating total polyphenolic content in foods and other biological samples despite advances made with more sophisticated analyses. More crudely, estimations of polyphenol content as well as antioxidant activity are also reported with values relating to radical scavenging activity. High-performance liquid chromatography (HPLC) is the method of choice for quantitative analysis of individual polyphenols such as flavanols and proanthocyanidins. Qualitative information regarding proanthocyanidin structure has been determined by chemical methods such as thiolysis and by HPLC-mass spectrometry (MS) techniques at present. The lack of appropriate standards is the single most important factor that limits the aforementioned analyses. However, with ever expanding research in the arena of flavanols, proanthocyanidins, and health and the importance of their future inclusion in food composition databases, the need for standards becomes more critical. At present, sufficiently well-characterized standard material is available for selective flavanols and proanthocyanidins, and construction of at least a limited food composition database is feasible. PMID:15712597

  2. Validation of an analytical method for the quantitative determination of selenium in bacterial biomass by ultraviolet-visible spectrophotometry.

    PubMed

    Mörschbächer, Ana Paula; Dullius, Anja; Dullius, Carlos Henrique; Bandt, Cassiano Ricardo; Kuhn, Daniel; Brietzke, Débora Tairini; Malmann Kuffel, Fernando José; Etgeton, Henrique Pretto; Altmayer, Taciélen; Gonçalves, Tamara Engelmann; Oreste, Eliézer Quadro; Ribeiro, Anderson Schwingel; de Souza, Claucia Fernanda Volken; Hoehne, Lucélia

    2018-07-30

    The present paper describes the validation of a spectrophotometry method involving molecular absorption in the visible ultraviolet-visible (UV-Vis) region for selenium (Se) determination in the bacterial biomass produced by lactic acid bacteria (LAB). The method was found to be suitable for the target application and presented a linearity range from 0.025 to 0.250 mg/L Se. The angular and linear coefficients of the linear equation were 1.0678 and 0.0197 mg/L Se, respectively, and the linear correlation coefficient (R 2 ) was 0.9991. Analyte recovery exceeded 96% with a relative standard deviation (RSD) below 3%. The Se contents in LAB ranged from 0.01 to 20 mg/g. The Se contents in the bacterial biomass determined by UV-Vis were not significantly different (p > 0.05) those determined by graphite furnace atomic absorption spectrometry. Thus, Se can be quantified in LAB biomass using this relatively simpler technique. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. DREAMING THE ANALYTIC SESSION: A CLINICAL ESSAY.

    PubMed

    Ogden, Thomas H

    2017-01-01

    This is a clinical paper in which the author describes analytic work in which he dreams the analytic session with three of his patients. He begins with a brief discussion of aspects of analytic theory that make up a good deal of the context for his clinical work. Central among these concepts are (1) the idea that the role of the analyst is to help the patient dream his previously "undreamt" and "interrupted" dreams; and (2) dreaming the analytic session involves engaging in the experience of dreaming the session with the patient and, at the same time, unconsciously (and at times consciously) understanding the dream. The author offers no "technique" for dreaming the analytic session. Each analyst must find his or her own way of dreaming each session with each patient. Dreaming the session is not something one works at; rather, one tries not to get in its way. © 2017 The Psychoanalytic Quarterly, Inc.

  4. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are

  5. Analyzing Matrices of Meta-Analytic Correlations: Current Practices and Recommendations

    ERIC Educational Resources Information Center

    Sheng, Zitong; Kong, Wenmo; Cortina, Jose M.; Hou, Shuofei

    2016-01-01

    Researchers have become increasingly interested in conducting analyses on meta-analytic correlation matrices. Methodologists have provided guidance and recommended practices for the application of this technique. The purpose of this article is to review current practices regarding analyzing meta-analytic correlation matrices, to identify the gaps…

  6. Quantitative diagnosis and prognosis framework for concrete degradation due to alkali-silica reaction

    NASA Astrophysics Data System (ADS)

    Mahadevan, Sankaran; Neal, Kyle; Nath, Paromita; Bao, Yanqing; Cai, Guowei; Orme, Peter; Adams, Douglas; Agarwal, Vivek

    2017-02-01

    This research is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in nuclear power plants that are subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements: monitoring, data analytics, uncertainty quantification, and prognosis. The current work focuses on degradation caused by ASR (alkali-silica reaction). Controlled concrete specimens with reactive aggregate are prepared to develop accelerated ASR degradation. Different monitoring techniques — infrared thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) — are studied for ASR diagnosis of the specimens. Both DIC and mechanical measurements record the specimen deformation caused by ASR gel expansion. Thermography is used to compare the thermal response of pristine and damaged concrete specimens and generate a 2-D map of the damage (i.e., ASR gel and cracked area), thus facilitating localization and quantification of damage. NIRAS and VAM are two separate vibration-based techniques that detect nonlinear changes in dynamic properties caused by the damage. The diagnosis results from multiple techniques are then fused using a Bayesian network, which also helps to quantify the uncertainty in the diagnosis. Prognosis of ASR degradation is then performed based on the current state of degradation obtained from diagnosis, by using a coupled thermo-hydro-mechanical-chemical (THMC) model for ASR degradation. This comprehensive approach of monitoring, data analytics, and uncertainty-quantified diagnosis and prognosis will facilitate the development of a quantitative, risk informed framework that will support continuous assessment and risk management of structural health and performance.

  7. Fluorescence metrology used for analytics of high-quality optical materials

    NASA Astrophysics Data System (ADS)

    Engel, Axel; Haspel, Rainer; Rupertus, Volker

    2004-09-01

    Optical, glass ceramics and crystals are used for various specialized applications in telecommunication, biomedical, optical, and micro lithography technology. In order to qualify and control the material quality during the research and production processes several specialized ultra trace analytisis methods have to be appliedcs Schott Glas is applied. One focus of our the activities is the determination of impurities ranging in the sub ppb-regime, because such kind of impurity level is required e.g. for pure materials used for microlithography for example. Common analytical techniques for these impurity levels areSuch impurities are determined using analytical methods like LA ICP-MS and or Neutron Activation Analysis for example. On the other hand direct and non-destructive optical analysistic becomes is attractive because it visualizes the requirement of the optical applications additionally. Typical eExamples are absorption and laser resistivity measurements of optical material with optical methods like precision spectral photometers and or in-situ transmission measurements by means ofusing lamps and or UV lasers. Analytical methods have the drawback that they are time consuming and rather expensive, whereas the sensitivity for the absorption method will not be sufficient to characterize the future needs (coefficient much below 10-3 cm-1). For a non-destructive qualification for the current and future quality requirements a Jobin Yvon FLUOROLOG 3.22 fluorescence spectrometery is employed to enable fast and precise qualification and analysis. The main advantage of this setup is the combination of highest sensitivity (more than one order of magnitude higher sensitivity than state of the art UV absorption spectroscopy), fast measurement and evaluation cycles (several minutes compared to several hours necessary for chemical analystics). An overview is given for spectral characteristics using specified standards, which are necessary to establish the analytical system

  8. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  9. An analytical study of reduced-gravity liquid reorientation using a simplified marker and cell technique

    NASA Technical Reports Server (NTRS)

    Betts, W. S., Jr.

    1972-01-01

    A computer program called HOPI was developed to predict reorientation flow dynamics, wherein liquids move from one end of a closed, partially filled, rigid container to the other end under the influence of container acceleration. The program uses the simplified marker and cell numerical technique and, using explicit finite-differencing, solves the Navier-Stokes equations for an incompressible viscous fluid. The effects of turbulence are also simulated in the program. HOPI can consider curved as well as straight walled boundaries. Both free-surface and confined flows can be calculated. The program was used to simulate five liquid reorientation cases. Three of these cases simulated actual NASA LeRC drop tower test conditions while two cases simulated full-scale Centaur tank conditions. It was concluded that while HOPI can be used to analytically determine the fluid motion in a typical settling problem, there is a current need to optimize HOPI. This includes both reducing the computer usage time and also reducing the core storage required for a given size problem.

  10. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  11. The analytical validation of the Oncotype DX Recurrence Score assay

    PubMed Central

    Baehner, Frederick L

    2016-01-01

    In vitro diagnostic multivariate index assays are highly complex molecular assays that can provide clinically actionable information regarding the underlying tumour biology and facilitate personalised treatment. These assays are only useful in clinical practice if all of the following are established: analytical validation (i.e., how accurately/reliably the assay measures the molecular characteristics), clinical validation (i.e., how consistently/accurately the test detects/predicts the outcomes of interest), and clinical utility (i.e., how likely the test is to significantly improve patient outcomes). In considering the use of these assays, clinicians often focus primarily on the clinical validity/utility; however, the analytical validity of an assay (e.g., its accuracy, reproducibility, and standardisation) should also be evaluated and carefully considered. This review focuses on the rigorous analytical validation and performance of the Oncotype DX® Breast Cancer Assay, which is performed at the Central Clinical Reference Laboratory of Genomic Health, Inc. The assay process includes tumour tissue enrichment (if needed), RNA extraction, gene expression quantitation (using a gene panel consisting of 16 cancer genes plus 5 reference genes and quantitative real-time RT-PCR), and an automated computer algorithm to produce a Recurrence Score® result (scale: 0–100). This review presents evidence showing that the Recurrence Score result reported for each patient falls within a tight clinically relevant confidence interval. Specifically, the review discusses how the development of the assay was designed to optimise assay performance, presents data supporting its analytical validity, and describes the quality control and assurance programmes that ensure optimal test performance over time. PMID:27729940

  12. The analytical validation of the Oncotype DX Recurrence Score assay.

    PubMed

    Baehner, Frederick L

    2016-01-01

    In vitro diagnostic multivariate index assays are highly complex molecular assays that can provide clinically actionable information regarding the underlying tumour biology and facilitate personalised treatment. These assays are only useful in clinical practice if all of the following are established: analytical validation (i.e., how accurately/reliably the assay measures the molecular characteristics), clinical validation (i.e., how consistently/accurately the test detects/predicts the outcomes of interest), and clinical utility (i.e., how likely the test is to significantly improve patient outcomes). In considering the use of these assays, clinicians often focus primarily on the clinical validity/utility; however, the analytical validity of an assay (e.g., its accuracy, reproducibility, and standardisation) should also be evaluated and carefully considered. This review focuses on the rigorous analytical validation and performance of the Oncotype DX ® Breast Cancer Assay, which is performed at the Central Clinical Reference Laboratory of Genomic Health, Inc. The assay process includes tumour tissue enrichment (if needed), RNA extraction, gene expression quantitation (using a gene panel consisting of 16 cancer genes plus 5 reference genes and quantitative real-time RT-PCR), and an automated computer algorithm to produce a Recurrence Score ® result (scale: 0-100). This review presents evidence showing that the Recurrence Score result reported for each patient falls within a tight clinically relevant confidence interval. Specifically, the review discusses how the development of the assay was designed to optimise assay performance, presents data supporting its analytical validity, and describes the quality control and assurance programmes that ensure optimal test performance over time.

  13. Two Analyte Calibration From The Transient Response Of Potentiometric Sensors Employed With The SIA Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cartas, Raul; Mimendia, Aitor; Valle, Manel del

    2009-05-23

    Calibration models for multi-analyte electronic tongues have been commonly built using a set of sensors, at least one per analyte under study. Complex signals recorded with these systems are formed by the sensors' responses to the analytes of interest plus interferents, from which a multivariate response model is then developed. This work describes a data treatment method for the simultaneous quantification of two species in solution employing the signal from a single sensor. The approach used here takes advantage of the complex information recorded with one electrode's transient after insertion of sample for building the calibration models for both analytes.more » The departure information from the electrode was firstly processed by discrete wavelet for transforming the signals to extract useful information and reduce its length, and then by artificial neural networks for fitting a model. Two different potentiometric sensors were used as study case for simultaneously corroborating the effectiveness of the approach.« less

  14. Research in health sciences library and information science: a quantitative analysis.

    PubMed Central

    Dimitroff, A

    1992-01-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas. PMID:1422504

  15. Training in motivational interviewing in obstetrics: a quantitative analytical tool.

    PubMed

    Lindhardt, Christina L; Rubak, Sune; Mogensen, Ole; Hansen, Helle P; Lamont, Ronald F; Jørgensen, Jan S

    2014-07-01

    To examine whether a 3-day training course in motivational interviewing, which is an approach to helping people to change, could improve the communication skills of obstetric healthcare professionals in their interaction with obese pregnant women. Intervention study. The Region of Southern Denmark. Eleven obstetric healthcare professionals working with obese pregnant women underwent a 3-day course in motivational interviewing techniques and were assessed before and after training to measure the impact on their overall performance as well as the effect on specific behavioral techniques observed during interviews. With a few exceptions, the participants changed their behavior appropriate to the motivational interviewing technique. The participants made more interventions towards the principles of motivational interviewing (adherent and nonadherent interventions). Furthermore, the participants asked fewer closed and more open questions before training in motivational interview. In the assessment of proficiency and competency, most of the participants scored higher after the training in motivational interviewing. Training in motivational interviewing improves healthcare professionals' proficiency and competency when communicating with obese pregnant women, albeit that the effect was not universal. © 2014 Nordic Federation of Societies of Obstetrics and Gynecology.

  16. Metrological approach to quantitative analysis of clinical samples by LA-ICP-MS: A critical review of recent studies.

    PubMed

    Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta

    2018-05-15

    Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Practical technique to quantify small, dense low-density lipoprotein cholesterol using dynamic light scattering

    NASA Astrophysics Data System (ADS)

    Trirongjitmoah, Suchin; Iinaga, Kazuya; Sakurai, Toshihiro; Chiba, Hitoshi; Sriyudthsak, Mana; Shimizu, Koichi

    2016-04-01

    Quantification of small, dense low-density lipoprotein (sdLDL) cholesterol is clinically significant. We propose a practical technique to estimate the amount of sdLDL cholesterol using dynamic light scattering (DLS). An analytical solution in a closed form has newly been obtained to estimate the weight fraction of one species of scatterers in the DLS measurement of two species of scatterers. Using this solution, we can quantify the sdLDL cholesterol amount from the amounts of the low-density lipoprotein cholesterol and the high-density lipoprotein (HDL) cholesterol, which are commonly obtained through clinical tests. The accuracy of the proposed technique was confirmed experimentally using latex spheres with known size distributions. The applicability of the proposed technique was examined using samples of human blood serum. The possibility of estimating the sdLDL amount using the HDL data was demonstrated. These results suggest that the quantitative estimation of sdLDL amounts using DLS is feasible for point-of-care testing in clinical practice.

  18. An Analytical Solution for Transient Thermal Response of an Insulated Structure

    NASA Technical Reports Server (NTRS)

    Blosser, Max L.

    2012-01-01

    An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.

  19. Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder S.; Kinney, Justin B.

    2016-03-01

    A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.

  20. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    PubMed

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.