Sample records for protocols analytical methods

  1. HSRP and HSRP Partner Analytical Methods and Protocols

    EPA Pesticide Factsheets

    HSRP has worked with various partners to develop and test analytical methods and protocols for use by laboratories charged with analyzing environmental and/or buildling material samples following contamination incident.

  2. Validation protocol of analytical procedures for quantification of drugs in polymeric systems for parenteral administration: dexamethasone phosphate disodium microparticles.

    PubMed

    Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel

    2013-12-15

    In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  4. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  5. Effect of different analyte diffusion/adsorption protocols on SERS signals

    NASA Astrophysics Data System (ADS)

    Li, Ruoping; Petschek, Rolfe G.; Han, Junhe; Huang, Mingju

    2018-07-01

    The effect of different analyte diffusion/adsorption protocols was studied which is often overlooked in surface-enhanced Raman scattering (SERS) technique. Three protocols: highly concentrated dilution (HCD) protocol, half-half dilution (HHD) protocol and layered adsorption (LA) protocol were studied and the SERS substrates were monolayer films of 80 nm Ag nanoparticles (NPs) which were modified by polyvinylpyrrolidone. The diffusion/adsorption mechanisms were modelled using the diffusion equation and the electromagnetic field distribution of two adjacent Ag NPs was simulated by the finite-different time-domain method. All experimental data and theoretical analysis suggest that different diffusion/adsorption behaviour of analytes will cause different SERS signal enhancements. HHD protocol could produce the most uniform and reproducible samples, and the corresponding signal intensity of the analyte is the strongest. This study will help to understand and promote the use of SERS technique in quantitative analysis.

  6. Molecular detection of Borrelia burgdorferi sensu lato – An analytical comparison of real-time PCR protocols from five different Scandinavian laboratories

    PubMed Central

    Faller, Maximilian; Wilhelmsson, Peter; Kjelland, Vivian; Andreassen, Åshild; Dargis, Rimtas; Quarsten, Hanne; Dessau, Ram; Fingerle, Volker; Margos, Gabriele; Noraas, Sølvi; Ornstein, Katharina; Petersson, Ann-Cathrine; Matussek, Andreas; Lindgren, Per-Eric; Henningsson, Anna J.

    2017-01-01

    Introduction Lyme borreliosis (LB) is the most common tick transmitted disease in Europe. The diagnosis of LB today is based on the patient´s medical history, clinical presentation and laboratory findings. The laboratory diagnostics are mainly based on antibody detection, but in certain conditions molecular detection by polymerase chain reaction (PCR) may serve as a complement. Aim The purpose of this study was to evaluate the analytical sensitivity, analytical specificity and concordance of eight different real-time PCR methods at five laboratories in Sweden, Norway and Denmark. Method Each participating laboratory was asked to analyse three different sets of samples (reference panels; all blinded) i) cDNA extracted and transcribed from water spiked with cultured Borrelia strains, ii) cerebrospinal fluid spiked with cultured Borrelia strains, and iii) DNA dilution series extracted from cultured Borrelia and relapsing fever strains. The results and the method descriptions of each laboratory were systematically evaluated. Results and conclusions The analytical sensitivities and the concordance between the eight protocols were in general high. The concordance was especially high between the protocols using 16S rRNA as the target gene, however, this concordance was mainly related to cDNA as the type of template. When comparing cDNA and DNA as the type of template the analytical sensitivity was in general higher for the protocols using DNA as template regardless of the use of target gene. The analytical specificity for all eight protocols was high. However, some protocols were not able to detect Borrelia spielmanii, Borrelia lusitaniae or Borrelia japonica. PMID:28937997

  7. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  8. Protocol for Detection of Yersinia pestis in Environmental ...

    EPA Pesticide Factsheets

    Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.

  9. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations.

    PubMed

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.

  10. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations

    PubMed Central

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    Aim The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. Background High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. Materials and methods The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. Results From designed door's thickness, the door designed by the MC simulation and Wu–McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Conclusion Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations. PMID:26900357

  11. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  12. Comparison of Gluten Extraction Protocols Assessed by LC-MS/MS Analysis.

    PubMed

    Fallahbaghery, Azadeh; Zou, Wei; Byrne, Keren; Howitt, Crispin A; Colgrave, Michelle L

    2017-04-05

    The efficiency of gluten extraction is of critical importance to the results derived from any analytical method for gluten detection and quantitation, whether it employs reagent-based technology (antibodies) or analytical instrumentation (mass spectrometry). If the target proteins are not efficiently extracted, the end result will be an under-estimation in the gluten content posing a health risk to people affected by conditions such as celiac disease (CD) and nonceliac gluten sensitivity (NCGS). Five different extraction protocols were investigated using LC-MRM-MS for their ability to efficiently and reproducibly extract gluten. The rapid and simple "IPA/DTT" protocol and related "two-step" protocol were enriched for gluten proteins, 55/86% (trypsin/chymotrypsin) and 41/68% of all protein identifications, respectively, with both methods showing high reproducibility (CV < 15%). When using multistep protocols, it was critical to examine all fractions, as coextraction of proteins occurred across fractions, with significant levels of proteins existing in unexpected fractions and not all proteins within a particular gluten class behaving the same.

  13. Accuracy verification and identification of matrix effects. The College of American Pathologists' Protocol.

    PubMed

    Eckfeldt, J H; Copeland, K R

    1993-04-01

    Proficiency testing using stabilized control materials has been used for decades as a means of monitoring and improving performance in the clinical laboratory. Often, the commonly used proficiency testing materials exhibit "matrix effects" that cause them to behave differently from fresh human specimens in certain clinical analytic systems. Because proficiency testing is the primary method in which regulatory agencies have chosen to evaluate clinical laboratory performance, the College of American Pathologists (CAP) has proposed guidelines for investigating the influence of matrix effects on their Survey results. The purpose of this investigation was to determine the feasibility, usefulness, and potential problems associated with this CAP Matrix Effect Analytical Protocol, in which fresh patient specimens and CAP proficiency specimens are analyzed simultaneously by a field method and a definitive, reference, or other comparative method. The optimal outcome would be that both the fresh human and CAP Survey specimens agree closely with the comparative method result. However, this was not always the case. Using several different analytic configurations, we were able to demonstrate matrix and calibration biases for several of the analytes investigated.

  14. Development of the Basis for an Analytical Protocol for Feeds and Products of Bio-oil Hydrotreatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oasmaa, Anja; Kuoppala, Eeva; Elliott, Douglas C.

    2012-04-02

    Methods for easily following the main changes in the composition, stability, and acidity of bio-oil in hydrotreatment are presented. The correlation to more conventional methods is provided. Depending on the final use the upgrading requirement is different. This will create challenges also for the analytical protocol. Polar pyrolysis liquids and their products can be divided into five main groups with solvent fractionation the change in which is easy to follow. This method has over ten years been successfully used for comparison of fast pyrolysis bio-oil quality, and the changes during handling, and storage, provides the basis of the analytical protocolmore » presented in this paper. The method has most recently been used also for characterisation of bio-oil hydrotreatment products. Discussion on the use of gas chromatographic and spectroscopic methods is provided. In addition, fuel oil analyses suitable for fast pyrolysis bio-oils and hydrotreatment products are discussed.« less

  15. STANDARDIZATION AND VALIDATION OF MICROBIOLOGICAL METHODS FOR EXAMINATION OF BIOSOLIDS

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within a complex matrix. Implications of ...

  16. Characterizing Contamination and Assessing Exposure, Risk and Resilience

    EPA Pesticide Factsheets

    EPA supports its responders' ability to characterize site contamination by developing sampling protocols, sample preparation methods, and analytical methods for chemicals, biotoxins, microbial pathogens, and radiological agents.

  17. ELISA: Methods and Protocols

    USDA-ARS?s Scientific Manuscript database

    The antibody is central to the performance of an ELISA providing the basis of analyte selection and detection. It is the interaction of antibody with analyte under defined conditions that dictates the outcome of the ELISA and deviations in those conditions will impact assay performance. The aim of...

  18. Keeping It Simple: Can We Estimate Malting Quality Potential Using an Isothermal Mashing Protocol and Common Laboratory Instrumentation?

    USDA-ARS?s Scientific Manuscript database

    Current methods for generating malting quality metrics have been developed largely to support commercial malting and brewing operations, providing accurate, reproducible analytical data to guide malting and brewing production. Infrastructure to support these analytical operations often involves sub...

  19. Method 1200: Analytical Protocol for Non-Typhoidal Salmonella in Drinking Water and Surface Water

    EPA Pesticide Factsheets

    Method 1200 is used for identification, confirmation and quantitation of non-typhoidal Salmonella in water samples, using selective and non-selective media followed by biochemical and serological confirmation.

  20. Working towards accreditation by the International Standards Organization 15189 Standard: how to validate an in-house developed method an example of lead determination in whole blood by electrothermal atomic absorption spectrometry.

    PubMed

    Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe

    2014-09-01

    Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.

  1. XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.

    2009-01-01

    Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, T.F.; Thorne, P.G.; Myers, K.F.

    Salting-out solvent extraction (SOE) was compared with cartridge and membrane solid-phase extraction (SPE) for preconcentration of nitroaromatics, nitramines, and aminonitroaromatics prior to determination by reversed-phase high-performance liquid chromatography. The solid phases used were manufacturer-cleaned materials, Porapak RDX for the cartridge method and Empore SDB-RPS for the membrane method. Thirty-three groundwater samples from the Naval Surface Warfare Center, Crane, Indiana, were analyzed using the direct analysis protocol specified in SW846 Method 8330, and the results were compared with analyses conducted after preconcentration using SOE with acetonitrile, cartridge-based SPE, and membrane-based SPE. For high-concentration samples, analytical results from the three preconcentration techniquesmore » were compared with results from the direct analysis protocol; good recovery of all target analytes was achieved by all three pre-concentration methods. For low-concentration samples, results from the two SPE methods were correlated with results from the SOE method; very similar data was obtained by the SOE and SPE methods, even at concentrations well below 1 microgram/L.« less

  3. Standardization of Nanoparticle Characterization: Methods for Testing Properties, Stability, and Functionality of Edible Nanoparticles.

    PubMed

    McClements, Jake; McClements, David Julian

    2016-06-10

    There has been a rapid increase in the fabrication of various kinds of edible nanoparticles for oral delivery of bioactive agents, such as those constructed from proteins, carbohydrates, lipids, and/or minerals. It is currently difficult to compare the relative advantages and disadvantages of different kinds of nanoparticle-based delivery systems because researchers use different analytical instruments and protocols to characterize them. In this paper, we briefly review the various analytical methods available for characterizing the properties of edible nanoparticles, such as composition, morphology, size, charge, physical state, and stability. This information is then used to propose a number of standardized protocols for characterizing nanoparticle properties, for evaluating their stability to environmental stresses, and for predicting their biological fate. Implementation of these protocols would facilitate comparison of the performance of nanoparticles under standardized conditions, which would facilitate the rational selection of nanoparticle-based delivery systems for different applications in the food, health care, and pharmaceutical industries.

  4. Predicting thermal history a-priori for magnetic nanoparticle hyperthermia of internal carcinoma

    NASA Astrophysics Data System (ADS)

    Dhar, Purbarun; Sirisha Maganti, Lakshmi

    2017-08-01

    This article proposes a simplistic and realistic method where a direct analytical expression can be derived for the temperature field within a tumour during magnetic nanoparticle hyperthermia. The approximated analytical expression for thermal history within the tumour is derived based on the lumped capacitance approach and considers all therapy protocols and parameters. The present method is simplistic and provides an easy framework for estimating hyperthermia protocol parameters promptly. The model has been validated with respect to several experimental reports on animal models such as mice/rabbit/hamster and human clinical trials. It has been observed that the model is able to accurately estimate the thermal history within the carcinoma during the hyperthermia therapy. The present approach may find implications in a-priori estimation of the thermal history in internal tumours for optimizing magnetic hyperthermia treatment protocols with respect to the ablation time, tumour size, magnetic drug concentration, field strength, field frequency, nanoparticle material and size, tumour location, and so on.

  5. A novel method for quantification of gemcitabine and its metabolites 2',2'-difluorodeoxyuridine and gemcitabine triphosphate in tumour tissue by LC-MS/MS: comparison with (19)F NMR spectroscopy.

    PubMed

    Bapiro, Tashinga E; Richards, Frances M; Goldgraben, Mae A; Olive, Kenneth P; Madhu, Basetti; Frese, Kristopher K; Cook, Natalie; Jacobetz, Michael A; Smith, Donna-Michelle; Tuveson, David A; Griffiths, John R; Jodrell, Duncan I

    2011-11-01

    To develop a sensitive analytical method to quantify gemcitabine (2',2'-difluorodeoxycytidine, dFdC) and its metabolites 2',2'-difluorodeoxyuridine (dFdU) and 2',2'-difluorodeoxycytidine-5'-triphosphate (dFdCTP) simultaneously from tumour tissue. Pancreatic ductal adenocarcinoma tumour tissue from genetically engineered mouse models of pancreatic cancer (KP ( FL/FL ) C and KP ( R172H/+) C) was collected after dosing the mice with gemcitabine. (19)F NMR spectroscopy and LC-MS/MS protocols were optimised to detect gemcitabine and its metabolites in homogenates of the tumour tissue. A (19)F NMR protocol was developed, which was capable of distinguishing the three analytes in tumour homogenates. However, it required at least 100 mg of the tissue in question and a long acquisition time per sample, making it impractical for use in large PK/PD studies or clinical trials. The LC-MS/MS protocol was developed using porous graphitic carbon to separate the analytes, enabling simultaneous detection of all three analytes from as little as 10 mg of tissue, with a sensitivity for dFdCTP of 0.2 ng/mg tissue. Multiple pieces of tissue from single tumours were analysed, showing little intra-tumour variation in the concentrations of dFdC or dFdU (both intra- and extra-cellular). Intra-tumoural variation was observed in the concentration of dFdCTP, an intra-cellular metabolite, which may reflect regions of different cellularity within a tumour. We have developed a sensitive LC-MS/MS method capable of quantifying gemcitabine, dFdU and dFdCTP in pancreatic tumour tissue. The requirement for only 10 mg of tissue enables this protocol to be used to analyse multiple areas from a single tumour and to spare tissue for additional pharmacodynamic assays.

  6. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    PubMed

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  7. Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol

    PubMed Central

    Klonoff, David C.; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A.; Arreaza-Rubin, Guillermo; Burk, Robert D.; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B.; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W.

    2015-01-01

    Background: Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. Methods: The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. Results: A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled “Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program” is attached as supplementary material. Conclusion: This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. PMID:26481642

  8. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  9. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  10. Development of characterization protocol for mixed liquid radioactive waste classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakaria, Norasalwa, E-mail: norasalwa@nuclearmalaysia.gov.my; Wafa, Syed Asraf; Wo, Yii Mei

    2015-04-29

    Mixed liquid organic waste generated from health-care and research activities containing tritium, carbon-14, and other radionuclides posed specific challenges in its management. Often, these wastes become legacy waste in many nuclear facilities and being considered as ‘problematic’ waste. One of the most important recommendations made by IAEA is to perform multistage processes aiming at declassification of the waste. At this moment, approximately 3000 bottles of mixed liquid waste, with estimated volume of 6000 litres are currently stored at the National Radioactive Waste Management Centre, Malaysia and some have been stored for more than 25 years. The aim of this studymore » is to develop a characterization protocol towards reclassification of these wastes. The characterization protocol entails waste identification, waste screening and segregation, and analytical radionuclides profiling using various analytical procedures including gross alpha/ gross beta, gamma spectrometry, and LSC method. The results obtained from the characterization protocol are used to establish criteria for speedy classification of the waste.« less

  11. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Elucidation of several neglected reactions in the GC-MS identification of sialic acids as heptafluorobutyrates calls for an urgent reassessment of previous claims.

    PubMed

    Rota, Paola; Anastasia, Luigi; Allevi, Pietro

    2015-05-07

    The current analytical protocol used for the GC-MS determination of free or 1,7-lactonized natural sialic acids (Sias), as heptafluorobutyrates, overlooks several transformations. Using authentic reference standards and by combining GC-MS and NMR analyses, flaws in the analytical protocol were pinpointed and elucidated, thus establishing the scope and limitations of the method. It was demonstrated that (a) Sias 1,7-lactones, even if present in biological samples, decompose under the acidic hydrolysis conditions used for their release; (b) Sias 1,7-lactones are unpredicted artifacts, accidentally generated from their parent acids; (c) the N-acetyl group is quantitatively exchanged with that of the derivatizing perfluorinated anhydride; (d) the partial or complete failure of the Sias esterification-step with diazomethane leads to the incorrect quantification and structure attribution of all free Sias. While these findings prompt an urgent correction and improvement of the current analytical protocol, they could be instrumental for a critical revision of many incorrect claims reported in the literature.

  13. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Determination of a risk management primer at petroleum-contaminated sites: developing new human health risk assessment strategy.

    PubMed

    Park, In-Sun; Park, Jae-Woo

    2011-01-30

    Total petroleum hydrocarbon (TPH) is an important environmental contaminant that is toxic to human and environmental receptors. However, human health risk assessment for petroleum, oil, and lubricant (POL)-contaminated sites is especially challenging because TPH is not a single compound, but rather a mixture of numerous substances. To address this concern, this study recommends a new human health risk assessment strategy for POL-contaminated sites. The strategy is based on a newly modified TPH fractionation method and includes an improved analytical protocol. The proposed TPH fractionation method is composed of ten fractions (e.g., aliphatic and aromatic EC8-10, EC10-12, EC12-16, EC16-22 and EC22-40). Physicochemical properties and toxicity values of each fraction were newly defined in this study. The stepwise ultrasonication-based analytical process was established to measure TPH fractions. Analytical results were compared with those from the TPH Criteria Working Group (TPHCWG) Direct Method. Better analytical efficiencies in TPH, aliphatic, and aromatic fractions were achieved when contaminated soil samples were analyzed with the new analytical protocol. Finally, a human health risk assessment was performed based on the developed tiered risk assessment framework. Results showed that a detailed quantitative risk assessment should be conducted to determine scientifically and economically appropriate cleanup target levels, although the phase II process is useful for determining the potency of human health risks posed by POL-contamination. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Addressing the need for biomarker liquid chromatography/mass spectrometry assays: a protocol for effective method development for the bioanalysis of endogenous compounds in cerebrospinal fluid.

    PubMed

    Benitex, Yulia; McNaney, Colleen A; Luchetti, David; Schaeffer, Eric; Olah, Timothy V; Morgan, Daniel G; Drexler, Dieter M

    2013-08-30

    Research on disorders of the central nervous system (CNS) has shown that an imbalance in the levels of specific endogenous neurotransmitters may underlie certain CNS diseases. These alterations in neurotransmitter levels may provide insight into pathophysiology, but can also serve as disease and pharmacodynamic biomarkers. To measure these potential biomarkers in vivo, the relevant sample matrix is cerebrospinal fluid (CSF), which is in equilibrium with the brain's interstitial fluid and circulates through the ventricular system of the brain and spinal cord. Accurate analysis of these potential biomarkers can be challenging due to low CSF sample volume, low analyte levels, and potential interferences from other endogenous compounds. A protocol has been established for effective method development of bioanalytical assays for endogenous compounds in CSF. Database searches and standard-addition experiments are employed to qualify sample preparation and specificity of the detection thus evaluating accuracy and precision. This protocol was applied to the study of the histaminergic neurotransmitter system and the analysis of histamine and its metabolite 1-methylhistamine in rat CSF. The protocol resulted in a specific and sensitive novel method utilizing pre-column derivatization ultra high performance liquid chromatography/tandem mass spectrometry (UHPLC/MS/MS), which is also capable of separating an endogenous interfering compound, identified as taurine, from the analytes of interest. Copyright © 2013 John Wiley & Sons, Ltd.

  16. The evaluation of an analytical protocol for the determination of substances in waste for hazard classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hennebert, Pierre, E-mail: pierre.hennebert@ineris.fr; Papin, Arnaud; Padox, Jean-Marie

    Highlights: • Knowledge of wastes in substances will be necessary to assess HP1–HP15 hazard properties. • A new analytical protocol is proposed for this and tested by two service laboratories on 32 samples. • Sixty-three percentage of the samples have a satisfactory analytical balance between 90% and 110%. • Eighty-four percentage of the samples were classified identically (Seveso Directive) for their hazardousness by the two laboratories. • The method, in progress, is being normalized in France and is be proposed to CEN. - Abstract: The classification of waste as hazardous could soon be assessed in Europe using largely the hazardmore » properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC–MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of ‘pools’ of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved ‘mass’ during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved ‘pools’) should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter results. Despite discrepancies in some parameters, a satisfactory sum of estimated or measured concentrations (analytical balance) of 90% was reached for 20 samples (63% of the overall total) during this first test exercise, with identified reasons for most of the unsatisfactory results. Regular use of this protocol (which is now included in the French legislation) has enabled service laboratories to reach a 90% mass balance for nearly all the solid samples tested, and most of liquid samples (difficulties were caused in some samples from polymers in solution and vegetable oil). The protocol is submitted to French and European normalization bodies (AFNOR and CEN) and further improvements are awaited.« less

  17. An automated baseline correction protocol for infrared spectra of atmospheric aerosols collected on polytetrafluoroethylene (Teflon) filters

    NASA Astrophysics Data System (ADS)

    Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi

    2016-06-01

    A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification, and (3) thermal optical reflectance (TOR) organic carbon (OC) and elemental carbon (EC) predictions. The discrepancy rate for a four-cluster solution is 10 %. For all functional groups but carboxylic COH the discrepancy is ≤ 10 %. Performance metrics obtained from TOR OC and EC predictions (R2 ≥ 0.94 %, bias ≤ 0.01 µg m-3, and error ≤ 0.04 µg m-3) are on a par with those obtained from uncorrected and PB-corrected spectra. The proposed protocol leads to visually and analytically similar estimates as those generated by the polynomial method. More importantly, the automated solution allows us and future users to evaluate its analytical reproducibility while minimizing reducible user bias. We anticipate the protocol will enable FT-IR researchers and data analysts to quickly and reliably analyze a large amount of data and connect them to a variety of available statistical learning methods to be applied to analyte absorbances isolated in atmospheric aerosol samples.

  18. Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP)

    EPA Pesticide Factsheets

    The Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP) provides guidance for the planning, implementation and assessment phases of projects that require laboratory analysis of radionuclides.

  19. A QUICK OVERVIEW OF MASS SPECTROMETRY, NEGATIVE IONS, AND TOXAPHENE DETERMINATION (PROPOSED METHOD 8276)

    EPA Science Inventory

    A new method for toxaphene and toxaphene congener determination has been proposed by OSW as the response to an internal report from the OIG relative to toxaphene determination. In the course of this development, ORD was asked to prepare a new GC/NIMS protocol for 8081 analytes t...

  20. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    PubMed

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. [Validation of an in-house method for the determination of zinc in serum: Meeting the requirements of ISO 17025].

    PubMed

    Llorente Ballesteros, M T; Navarro Serrano, I; López Colón, J L

    2015-01-01

    The aim of this report is to propose a scheme for validation of an analytical technique according to ISO 17025. According to ISO 17025, the fundamental parameters tested were: selectivity, calibration model, precision, accuracy, uncertainty of measurement, and analytical interference. A protocol has been developed that has been applied successfully to quantify zinc in serum by atomic absorption spectrometry. It is demonstrated that our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  2. MFAHP: A novel method on the performance evaluation of the industrial wireless networked control system

    NASA Astrophysics Data System (ADS)

    Wu, Linqin; Xu, Sheng; Jiang, Dezhi

    2015-12-01

    Industrial wireless networked control system has been widely used, and how to evaluate the performance of the wireless network is of great significance. In this paper, considering the shortcoming of the existing performance evaluation methods, a comprehensive performance evaluation method of networks multi-indexes fuzzy analytic hierarchy process (MFAHP) combined with the fuzzy mathematics and the traditional analytic hierarchy process (AHP) is presented. The method can overcome that the performance evaluation is not comprehensive and subjective. Experiments show that the method can reflect the network performance of real condition. It has direct guiding role on protocol selection, network cabling, and node setting, and can meet the requirements of different occasions by modifying the underlying parameters.

  3. Comparison of methods for estimating density of forest songbirds from point counts

    Treesearch

    Jennifer L. Reidy; Frank R. Thompson; J. Wesley. Bailey

    2011-01-01

    New analytical methods have been promoted for estimating the probability of detection and density of birds from count data but few studies have compared these methods using real data. We compared estimates of detection probability and density from distance and time-removal models and survey protocols based on 5- or 10-min counts and outer radii of 50 or 100 m. We...

  4. Comparison of the Liaison® Calprotectin kit with a well established point of care test (Quantum Blue - Bühlmann-Alere®) in terms of analytical performances and ability to detect relapses amongst a Crohn population in follow-up.

    PubMed

    Delefortrie, Quentin; Schatt, Patricia; Grimmelprez, Alexandre; Gohy, Patrick; Deltour, Didier; Collard, Geneviève; Vankerkhoven, Patrick

    2016-02-01

    Although colonoscopy associated with histopathological sampling remains the gold standard in the diagnostic and follow-up of inflammatory bowel disease (IBD), calprotectin is becoming an essential biomarker in gastroenterology. The aim of this work is to compare a newly developed kit (Liaison® Calprotectin - Diasorin®) and its two distinct extraction protocols (weighing and extraction device protocol) with a well established point of care test (Quantum Blue® - Bühlmann-Alere®) in terms of analytical performances and ability to detect relapses amongst a Crohn's population in follow-up. Stool specimens were collected over a six month period and were composed of control and Crohn's patients. Amongst the Crohn's population disease activity (active vs quiescent) was evaluated by gastroenterologists. A significant difference was found between all three procedures in terms of calprotectin measurements (weighing protocol=30.3μg/g (median); stool extraction device protocol=36.9μg/g (median); Quantum Blue® (median)=63; Friedman test, P value=0.05). However, a good correlation was found between both extraction methods coupled with the Liaison® analyzer and between the Quantum Blue® (weighing protocol/extraction device protocol Rs=0.844, P=0.01; Quantum Blue®/extraction device protocol Rs=0.708, P=0.01; Quantum Blue®/weighing protocol, Rs=0.808, P=0.01). Finally, optimal cut-offs (and associated negative predictive values - NPV) for detecting relapses were in accordance with above results (Quantum Blue® 183.5μg/g and NPV of 100%>extraction device protocol+Liaison® analyzer 124.5μg/g and NPV of 93.5%>weighing protocol+Liaison® analyzer 106.5μg/g and NPV of 95%). Although all three methods correlated well and had relatively good NPV in terms of detecting relapses amongst a Crohn's population in follow-up, the lack of any international standard is the origin of different optimal cut-offs between the three procedures. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. Successful attack on permutation-parity-machine-based neural cryptography.

    PubMed

    Seoane, Luís F; Ruttor, Andreas

    2012-02-01

    An algorithm is presented which implements a probabilistic attack on the key-exchange protocol based on permutation parity machines. Instead of imitating the synchronization of the communicating partners, the strategy consists of a Monte Carlo method to sample the space of possible weights during inner rounds and an analytic approach to convey the extracted information from one outer round to the next one. The results show that the protocol under attack fails to synchronize faster than an eavesdropper using this algorithm.

  6. ANALYTICAL METHODS FOR FUEL OXYGENATES

    EPA Science Inventory

    MTBE (and potentially any other oxygenate) may be present at any petroleum UST site, whether the release is new or old, virtually anywhere in the United States. Consequently, it is prudent to analyze samples for the entire suite of oxygenates as identified in this protocol (i.e....

  7. Numerical approach for unstructured quantum key distribution

    PubMed Central

    Coles, Patrick J.; Metodiev, Eric M.; Lütkenhaus, Norbert

    2016-01-01

    Quantum key distribution (QKD) allows for communication with security guaranteed by quantum theory. The main theoretical problem in QKD is to calculate the secret key rate for a given protocol. Analytical formulas are known for protocols with symmetries, since symmetry simplifies the analysis. However, experimental imperfections break symmetries, hence the effect of imperfections on key rates is difficult to estimate. Furthermore, it is an interesting question whether (intentionally) asymmetric protocols could outperform symmetric ones. Here we develop a robust numerical approach for calculating the key rate for arbitrary discrete-variable QKD protocols. Ultimately this will allow researchers to study ‘unstructured' protocols, that is, those that lack symmetry. Our approach relies on transforming the key rate calculation to the dual optimization problem, which markedly reduces the number of parameters and hence the calculation time. We illustrate our method by investigating some unstructured protocols for which the key rate was previously unknown. PMID:27198739

  8. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry.

    PubMed

    Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.

  9. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  10. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  11. An in silico method to identify computer-based protocols worthy of clinical study: An insulin infusion protocol use case

    PubMed Central

    Wong, Anthony F; Pielmeier, Ulrike; Haug, Peter J; Andreassen, Steen

    2016-01-01

    Objective Develop an efficient non-clinical method for identifying promising computer-based protocols for clinical study. An in silico comparison can provide information that informs the decision to proceed to a clinical trial. The authors compared two existing computer-based insulin infusion protocols: eProtocol-insulin from Utah, USA, and Glucosafe from Denmark. Materials and Methods The authors used eProtocol-insulin to manage intensive care unit (ICU) hyperglycemia with intravenous (IV) insulin from 2004 to 2010. Recommendations accepted by the bedside clinicians directly link the subsequent blood glucose values to eProtocol-insulin recommendations and provide a unique clinical database. The authors retrospectively compared in silico 18 984 eProtocol-insulin continuous IV insulin infusion rate recommendations from 408 ICU patients with those of Glucosafe, the candidate computer-based protocol. The subsequent blood glucose measurement value (low, on target, high) was used to identify if the insulin recommendation was too high, on target, or too low. Results Glucosafe consistently provided more favorable continuous IV insulin infusion rate recommendations than eProtocol-insulin for on target (64% of comparisons), low (80% of comparisons), or high (70% of comparisons) blood glucose. Aggregated eProtocol-insulin and Glucosafe continuous IV insulin infusion rates were clinically similar though statistically significantly different (Wilcoxon signed rank test P = .01). In contrast, when stratified by low, on target, or high subsequent blood glucose measurement, insulin infusion rates from eProtocol-insulin and Glucosafe were statistically significantly different (Wilcoxon signed rank test, P < .001), and clinically different. Discussion This in silico comparison appears to be an efficient nonclinical method for identifying promising computer-based protocols. Conclusion Preclinical in silico comparison analytical framework allows rapid and inexpensive identification of computer-based protocol care strategies that justify expensive and burdensome clinical trials. PMID:26228765

  12. Development of a Standardized Approach for Assessing Potential Risks to Amphibians Exposed to Sediment and Hydric Soils

    DTIC Science & Technology

    2004-05-01

    following digestion using method 3005A. Copper concentrations were verified using atomic absorption spectroscopy/graphite furnace. Each chamber...1995. Ammonia Variation in Sediments: Spatial, Temporal and Method -Related Effects. Environ. Toxicol. Chem. 14:1499-1506. Savage, W.K., F.W...Regulator Approved Methods and Protocols for Conducting Marine and Terrestrial Risk Assessments 1.III.01.k - Improved Field Analytical Sensors

  13. Fabricating a UV-Vis and Raman Spectroscopy Immunoassay Platform.

    PubMed

    Hanson, Cynthia; Israelsen, Nathan D; Sieverts, Michael; Vargis, Elizabeth

    2016-11-10

    Immunoassays are used to detect proteins based on the presence of associated antibodies. Because of their extensive use in research and clinical settings, a large infrastructure of immunoassay instruments and materials can be found. For example, 96- and 384-well polystyrene plates are available commercially and have a standard design to accommodate ultraviolet-visible (UV-Vis) spectroscopy machines from various manufacturers. In addition, a wide variety of immunoglobulins, detection tags, and blocking agents for customized immunoassay designs such as enzyme-linked immunosorbent assays (ELISA) are available. Despite the existing infrastructure, standard ELISA kits do not meet all research needs, requiring individualized immunoassay development, which can be expensive and time-consuming. For example, ELISA kits have low multiplexing (detection of more than one analyte at a time) capabilities as they usually depend on fluorescence or colorimetric methods for detection. Colorimetric and fluorescent-based analyses have limited multiplexing capabilities due to broad spectral peaks. In contrast, Raman spectroscopy-based methods have a much greater capability for multiplexing due to narrow emission peaks. Another advantage of Raman spectroscopy is that Raman reporters experience significantly less photobleaching than fluorescent tags 1 . Despite the advantages that Raman reporters have over fluorescent and colorimetric tags, protocols to fabricate Raman-based immunoassays are limited. The purpose of this paper is to provide a protocol to prepare functionalized probes to use in conjunction with polystyrene plates for direct detection of analytes by UV-Vis analysis and Raman spectroscopy. This protocol will allow researchers to take a do-it-yourself approach for future multi-analyte detection while capitalizing on pre-established infrastructure.

  14. MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER: PART 1. PROTOCOLS

    EPA Science Inventory

    A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...

  15. 77 FR 15722 - Southern California Hook and Line Survey; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-16

    ... meeting to evaluate the Southern California Shelf Rockfish Hook and Line Survey which was designed to... and Line survey design and protocols; (2) examine the analytical methods used to generate rockfish... California Hook and Line Survey; Public Meeting AGENCY: National Marine Fisheries Service (NMFS), National...

  16. MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER. PART 2. APPENDICES TO PROTOCOLS

    EPA Science Inventory

    A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...

  17. Geochemical and mineralogical data for soils of the conterminous United States

    USGS Publications Warehouse

    Smith, David B.; Cannon, William F.; Woodruff, Laurel G.; Solano, Federico; Kilburn, James E.; Fey, David L.

    2013-01-01

    In 2007, the U.S. Geological Survey initiated a low-density (1 site per 1,600 square kilometers, 4,857 sites) geochemical and mineralogical survey of soils of the conterminous United States as part of the North American Soil Geochemical Landscapes Project. Sampling and analytical protocols were developed at a workshop in 2003, and pilot studies were conducted from 2004 to 2007 to test and refine these recommended protocols. The final sampling protocol for the national-scale survey included, at each site, a sample from a depth of 0 to 5 centimeters, a composite of the soil A horizon, and a deeper sample from the soil C horizon or, if the top of the C horizon was at a depth greater than 1 meter, from a depth of approximately 80–100 centimeters. The <2-millimeter fraction of each sample was analyzed for a suite of 45 major and trace elements by methods that yield the total or near-total elemental content. The major mineralogical components in the samples from the soil A and C horizons were determined by a quantitative X-ray diffraction method using Rietveld refinement. Sampling in the conterminous United States was completed in 2010, with chemical and mineralogical analyses completed in May 2013. The resulting dataset provides an estimate of the abundance and spatial distribution of chemical elements and minerals in soils of the conterminous United States and represents a baseline for soil geochemistry and mineralogy against which future changes may be recognized and quantified. This report (1) describes the sampling, sample preparation, and analytical methods used; (2) gives details of the quality control protocols used to monitor the quality of chemical and mineralogical analyses over approximately six years; and (3) makes available the soil geochemical and mineralogical data in downloadable tables.

  18. Carbon Nanotube Material Quality Assessment

    NASA Technical Reports Server (NTRS)

    Yowell, Leonard; Arepalli, Sivaram; Sosa, Edward; Niolaev, Pavel; Gorelik, Olga

    2006-01-01

    The nanomaterial activities at NASA Johnson Space Center focus on carbon nanotube production, characterization and their applications for aerospace systems. Single wall carbon nanotubes are produced by arc and laser methods. Characterization of the nanotube material is performed using the NASA JSC protocol developed by combining analytical techniques of SEM, TEM, UV-VIS-NIR absorption, Raman, and TGA. A possible addition of other techniques such as XPS, and ICP to the existing protocol will be discussed. Changes in the quality of the material collected in different regions of the arc and laser production chambers is assessed using the original JSC protocol. The observed variations indicate different growth conditions in different regions of the production chambers.

  19. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    PubMed

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Entanglement distillation protocols and number theory

    NASA Astrophysics Data System (ADS)

    Bombin, H.; Martin-Delgado, M. A.

    2005-09-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set ZDn associated with Bell diagonal states is a module rather than a vector space. We find that a partition of ZDn into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D . When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively.

  1. Pre-analytical effects of blood sampling and handling in quantitative immunoassays for rheumatoid arthritis.

    PubMed

    Zhao, Xiaoyan; Qureshi, Ferhan; Eastman, P Scott; Manning, William C; Alexander, Claire; Robinson, William H; Hesterberg, Lyndal K

    2012-04-30

    Variability in pre-analytical blood sampling and handling can significantly impact results obtained in quantitative immunoassays. Understanding the impact of these variables is critical for accurate quantification and validation of biomarker measurements. Particularly, in the design and execution of large clinical trials, even small differences in sample processing and handling can have dramatic effects in analytical reliability, results interpretation, trial management and outcome. The effects of two common blood sampling methods (serum vs. plasma) and two widely-used serum handling methods (on the clot with ambient temperature shipping, "traditional", vs. centrifuged with cold chain shipping, "protocol") on protein and autoantibody concentrations were examined. Matched serum and plasma samples were collected from 32 rheumatoid arthritis (RA) patients representing a wide range of disease activity status. Additionally, a set of matched serum samples with two sample handling methods was collected. One tube was processed per manufacturer's instructions and shipped overnight on cold packs (protocol). The matched tube, without prior centrifugation, was simultaneously shipped overnight at ambient temperatures (traditional). Upon delivery, the traditional tube was centrifuged. All samples were subsequently aliquoted and frozen prior to analysis of protein and autoantibody biomarkers. Median correlation between paired serum and plasma across all autoantibody assays was 0.99 (0.98-1.00) with a median % difference of -3.3 (-7.5 to 6.0). In contrast, observed protein biomarker concentrations were significantly affected by sample types, with median correlation of 0.99 (0.33-1.00) and a median % difference of -10 (-55 to 23). When the two serum collection/handling methods were compared, the median correlation between paired samples for autoantibodies was 0.99 (0.91-1.00) with a median difference of 4%. In contrast, significant increases were observed in protein biomarker concentrations among certain biomarkers in samples processed with the 'traditional' method. Autoantibody quantification appears robust to both sample type (plasma vs. serum) and pre-analytical sample collection/handling methods (protocol vs. traditional). In contrast, for non-antibody protein biomarker concentrations, sample type had a significant impact; plasma samples generally exhibit decreased protein biomarker concentrations relative to serum. Similarly, sample handling significantly impacted the variability of protein biomarker concentrations. When biomarker concentrations are combined algorithmically into a single test score such as a multi-biomarker disease activity test for rheumatoid arthritis (MBDA), changes in protein biomarker concentrations may result in a bias of the score. These results illustrate the importance of characterizing pre-analytical methodology, sample type, sample processing and handling procedures for clinical testing in order to ensure test accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry

    PubMed Central

    Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D.; Baranov, Vladimir I.; Nitz, Mark; Winnik, Mitchell A.

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping. PMID:19122859

  3. Analytical methods for the determination of personal care products in human samples: an overview.

    PubMed

    Jiménez-Díaz, I; Zafra-Gómez, A; Ballesteros, O; Navalón, A

    2014-11-01

    Personal care products (PCPs) are organic chemicals widely used in everyday human life. Nowadays, preservatives, UV-filters, antimicrobials and musk fragrances are widely used PCPs. Different studies have shown that some of these compounds can cause adverse health effects, such as genotoxicity, which could even lead to mutagenic or carcinogenic effects, or estrogenicity because of their endocrine disruption activity. Due to the absence of official monitoring protocols, there is an increasing demand of analytical methods that allow the determination of those compounds in human samples in order to obtain more information regarding their behavior and fate in the human body. The complexity of the biological matrices and the low concentration levels of these compounds make necessary the use of advanced sample treatment procedures that afford both, sample clean-up, to remove potentially interfering matrix components, as well as the concentration of analytes. In the present work, a review of the more recent analytical methods published in the scientific literature for the determination of PCPs in human fluids and tissue samples, is presented. The work focused on sample preparation and the analytical techniques employed. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Evaluation of a reduced centrifugation time and higher centrifugal force on various general chemistry and immunochemistry analytes in plasma and serum.

    PubMed

    Møller, Mette F; Søndergaard, Tove R; Kristensen, Helle T; Münster, Anna-Marie B

    2017-09-01

    Background Centrifugation of blood samples is an essential preanalytical step in the clinical biochemistry laboratory. Centrifugation settings are often altered to optimize sample flow and turnaround time. Few studies have addressed the effect of altering centrifugation settings on analytical quality, and almost all studies have been done using collection tubes with gel separator. Methods In this study, we compared a centrifugation time of 5 min at 3000 ×  g to a standard protocol of 10 min at 2200 ×  g. Nine selected general chemistry and immunochemistry analytes and interference indices were studied in lithium heparin plasma tubes and serum tubes without gel separator. Results were evaluated using mean bias, difference plots and coefficient of variation, compared with maximum allowable bias and coefficient of variation used in laboratory routine quality control. Results For all analytes except lactate dehydrogenase, the results were within the predefined acceptance criteria, indicating that the analytical quality was not compromised. Lactate dehydrogenase showed higher values after centrifugation for 5 min at 3000 ×  g, mean bias was 6.3 ± 2.2% and the coefficient of variation was 5%. Conclusions We found that a centrifugation protocol of 5 min at 3000 ×  g can be used for the general chemistry and immunochemistry analytes studied, with the possible exception of lactate dehydrogenase, which requires further assessment.

  5. Evaluation of two methods to determine glyphosate and AMPA in soils of Argentina

    NASA Astrophysics Data System (ADS)

    De Geronimo, Eduardo; Lorenzon, Claudio; Iwasita, Barbara; Faggioli, Valeria; Aparicio, Virginia; Costa, Jose Luis

    2017-04-01

    Argentine agricultural production is fundamentally based on a technological package combining no-tillage and the dependence of glyphosate applications to control weeds in transgenic crops (soybean, maize and cotton). Therefore, glyphosate is the most employed herbicide in the country, where 180 to 200 million liters are applied every year. Due to its widespread use, it is important to assess its impact on the environment and, therefore, reliable analytical methods are mandatory. Glyphosate molecule exhibits unique physical and chemical characteristics which difficult its quantification, especially in soils with high organic matter content, such as the central eastern Argentine soils, where strong interferences are normally observed. The objective of this work was to compare two methods for extraction and quantification of glyphosate and AMPA in samples of 8 representative soils of Argentina. The first analytical method (method 1) was based on the use of phosphate buffer as extracting solution and dichloromethane to minimize matrix organic content. In the second method (method 2), potassium hydroxide was used to extract the analytes followed by a clean-up step using solid phase extraction (SPE) to minimize strong interferences. Sensitivity, recoveries, matrix effects and robustness were evaluated. Both methodologies involved a derivatization with 9-fluorenyl-methyl-chloroformate (FMOC) in borate buffer and detection based on ultra-high-pressure liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS). Recoveries obtained from soil samples spiked at 0.1 and 1 mg kg-1 and were satisfactory in both methods (70% - 120%). However, there was a remarkable difference regarding the matrix effect, being the SPE clean-up step (method 2) insufficient to remove the interferences. Whereas the dilution and the clean-up with dichloromethane (method 1) were more effective minimizing the ionic suppression. Moreover, method 1 had fewer steps in the protocol of sample processing than method 2. This can be highly valuable in the routine lab work due to the reduction of potential undesired errors such as the loss of analyte or sample contamination. In addition, the substitution of SPE by another alternative involved a considerable reduction of analytical costs in method 1. We conclude that method 1 seemed to be simpler and cheaper than method 2, as well as reliable to quantify glyphosate in Argentinean soils. We hope that this experience can be useful to simplify the protocols of glyphosate quantification and contribute to the understanding of the fate of this herbicide in the environment.

  6. Analytical platform for metabolome analysis of microbial cells using methyl chloroformate derivatization followed by gas chromatography-mass spectrometry.

    PubMed

    Smart, Kathleen F; Aggio, Raphael B M; Van Houtte, Jeremy R; Villas-Bôas, Silas G

    2010-09-01

    This protocol describes an analytical platform for the analysis of intra- and extracellular metabolites of microbial cells (yeast, filamentous fungi and bacteria) using gas chromatography-mass spectrometry (GC-MS). The protocol is subdivided into sampling, sample preparation, chemical derivatization of metabolites, GC-MS analysis and data processing and analysis. This protocol uses two robust quenching methods for microbial cultures, the first of which, cold glycerol-saline quenching, causes reduced leakage of intracellular metabolites, thus allowing a more reliable separation of intra- and extracellular metabolites with simultaneous stopping of cell metabolism. The second, fast filtration, is specifically designed for quenching filamentous micro-organisms. These sampling techniques are combined with an easy sample-preparation procedure and a fast chemical derivatization reaction using methyl chloroformate. This reaction takes place at room temperature, in aqueous medium, and is less prone to matrix effect compared with other derivatizations. This protocol takes an average of 10 d to complete and enables the simultaneous analysis of hundreds of metabolites from the central carbon metabolism (amino and nonamino organic acids, phosphorylated organic acids and fatty acid intermediates) using an in-house MS library and a data analysis pipeline consisting of two free software programs (Automated Mass Deconvolution and Identification System (AMDIS) and R).

  7. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  8. Laboratory and quality assurance protocols for the analysis of herbicides in ground water from the Management Systems Evaluation Area, Princeton, Minnesota

    USGS Publications Warehouse

    Larson, S.J.; Capel, P.D.; VanderLoop, A.G.

    1996-01-01

    Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.

  9. A communal catalogue reveals Earth's multiscale microbial diversity.

    PubMed

    Thompson, Luke R; Sanders, Jon G; McDonald, Daniel; Amir, Amnon; Ladau, Joshua; Locey, Kenneth J; Prill, Robert J; Tripathi, Anupriya; Gibbons, Sean M; Ackermann, Gail; Navas-Molina, Jose A; Janssen, Stefan; Kopylova, Evguenia; Vázquez-Baeza, Yoshiki; González, Antonio; Morton, James T; Mirarab, Siavash; Zech Xu, Zhenjiang; Jiang, Lingjing; Haroon, Mohamed F; Kanbar, Jad; Zhu, Qiyun; Jin Song, Se; Kosciolek, Tomasz; Bokulich, Nicholas A; Lefler, Joshua; Brislawn, Colin J; Humphrey, Gregory; Owens, Sarah M; Hampton-Marcell, Jarrad; Berg-Lyons, Donna; McKenzie, Valerie; Fierer, Noah; Fuhrman, Jed A; Clauset, Aaron; Stevens, Rick L; Shade, Ashley; Pollard, Katherine S; Goodwin, Kelly D; Jansson, Janet K; Gilbert, Jack A; Knight, Rob

    2017-11-23

    Our growing awareness of the microbial world's importance and diversity contrasts starkly with our limited understanding of its fundamental structure. Despite recent advances in DNA sequencing, a lack of standardized protocols and common analytical frameworks impedes comparisons among studies, hindering the development of global inferences about microbial life on Earth. Here we present a meta-analysis of microbial community samples collected by hundreds of researchers for the Earth Microbiome Project. Coordinated protocols and new analytical methods, particularly the use of exact sequences instead of clustered operational taxonomic units, enable bacterial and archaeal ribosomal RNA gene sequences to be followed across multiple studies and allow us to explore patterns of diversity at an unprecedented scale. The result is both a reference database giving global context to DNA sequence data and a framework for incorporating data from future studies, fostering increasingly complete characterization of Earth's microbial diversity.

  10. Establishment of reference intervals of clinical chemistry analytes for the adult population in Saudi Arabia: a study conducted as a part of the IFCC global study on reference values.

    PubMed

    Borai, Anwar; Ichihara, Kiyoshi; Al Masaud, Abdulaziz; Tamimi, Waleed; Bahijri, Suhad; Armbuster, David; Bawazeer, Ali; Nawajha, Mustafa; Otaibi, Nawaf; Khalil, Haitham; Kawano, Reo; Kaddam, Ibrahim; Abdelaal, Mohamed

    2016-05-01

    This study is a part of the IFCC-global study to derive reference intervals (RIs) for 28 chemistry analytes in Saudis. Healthy individuals (n=826) aged ≥18 years were recruited using the global study protocol. All specimens were measured using an Architect analyzer. RIs were derived by both parametric and non-parametric methods for comparative purpose. The need for secondary exclusion of reference values based on latent abnormal values exclusion (LAVE) method was examined. The magnitude of variation attributable to gender, ages and regions was calculated by the standard deviation ratio (SDR). Sources of variations: age, BMI, physical exercise and smoking levels were investigated by using the multiple regression analysis. SDRs for gender, age and regional differences were significant for 14, 8 and 2 analytes, respectively. BMI-related changes in test results were noted conspicuously for CRP. For some metabolic related parameters the ranges of RIs by non-parametric method were wider than by the parametric method and RIs derived using the LAVE method were significantly different than those without it. RIs were derived with and without gender partition (BMI, drugs and supplements were considered). RIs applicable to Saudis were established for the majority of chemistry analytes, whereas gender, regional and age RI partitioning was required for some analytes. The elevated upper limits of metabolic analytes reflects the existence of high prevalence of metabolic syndrome in Saudi population.

  11. Rapid and high-resolution stable isotopic measurement of biogenic accretionary carbonate using an online CO2 laser ablation system: Standardization of the analytical protocol.

    PubMed

    Sreemany, Arpita; Bera, Melinda Kumar; Sarkar, Anindya

    2017-12-30

    The elaborate sampling and analytical protocol associated with conventional dual-inlet isotope ratio mass spectrometry has long hindered high-resolution climate studies from biogenic accretionary carbonates. Laser-based on-line systems, in comparison, produce rapid data, but suffer from unresolvable matrix effects. It is, therefore, necessary to resolve these matrix effects to take advantage of the automated laser-based method. Two marine bivalve shells (one aragonite and one calcite) and one fish otolith (aragonite) were first analysed using a CO 2 laser ablation system attached to a continuous flow isotope ratio mass spectrometer under different experimental conditions (different laser power, sample untreated vs vacuum roasted). The shells and the otolith were then micro-drilled and the isotopic compositions of the powders were measured in a dual-inlet isotope ratio mass spectrometer following the conventional acid digestion method. The vacuum-roasted samples (both aragonite and calcite) produced mean isotopic ratios (with a reproducibility of ±0.2 ‰ for both δ 18 O and δ 13 C values) almost identical to the values obtained using the conventional acid digestion method. As the isotopic ratio of the acid digested samples fall within the analytical precision (±0.2 ‰) of the laser ablation system, this suggests the usefulness of the method for studying the biogenic accretionary carbonate matrix. When using laser-based continuous flow isotope ratio mass spectrometry for the high-resolution isotopic measurements of biogenic carbonates, the employment of a vacuum-roasting step will reduce the matrix effect. This method will be of immense help to geologists and sclerochronologists in exploring short-term changes in climatic parameters (e.g. seasonality) in geological times. Copyright © 2017 John Wiley & Sons, Ltd.

  12. One-sided measurement-device-independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Cao, Wen-Fei; Zhen, Yi-Zheng; Zheng, Yu-Lin; Li, Li; Chen, Zeng-Bing; Liu, Nai-Le; Chen, Kai

    2018-01-01

    Measurement-device-independent quantum key distribution (MDI-QKD) protocol was proposed to remove all the detector side channel attacks, while its security relies on the trusted encoding systems. Here we propose a one-sided MDI-QKD (1SMDI-QKD) protocol, which enjoys detection loophole-free advantage, and at the same time weakens the state preparation assumption in MDI-QKD. The 1SMDI-QKD can be regarded as a modified MDI-QKD, in which Bob's encoding system is trusted, while Alice's is uncharacterized. For the practical implementation, we also provide a scheme by utilizing coherent light source with an analytical two decoy state estimation method. Simulation with realistic experimental parameters shows that the protocol has a promising performance, and thus can be applied to practical QKD applications.

  13. VALIDATION OF STANDARD ANALYTICAL PROTOCOL FOR ...

    EPA Pesticide Factsheets

    There is a growing concern with the potential for terrorist use of chemical weapons to cause civilian harm. In the event of an actual or suspected outdoor release of chemically hazardous material in a large area, the extent of contamination must be determined. This requires a system with the ability to prepare and quickly analyze a large number of contaminated samples for the traditional chemical agents, as well as numerous toxic industrial chemicals. Liquid samples (both aqueous and organic), solid samples (e.g., soil), vapor samples (e.g., air) and mixed state samples, all ranging from household items to deceased animals, may require some level of analyses. To meet this challenge, the U.S. Environmental Protection Agency (U.S. EPA) National Homeland Security Research Center, in collaboration with experts from across U.S. EPA and other Federal Agencies, initiated an effort to identify analytical methods for the chemical and biological agents that could be used to respond to a terrorist attack or a homeland security incident. U.S. EPA began development of standard analytical protocols (SAPs) for laboratory identification and measurement of target agents in case of a contamination threat. These methods will be used to help assist in the identification of existing contamination, the effectiveness of decontamination, as well as clearance for the affected population to reoccupy previously contaminated areas. One of the first SAPs developed was for the determin

  14. An orientation soil survey at the Pebble Cu-Au-Mo porphyry deposit, Alaska

    USGS Publications Warehouse

    Smith, Steven M.; Eppinger, Robert G.; Fey, David L.; Kelley, Karen D.; Giles, S.A.

    2009-01-01

    Soil samples were collected in 2007 and 2008 along three traverses across the giant Pebble Cu-Au-Mo porphyry deposit. Within each soil pit, four subsamples were collected following recommended protocols for each of ten commonly-used and proprietary leach/digestion techniques. The significance of geochemical patterns generated by these techniques was classified by visual inspection of plots showing individual element concentration by each analytical method along the 2007 traverse. A simple matrix by element versus method, populated with a value based on the significance classification, provides a method for ranking the utility of methods and elements at this deposit. The interpretation of a complex multi-element dataset derived from multiple analytical techniques is challenging. An example of vanadium results from a single leach technique is used to illustrate the several possible interpretations of the data.

  15. Current Protocols in Pharmacology

    PubMed Central

    2016-01-01

    Determination of drug or drug metabolite concentrations in biological samples, particularly in serum or plasma, is fundamental to describing the relationships between administered dose, route of administration, and time after dose to the drug concentrations achieved and to the observed effects of the drug. A well-characterized, accurate analytical method is needed, but it must also be established that the analyte concentration in the sample at the time of analysis is the same as the concentration at sample acquisition. Drugs and metabolites may be susceptible to degradation in samples due to metabolism or to physical and chemical processes, resulting in a lower measured concentration than was in the original sample. Careful examination of analyte stability during processing and storage and adjustment of procedures and conditions to maximize that stability are a critical part of method validation for the analysis, and can ensure the accuracy of the measured concentrations. PMID:27960029

  16. Simultaneous determination of thirteen different steroid hormones using micro UHPLC-MS/MS with on-line SPE system.

    PubMed

    Márta, Zoltán; Bobály, Balázs; Fekete, Jenő; Magda, Balázs; Imre, Tímea; Mészáros, Katalin Viola; Bálint, Mária; Szabó, Pál Tamás

    2018-02-20

    Ultratrace analysis of sample components requires excellent analytical performance in terms of limits of quantitation (LOQ). Micro UHPLC coupled to sensitive tandem mass spectrometry provides state of the art solution for such analytical problems. Using on-line SPE with column switching on a micro UHPLC-MS/MS system allowed to decrease LOQ without any complex sample preparation protocol. The presented method is capable of reaching satisfactory low LOQ values for analysis of thirteen different steroid molecules from human plasma without the most commonly used off-line SPE or compound derivatization. Steroids were determined by using two simple sample preparation methods, based on lower and higher plasma steroid concentrations. In the first method, higher analyte concentrations were directly determined after protein precipitation with methanol. The organic phase obtained from the precipitation was diluted with water and directly injected into the LC-MS system. In the second method, low steroid levels were determined by concentrating the organic phase after steroid extraction. In this case, analytes were extracted with ethyl acetate and reconstituted in 90/10 water/acetonitrile following evaporation to dryness. This step provided much lower LOQs, outperforming previously published values. The method has been validated and subsequently applied to clinical laboratory measurement. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Heparin removal by ecteola-cellulose pre-treatment enables the use of plasma samples for accurate measurement of anti-Yellow fever virus neutralizing antibodies.

    PubMed

    Campi-Azevedo, Ana Carolina; Peruhype-Magalhães, Vanessa; Coelho-Dos-Reis, Jordana Grazziela; Costa-Pereira, Christiane; Yamamura, Anna Yoshida; Lima, Sheila Maria Barbosa de; Simões, Marisol; Campos, Fernanda Magalhães Freire; de Castro Zacche Tonini, Aline; Lemos, Elenice Moreira; Brum, Ricardo Cristiano; de Noronha, Tatiana Guimarães; Freire, Marcos Silva; Maia, Maria de Lourdes Sousa; Camacho, Luiz Antônio Bastos; Rios, Maria; Chancey, Caren; Romano, Alessandro; Domingues, Carla Magda; Teixeira-Carvalho, Andréa; Martins-Filho, Olindo Assis

    2017-09-01

    Technological innovations in vaccinology have recently contributed to bring about novel insights for the vaccine-induced immune response. While the current protocols that use peripheral blood samples may provide abundant data, a range of distinct components of whole blood samples are required and the different anticoagulant systems employed may impair some properties of the biological sample and interfere with functional assays. Although the interference of heparin in functional assays for viral neutralizing antibodies such as the functional plaque-reduction neutralization test (PRNT), considered the gold-standard method to assess and monitor the protective immunity induced by the Yellow fever virus (YFV) vaccine, has been well characterized, the development of pre-analytical treatments is still required for the establishment of optimized protocols. The present study intended to optimize and evaluate the performance of pre-analytical treatment of heparin-collected blood samples with ecteola-cellulose (ECT) to provide accurate measurement of anti-YFV neutralizing antibodies, by PRNT. The study was designed in three steps, including: I. Problem statement; II. Pre-analytical steps; III. Analytical steps. Data confirmed the interference of heparin on PRNT reactivity in a dose-responsive fashion. Distinct sets of conditions for ECT pre-treatment were tested to optimize the heparin removal. The optimized protocol was pre-validated to determine the effectiveness of heparin plasma:ECT treatment to restore the PRNT titers as compared to serum samples. The validation and comparative performance was carried out by using a large range of serum vs heparin plasma:ECT 1:2 paired samples obtained from unvaccinated and 17DD-YFV primary vaccinated subjects. Altogether, the findings support the use of heparin plasma:ECT samples for accurate measurement of anti-YFV neutralizing antibodies. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. CT protocol management: simplifying the process by using a master protocol concept.

    PubMed

    Szczykutowicz, Timothy P; Bour, Robert K; Rubert, Nicholas; Wendt, Gary; Pozniak, Myron; Ranallo, Frank N

    2015-07-08

    This article explains a method for creating CT protocols for a wide range of patient body sizes and clinical indications, using detailed tube current information from a small set of commonly used protocols. Analytical expressions were created relating CT technical acquisition parameters which can be used to create new CT protocols on a given scanner or customize protocols from one scanner to another. Plots of mA as a function of patient size for specific anatomical regions were generated and used to identify the tube output needs for patients as a function of size for a single master protocol. Tube output data were obtained from the DICOM header of clinical images from our PACS and patient size was measured from CT localizer radiographs under IRB approval. This master protocol was then used to create 11 additional master protocols. The 12 master protocols were further combined to create 39 single and multiphase clinical protocols. Radiologist acceptance rate of exams scanned using the clinical protocols was monitored for 12,857 patients to analyze the effectiveness of the presented protocol management methods using a two-tailed Fisher's exact test. A single routine adult abdominal protocol was used as the master protocol to create 11 additional master abdominal protocols of varying dose and beam energy. Situations in which the maximum tube current would have been exceeded are presented, and the trade-offs between increasing the effective tube output via 1) decreasing pitch, 2) increasing the scan time, or 3) increasing the kV are discussed. Out of 12 master protocols customized across three different scanners, only one had a statistically significant acceptance rate that differed from the scanner it was customized from. The difference, however, was only 1% and was judged to be negligible. All other master protocols differed in acceptance rate insignificantly between scanners. The methodology described in this paper allows a small set of master protocols to be adapted among different clinical indications on a single scanner and among different CT scanners.

  19. A Simplified Digestion Protocol for the Analysis of Hg in Fish by Cold Vapor Atomic Absorption Spectroscopy

    ERIC Educational Resources Information Center

    Kristian, Kathleen E.; Friedbauer, Scott; Kabashi, Donika; Ferencz, Kristen M.; Barajas, Jennifer C.; O'Brien, Kelly

    2015-01-01

    Analysis of mercury in fish is an interesting problem with the potential to motivate students in chemistry laboratory courses. The recommended method for mercury analysis in fish is cold vapor atomic absorption spectroscopy (CVAAS), which requires homogeneous analyte solutions, typically prepared by acid digestion. Previously published digestion…

  20. Determination of hydrazine in drinking water: Development and multivariate optimization of a rapid and simple solid phase microextraction-gas chromatography-triple quadrupole mass spectrometry protocol.

    PubMed

    Gionfriddo, Emanuela; Naccarato, Attilio; Sindona, Giovanni; Tagarelli, Antonio

    2014-07-04

    In this work, the capabilities of solid phase microextraction were exploited in a fully optimized SPME-GC-QqQ-MS analytical approach for hydrazine assay. A rapid and easy method was obtained by a simple derivatization reaction with propyl chloroformate and pyridine carried out directly in water samples, followed by automated SPME analysis in the same vial without further sample handling. The affinity of the different derivatized compounds obtained towards five commercially available SPME coatings was evaluated, in order to achieve the best extraction efficiency. GC analyses were carried out using a GC-QqQ-MS instrument in selected reaction monitoring (SRM) acquisition mode which has allowed the achievement of high specificity by selecting appropriate precursor-product ion couples improving the capability in analyte identification. The multivariate approach of experimental design was crucial in order to optimize derivatization reaction, SPME process and tandem mass spectrometry parameters. Accuracy of the proposed protocol, tested at 60, 200 and 800 ng L(-1), provided satisfactory values (114.2%, 83.6% and 98.6%, respectively), whereas precision (RSD%) at the same concentration levels were of 10.9%, 7.9% and 7.7% respectively. Limit of detection and quantification of 4.4 and 8.3 ng L(-1) were obtained. The reliable application of the proposed protocol to real drinking water samples confirmed its capability to be used as analytical tool for routine analyses. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Building America House Simulation Protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendron, Robert; Engebrecht, Cheryn

    2010-09-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  2. Analytic few-photon scattering in waveguide QED

    NASA Astrophysics Data System (ADS)

    Hurst, David L.; Kok, Pieter

    2018-04-01

    We develop an approach to light-matter coupling in waveguide QED based upon scattering amplitudes evaluated via Dyson series. For optical states containing more than single photons, terms in this series become increasingly complex, and we provide a diagrammatic recipe for their evaluation, which is capable of yielding analytic results. Our method fully specifies a combined emitter-optical state that permits investigation of light-matter entanglement generation protocols. We use our expressions to study two-photon scattering from a Λ -system and find that the pole structure of the transition amplitude is dramatically altered as the two ground states are tuned from degeneracy.

  3. Evaluation of analytical performance of a new high-sensitivity immunoassay for cardiac troponin I.

    PubMed

    Masotti, Silvia; Prontera, Concetta; Musetti, Veronica; Storti, Simona; Ndreu, Rudina; Zucchelli, Gian Carlo; Passino, Claudio; Clerico, Aldo

    2018-02-23

    The study aim was to evaluate and compare the analytical performance of the new chemiluminescent immunoassay for cardiac troponin I (cTnI), called Access hs-TnI using DxI platform, with those of Access AccuTnI+3 method, and high-sensitivity (hs) cTnI method for ARCHITECT platform. The limits of blank (LoB), detection (LoD) and quantitation (LoQ) at 10% and 20% CV were evaluated according to international standardized protocols. For the evaluation of analytical performance and comparison of cTnI results, both heparinized plasma samples, collected from healthy subjects and patients with cardiac diseases, and quality control samples distributed in external quality assessment programs were used. LoB, LoD and LoQ at 20% and 10% CV values of the Access hs-cTnI method were 0.6, 1.3, 2.1 and 5.3 ng/L, respectively. Access hs-cTnI method showed analytical performance significantly better than that of Access AccuTnI+3 method and similar results to those of hs ARCHITECT cTnI method. Moreover, the cTnI concentrations measured with Access hs-cTnI method showed close linear regressions with both Access AccuTnI+3 and ARCHITECT hs-cTnI methods, although there were systematic differences between these methods. There was no difference between cTnI values measured by Access hs-cTnI in heparinized plasma and serum samples, whereas there was a significant difference between cTnI values, respectively measured in EDTA and heparin plasma samples. Access hs-cTnI has analytical sensitivity parameters significantly improved compared to Access AccuTnI+3 method and is similar to those of the high-sensitivity method using ARCHITECT platform.

  4. S-curve networks and an approximate method for estimating degree distributions of complex networks

    NASA Astrophysics Data System (ADS)

    Guo, Jin-Li

    2010-12-01

    In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol version 4) addresses, this paper proposes a forecasting model by using S curve (logistic curve). The growing trend of IPv4 addresses in China is forecasted. There are some reference values for optimizing the distribution of IPv4 address resource and the development of IPv6. Based on the laws of IPv4 growth, that is, the bulk growth and the finitely growing limit, it proposes a finite network model with a bulk growth. The model is said to be an S-curve network. Analysis demonstrates that the analytic method based on uniform distributions (i.e., Barabási-Albert method) is not suitable for the network. It develops an approximate method to predict the growth dynamics of the individual nodes, and uses this to calculate analytically the degree distribution and the scaling exponents. The analytical result agrees with the simulation well, obeying an approximately power-law form. This method can overcome a shortcoming of Barabási-Albert method commonly used in current network research.

  5. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A.

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses onmore » validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.« less

  6. Multipinhole SPECT helical scan parameters and imaging volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Rutao, E-mail: rutaoyao@buffalo.edu; Deng, Xiao; Wei, Qingyang

    Purpose: The authors developed SPECT imaging capability on an animal PET scanner using a multiple-pinhole collimator and step-and-shoot helical data acquisition protocols. The objective of this work was to determine the preferred helical scan parameters, i.e., the angular and axial step sizes, and the imaging volume, that provide optimal imaging performance. Methods: The authors studied nine helical scan protocols formed by permuting three rotational and three axial step sizes. These step sizes were chosen around the reference values analytically calculated from the estimated spatial resolution of the SPECT system and the Nyquist sampling theorem. The nine helical protocols were evaluatedmore » by two figures-of-merit: the sampling completeness percentage (SCP) and the root-mean-square (RMS) resolution. SCP was an analytically calculated numerical index based on projection sampling. RMS resolution was derived from the reconstructed images of a sphere-grid phantom. Results: The RMS resolution results show that (1) the start and end pinhole planes of the helical scheme determine the axial extent of the effective field of view (EFOV), and (2) the diameter of the transverse EFOV is adequately calculated from the geometry of the pinhole opening, since the peripheral region beyond EFOV would introduce projection multiplexing and consequent effects. The RMS resolution results of the nine helical scan schemes show optimal resolution is achieved when the axial step size is the half, and the angular step size is about twice the corresponding values derived from the Nyquist theorem. The SCP results agree in general with that of RMS resolution but are less critical in assessing the effects of helical parameters and EFOV. Conclusions: The authors quantitatively validated the effective FOV of multiple pinhole helical scan protocols and proposed a simple method to calculate optimal helical scan parameters.« less

  7. A Validated Method for the Quality Control of Andrographis paniculata Preparations.

    PubMed

    Karioti, Anastasia; Timoteo, Patricia; Bergonzi, Maria Camilla; Bilia, Anna Rita

    2017-10-01

    Andrographis paniculata is a herbal drug of Asian traditional medicine largely employed for the treatment of several diseases. Recently, it has been introduced in Europe for the prophylactic and symptomatic treatment of common cold and as an ingredient of dietary supplements. The active principles are diterpenes with andrographolide as the main representative. In the present study, an analytical protocol was developed for the determination of the main constituents in the herb and preparations of A. paniculata . Three different extraction protocols (methanol extraction using a modified Soxhlet procedure, maceration under ultrasonication, and decoction) were tested. Ultrasonication achieved the highest content of analytes. HPLC conditions were optimized in terms of solvent mixtures, time course, and temperature. A reversed phase C18 column eluted with a gradient system consisting of acetonitrile and acidified water and including an isocratic step at 30 °C was used. The HPLC method was validated for linearity, limits of quantitation and detection, repeatability, precision, and accuracy. The overall method was validated for precision and accuracy over at least three different concentration levels. Relative standard deviation was less than 1.13%, whereas recovery was between 95.50% and 97.19%. The method also proved to be suitable for the determination of a large number of commercial samples and was proposed to the European Pharmacopoeia for the quality control of Andrographidis herba. Georg Thieme Verlag KG Stuttgart · New York.

  8. Validation of an isotope dilution, ICP-MS method based on internal mass bias correction for the determination of trace concentrations of Hg in sediment cores.

    PubMed

    Ciceri, E; Recchia, S; Dossi, C; Yang, L; Sturgeon, R E

    2008-01-15

    The development and validation of a method for the determination of mercury in sediments using a sector field inductively coupled plasma mass spectrometer (SF-ICP-MS) for detection is described. The utilization of isotope dilution (ID) calibration is shown to solve analytical problems related to matrix composition. Mass bias is corrected using an internal mass bias correction technique, validated against the traditional standard bracketing method. The overall analytical protocol is validated against NRCC PACS-2 marine sediment CRM. The estimated limit of detection is 12ng/g. The proposed procedure was applied to the analysis of a real sediment core sampled to a depth of 160m in Lake Como, where Hg concentrations ranged from 66 to 750ng/g.

  9. Cluster Size Optimization in Sensor Networks with Decentralized Cluster-Based Protocols

    PubMed Central

    Amini, Navid; Vahdatpour, Alireza; Xu, Wenyao; Gerla, Mario; Sarrafzadeh, Majid

    2011-01-01

    Network lifetime and energy-efficiency are viewed as the dominating considerations in designing cluster-based communication protocols for wireless sensor networks. This paper analytically provides the optimal cluster size that minimizes the total energy expenditure in such networks, where all sensors communicate data through their elected cluster heads to the base station in a decentralized fashion. LEACH, LEACH-Coverage, and DBS comprise three cluster-based protocols investigated in this paper that do not require any centralized support from a certain node. The analytical outcomes are given in the form of closed-form expressions for various widely-used network configurations. Extensive simulations on different networks are used to confirm the expectations based on the analytical results. To obtain a thorough understanding of the results, cluster number variability problem is identified and inspected from the energy consumption point of view. PMID:22267882

  10. Standardization and optimization of fluorescence in situ hybridization (FISH) for HER-2 assessment in breast cancer: A single center experience.

    PubMed

    Bogdanovska-Todorovska, Magdalena; Petrushevska, Gordana; Janevska, Vesna; Spasevska, Liljana; Kostadinova-Kunovska, Slavica

    2018-05-20

    Accurate assessment of human epidermal growth factor receptor 2 (HER-2) is crucial in selecting patients for targeted therapy. Commonly used methods for HER-2 testing are immunohistochemistry (IHC) and fluorescence in situ hybridization (FISH). Here we presented the implementation, optimization and standardization of two FISH protocols using breast cancer samples and assessed the impact of pre-analytical and analytical factors on HER-2 testing. Formalin fixed paraffin embedded (FFPE) tissue samples from 70 breast cancer patients were tested for HER-2 using PathVysion™ HER-2 DNA Probe Kit and two different paraffin pretreatment kits, Vysis/Abbott Paraffin Pretreatment Reagent Kit (40 samples) and DAKO Histology FISH Accessory Kit (30 samples). The concordance between FISH and IHC results was determined. Pre-analytical and analytical factors (i.e., fixation, baking, digestion, and post-hybridization washing) affected the efficiency and quality of hybridization. The overall hybridization success in our study was 98.6% (69/70); the failure rate was 1.4%. The DAKO pretreatment kit was more time-efficient and resulted in more uniform signals that were easier to interpret, compared to the Vysis/Abbott kit. The overall concordance between IHC and FISH was 84.06%, kappa coefficient 0.5976 (p < 0.0001). The greatest discordance (82%) between IHC and FISH was observed in IHC 2+ group. A standardized FISH protocol for HER-2 assessment, with high hybridization efficiency, is necessary due to variability in tissue processing and individual tissue characteristics. Differences in the pre-analytical and analytical steps can affect the hybridization quality and efficiency. The use of DAKO pretreatment kit is time-saving and cost-effective.

  11. Quantitative analysis of the major constituents of St John's wort with HPLC-ESI-MS.

    PubMed

    Chandrasekera, Dhammitha H; Welham, Kevin J; Ashton, David; Middleton, Richard; Heinrich, Michael

    2005-12-01

    A method was developed to profile the major constituents of St John's wort extracts using high-performance liquid chromatography-electrospray mass spectrometry (HPLC-ESI-MS). The objective was to simultaneously separate, identify and quantify hyperforin, hypericin, pseudohypericin, rutin, hyperoside, isoquercetrin, quercitrin and chlorogenic acid using HPLC-MS. Quantification was performed using an external standardisation method with reference standards. The method consisted of two protocols: one for the analysis of flavonoids and glycosides and the other for the analysis of the more lipophilic hypericins and hyperforin. Both protocols used a reverse phase Luna phenyl hexyl column. The separation of the flavonoids and glycosides was achieved within 35 min and that of the hypericins and hyperforin within 9 min. The linear response range in ESI-MS was established for each compound and all had linear regression coefficient values greater than 0.97. Both protocols proved to be very specific for the constituents analysed. MS analysis showed no other signals within the analyte peaks. The method was robust and applicable to alcoholic tinctures, tablet/capsule extracts in various solvents and herb extracts. The method was applied to evaluate the phytopharmaceutical quality of St John's wort preparations available in the UK in order to test the method and investigate if they contain at least the main constituents and at what concentrations.

  12. Miniaturized Temperature-Controlled Planar Chromatography (Micro-TLC) as a Versatile Technique for Fast Screening of Micropollutants and Biomarkers Derived from Surface Water Ecosystems and During Technological Processes of Wastewater Treatment.

    PubMed

    Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2017-07-01

    There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.

  13. 40 CFR Appendix A - Protocol for Using an Electrochemical Analyzer to Determine Oxygen and Carbon Monoxide...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., and Process Heaters Using Portable Analyzers”, EMC Conditional Test Protocol 30 (CTM-30), Gas Research... cell design(s) conforming to this protocol will determine the analytical range for each gas component..., selective gas scrubbers, etc.) to meet the design specifications of this protocol. Do not make changes to...

  14. 40 CFR Appendix A - Protocol for Using an Electrochemical Analyzer to Determine Oxygen and Carbon Monoxide...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., and Process Heaters Using Portable Analyzers”, EMC Conditional Test Protocol 30 (CTM-30), Gas Research... cell design(s) conforming to this protocol will determine the analytical range for each gas component..., selective gas scrubbers, etc.) to meet the design specifications of this protocol. Do not make changes to...

  15. CT protocol management: simplifying the process by using a master protocol concept

    PubMed Central

    Bour, Robert K.; Rubert, Nicholas; Wendt, Gary; Pozniak, Myron; Ranallo, Frank N.

    2015-01-01

    This article explains a method for creating CT protocols for a wide range of patient body sizes and clinical indications, using detailed tube current information from a small set of commonly used protocols. Analytical expressions were created relating CT technical acquisition parameters which can be used to create new CT protocols on a given scanner or customize protocols from one scanner to another. Plots of mA as a function of patient size for specific anatomical regions were generated and used to identify the tube output needs for patients as a function of size for a single master protocol. Tube output data were obtained from the DICOM header of clinical images from our PACS and patient size was measured from CT localizer radiographs under IRB approval. This master protocol was then used to create 11 additional master protocols. The 12 master protocols were further combined to create 39 single and multiphase clinical protocols. Radiologist acceptance rate of exams scanned using the clinical protocols was monitored for 12,857 patients to analyze the effectiveness of the presented protocol management methods using a two‐tailed Fisher's exact test. A single routine adult abdominal protocol was used as the master protocol to create 11 additional master abdominal protocols of varying dose and beam energy. Situations in which the maximum tube current would have been exceeded are presented, and the trade‐offs between increasing the effective tube output via 1) decreasing pitch, 2) increasing the scan time, or 3) increasing the kV are discussed. Out of 12 master protocols customized across three different scanners, only one had a statistically significant acceptance rate that differed from the scanner it was customized from. The difference, however, was only 1% and was judged to be negligible. All other master protocols differed in acceptance rate insignificantly between scanners. The methodology described in this paper allows a small set of master protocols to be adapted among different clinical indications on a single scanner and among different CT scanners. PACS number: 87.57.Q PMID:26219005

  16. Comparison of PCR methods for the detection of genetic variants of carp edema virus.

    PubMed

    Adamek, Mikolaj; Matras, Marek; Jung-Schroers, Verena; Teitge, Felix; Heling, Max; Bergmann, Sven M; Reichert, Michal; Way, Keith; Stone, David M; Steinhagen, Dieter

    2017-09-20

    The infection of common carp and its ornamental variety, koi, with the carp edema virus (CEV) is often associated with the occurrence of a clinical disease called 'koi sleepy disease'. The disease may lead to high mortality in both koi and common carp populations. To prevent further spread of the infection and the disease, a reliable detection method for this virus is required. However, the high genetic variability of the CEV p4a gene used for PCR-based diagnostics could be a serious obstacle for successful and reliable detection of virus infection in field samples. By analysing 39 field samples from different geographical origins obtained from koi and farmed carp and from all 3 genogroups of CEV, using several recently available PCR protocols, we investigated which of the protocols would allow the detection of CEV from all known genogroups present in samples from Central European carp or koi populations. The comparison of 5 different PCR protocols showed that the PCR assays (both end-point and quantitative) developed in the Centre for Environment, Fisheries and Aquaculture Science exhibited the highest analytical inclusivity and diagnostic sensitivity. Currently, this makes them the most suitable protocols for detecting viruses from all known CEV genogroups.

  17. A mass spectrometry primer for mass spectrometry imaging

    PubMed Central

    Rubakhin, Stanislav S.; Sweedler, Jonathan V.

    2011-01-01

    Mass spectrometry imaging (MSI), a rapidly growing subfield of chemical imaging, employs mass spectrometry (MS) technologies to create single- and multi-dimensional localization maps for a variety of atoms and molecules. Complimentary to other imaging approaches, MSI provides high chemical specificity and broad analyte coverage. This powerful analytical toolset is capable of measuring the distribution of many classes of inorganics, metabolites, proteins and pharmaceuticals in chemically and structurally complex biological specimens in vivo, in vitro, and in situ. The MSI approaches highlighted in this Methods in Molecular Biology volume provide flexibility of detection, characterization, and identification of multiple known and unknown analytes. The goal of this chapter is to introduce investigators who may be unfamiliar with MS to the basic principles of the mass spectrometric approaches as used in MSI. In addition to guidelines for choosing the most suitable MSI method for specific investigations, cross-references are provided to the chapters in this volume that describe the appropriate experimental protocols. PMID:20680583

  18. Novel protocol for highly efficient gas-phase chemical derivatization of surface amine groups using trifluoroacetic anhydride

    NASA Astrophysics Data System (ADS)

    Duchoslav, Jiri; Kehrer, Matthias; Hinterreiter, Andreas; Duchoslav, Vojtech; Unterweger, Christoph; Fürst, Christian; Steinberger, Roland; Stifter, David

    2018-06-01

    In the current work, chemical derivatization of amine (NH2) groups with trifluoroacetic anhydride (TFAA) as an analytical method to improve the information scope of X-ray photoelectron spectroscopy (XPS) is investigated. TFAA is known to successfully label hydroxyl (OH) groups. With the introduction of a newly developed gas-phase derivatization protocol conducted at ambient pressure and using a catalyst also NH2 groups can now efficiently be labelled with a high yield and without the formation of unwanted by-products. By establishing a comprehensive and self-consistent database of reference binding energies for XPS a promising approach for distinguishing hydroxyl from amine groups is presented. The protocol was verified on different polymers, including poly(allylamine), poly(ethyleneimine), poly(vinylalcohol) and chitosan, the latter one containing both types of addressed chemical groups.

  19. Comparison of three sampling and analytical methods for the determination of airborne hexavalent chromium.

    PubMed

    Boiano, J M; Wallace, M E; Sieber, W K; Groff, J H; Wang, J; Ashley, K

    2000-08-01

    A field study was conducted with the goal of comparing the performance of three recently developed or modified sampling and analytical methods for the determination of airborne hexavalent chromium (Cr(VI)). The study was carried out in a hard chrome electroplating facility and in a jet engine manufacturing facility where airborne Cr(VI) was expected to be present. The analytical methods evaluated included two laboratory-based procedures (OSHA Method ID-215 and NIOSH Method 7605) and a field-portable method (NIOSH Method 7703). These three methods employ an identical sampling methodology: collection of Cr(VI)-containing aerosol on a polyvinyl chloride (PVC) filter housed in a sampling cassette, which is connected to a personal sampling pump calibrated at an appropriate flow rate. The basis of the analytical methods for all three methods involves extraction of the PVC filter in alkaline buffer solution, chemical isolation of the Cr(VI) ion, complexation of the Cr(VI) ion with 1,5-diphenylcarbazide, and spectrometric measurement of the violet chromium diphenylcarbazone complex at 540 nm. However, there are notable specific differences within the sample preparation procedures used in three methods. To assess the comparability of the three measurement protocols, a total of 20 side-by-side air samples were collected, equally divided between a chromic acid electroplating operation and a spray paint operation where water soluble forms of Cr(VI) were used. A range of Cr(VI) concentrations from 0.6 to 960 microg m(-3), with Cr(VI) mass loadings ranging from 0.4 to 32 microg, was measured at the two operations. The equivalence of the means of the log-transformed Cr(VI) concentrations obtained from the different analytical methods was compared. Based on analysis of variance (ANOVA) results, no statistically significant differences were observed between mean values measured using each of the three methods. Small but statistically significant differences were observed between results obtained from performance evaluation samples for the NIOSH field method and the OSHA laboratory method.

  20. Difficulties in fumonisin determination: the issue of hidden fumonisins.

    PubMed

    Dall'Asta, Chiara; Mangia, Mattia; Berthiller, Franz; Molinelli, Alexandra; Sulyok, Michael; Schuhmacher, Rainer; Krska, Rudolf; Galaverna, Gianni; Dossena, Arnaldo; Marchelli, Rosangela

    2009-11-01

    In this paper, the results obtained by five independent methods for the quantification of fumonisins B(1), B(2), and B(3) in raw maize are reported. Five naturally contaminated maize samples and a reference material were analyzed in three different laboratories. Although each method was validated and common calibrants were used, a poor agreement about fumonisin contamination levels was obtained. In order to investigate the interactions among analyte and matrix leading to this lack of consistency, the occurrence of fumonisin derivatives was checked. Significant amounts of hidden fumonisins were detected for all the considered samples. Furthermore, the application of an in vitro digestion protocol to raw maize allowed for a higher recovery of native fumonisins, suggesting that the interaction occurring among analytes and matrix macromolecules is associative rather than covalent. Depending on the analytical method as well as the maize sample, only 37-68% of the total fumonisin concentrations were found to be extractable from the samples. These results are particularly impressive and significant in the case of the certified reference material, underlying the actual difficulties in ascertaining the trueness of a method for fumonisin determination, opening thus an important issue for risk assessment.

  1. Critical factors for assembling a high volume of DNA barcodes

    PubMed Central

    Hajibabaei, Mehrdad; deWaard, Jeremy R; Ivanova, Natalia V; Ratnasingham, Sujeevan; Dooh, Robert T; Kirk, Stephanie L; Mackie, Paula M; Hebert, Paul D.N

    2005-01-01

    Large-scale DNA barcoding projects are now moving toward activation while the creation of a comprehensive barcode library for eukaryotes will ultimately require the acquisition of some 100 million barcodes. To satisfy this need, analytical facilities must adopt protocols that can support the rapid, cost-effective assembly of barcodes. In this paper we discuss the prospects for establishing high volume DNA barcoding facilities by evaluating key steps in the analytical chain from specimens to barcodes. Alliances with members of the taxonomic community represent the most effective strategy for provisioning the analytical chain with specimens. The optimal protocols for DNA extraction and subsequent PCR amplification of the barcode region depend strongly on their condition, but production targets of 100K barcode records per year are now feasible for facilities working with compliant specimens. The analysis of museum collections is currently challenging, but PCR cocktails that combine polymerases with repair enzyme(s) promise future success. Barcode analysis is already a cost-effective option for species identification in some situations and this will increasingly be the case as reference libraries are assembled and analytical protocols are simplified. PMID:16214753

  2. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the whole range of target substances as well as chemo-taxonomic studies and fingerprinting of complex mixtures, which are present in biological or environmental samples. Due to low consumption of eluent (usually 0.3-1mL/run) mainly composed of water-alcohol binary mixtures, this method can be considered as environmentally friendly and green chemistry focused analytical tool, supplementary to analytical protocols involving column chromatography or planar micro-fluidic devices. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Comparing rapid methods for detecting Listeria in seafood and environmental samples using the most probably number (MPN) technique.

    PubMed

    Cruz, Cristina D; Win, Jessicah K; Chantarachoti, Jiraporn; Mutukumira, Anthony N; Fletcher, Graham C

    2012-02-15

    The standard Bacteriological Analytical Manual (BAM) protocol for detecting Listeria in food and on environmental surfaces takes about 96 h. Some studies indicate that rapid methods, which produce results within 48 h, may be as sensitive and accurate as the culture protocol. As they only give presence/absence results, it can be difficult to compare the accuracy of results generated. We used the Most Probable Number (MPN) technique to evaluate the performance and detection limits of six rapid kits for detecting Listeria in seafood and on an environmental surface compared with the standard protocol. Three seafood products and an environmental surface were inoculated with similar known cell concentrations of Listeria and analyzed according to the manufacturers' instructions. The MPN was estimated using the MPN-BAM spreadsheet. For the seafood products no differences were observed among the rapid kits and efficiency was similar to the BAM method. On the environmental surface the BAM protocol had a higher recovery rate (sensitivity) than any of the rapid kits tested. Clearview™, Reveal®, TECRA® and VIDAS® LDUO detected the cells but only at high concentrations (>10(2) CFU/10 cm(2)). Two kits (VIP™ and Petrifilm™) failed to detect 10(4) CFU/10 cm(2). The MPN method was a useful tool for comparing the results generated by these presence/absence test kits. There remains a need to develop a rapid and sensitive method for detecting Listeria in environmental samples that performs as well as the BAM protocol, since none of the rapid tests used in this study achieved a satisfactory result. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Green approach using monolithic column for simultaneous determination of coformulated drugs.

    PubMed

    Yehia, Ali M; Mohamed, Heba M

    2016-06-01

    Green chemistry and sustainability is now entirely encompassed across the majority of pharmaceutical companies and research labs. Researchers' attention is careworn toward implementing the green analytical chemistry principles for more eco-friendly analytical methodologies. Solvents play a dominant role in determining the greenness of the analytical procedure. Using safer solvents, the greenness profile of the methodology could be increased remarkably. In this context, a green chromatographic method has been developed and validated for the simultaneous determination of phenylephrine, paracetamol, and guaifenesin in their ternary pharmaceutical mixture. The chromatographic separation was carried out using monolithic column and green solvents as mobile phase. The use of monolithic column allows efficient separation protocols at higher flow rates, which results in short time of analysis. Two-factor three-level experimental design was used to optimize the chromatographic conditions. The greenness profile of the proposed methodology was assessed using eco-scale as a green metrics and was found to be an excellent green method with regard to the usage and production of hazardous chemicals and solvents, energy consumption, and amount of produced waste. The proposed method improved the environmental impact without compromising the analytical performance criteria and could be used as a safer alternate for the routine analysis of the studied drugs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Target analyte quantification by isotope dilution LC-MS/MS directly referring to internal standard concentrations--validation for serum cortisol measurement.

    PubMed

    Maier, Barbara; Vogeser, Michael

    2013-04-01

    Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.

  6. Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol.

    PubMed

    Klonoff, David C; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A; Arreaza-Rubin, Guillermo; Burk, Robert D; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W

    2016-05-01

    Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled "Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program" is attached as supplementary material. This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. © 2015 Diabetes Technology Society.

  7. 40 CFR Appendix G to Subpart A of... - UNEP Recommendations for Conditions Applied to Exemption for Essential Laboratory and Analytical...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...

  8. 40 CFR Appendix G to Subpart A of... - UNEP Recommendations for Conditions Applied to Exemption for Essential Laboratory and Analytical...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...

  9. 40 CFR Appendix G to Subpart A of... - UNEP Recommendations for Conditions Applied to Exemption for Essential Laboratory and Analytical...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...

  10. 40 CFR Appendix G to Subpart A of... - UNEP Recommendations for Conditions Applied to Exemption for Essential Laboratory and Analytical...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...

  11. 40 CFR Appendix G to Subpart A of... - UNEP Recommendations for Conditions Applied to Exemption for Essential Laboratory and Analytical...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...

  12. The combination of four analytical methods to explore skeletal muscle metabolomics: Better coverage of metabolic pathways or a marketing argument?

    PubMed

    Bruno, C; Patin, F; Bocca, C; Nadal-Desbarats, L; Bonnier, F; Reynier, P; Emond, P; Vourc'h, P; Joseph-Delafont, K; Corcia, P; Andres, C R; Blasco, H

    2018-01-30

    Metabolomics is an emerging science based on diverse high throughput methods that are rapidly evolving to improve metabolic coverage of biological fluids and tissues. Technical progress has led researchers to combine several analytical methods without reporting the impact on metabolic coverage of such a strategy. The objective of our study was to develop and validate several analytical techniques (mass spectrometry coupled to gas or liquid chromatography and nuclear magnetic resonance) for the metabolomic analysis of small muscle samples and evaluate the impact of combining methods for more exhaustive metabolite covering. We evaluated the muscle metabolome from the same pool of mouse muscle samples after 2 metabolite extraction protocols. Four analytical methods were used: targeted flow injection analysis coupled with mass spectrometry (FIA-MS/MS), gas chromatography coupled with mass spectrometry (GC-MS), liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS), and nuclear magnetic resonance (NMR) analysis. We evaluated the global variability of each compound i.e., analytical (from quality controls) and extraction variability (from muscle extracts). We determined the best extraction method and we reported the common and distinct metabolites identified based on the number and identity of the compounds detected with low analytical variability (variation coefficient<30%) for each method. Finally, we assessed the coverage of muscle metabolic pathways obtained. Methanol/chloroform/water and water/methanol were the best extraction solvent for muscle metabolome analysis by NMR and MS, respectively. We identified 38 metabolites by nuclear magnetic resonance, 37 by FIA-MS/MS, 18 by GC-MS, and 80 by LC-HRMS. The combination led us to identify a total of 132 metabolites with low variability partitioned into 58 metabolic pathways, such as amino acid, nitrogen, purine, and pyrimidine metabolism, and the citric acid cycle. This combination also showed that the contribution of GC-MS was low when used in combination with other mass spectrometry methods and nuclear magnetic resonance to explore muscle samples. This study reports the validation of several analytical methods, based on nuclear magnetic resonance and several mass spectrometry methods, to explore the muscle metabolome from a small amount of tissue, comparable to that obtained during a clinical trial. The combination of several techniques may be relevant for the exploration of muscle metabolism, with acceptable analytical variability and overlap between methods However, the difficult and time-consuming data pre-processing, processing, and statistical analysis steps do not justify systematically combining analytical methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Engineering of a miniaturized, robotic clinical laboratory

    PubMed Central

    Nourse, Marilyn B.; Engel, Kate; Anekal, Samartha G.; Bailey, Jocelyn A.; Bhatta, Pradeep; Bhave, Devayani P.; Chandrasekaran, Shekar; Chen, Yutao; Chow, Steven; Das, Ushati; Galil, Erez; Gong, Xinwei; Gessert, Steven F.; Ha, Kevin D.; Hu, Ran; Hyland, Laura; Jammalamadaka, Arvind; Jayasurya, Karthik; Kemp, Timothy M.; Kim, Andrew N.; Lee, Lucie S.; Liu, Yang Lily; Nguyen, Alphonso; O'Leary, Jared; Pangarkar, Chinmay H.; Patel, Paul J.; Quon, Ken; Ramachandran, Pradeep L.; Rappaport, Amy R.; Roy, Joy; Sapida, Jerald F.; Sergeev, Nikolay V.; Shee, Chandan; Shenoy, Renuka; Sivaraman, Sharada; Sosa‐Padilla, Bernardo; Tran, Lorraine; Trent, Amanda; Waggoner, Thomas C.; Wodziak, Dariusz; Yuan, Amy; Zhao, Peter; Holmes, Elizabeth A.

    2018-01-01

    Abstract The ability to perform laboratory testing near the patient and with smaller blood volumes would benefit patients and physicians alike. We describe our design of a miniaturized clinical laboratory system with three components: a hardware platform (ie, the miniLab) that performs preanalytical and analytical processing steps using miniaturized sample manipulation and detection modules, an assay‐configurable cartridge that provides consumable materials and assay reagents, and a server that communicates bidirectionally with the miniLab to manage assay‐specific protocols and analyze, store, and report results (i.e., the virtual analyzer). The miniLab can detect analytes in blood using multiple methods, including molecular diagnostics, immunoassays, clinical chemistry, and hematology. Analytical performance results show that our qualitative Zika virus assay has a limit of detection of 55 genomic copies/ml. For our anti‐herpes simplex virus type 2 immunoglobulin G, lipid panel, and lymphocyte subset panel assays, the miniLab has low imprecision, and method comparison results agree well with those from the United States Food and Drug Administration‐cleared devices. With its small footprint and versatility, the miniLab has the potential to provide testing of a range of analytes in decentralized locations. PMID:29376134

  14. Engineering of a miniaturized, robotic clinical laboratory.

    PubMed

    Nourse, Marilyn B; Engel, Kate; Anekal, Samartha G; Bailey, Jocelyn A; Bhatta, Pradeep; Bhave, Devayani P; Chandrasekaran, Shekar; Chen, Yutao; Chow, Steven; Das, Ushati; Galil, Erez; Gong, Xinwei; Gessert, Steven F; Ha, Kevin D; Hu, Ran; Hyland, Laura; Jammalamadaka, Arvind; Jayasurya, Karthik; Kemp, Timothy M; Kim, Andrew N; Lee, Lucie S; Liu, Yang Lily; Nguyen, Alphonso; O'Leary, Jared; Pangarkar, Chinmay H; Patel, Paul J; Quon, Ken; Ramachandran, Pradeep L; Rappaport, Amy R; Roy, Joy; Sapida, Jerald F; Sergeev, Nikolay V; Shee, Chandan; Shenoy, Renuka; Sivaraman, Sharada; Sosa-Padilla, Bernardo; Tran, Lorraine; Trent, Amanda; Waggoner, Thomas C; Wodziak, Dariusz; Yuan, Amy; Zhao, Peter; Young, Daniel L; Robertson, Channing R; Holmes, Elizabeth A

    2018-01-01

    The ability to perform laboratory testing near the patient and with smaller blood volumes would benefit patients and physicians alike. We describe our design of a miniaturized clinical laboratory system with three components: a hardware platform (ie, the miniLab) that performs preanalytical and analytical processing steps using miniaturized sample manipulation and detection modules, an assay-configurable cartridge that provides consumable materials and assay reagents, and a server that communicates bidirectionally with the miniLab to manage assay-specific protocols and analyze, store, and report results (i.e., the virtual analyzer). The miniLab can detect analytes in blood using multiple methods, including molecular diagnostics, immunoassays, clinical chemistry, and hematology. Analytical performance results show that our qualitative Zika virus assay has a limit of detection of 55 genomic copies/ml. For our anti-herpes simplex virus type 2 immunoglobulin G, lipid panel, and lymphocyte subset panel assays, the miniLab has low imprecision, and method comparison results agree well with those from the United States Food and Drug Administration-cleared devices. With its small footprint and versatility, the miniLab has the potential to provide testing of a range of analytes in decentralized locations.

  15. Effect of analytical conditions in wavelength dispersive electron microprobe analysis on the measurement of strontium-to-calcium (Sr/Ca) ratios in otoliths of anadromous salmonids

    USGS Publications Warehouse

    Zimmerman, Christian E.; Nielsen, Roger L.

    2003-01-01

    The use of strontium-to-calcium (Sr/Ca) ratios in otoliths is becoming a standard method to describe life history type and the chronology of migrations between freshwater and seawater habitats in teleosts (e.g. Kalish, 1990; Radtke et al., 1990; Secor, 1992; Rieman et al., 1994; Radtke, 1995; Limburg, 1995; Tzeng et al. 1997; Volk et al., 2000; Zimmerman, 2000; Zimmerman and Reeves, 2000, 2002). This method provides critical information concerning the relationship and ecology of species exhibiting phenotypic variation in migratory behavior (Kalish, 1990; Secor, 1999). Methods and procedures, however, vary among laboratories because a standard method or protocol for measurement of Sr in otoliths does not exist. In this note, we examine the variations in analytical conditions in an effort to increase precision of Sr/Ca measurements. From these findings we argue that precision can be maximized with higher beam current (although there is specimen damage) than previously recommended by Gunn et al. (1992).

  16. An UPLC-MS/MS method for separation and accurate quantification of tamoxifen and its metabolites isomers.

    PubMed

    Arellano, Cécile; Allal, Ben; Goubaa, Anwar; Roché, Henri; Chatelut, Etienne

    2014-11-01

    A selective and accurate analytical method is needed to quantify tamoxifen and its phase I metabolites in a prospective clinical protocol, for evaluation of pharmacokinetic parameters of tamoxifen and its metabolites in adjuvant treatment of breast cancer. The selectivity of the analytical method is a fundamental criteria to allow the quantification of the main active metabolites (Z)-isomers from (Z)'-isomers. An UPLC-MS/MS method was developed and validated for the quantification of (Z)-tamoxifen, (Z)-endoxifen, (E)-endoxifen, Z'-endoxifen, (Z)'-endoxifen, (Z)-4-hydroxytamoxifen, (Z)-4'-hydroxytamoxifen, N-desmethyl tamoxifen, and tamoxifen-N-oxide. The validation range was set between 0.5ng/mL and 125ng/mL for 4-hydroxytamoxifen and endoxifen isomers, and between 12.5ng/mL and 300ng/mL for tamoxifen, tamoxifen N-desmethyl and tamoxifen-N-oxide. The application to patient plasma samples was performed. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  18. A Group Neighborhood Average Clock Synchronization Protocol for Wireless Sensor Networks

    PubMed Central

    Lin, Lin; Ma, Shiwei; Ma, Maode

    2014-01-01

    Clock synchronization is a very important issue for the applications of wireless sensor networks. The sensors need to keep a strict clock so that users can know exactly what happens in the monitoring area at the same time. This paper proposes a novel internal distributed clock synchronization solution using group neighborhood average. Each sensor node collects the offset and skew rate of the neighbors. Group averaging of offset and skew rate value are calculated instead of conventional point-to-point averaging method. The sensor node then returns compensated value back to the neighbors. The propagation delay is considered and compensated. The analytical analysis of offset and skew compensation is presented. Simulation results validate the effectiveness of the protocol and reveal that the protocol allows sensor networks to quickly establish a consensus clock and maintain a small deviation from the consensus clock. PMID:25120163

  19. Channel MAC Protocol for Opportunistic Communication in Ad Hoc Wireless Networks

    NASA Astrophysics Data System (ADS)

    Ashraf, Manzur; Jayasuriya, Aruna; Perreau, Sylvie

    2008-12-01

    Despite significant research effort, the performance of distributed medium access control methods has failed to meet theoretical expectations. This paper proposes a protocol named "Channel MAC" performing a fully distributed medium access control based on opportunistic communication principles. In this protocol, nodes access the channel when the channel quality increases beyond a threshold, while neighbouring nodes are deemed to be silent. Once a node starts transmitting, it will keep transmitting until the channel becomes "bad." We derive an analytical throughput limit for Channel MAC in a shared multiple access environment. Furthermore, three performance metrics of Channel MAC—throughput, fairness, and delay—are analysed in single hop and multihop scenarios using NS2 simulations. The simulation results show throughput performance improvement of up to 130% with Channel MAC over IEEE 802.11. We also show that the severe resource starvation problem (unfairness) of IEEE 802.11 in some network scenarios is reduced by the Channel MAC mechanism.

  20. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less

  1. Technical pre-analytical effects on the clinical biochemistry of Atlantic salmon (Salmo salar L.).

    PubMed

    Braceland, M; Houston, K; Ashby, A; Matthews, C; Haining, H; Rodger, H; Eckersall, P D

    2017-01-01

    Clinical biochemistry has long been utilized in human and veterinary medicine as a vital diagnostic tool, but despite occasional studies showing its usefulness in monitoring health status in Atlantic salmon (Salmo salar L.), it has not yet been widely utilized within the aquaculture industry. This is due, in part, to a lack of an agreed protocol for collection and processing of blood prior to analysis. Moreover, while the analytical phase of clinical biochemistry is well controlled, there is a growing understanding that technical pre-analytical variables can influence analyte concentrations or activities. In addition, post-analytical interpretation of treatment effects is variable in the literature, thus making the true effect of sample treatment hard to evaluate. Therefore, a number of pre-analytical treatments have been investigated to examine their effect on analyte concentrations and activities. In addition, reference ranges for salmon plasma biochemical analytes have been established to inform veterinary practitioners and the aquaculture industry of the importance of clinical biochemistry in health and disease monitoring. Furthermore, a standardized protocol for blood collection has been proposed. © 2016 The Authors Journal of Fish Diseases Published by John Wiley & Sons Ltd.

  2. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    DOT National Transportation Integrated Search

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  3. A Protocol Layer Trust-Based Intrusion Detection Scheme for Wireless Sensor Networks

    PubMed Central

    Wang, Jian; Jiang, Shuai; Fapojuwo, Abraham O.

    2017-01-01

    This article proposes a protocol layer trust-based intrusion detection scheme for wireless sensor networks. Unlike existing work, the trust value of a sensor node is evaluated according to the deviations of key parameters at each protocol layer considering the attacks initiated at different protocol layers will inevitably have impacts on the parameters of the corresponding protocol layers. For simplicity, the paper mainly considers three aspects of trustworthiness, namely physical layer trust, media access control layer trust and network layer trust. The per-layer trust metrics are then combined to determine the overall trust metric of a sensor node. The performance of the proposed intrusion detection mechanism is then analyzed using the t-distribution to derive analytical results of false positive and false negative probabilities. Numerical analytical results, validated by simulation results, are presented in different attack scenarios. It is shown that the proposed protocol layer trust-based intrusion detection scheme outperforms a state-of-the-art scheme in terms of detection probability and false probability, demonstrating its usefulness for detecting cross-layer attacks. PMID:28555023

  4. A Protocol Layer Trust-Based Intrusion Detection Scheme for Wireless Sensor Networks.

    PubMed

    Wang, Jian; Jiang, Shuai; Fapojuwo, Abraham O

    2017-05-27

    This article proposes a protocol layer trust-based intrusion detection scheme for wireless sensor networks. Unlike existing work, the trust value of a sensor node is evaluated according to the deviations of key parameters at each protocol layer considering the attacks initiated at different protocol layers will inevitably have impacts on the parameters of the corresponding protocol layers. For simplicity, the paper mainly considers three aspects of trustworthiness, namely physical layer trust, media access control layer trust and network layer trust. The per-layer trust metrics are then combined to determine the overall trust metric of a sensor node. The performance of the proposed intrusion detection mechanism is then analyzed using the t-distribution to derive analytical results of false positive and false negative probabilities. Numerical analytical results, validated by simulation results, are presented in different attack scenarios. It is shown that the proposed protocol layer trust-based intrusion detection scheme outperforms a state-of-the-art scheme in terms of detection probability and false probability, demonstrating its usefulness for detecting cross-layer attacks.

  5. Assessing precision, bias and sigma-metrics of 53 measurands of the Alinity ci system.

    PubMed

    Westgard, Sten; Petrides, Victoria; Schneider, Sharon; Berman, Marvin; Herzogenrath, Jörg; Orzechowski, Anthony

    2017-12-01

    Assay performance is dependent on the accuracy and precision of a given method. These attributes can be combined into an analytical Sigma-metric, providing a simple value for laboratorians to use in evaluating a test method's capability to meet its analytical quality requirements. Sigma-metrics were determined for 37 clinical chemistry assays, 13 immunoassays, and 3 ICT methods on the Alinity ci system. Analytical Performance Specifications were defined for the assays, following a rationale of using CLIA goals first, then Ricos Desirable goals when CLIA did not regulate the method, and then other sources if the Ricos Desirable goal was unrealistic. A precision study was conducted at Abbott on each assay using the Alinity ci system following the CLSI EP05-A2 protocol. Bias was estimated following the CLSI EP09-A3 protocol using samples with concentrations spanning the assay's measuring interval tested in duplicate on the Alinity ci system and ARCHITECT c8000 and i2000 SR systems, where testing was also performed at Abbott. Using the regression model, the %bias was estimated at an important medical decisions point. Then the Sigma-metric was estimated for each assay and was plotted on a method decision chart. The Sigma-metric was calculated using the equation: Sigma-metric=(%TEa-|%bias|)/%CV. The Sigma-metrics and Normalized Method Decision charts demonstrate that a majority of the Alinity assays perform at least at five Sigma or higher, at or near critical medical decision levels. More than 90% of the assays performed at Five and Six Sigma. None performed below Three Sigma. Sigma-metrics plotted on Normalized Method Decision charts provide useful evaluations of performance. The majority of Alinity ci system assays had sigma values >5 and thus laboratories can expect excellent or world class performance. Laboratorians can use these tools as aids in choosing high-quality products, further contributing to the delivery of excellent quality healthcare for patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Optimisation of an analytical method and results from the inter-laboratory comparison of the migration of regulated substances from food packaging into the new mandatory European Union simulant for dry foodstuffs.

    PubMed

    Jakubowska, Natalia; Beldì, Giorgia; Peychès Bach, Aurélie; Simoneau, Catherine

    2014-01-01

    This paper presents the outcome of the development, optimisation and validation at European Union level of an analytical method for using poly(2,6-diphenyl phenylene oxide--PPPO), which is stipulated in Regulation (EU) No. 10/2011, as food simulant E for testing specific migration from plastics into dry foodstuffs. Two methods for fortifying respectively PPPO and a low-density polyethylene (LDPE) film with surrogate substances that are relevant to food contact were developed. A protocol for cleaning the PPPO and an efficient analytical method were developed for the quantification of butylhydroxytoluene (BHT), benzophenone (BP), diisobutylphthalate (DiBP), bis(2-ethylhexyl) adipate (DEHA) and 1,2-cyclohexanedicarboxylic acid, diisononyl ester (DINCH) from PPPO. A protocol for a migration test from plastics using small migration cells was also developed. The method was validated by an inter-laboratory comparison (ILC) with 16 national reference laboratories for food contact materials in the European Union. This allowed for the first time data to be obtained on the precision and laboratory performance of both migration and quantification. The results showed that the validation ILC was successful even when taking into account the complexity of the exercise. The results showed that the method performance was 7-9% repeatability standard deviation (rSD) for most substances (regardless of concentration), with 12% rSD for the high level of BHT and for DiBP at very low levels. The reproducibility standard deviation results for the 16 European Union laboratories were in the range of 20-30% for the quantification from PPPO (for the three levels of concentrations of the five substances) and 15-40% from migration experiments from the fortified plastic at 60°C for 10 days and subsequent quantification. Considering the lack of data previously available in the literature, this work has demonstrated that the validation of a method is possible both for migration from a film and for quantification into a corresponding simulant for specific migration.

  7. Analytical Measurement of Discrete Hydrogen Sulfide Pools in Biological Specimens

    PubMed Central

    Shen, Xinggui; Peter, Elvis A.; Bir, Shyamal; Wang, Rui; Kevil, Christopher G.

    2015-01-01

    Hydrogen sulfide (H2S) is a ubiquitous gaseous signaling molecule that plays a vital role in numerous cellular functions and has become the focus of many research endeavors including pharmaco-therapeutic manipulation. Amongst the challenges facing the field is the accurate measurement of biologically active H2S. We have recently reported that the typically used methylene blue method and its associated results are invalid and do not measure bonafide H2S. The complexity of analytical H2S measurement reflects the fact that hydrogen sulfide is a volatile gas and exists in the body in different forms, including a free form, an acid labile pool and as bound sulfane sulfur. Here we describe a new protocol to discretely measure specific H2S pools using the monobromobimane method coupled with RP-HPLC. This new protocol involves selective liberation, trapping and derivatization of H2S. Acid-labile H2S is released by incubating the sample in an acidic solution (pH 2.6) of 100 mM phosphate buffer with 0.1 mM DTPA, in an enclosed system to contain volatilized H2S. Volatilized H2S is then trapped in 100 mM Tris-HCl (pH 9.5, 0.1 mM DTPA) and then reacted with excess monobromobimane. In a separate aliquot, the contribution of bound sulfane sulfur pool was measured by incubating the sample with 1 mM TCEP (Tris(2-carboxyethyl)phosphine hydrochloride), a reducing agent to reduce disulfide bonds, in 100 mM phosphate buffer (pH 2.6, 0.1 mM DTPA), and H2S measurement performed in an analogous manner to the one described above. The acid labile pool was determined by subtracting the free hydrogen sulfide value from the value obtained by the acid liberation protocol. The bound sulfane sulfur pool was determined by subtracting the H2S measurement from the acid liberation protocol alone compared to that of TCEP plus acidic conditions. In summary, our new method protocol allows very sensitive and accurate measurement of the three primary biological pools of H2S including free, acid labile, and bound sulfane sulfur in various biological specimens. PMID:22561703

  8. Application Of A Potentiometric Electronic Tongue For The Determination Of Free SO2 And Other Analytical Parameters In White Wines From New Zealand

    NASA Astrophysics Data System (ADS)

    Mednova, Olga; Kirsanov, Dmitry; Rudnitskaya, Alisa; Kilmartin, Paul; Legin, Andrey

    2009-05-01

    The present study deals with a potentiometric electronic tongue (ET) multisensor system applied for the simultaneous determination of several chemical parameters for white wines produced in New Zealand. Methods in use for wine quality control are often expensive and require considerable time and skilled operation. The ET approach usually offers a simple and fast measurement protocol and allows automation for on-line analysis under industrial conditions. The ET device developed in this research is capable of quantifying the free and total SO2 content, total acids and some polyphenolic compounds in white wines with acceptable analytical errors.

  9. Simple and Sensitive Paper-Based Device Coupling Electrochemical Sample Pretreatment and Colorimetric Detection.

    PubMed

    Silva, Thalita G; de Araujo, William R; Muñoz, Rodrigo A A; Richter, Eduardo M; Santana, Mário H P; Coltro, Wendell K T; Paixão, Thiago R L C

    2016-05-17

    We report the development of a simple, portable, low-cost, high-throughput visual colorimetric paper-based analytical device for the detection of procaine in seized cocaine samples. The interference of most common cutting agents found in cocaine samples was verified, and a novel electrochemical approach was used for sample pretreatment in order to increase the selectivity. Under the optimized experimental conditions, a linear analytical curve was obtained for procaine concentrations ranging from 5 to 60 μmol L(-1), with a detection limit of 0.9 μmol L(-1). The accuracy of the proposed method was evaluated using seized cocaine samples and an addition and recovery protocol.

  10. Multi-site study of additive genetic effects on fractional anisotropy of cerebral white matter: Comparing meta and megaanalytical approaches for data pooling.

    PubMed

    Kochunov, Peter; Jahanshad, Neda; Sprooten, Emma; Nichols, Thomas E; Mandl, René C; Almasy, Laura; Booth, Tom; Brouwer, Rachel M; Curran, Joanne E; de Zubicaray, Greig I; Dimitrova, Rali; Duggirala, Ravi; Fox, Peter T; Hong, L Elliot; Landman, Bennett A; Lemaitre, Hervé; Lopez, Lorna M; Martin, Nicholas G; McMahon, Katie L; Mitchell, Braxton D; Olvera, Rene L; Peterson, Charles P; Starr, John M; Sussmann, Jessika E; Toga, Arthur W; Wardlaw, Joanna M; Wright, Margaret J; Wright, Susan N; Bastin, Mark E; McIntosh, Andrew M; Boomsma, Dorret I; Kahn, René S; den Braber, Anouk; de Geus, Eco J C; Deary, Ian J; Hulshoff Pol, Hilleke E; Williamson, Douglas E; Blangero, John; van 't Ent, Dennis; Thompson, Paul M; Glahn, David C

    2014-07-15

    Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2014-01-01

    Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    PubMed

    Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng

    2013-01-01

    Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  14. Decision support for environmental management of industrial non-hazardous secondary materials: New analytical methods combined with simulation and optimization modeling.

    PubMed

    Little, Keith W; Koralegedara, Nadeesha H; Northeim, Coleen M; Al-Abed, Souhail R

    2017-07-01

    Non-hazardous solid materials from industrial processes, once regarded as waste and disposed in landfills, offer numerous environmental and economic advantages when put to beneficial uses (BUs). Proper management of these industrial non-hazardous secondary materials (INSM) requires estimates of their probable environmental impacts among disposal as well as BU options. The U.S. Environmental Protection Agency (EPA) has recently approved new analytical methods (EPA Methods 1313-1316) to assess leachability of constituents of potential concern in these materials. These new methods are more realistic for many disposal and BU options than historical methods, such as the toxicity characteristic leaching protocol. Experimental data from these new methods are used to parameterize a chemical fate and transport (F&T) model to simulate long-term environmental releases from flue gas desulfurization gypsum (FGDG) when disposed of in an industrial landfill or beneficially used as an agricultural soil amendment. The F&T model is also coupled with optimization algorithms, the Beneficial Use Decision Support System (BUDSS), under development by EPA to enhance INSM management. Published by Elsevier Ltd.

  15. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  16. Analysis of longitudinal data from animals where some data are missing in SPSS

    PubMed Central

    Duricki, DA; Soleman, S; Moon, LDF

    2017-01-01

    Testing of therapies for disease or injury often involves analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly where some data are missing) yet are not used widely by pre-clinical researchers. We provide here an easy to use protocol for analysing longitudinal data from animals and present a click-by-click guide for performing suitable analyses using the statistical package SPSS. We guide readers through analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. We show that repeated measures analysis of covariance failed to detect a treatment effect when a few data points were missing (due to animal drop-out) whereas analysis using an alternative method detected a beneficial effect of treatment; specifically, we demonstrate the superiority of linear models (with various covariance structures) analysed using Restricted Maximum Likelihood estimation (to include all available data). This protocol takes two hours to follow. PMID:27196723

  17. Non-Gradient Blue Native Polyacrylamide Gel Electrophoresis.

    PubMed

    Luo, Xiaoting; Wu, Jinzi; Jin, Zhen; Yan, Liang-Jun

    2017-02-02

    Gradient blue native polyacrylamide gel electrophoresis (BN-PAGE) is a well established and widely used technique for activity analysis of high-molecular-weight proteins, protein complexes, and protein-protein interactions. Since its inception in the early 1990s, a variety of minor modifications have been made to this gradient gel analytical method. Here we provide a major modification of the method, which we call non-gradient BN-PAGE. The procedure, similar to that of non-gradient SDS-PAGE, is simple because there is no expensive gradient maker involved. The non-gradient BN-PAGE protocols presented herein provide guidelines on the analysis of mitochondrial protein complexes, in particular, dihydrolipoamide dehydrogenase (DLDH) and those in the electron transport chain. Protocols for the analysis of blood esterases or mitochondrial esterases are also presented. The non-gradient BN-PAGE method may be tailored for analysis of specific proteins according to their molecular weight regardless of whether the target proteins are hydrophobic or hydrophilic. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  18. USDA-ARS GRACEnet Project Protocols, Chapter 3. Chamber-based trace gas flux measurements4

    USDA-ARS?s Scientific Manuscript database

    This protocol addresses N2O, CO2 and CH4 flux measurement by soil chamber methodology. The reactivities of other gasses of interest such as NOx O3, CO, and NH3 will require different chambers and associated instrumentation. Carbon dioxide is included as an analyte with this protocol; however, when p...

  19. Development of an analytical method for the simultaneous analysis of MCPD esters and glycidyl esters in oil-based foodstuffs.

    PubMed

    Ermacora, Alessia; Hrnčiřík, Karel

    2014-01-01

    Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.

  20. Cumulative effective dose and cancer risk for pediatric population in repetitive full spine follow-up imaging: How micro dose is the EOS microdose protocol?

    PubMed

    Law, Martin; Ma, Wang-Kei; Lau, Damian; Cheung, Kenneth; Ip, Janice; Yip, Lawrance; Lam, Wendy

    2018-04-01

    To evaluate and to obtain analytic formulation for the calculation of the effective dose and associated cancer risk using the EOS microdose protocol for scoliotic pediatric patients undergoing full spine imaging at different age of exposure; to demonstrate the microdose protocol capable of delivering lesser radiation dose and hence of further reducing cancer risk induction when compared with the EOS low dose protocol; to obtain cumulative effective dose and cancer risk for both genders scoliotic pediatrics of US and Hong Kong population using the microdose protocol. Organ absorbed doses of full spine exposed scoliotic pediatric patients have been simulated with the use of EOS microdose protocol imaging parameters input to the Monte Carlo software PCXMC. Gender and age specific effective dose has been calculated with the simulated organ absorbed dose using the ICRP-103 approach. The associated radiation induced cancer risk, expressed as lifetime attributable risk (LAR), has been estimated according to the method introduced in the Biological Effects of Ionizing Radiation VII report. Values of LAR have been estimated for scoliotic patients exposed repetitively during their follow up period at different age for US and Hong Kong population. The effective doses of full spine imaging with simultaneous posteroanterior and lateral projection for patients exposed at the age between 5 and 18 years using the EOS microdose protocol have been calculated within the range of 2.54-14.75 μSv. The corresponding LAR for US and Hong Kong population was ranged between 0.04 × 10 -6 and 0.84 × 10 -6 . Cumulative effective dose and cancer risk during follow-up period can be estimated using the results and are of information to patients and their parents. With the use of computer simulation and analytic formulation, we obtained the cumulative effective dose and cancer risk at any age of exposure for pediatric patients of US and Hong Kong population undergoing repetitive microdose protocol full spine imaging. Girls would be at a statistically significant higher cumulative cancer risk than boys undergoing the same microdose full spine imaging protocol and the same follow-up schedule. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. An automated protocol for performance benchmarking a widefield fluorescence microscope.

    PubMed

    Halter, Michael; Bier, Elianna; DeRose, Paul C; Cooksey, Gregory A; Choquette, Steven J; Plant, Anne L; Elliott, John T

    2014-11-01

    Widefield fluorescence microscopy is a highly used tool for visually assessing biological samples and for quantifying cell responses. Despite its widespread use in high content analysis and other imaging applications, few published methods exist for evaluating and benchmarking the analytical performance of a microscope. Easy-to-use benchmarking methods would facilitate the use of fluorescence imaging as a quantitative analytical tool in research applications, and would aid the determination of instrumental method validation for commercial product development applications. We describe and evaluate an automated method to characterize a fluorescence imaging system's performance by benchmarking the detection threshold, saturation, and linear dynamic range to a reference material. The benchmarking procedure is demonstrated using two different materials as the reference material, uranyl-ion-doped glass and Schott 475 GG filter glass. Both are suitable candidate reference materials that are homogeneously fluorescent and highly photostable, and the Schott 475 GG filter glass is currently commercially available. In addition to benchmarking the analytical performance, we also demonstrate that the reference materials provide for accurate day to day intensity calibration. Published 2014 Wiley Periodicals Inc. Published 2014 Wiley Periodicals Inc. This article is a US government work and, as such, is in the public domain in the United States of America.

  2. Looking for new biomarkers of skin wound vitality with a cytokine-based multiplex assay: preliminary study.

    PubMed

    Peyron, Pierre-Antoine; Baccino, Éric; Nagot, Nicolas; Lehmann, Sylvain; Delaby, Constance

    2017-02-01

    Determination of skin wound vitality is an important issue in forensic practice. No reliable biomarker currently exists. Quantification of inflammatory cytokines in injured skin with MSD ® technology is an innovative and promising approach. This preliminary study aims to develop a protocol for the preparation and the analysis of skin samples. Samples from ante mortem wounds, post mortem wounds, and intact skin ("control samples") were taken from corpses at the autopsy. After an optimization of the pre-analytical protocol had been performed in terms of skin homogeneisation and proteic extraction, the concentration of TNF-α was measured in each sample with the MSD ® approach. Then five other cytokines of interest (IL-1β, IL-6, IL-10, IL-12p70 and IFN-γ) were simultaneously quantified with a MSD ® multiplex assay. The optimal pre-analytical conditions consist in a proteic extraction from a 6 mm diameter skin sample, in a PBS buffer with triton 0,05%. Our results show the linearity and the reproductibility of the TNF-α quantification with MSD ® , and an inter- and intra-individual variability of the concentrations of proteins. The MSD ® multiplex assay is likely to detect differential skin concentrations for each cytokine of interest. This preliminary study was used to develop and optimize the pre-analytical and analytical conditions of the MSD ® method using injured and healthy skin samples, for the purpose of looking for and identifying the cytokine, or the set of cytokines, that may be biomarkers of skin wound vitality.

  3. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Protocol and standard operating procedures for common use in a worldwide multicenter study on reference values.

    PubMed

    Ozarda, Yesim; Ichihara, Kiyoshi; Barth, Julian H; Klee, George

    2013-05-01

    The reference intervals (RIs) given in laboratory reports have an important role in aiding clinicians in interpreting test results in reference to values of healthy populations. In this report, we present a proposed protocol and standard operating procedures (SOPs) for common use in conducting multicenter RI studies on a national or international scale. The protocols and consensus on their contents were refined through discussions in recent C-RIDL meetings. The protocol describes in detail (1) the scheme and organization of the study, (2) the target population, inclusion/exclusion criteria, ethnicity, and sample size, (3) health status questionnaire, (4) target analytes, (5) blood collection, (6) sample processing and storage, (7) assays, (8) cross-check testing, (9) ethics, (10) data analyses, and (11) reporting of results. In addition, the protocol proposes the common measurement of a panel of sera when no standard materials exist for harmonization of test results. It also describes the requirements of the central laboratory, including the method of cross-check testing between the central laboratory of each country and local laboratories. This protocol and the SOPs remain largely exploratory and may require a reevaluation from the practical point of view after their implementation in the ongoing worldwide study. The paper is mainly intended to be a basis for discussion in the scientific community.

  5. Optimal molecular profiling of tissue and tissue components: defining the best processing and microdissection methods for biomedical applications.

    PubMed

    Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Su, Gloria H; Emmert-Buck, Michael R

    2005-02-01

    Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This article reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies, and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing, and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high quality, appropriately anatomically tagged scientific results. In optimized protocols is a source of inefficiency in current life science research. Improvement in this area will significantly increase life science quality and productivity. The article is divided into introduction, materials, protocols, and notes sections. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this article, readers are advised to read through the entire article first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.

  6. Spectrophotometric Analysis of Phenolic Compounds in Grapes and Wines.

    PubMed

    Aleixandre-Tudo, Jose Luis; Buica, Astrid; Nieuwoudt, Helene; Aleixandre, Jose Luis; du Toit, Wessel

    2017-05-24

    Phenolic compounds are of crucial importance for red wine color and mouthfeel attributes. A large number of enzymatic and chemical reactions involving phenolic compounds take place during winemaking and aging. Despite the large number of published analytical methods for phenolic analyses, the values obtained may vary considerably. In addition, the existing scientific knowledge needs to be updated, but also critically evaluated and simplified for newcomers and wine industry partners. The most used and widely cited spectrophotometric methods for grape and wine phenolic analysis were identified through a bibliometric search using the Science Citation Index-Expanded (SCIE) database accessed through the Web of Science (WOS) platform from Thompson Reuters. The selection of spectrophotometry was based on its ease of use as a routine analytical technique. On the basis of the number of citations, as well as the advantages and disadvantages reported, the modified Somers assay appears as a multistep, simple, and robust procedure that provides a good estimation of the state of the anthocyanins equilibria. Precipitation methods for total tannin levels have also been identified as preferred protocols for these types of compounds. Good reported correlations between methods (methylcellulose precipitable vs bovine serum albumin) and between these and perceived red wine astringency, in combination with the adaptation to high-throughput format, make them suitable for routine analysis. The bovine serum albumin tannin assay also allows for the estimation of the anthocyanins content with the measurement of small and large polymeric pigments. Finally, the measurement of wine color using the CIELab space approach is also suggested as the protocol of choice as it provides good insight into the wine's color properties.

  7. Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.

    2004-01-01

    A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.

  8. MC ICP-MS δ(34)S(VCDT) measurement of dissolved sulfate in environmental aqueous samples after matrix separation by means of an anion exchange membrane.

    PubMed

    Hanousek, Ondrej; Berger, Torsten W; Prohaska, Thomas

    2016-01-01

    Analysis of (34)S/(32)S of sulfate in rainwater and soil solutions can be seen as a powerful tool for the study of the sulfur cycle. Therefore, it is considered as a useful means, e.g., for amelioration and calibration of ecological or biogeochemical models. Due to several analytical limitations, mainly caused by low sulfate concentration in rainwater, complex matrix of soil solutions, limited sample volume, and high number of samples in ecosystem studies, a straightforward analytical protocol is required to provide accurate S isotopic data on a large set of diverse samples. Therefore, sulfate separation by anion exchange membrane was combined with precise isotopic measurement by multicollector inductively coupled plasma mass spectrometry (MC ICP-MS). The separation method proved to be able to remove quantitatively sulfate from matrix cations (Ca, K, Na, or Li) which is a precondition in order to avoid a matrix-induced analytical bias in the mass spectrometer. Moreover, sulfate exchange on the resin is capable of preconcentrating sulfate from low concentrated solutions (to factor 3 in our protocol). No significant sulfur isotope fractionation was observed during separation and preconcentration. MC ICP-MS operated at edge mass resolution has enabled the direct (34)S/(32)S analysis of sulfate eluted from the membrane, with an expanded uncertainty U (k = 2) down to 0.3 ‰ (a single measurement). The protocol was optimized and validated using different sulfate solutions and different matrix compositions. The optimized method was applied in a study on solute samples retrieved in a beech (Fagus sylvatica) forest in the Vienna Woods. Both rainwater (precipitation and tree throughfall) and soil solution δ (34)SVCDT ranged between 4 and 6 ‰, the ratio in soil solution being slightly lower. The lower ratio indicates that a considerable portion of the atmospherically deposited sulfate is cycled through the organic S pool before being released to the soil solution. Nearly the same trends and variations were observed in soil solution and rainwater δ (34)SVCDT values showing that sulfate adsorption/desorption are not important processes in the studied soil.

  9. Long-term variability in sugarcane bagasse feedstock compositional methods: Sources and magnitude of analytical variability

    DOE PAGES

    Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie; ...

    2016-10-18

    In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less

  10. Development of a dynamic headspace gas chromatography-mass spectrometry method for on-site analysis of sulfur mustard degradation products in sediments.

    PubMed

    Magnusson, R; Nordlander, T; Östin, A

    2016-01-15

    Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Long-term variability in sugarcane bagasse feedstock compositional methods: Sources and magnitude of analytical variability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie

    In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less

  12. Heavy vehicle driver workload assessment. Task 1, task analysis data and protocols review

    DOT National Transportation Integrated Search

    This report contains a review of available task analytic data and protocols pertinent to heavy vehicle operation and determination of the availability and relevance of such data to heavy vehicle driver workload assessment. Additionally, a preliminary...

  13. Rapid determination of anti-estrogens by gas chromatography/mass spectrometry in urine: Method validation and application to real samples.

    PubMed

    Gerace, E; Salomone, A; Abbadessa, G; Racca, S; Vincenti, M

    2012-02-01

    A fast screening protocol was developed for the simultaneous determination of nine anti-estrogenic agents (aminoglutethimide, anastrozole, clomiphene, drostanolone, formestane, letrozole, mesterolone, tamoxifen, testolactone) plus five of their metabolites in human urine. After an enzymatic hydrolysis, these compounds can be extracted simultaneously from urine with a simple liquid-liquid extraction at alkaline conditions. The analytes were subsequently analyzed by fast-gas chromatography/mass spectrometry (fast-GC/MS) after derivatization. The use of a short column, high-flow carrier gas velocity and fast temperature ramping produced an efficient separation of all analytes in about 4 min, allowing a processing rate of 10 samples/h. The present analytical method was validated according to UNI EN ISO/IEC 17025 guidelines for qualitative methods. The range of investigated parameters included the limit of detection, selectivity, linearity, repeatability, robustness and extraction efficiency. High MS-sampling rate, using a benchtop quadrupole mass analyzer, resulted in accurate peak shape definition under both scan and selected ion monitoring modes, and high sensitivity in the latter mode. Therefore, the performances of the method are comparable to the ones obtainable from traditional GC/MS analysis. The method was successfully tested on real samples arising from clinical treatments of hospitalized patients and could profitably be used for clinical studies on anti-estrogenic drug administration.

  14. Rapid determination of anti-estrogens by gas chromatography/mass spectrometry in urine: Method validation and application to real samples

    PubMed Central

    Gerace, E.; Salomone, A.; Abbadessa, G.; Racca, S.; Vincenti, M.

    2011-01-01

    A fast screening protocol was developed for the simultaneous determination of nine anti-estrogenic agents (aminoglutethimide, anastrozole, clomiphene, drostanolone, formestane, letrozole, mesterolone, tamoxifen, testolactone) plus five of their metabolites in human urine. After an enzymatic hydrolysis, these compounds can be extracted simultaneously from urine with a simple liquid–liquid extraction at alkaline conditions. The analytes were subsequently analyzed by fast-gas chromatography/mass spectrometry (fast-GC/MS) after derivatization. The use of a short column, high-flow carrier gas velocity and fast temperature ramping produced an efficient separation of all analytes in about 4 min, allowing a processing rate of 10 samples/h. The present analytical method was validated according to UNI EN ISO/IEC 17025 guidelines for qualitative methods. The range of investigated parameters included the limit of detection, selectivity, linearity, repeatability, robustness and extraction efficiency. High MS-sampling rate, using a benchtop quadrupole mass analyzer, resulted in accurate peak shape definition under both scan and selected ion monitoring modes, and high sensitivity in the latter mode. Therefore, the performances of the method are comparable to the ones obtainable from traditional GC/MS analysis. The method was successfully tested on real samples arising from clinical treatments of hospitalized patients and could profitably be used for clinical studies on anti-estrogenic drug administration. PMID:29403714

  15. Current Technical Approaches for the Early Detection of Foodborne Pathogens: Challenges and Opportunities.

    PubMed

    Cho, Il-Hoon; Ku, Seockmo

    2017-09-30

    The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

  16. Comparative Analytical Utility of DNA Derived from Alternative Human Specimens for Molecular Autopsy and Diagnostics

    PubMed Central

    Klassen, Tara L.; von Rüden, Eva-Lotta; Drabek, Janice; Noebels, Jeffrey L.; Goldman, Alica M.

    2013-01-01

    Genetic testing and research have increased the demand for high-quality DNA that has traditionally been obtained by venipuncture. However, venous blood collection may prove difficult in special populations and when large-scale specimen collection or exchange is prerequisite for international collaborative investigations. Guthrie/FTA card–based blood spots, buccal scrapes, and finger nail clippings are DNA-containing specimens that are uniquely accessible and thus attractive as alternative tissue sources (ATS). The literature details a variety of protocols for extraction of nucleic acids from a singular ATS type, but their utility has not been systematically analyzed in comparison with conventional sources such as venous blood. Additionally, the efficacy of each protocol is often equated with the overall nucleic acid yield but not with the analytical performance of the DNA during mutation detection. Together with a critical in-depth literature review of published extraction methods, we developed and evaluated an all-inclusive approach for serial, systematic, and direct comparison of DNA utility from multiple biological samples. Our results point to the often underappreciated value of these alternative tissue sources and highlight ways to maximize the ATS-derived DNA for optimal quantity, quality, and utility as a function of extraction method. Our comparative analysis clarifies the value of ATS in genomic analysis projects for population-based screening, diagnostics, molecular autopsy, medico-legal investigations, or multi-organ surveys of suspected mosaicisms. PMID:22796560

  17. Can cloud point-based enrichment, preservation, and detection methods help to bridge gaps in aquatic nanometrology?

    PubMed

    Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan

    2016-11-01

    Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.

  18. Indoor Exposure Product Testing Protocols Version 2

    EPA Science Inventory

    EPA’s Office of Pollution Prevention and Toxics (OPPT) has developed a set of ten indoor exposure testing protocols intended to provide information on the purpose of the testing, general description of the sampling and analytical procedures, and references for tests that will be ...

  19. Detection of genetically modified organisms in foods by DNA amplification techniques.

    PubMed

    García-Cañas, Virginia; Cifuentes, Alejandro; González, Ramón

    2004-01-01

    In this article, the different DNA amplification techniques that are being used for detecting genetically modified organisms (GMOs) in foods are examined. This study intends to provide an updated overview (including works published till June 2002) on the principal applications of such techniques together with their main advantages and drawbacks in GMO detection in foods. Some relevant facts on sampling, DNA isolation, and DNA amplification methods are discussed. Moreover; these analytical protocols are discuissed from a quantitative point of view, including the newest investigations on multiplex detection of GMOs in foods and validation of methods.

  20. Isotope Inversion Experiment evaluating the suitability of calibration in surrogate matrix for quantification via LC-MS/MS-Exemplary application for a steroid multi-method.

    PubMed

    Suhr, Anna Catharina; Vogeser, Michael; Grimm, Stefanie H

    2016-05-30

    For quotable quantitative analysis of endogenous analytes in complex biological samples by isotope dilution LC-MS/MS, the creation of appropriate calibrators is a challenge, since analyte-free authentic material is in general not available. Thus, surrogate matrices are often used to prepare calibrators and controls. However, currently employed validation protocols do not include specific experiments to verify the suitability of a surrogate matrix calibration for quantification of authentic matrix samples. The aim of the study was the development of a novel validation experiment to test whether surrogate matrix based calibrators enable correct quantification of authentic matrix samples. The key element of the novel validation experiment is the inversion of nonlabelled analytes and their stable isotope labelled (SIL) counterparts in respect to their functions, i.e. SIL compound is the analyte and nonlabelled substance is employed as internal standard. As a consequence, both surrogate and authentic matrix are analyte-free regarding SIL analytes, which allows a comparison of both matrices. We called this approach Isotope Inversion Experiment. As figure of merit we defined the accuracy of inverse quality controls in authentic matrix quantified by means of a surrogate matrix calibration curve. As a proof-of-concept application a LC-MS/MS assay addressing six corticosteroids (cortisol, cortisone, corticosterone, 11-deoxycortisol, 11-deoxycorticosterone, and 17-OH-progesterone) was chosen. The integration of the Isotope Inversion Experiment in the validation protocol for the steroid assay was successfully realized. The accuracy results of the inverse quality controls were all in all very satisfying. As a consequence the suitability of a surrogate matrix calibration for quantification of the targeted steroids in human serum as authentic matrix could be successfully demonstrated. The Isotope Inversion Experiment fills a gap in the validation process for LC-MS/MS assays quantifying endogenous analytes. We consider it a valuable and convenient tool to evaluate the correct quantification of authentic matrix samples based on a calibration curve in surrogate matrix. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    PubMed

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. LABORATORY MISCONDUCT - WHAT CAN HAPPEN TO YOU?

    EPA Science Inventory

    Contracted laboratories perform a vast number of routine and special analytical services that are the foundation of decisions upon which rests the fate of the environment. Guiding these laboratories in the generation of environmental data has been the analytical protocols and ...

  3. A global multicenter study on reference values: 1. Assessment of methods for derivation and comparison of reference intervals.

    PubMed

    Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Qiu, Ling; Erasmus, Rajiv; Borai, Anwar; Evgina, Svetlana; Ashavaid, Tester; Khan, Dilshad; Schreier, Laura; Rolle, Reynan; Shimizu, Yoshihisa; Kimura, Shogo; Kawano, Reo; Armbruster, David; Mori, Kazuo; Yadav, Binod K

    2017-04-01

    The IFCC Committee on Reference Intervals and Decision Limits coordinated a global multicenter study on reference values (RVs) to explore rational and harmonizable procedures for derivation of reference intervals (RIs) and investigate the feasibility of sharing RIs through evaluation of sources of variation of RVs on a global scale. For the common protocol, rather lenient criteria for reference individuals were adopted to facilitate harmonized recruitment with planned use of the latent abnormal values exclusion (LAVE) method. As of July 2015, 12 countries had completed their study with total recruitment of 13,386 healthy adults. 25 analytes were measured chemically and 25 immunologically. A serum panel with assigned values was measured by all laboratories. RIs were derived by parametric and nonparametric methods. The effect of LAVE methods is prominent in analytes which reflect nutritional status, inflammation and muscular exertion, indicating that inappropriate results are frequent in any country. The validity of the parametric method was confirmed by the presence of analyte-specific distribution patterns and successful Gaussian transformation using the modified Box-Cox formula in all countries. After successful alignment of RVs based on the panel test results, nearly half the analytes showed variable degrees of between-country differences. This finding, however, requires confirmation after adjusting for BMI and other sources of variation. The results are reported in the second part of this paper. The collaborative study enabled us to evaluate rational methods for deriving RIs and comparing the RVs based on real-world datasets obtained in a harmonized manner. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    PubMed

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Methodological aspects of crossover and maximum fat-oxidation rate point determination.

    PubMed

    Michallet, A-S; Tonini, J; Regnier, J; Guinot, M; Favre-Juvin, A; Bricout, V; Halimi, S; Wuyam, B; Flore, P

    2008-11-01

    Indirect calorimetry during exercise provides two metabolic indices of substrate oxidation balance: the crossover point (COP) and maximum fat oxidation rate (LIPOXmax). We aimed to study the effects of the analytical device, protocol type and ventilatory response on variability of these indices, and the relationship with lactate and ventilation thresholds. After maximum exercise testing, 14 relatively fit subjects (aged 32+/-10 years; nine men, five women) performed three submaximum graded tests: one was based on a theoretical maximum power (tMAP) reference; and two were based on the true maximum aerobic power (MAP). Gas exchange was measured concomitantly using a Douglas bag (D) and an ergospirometer (E). All metabolic indices were interpretable only when obtained by the D reference method and MAP protocol. Bland and Altman analysis showed overestimation of both indices with E versus D. Despite no mean differences between COP and LIPOXmax whether tMAP or MAP was used, the individual data clearly showed disagreement between the two protocols. Ventilation explained 10-16% of the metabolic index variations. COP was correlated with ventilation (r=0.96, P<0.01) and the rate of increase in blood lactate (r=0.79, P<0.01), and LIPOXmax correlated with the ventilation threshold (r=0.95, P<0.01). This study shows that, in fit healthy subjects, the analytical device, reference used to build the protocol and ventilation responses affect metabolic indices. In this population, and particularly to obtain interpretable metabolic indices, we recommend a protocol based on the true MAP or one adapted to include the transition from fat to carbohydrate. The correlation between metabolic indices and lactate/ventilation thresholds suggests that shorter, classical maximum progressive exercise testing may be an alternative means of estimating these indices in relatively fit subjects. However, this needs to be confirmed in patients who have metabolic defects.

  6. Fluid-chemical evidence for one billion years of fluid flow through Mesoproterozoic deep-water carbonate mounds (Nanisivik zinc district, Nunavut)

    NASA Astrophysics Data System (ADS)

    Hahn, K. E.; Turner, E. C.; Kontak, D. J.; Fayek, M.

    2018-02-01

    Ancient carbonate rocks commonly contain numerous post-depositional phases (carbonate minerals; quartz) recording successive diagenetic events that can be deciphered and tied to known or inferred geological events using a multi-pronged in situ analytical protocol. The framework voids of large, deep-water microbial carbonate seep-mounds in Arctic Canada (Mesoproterozoic Ikpiarjuk Formation) contain multiple generations of synsedimentary and late cement. An in situ analytical study of the post-seafloor cements used optical and cathodoluminescence petrography, SEM-EDS analysis, fluid inclusion (FI) microthermometry and evaporate mound analysis, LA-ICP-MS analysis, and SIMS δ18O to decipher the mounds' long-term diagenetic history. The six void-filling late cements include, in paragenetic order: inclusion-rich euhedral dolomite (ED), finely crystalline clear dolomite (FCD), hematite-bearing dolomite (HD), coarsely crystalline clear dolomite (CCD), quartz (Q), replacive calcite (RC) and late calcite (LC). Based on the combined analytical results, the following fluid-flow history is defined: (1) ED precipitation by autocementation during shallow burial (fluid 1; Mesoproterozoic); (2) progressive mixing of Ca-rich hydrothermal fluid with the connate fluid, resulting in precipitation of FCD followed by HD (fluid 2; also Mesoproterozoic); (3) precipitation of hydrothermal dolomite (CCD) from high-Ca and K-rich fluids (fluid 3; possibly Mesoproterozoic, but timing unclear); (4) hydrothermal Q precipitation (fluid 4; timing unclear), and (5) RC and LC precipitation from a meteoric-derived water (fluid 5) in or since the Mesozoic. Fluids associated with FCD, HD, and CCD may have been mobilised during deposition of the upper Bylot Supergroup; this time interval was the most tectonically active episode in the region's Mesoproterozoic to Recent history. The entire history of intermittent fluid migration and cement precipitation recorded in seemingly unimportant void-filling mineral phases spans over 1 billion years, and was decipherable only because of the in situ protocol used. The multiple-method in situ analytical protocol employed in this study substantially augments the knowledge of an area's geological history, parts of which cannot be discerned by means other than meticulous study of diagenetic phases, and should become routine in similar studies.

  7. Analytical control test plan and microbiological methods for the water recovery test

    NASA Technical Reports Server (NTRS)

    Traweek, M. S. (Editor); Tatara, J. D. (Editor)

    1994-01-01

    Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.

  8. A Field-Based Cleaning Protocol for Sampling Devices Used in Life-Detection Studies

    NASA Astrophysics Data System (ADS)

    Eigenbrode, Jennifer; Benning, Liane G.; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E. F.

    2009-06-01

    Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.

  9. A field-based cleaning protocol for sampling devices used in life-detection studies.

    PubMed

    Eigenbrode, Jennifer; Benning, Liane G; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E F

    2009-06-01

    Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.

  10. GC-MS analyses of the volatiles of Houttuynia cordata Thunb.

    PubMed

    Yang, Zhan-Nan; Luo, Shi-Qiong; Ma, Jing; Wu, Dan; Hong, Liang; Yu, Zheng-Wen

    2016-09-01

    GC-MS is the basis of analysis of plant volatiles. Several protocols employed for the assay have resulted in inconsistent results in the literature. We developed a GC-MS method, which were applied to analyze 25 volatiles (α-pinene, camphene, β-pinene, 2-methyl-2-pentenal, myrcene, (+)-limonene, eucalyptol, trans-2-hexenal, γ-terpinene, cis-3-hexeneyl-acetate, 1-hexanol, α-pinene oxide, cis-3-hexen-1-ol, trans-2-hexen-1-ol, decanal, linalool, acetyl-borneol, β-caryophyllene, 2-undecanone, 4-terpineol, borneol, decanol, eugenol, isophytol and phytol) of Houttuynia cordata Thunb. Linear behaviors for all analytes were observed with a linear regression relationship (r2>0.9991) at the concentrations tested. Recoveries of the 25 analytes were 98.56-103.77% with RSDs <3.0%. Solution extraction (SE), which involved addition of an internal standard, could avoid errors for factors in sample preparation by steam distillation (SD) and solidphase micro extraction (SPME). Less sample material (≍0.05g fresh leaves of H. cordata) could be used to determine the contents of 25 analytes by our proposed method and, after collection, did not affect the normal physiological activity or growth of H. cordata. This method can be used to monitor the metabolic accumulation of H. cordata volatiles.

  11. Automated sample preparation using membrane microtiter extraction for bioanalytical mass spectrometry.

    PubMed

    Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H

    1997-01-01

    The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.

  12. Using functional neuroimaging combined with a think-aloud protocol to explore clinical reasoning expertise in internal medicine.

    PubMed

    Durning, Steven J; Graner, John; Artino, Anthony R; Pangaro, Louis N; Beckman, Thomas; Holmboe, Eric; Oakes, Terrance; Roy, Michael; Riedy, Gerard; Capaldi, Vincent; Walter, Robert; van der Vleuten, Cees; Schuwirth, Lambert

    2012-09-01

    Clinical reasoning is essential to medical practice, but because it entails internal mental processes, it is difficult to assess. Functional magnetic resonance imaging (fMRI) and think-aloud protocols may improve understanding of clinical reasoning as these methods can more directly assess these processes. The objective of our study was to use a combination of fMRI and think-aloud procedures to examine fMRI correlates of a leading theoretical model in clinical reasoning based on experimental findings to date: analytic (i.e., actively comparing and contrasting diagnostic entities) and nonanalytic (i.e., pattern recognition) reasoning. We hypothesized that there would be functional neuroimaging differences between analytic and nonanalytic reasoning theory. 17 board-certified experts in internal medicine answered and reflected on validated U.S. Medical Licensing Exam and American Board of Internal Medicine multiple-choice questions (easy and difficult) during an fMRI scan. This procedure was followed by completion of a formal think-aloud procedure. fMRI findings provide some support for the presence of analytic and nonanalytic reasoning systems. Statistically significant activation of prefrontal cortex distinguished answering incorrectly versus correctly (p < 0.01), whereas activation of precuneus and midtemporal gyrus distinguished not guessing from guessing (p < 0.01). We found limited fMRI evidence to support analytic and nonanalytic reasoning theory, as our results indicate functional differences with correct vs. incorrect answers and guessing vs. not guessing. However, our findings did not suggest one consistent fMRI activation pattern of internal medicine expertise. This model of employing fMRI correlates offers opportunities to enhance our understanding of theory, as well as improve our teaching and assessment of clinical reasoning, a key outcome of medical education.

  13. Applications of Functional Analytic and Martingale Methods to Problems in Queueing Network Theory.

    DTIC Science & Technology

    1983-05-14

    8217’") Air Force Office of Scientific Research Sf. ADDRESS (Cllty. State and ZIP Code) 7b. ADDRESS (City. State and ZIP Code) Directorate of Mathematical... Scientific Report on Air Force Grant #82-0167 Principal Investigator: Professor Walter A. Rosenkrantz I. Publications (1) Calculation of the LaPlace transform...whether or not a protocol for accessing a comunications channel is stable. In AFOSR 82-0167, Report No. 3 we showed that the SLOTTED ALOHA Multi access

  14. Hybrid optimal scheduling for intermittent androgen suppression of prostate cancer

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito; di Bernardo, Mario; Bruchovsky, Nicholas; Aihara, Kazuyuki

    2010-12-01

    We propose a method for achieving an optimal protocol of intermittent androgen suppression for the treatment of prostate cancer. Since the model that reproduces the dynamical behavior of the surrogate tumor marker, prostate specific antigen, is piecewise linear, we can obtain an analytical solution for the model. Based on this, we derive conditions for either stopping or delaying recurrent disease. The solution also provides a design principle for the most favorable schedule of treatment that minimizes the rate of expansion of the malignant cell population.

  15. Optimal molecular profiling of tissue and tissue components: defining the best processing and microdissection methods for biomedical applications.

    PubMed

    Rodriguez-Canales, Jaime; Hanson, Jeffrey C; Hipp, Jason D; Balis, Ulysses J; Tangrea, Michael A; Emmert-Buck, Michael R; Bova, G Steven

    2013-01-01

    Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This updated chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high-quality, appropriately anatomically tagged scientific results. Improvement in this area will significantly increase life science quality and productivity. The chapter is divided into introduction, materials, protocols, and notes subheadings. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this chapter, readers are advised to read through the entire chapter first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.

  16. Mainstream Smoke Levels of Volatile Organic Compounds in 50 US Domestic Cigarette Brands Smoked with the ISO and Canadian Intense Protocols

    PubMed Central

    Pazo, Daniel Y.; Moliere, Fallon; Sampson, Maureen M.; Reese, Christopher M.; Agnew-Heard, Kimberly A.; Walters, Matthew J.; Holman, Matthew R.; Blount, Benjamin C.; Watson, Clifford; Chambers, David M.

    2017-01-01

    Introduction A significant portion of the increased risk of cancer and respiratory disease from exposure to cigarette smoke is attributed to volatile organic compounds (VOCs). In this study, 21 VOCs were quantified in mainstream cigarette smoke from 50 U.S. domestic brand varieties that included high market share brands and two Kentucky research cigarettes (3R4F and 1R5F). Methods Mainstream smoke was generated under ISO 3308 and Canadian Intense (CI) smoking protocols with linear smoking machines with a gas sampling bag collection followed by SPME/GC/MS analysis. Results For both protocols, mainstream smoke VOC amounts among the different brand varieties were strongly correlated between the majority of the analytes. Overall, Pearson correlation (r) ranged from 0.68 to 0.99 for ISO and 0.36 to 0.95 for CI. However, monoaromatic compounds were found to increase disproportionately compared to unsaturated, nitro, and carbonyl compounds under the CI smoking protocol where filter ventilation is blocked. Conclusions Overall, machine generated “vapor phase” amounts (μg/cigarette) are primarily attributed to smoking protocol (e.g., blocking of vent holes, puff volume, and puff duration) and filter ventilation. A possible cause for the disproportionate increase in monoaromatic compounds could be increased pyrolysis under low oxygen conditions associated with the CI protocol. PMID:27113015

  17. Development of a microchip-pulsed electrochemical method for rapid determination of L-DOPA and tyrosine in Mucuna pruriens.

    PubMed

    Li, Xinchun; Chen, Zuanguang; Yang, Fan; Pan, Jianbin; Li, Yinbao

    2013-05-01

    L-3,4-dihydroxyphenylalanine (L-DOPA) is a well-recognized therapeutic compound to Parkinson's disease. Tyrosine is a precursor for the biosynthesis of L-DOPA, both of which are widely found in traditional medicinal material, Mucuna pruriens. In this paper, we described a validated novel analytical method based on microchip capillary electrophoresis with pulsed electrochemical detection for the simultaneous measurement of L-DOPA and tyrosine in M. pruriens. This protocol adopted end-channel amperometric detection using platinum disk electrode on a homemade glass/polydimethylsiloxane electrophoresis microchip. The background buffer consisted of 10 mM borate (pH 9.5) and 0.02 mM cetyltrimethylammonium bromide, which can produce an effective resolution for the two analytes. In the optimal condition, sufficient electrophoretic separation and sensitive detection for the target analytes can be realized within 60 s. Both tyrosine and L-DOPA yielded linear response in the concentration range of 5.0-400 μM (R(2) > 0.99), and the LOD were 0.79 and 1.1 μM, respectively. The accuracy and precision of the established method were favorable. The present method shows several merits such as facile apparatus, high speed, low cost and minimal pollution, and provides a means for the pharmacologically active ingredients assay in M. pruriens. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Filter Membrane Effects on Water-Extractable Phosphorus Concentrations from Soil.

    PubMed

    Norby, Jessica; Strawn, Daniel; Brooks, Erin

    2018-03-01

    To accurately assess P concentrations in soil extracts, standard laboratory practices for monitoring P concentrations are needed. Water-extractable P is a common analytical test to determine P availability for leaching from soils, and it is used to determine best management practices. Most P analytical tests require filtration through a filter membrane with 0.45-μm pore size to distinguish between particulate and dissolved P species. However, filter membrane type is rarely specified in method protocols, and many different types of membranes are available. In this study, three common filter membrane materials (polyether sulfone, nylon, and nitrocellulose), all with 0.45-μm pore sizes, were tested for analytical differences in total P concentrations and dissolved reactive P (DRP) concentrations in water extracts from six soils sampled from two regions. Three of the extracts from the six soil samples had different total P concentrations for all three membrane types. The other three soil extracts had significantly different total P results from at least one filter membrane type. Total P concentration differences were as great as 35%. The DRP concentrations in the extracts were dependent on filter type in five of the six soil types. Results from this research show that filter membrane type is an important parameter that affects concentrations of total P and DRP from soil extracts. Thus, membrane type should be specified in soil extraction protocols. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  19. Analytical Protocol (GC/ECNIMS) for OSWER's Response to OIG Report (2005-P-00022) on Toxaphene Analysis

    EPA Science Inventory

    The research approached the large number and complexity of the analytes as four separate groups: technical toxaphene, toxaphene congeners (eight in number), chlordane, and organochlorine pesticides. This approach was advantageous because it eliminated potential interferences amon...

  20. Analysis of variation matrix array by bilinear least squares-residual bilinearization (BLLS-RBL) for resolving and quantifying of foodstuff dyes in a candy sample.

    PubMed

    Asadpour-Zeynali, Karim; Maryam Sajjadi, S; Taherzadeh, Fatemeh; Rahmanian, Reza

    2014-04-05

    Bilinear least square (BLLS) method is one of the most suitable algorithms for second-order calibration. Original BLLS method is not applicable to the second order pH-spectral data when an analyte has more than one spectroscopically active species. Bilinear least square-residual bilinearization (BLLS-RBL) was developed to achieve the second order advantage for analysis of complex mixtures. Although the modified method is useful, the pure profiles cannot be obtained and only the linear combination will be obtained. Moreover, for prediction of analyte in an unknown sample, the original algorithm of RBL may diverge; instead of converging to the desired analyte concentrations. Therefore, Gauss Newton-RLB algorithm should be used, which is not as simple as original protocol. Also, the analyte concentration can be predicted on the basis of each of the equilibrating species of the component of interest that are not exactly the same. The aim of the present work is to tackle the non-uniqueness problem in the second order calibration of monoprotic acid mixtures and divergence of RBL. Each pH-absorbance matrix was pretreated by subtraction of the first spectrum from other spectra in the data set to produce full rank array that is called variation matrix. Then variation matrices were analyzed uniquely by original BLLS-RBL that is more parsimonious than its modified counterpart. The proposed method was performed on the simulated as well as the analysis of real data. Sunset yellow and Carmosine as monoprotic acids were determined in candy sample in the presence of unknown interference by this method. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Development and application of a multi-residue method for the determination of 53 pharmaceuticals in water, sediment, and suspended solids using liquid chromatography-tandem mass spectrometry.

    PubMed

    Aminot, Yann; Litrico, Xavier; Chambolle, Mélodie; Arnaud, Christine; Pardon, Patrick; Budzindki, Hélène

    2015-11-01

    Comprehensive source and fate studies of pharmaceuticals in the environment require analytical methods able to quantify a wide range of molecules over various therapeutic classes, in aqueous and solid matrices. Considering this need, the development of an analytical method to determine 53 pharmaceuticals in aqueous phase and in solid matrices using a combination of microwave-assisted extraction, solid phase extraction, and liquid chromatography coupled with tandem mass spectrometry is reported. Method was successfully validated regarding linearity, repeatability, and overall protocol recovery. Method detection limits (MDLs) do not exceed 1 ng L(-1) for 40 molecules in aqueous matrices (6 ng L(-1) for the 13 remaining), while subnanogram per gram MDLs were reached for 38 molecules in solid phase (29 ng g(-1) for the 15 remaining). Losses due to preparative steps were assessed for the 32 analytes associated to their labeled homologue, revealing an average loss of 40 % during reconcentration, the most altering step. Presence of analytes in wastewater treatment plant (WWTP) effluent aqueous phase and suspended solids (SS) as well as in river water, SS, and sediments was then investigated on a periurban river located in the suburbs of Bordeaux, France, revealing a major contribution of WWTP effluent to the river contamination. Sorption on river SS exceeded 5 % of total concentration for amitriptyline, fluoxetine, imipramine, ritonavir, sildenafil, and propranolol and appeared to be submitted to a seasonal influence. Sediment contamination was lower than the one of SS, organic carbon content, and sediment fine element proportion was accountable for the highest measured concentrations.

  2. Thermal/Optical Methods for Elemental Carbon Quantification in Soils and Urban Dusts: Equivalence of Different Analysis Protocols

    PubMed Central

    Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng

    2013-01-01

    Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method. PMID:24358286

  3. Analysis of longitudinal data from animals with missing values using SPSS.

    PubMed

    Duricki, Denise A; Soleman, Sara; Moon, Lawrence D F

    2016-06-01

    Testing of therapies for disease or injury often involves the analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly when some data are missing), yet they are not used widely by preclinical researchers. Here we provide an easy-to-use protocol for the analysis of longitudinal data from animals, and we present a click-by-click guide for performing suitable analyses using the statistical package IBM SPSS Statistics software (SPSS). We guide readers through the analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. If a few data points are missing, as in this example data set (for example, because of animal dropout), repeated-measures analysis of covariance may fail to detect a treatment effect. An alternative analysis method, such as the use of linear models (with various covariance structures), and analysis using restricted maximum likelihood estimation (to include all available data) can be used to better detect treatment effects. This protocol takes 2 h to carry out.

  4. Are extraction methods in quantitative assays of pharmacopoeia monographs exhaustive? A comparison with pressurized liquid extraction.

    PubMed

    Basalo, Carlos; Mohn, Tobias; Hamburger, Matthias

    2006-10-01

    The extraction methods in selected monographs of the European and the Swiss Pharmacopoeia were compared to pressurized liquid extraction (PLE) with respect to the yield of constituents to be dosed in the quantitative assay for the respective herbal drugs. The study included five drugs, Belladonnae folium, Colae semen, Boldo folium, Tanaceti herba and Agni casti fructus. They were selected to cover different classes of compounds to be analyzed and different extraction methods to be used according to the monographs. Extraction protocols for PLE were optimized by varying the solvents and number of extraction cycles. In PLE, yields > 97 % of extractable analytes were typically achieved with two extraction cycles. For alkaloid-containing drugs, the addition of ammonia prior to extraction significantly increased the yield and reduced the number of extraction cycles required for exhaustive extraction. PLE was in all cases superior to the extraction protocol of the pharmacopoeia monographs (taken as 100 %), with differences ranging from 108 % in case of parthenolide in Tanaceti herba to 343 % in case of alkaloids in Boldo folium.

  5. Development of Two Analytical Methods Based on Reverse Phase Chromatographic and SDS-PAGE Gel for Assessment of Deglycosylation Yield in N-Glycan Mapping.

    PubMed

    Eckard, Anahita D; Dupont, David R; Young, Johnie K

    2018-01-01

    N -lined glycosylation is one of the critical quality attributes (CQA) for biotherapeutics impacting the safety and activity of drug product. Changes in pattern and level of glycosylation can significantly alter the intrinsic properties of the product and, therefore, have to be monitored throughout its lifecycle. Therefore fast, precise, and unbiased N -glycan mapping assay is desired. To ensure these qualities, using analytical methods that evaluate completeness of deglycosylation is necessary. For quantification of deglycosylation yield, methods such as reduced liquid chromatography-mass spectrometry (LC-MS) and reduced capillary gel electrophoresis (CGE) have been commonly used. Here we present development of two additional methods to evaluate deglycosylation yield: one based on LC using reverse phase (RP) column and one based on reduced sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE gel) with offline software (GelAnalyzer). With the advent of rapid deglycosylation workflows in the market for N -glycan profiling replacing overnight incubation, we have aimed to quantify the level of deglycosylation in a selected rapid deglycosylation workflow. Our results have shown well resolved peaks of glycosylated and deglycosylated protein species with RP-LC method allowing simple quantification of deglycosylation yield of protein with high confidence. Additionally a good correlation, ≥0.94, was found between deglycosylation yields estimated by RP-LC method and that of reduced SDS-PAGE gel method with offline software. Evaluation of rapid deglycosylation protocol from GlycanAssure™ HyPerformance assay kit performed on fetuin and RNase B has shown complete deglycosylation within the recommended protocol time when evaluated with these techniques. Using this kit, N -glycans from NIST mAb were prepared in 1.4 hr and analyzed by hydrophilic interaction chromatography (HILIC) ultrahigh performance LC (UHPLC) equipped with a fluorescence detector (FLD). 37 peaks were resolved with good resolution. Excellent sample preparation repeatability was found with relative standard deviation (RSD) of <5% for peaks with >0.5% relative area.

  6. Plant iTRAQ-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Handakumbura, Pubudu; Hixson, Kim K.; Purvine, Samuel O.

    We present a simple one-­pot extraction protocol, which rapidly isolates hydrophyllic metabolites, lipids, and proteins from the same pulverized plant sample. Also detailed is a global plant proteomics sample preparation method utilizing iTRAQ multiplexing reagents that enables deep proteome coverage due to the use of HPLC fractionation of the peptides prior to mass spectrometric analysis. We have successfully used this protocol on several different plant tissues (e.g., roots, stems, leaves) from different plants (e.g., sorghum, poplar, Arabidopsis, soybean), and have been able to successfully detect and quantify thousands of proteins. Multiplexing strategies such as iTRAQ and the bioinformatics strategy outlinedmore » here, ultimately provide insight into which proteins are significantly changed in abundance between two or more groups (e.g., control, perturbation). Our bioinformatics strategy yields z-­score values, which normalize the expression data into a format that can easily be cross-­compared with other expression data (i.e., metabolomics, transcriptomics) obtained from different analytical methods and instrumentation.« less

  7. Homogeneous Immunoassays: Historical Perspective and Future Promise

    NASA Astrophysics Data System (ADS)

    Ullman, Edwin F.

    1999-06-01

    The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.

  8. SCIENCE MISCONDUCT ACTIVITIES IN ENVIRONMENTAL ANALYSIS - FRAUD DETECTION IN GC/MS/ICP ACTIVITIES

    EPA Science Inventory

    Contracted laboratories perform a vast number of routine and special analytical services that are the foundation of decisions upon which rests the fate of the environment. Guiding these laboratories in the generation of environmental data has been the analytical protocols and th...

  9. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  10. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  11. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  12. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  13. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  14. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2015-01-01

    Within the mosaic display of international anti-doping efforts, analytical strategies based on up-to-date instrumentation as well as most recent information about physiology, pharmacology, metabolism, etc., of prohibited substances and methods of doping are indispensable. The continuous emergence of new chemical entities and the identification of arguably beneficial effects of established or even obsolete drugs on endurance, strength, and regeneration, necessitate frequent and adequate adaptations of sports drug testing procedures. These largely rely on exploiting new technologies, extending the substance coverage of existing test protocols, and generating new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA). In reference of the content of the 2014 Prohibited List, literature concerning human sports drug testing that was published between October 2013 and September 2014 is summarized and reviewed in this annual banned-substance review, with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2014 John Wiley & Sons, Ltd.

  15. Analytical Model of Large Data Transactions in CoAP Networks

    PubMed Central

    Ludovici, Alessandro; Di Marco, Piergiuseppe; Calveras, Anna; Johansson, Karl H.

    2014-01-01

    We propose a novel analytical model to study fragmentation methods in wireless sensor networks adopting the Constrained Application Protocol (CoAP) and the IEEE 802.15.4 standard for medium access control (MAC). The blockwise transfer technique proposed in CoAP and the 6LoWPAN fragmentation are included in the analysis. The two techniques are compared in terms of reliability and delay, depending on the traffic, the number of nodes and the parameters of the IEEE 802.15.4 MAC. The results are validated trough Monte Carlo simulations. To the best of our knowledge this is the first study that evaluates and compares analytically the performance of CoAP blockwise transfer and 6LoWPAN fragmentation. A major contribution is the possibility to understand the behavior of both techniques with different network conditions. Our results show that 6LoWPAN fragmentation is preferable for delay-constrained applications. For highly congested networks, the blockwise transfer slightly outperforms 6LoWPAN fragmentation in terms of reliability. PMID:25153143

  16. Measurement Challenges for Carbon Nanotube Material

    NASA Technical Reports Server (NTRS)

    Sosa, Edward; Arepalli, Sivaram; Nikolaev, Pasha; Gorelik, Olga; Yowell, Leonard

    2006-01-01

    The advances in large scale applications of carbon nanotubes demand a reliable supply of raw and processed materials. It is imperative to have a consistent quality control of these nanomaterials to distinguish material inconsistency from the modifications induced by processing of nanotubes for any application. NASA Johnson Space Center realized this need five years back and started a program to standardize the characterization methods. The JSC team conducted two workshops (2003 and 2005) in collaboration with NIST focusing on purity and dispersion measurement issues of carbon nanotubes [1]. In 2004, the NASA-JSC protocol was developed by combining analytical techniques of SEM, TEM, UV-VIS-NIR absorption, Raman, and TGA [2]. This protocol is routinely used by several researchers across the world as a first step in characterizing raw and purified carbon nanotubes. A suggested practice guide consisting of detailed chapters on TGA, Raman, electron microscopy and NIR absorption is in the final stages and is undergoing revisions with input from the nanotube community [3]. The possible addition of other techniques such as XPS, and ICP to the existing protocol will be presented. Recent activities at ANSI and ISO towards implementing these protocols as nanotube characterization standards will be discussed.

  17. On Equivalence between Critical Probabilities of Dynamic Gossip Protocol and Static Site Percolation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Tetsuya; Hayakawa, Tomohisa

    The relationship between the critical probability of gossip protocol on the square lattice and the critical probability of site percolation on the square lattice is discussed. Specifically, these two critical probabilities are analytically shown to be equal to each other. Furthermore, we present a way of evaluating the critical probability of site percolation by approximating the saturation of gossip protocol. Finally, we provide numerical results which support the theoretical analysis.

  18. Task-based image quality evaluation of iterative reconstruction methods for low dose CT using computer simulations

    NASA Astrophysics Data System (ADS)

    Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.

    2015-04-01

    Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.

  19. Concurrent measurement of cellular turbidity and hemoglobin to evaluate the antioxidant activity of plants.

    PubMed

    Bellik, Yuva; Iguer-Ouada, Mokrane

    2016-01-01

    In past decades, a multitude of analytical methods for measuring antioxidant activity of plant extracts has been developed. However, when using methods to determine hemoglobin released from human erythrocytes treated with ginger extracts, we found hemoglobin concentrations were significantly higher than in untreated control samples. This suggests in the presence of antioxidants that measuring hemoglobin alone is not sufficient to determine hemolysis. We show concurrent measurement of erythrocyte concentration and hemoglobin is essential in such assays, and describe a new protocol based on simultaneous measurement of cellular turbidity and hemoglobin. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Exposure assessment for endocrine disruptors: some considerations in the design of studies.

    PubMed Central

    Rice, Carol; Birnbaum, Linda S; Cogliano, James; Mahaffey, Kathryn; Needham, Larry; Rogan, Walter J; vom Saal, Frederick S

    2003-01-01

    In studies designed to evaluate exposure-response relationships in children's development from conception through puberty, multiple factors that affect the generation of meaningful exposure metrics must be considered. These factors include multiple routes of exposure; the timing, frequency, and duration of exposure; need for qualitative and quantitative data; sample collection and storage protocols; and the selection and documentation of analytic methods. The methods for exposure data collection and analysis must be sufficiently robust to accommodate the a priori hypotheses to be tested, as well as hypotheses generated from the data. A number of issues that must be considered in study design are summarized here. PMID:14527851

  1. Guidance from an NIH Workshop on Designing, Implementing, and Reporting Clinical Studies of Soy Interventions1–4

    PubMed Central

    Klein, Marguerite A.; Nahin, Richard L.; Messina, Mark J.; Rader, Jeanne I.; Thompson, Lilian U.; Badger, Thomas M.; Dwyer, Johanna T.; Kim, Young S.; Pontzer, Carol H.; Starke-Reed, Pamela E.; Weaver, Connie M.

    2010-01-01

    The NIH sponsored a scientific workshop, “Soy Protein/Isoflavone Research: Challenges in Designing and Evaluating Intervention Studies,” July 28–29, 2009. The workshop goal was to provide guidance for the next generation of soy protein/isoflavone human research. Session topics included population exposure to soy; the variability of the human response to soy; product composition; methods, tools, and resources available to estimate exposure and protocol adherence; and analytical methods to assess soy in foods and supplements and analytes in biologic fluids and other tissues. The intent of the workshop was to address the quality of soy studies, not the efficacy or safety of soy. Prior NIH workshops and an evidence-based review questioned the quality of data from human soy studies. If clinical studies are pursued, investigators need to ensure that the experimental designs are optimal and the studies properly executed. The workshop participants identified methodological issues that may confound study results and interpretation. Scientifically sound and useful options for dealing with these issues were discussed. The resulting guidance is presented in this document with a brief rationale. The guidance is specific to soy clinical research and does not address nonsoy-related factors that should also be considered in designing and reporting clinical studies. This guidance may be used by investigators, journal editors, study sponsors, and protocol reviewers for a variety of purposes, including designing and implementing trials, reporting results, and interpreting published epidemiological and clinical studies. PMID:20392880

  2. Development of an analytical microbial consortia method for enhancing performance monitoring at aerobic wastewater treatment plants.

    PubMed

    Razban, Behrooz; Nelson, Kristina Y; McMartin, Dena W; Cullimore, D Roy; Wall, Michelle; Wang, Dunling

    2012-01-01

    An analytical method to produce profiles of bacterial biomass fatty acid methyl esters (FAME) was developed employing rapid agitation followed by static incubation (RASI) using selective media of wastewater microbial communities. The results were compiled to produce a unique library for comparison and performance analysis at a Wastewater Treatment Plant (WWTP). A total of 146 samples from the aerated WWTP, comprising 73 samples of each secondary and tertiary effluent, were included analyzed. For comparison purposes, all samples were evaluated via a similarity index (SI) with secondary effluents producing an SI of 0.88 with 2.7% variation and tertiary samples producing an SI 0.86 with 5.0% variation. The results also highlighted significant differences between the fatty acid profiles of the tertiary and secondary effluents indicating considerable shifts in the bacterial community profile between these treatment phases. The WWTP performance results using this method were highly replicable and reproducible indicating that the protocol has potential as a performance-monitoring tool for aerated WWTPs. The results quickly and accurately reflect shifts in dominant bacterial communities that result when processes operations and performance change.

  3. Student Career Decisions: The Limits of Rationality.

    ERIC Educational Resources Information Center

    Baumgardner, Steve R.; Rappoport, Leon

    This study compares modes of cognitive functioning revealed in student selection of a college major. Students were interviewed in-depth concerning reasons for their choice of majors. Protocol data suggested two distinct modes of thinking were evident on an analytic-intuitive dimension. For operational purposes analytic thinking was defined by…

  4. Detection methods and performance criteria for genetically modified organisms.

    PubMed

    Bertheau, Yves; Diolez, Annick; Kobilinsky, André; Magin, Kimberly

    2002-01-01

    Detection methods for genetically modified organisms (GMOs) are necessary for many applications, from seed purity assessment to compliance of food labeling in several countries. Numerous analytical methods are currently used or under development to support these needs. The currently used methods are bioassays and protein- and DNA-based detection protocols. To avoid discrepancy of results between such largely different methods and, for instance, the potential resulting legal actions, compatibility of the methods is urgently needed. Performance criteria of methods allow evaluation against a common standard. The more-common performance criteria for detection methods are precision, accuracy, sensitivity, and specificity, which together specifically address other terms used to describe the performance of a method, such as applicability, selectivity, calibration, trueness, precision, recovery, operating range, limit of quantitation, limit of detection, and ruggedness. Performance criteria should provide objective tools to accept or reject specific methods, to validate them, to ensure compatibility between validated methods, and be used on a routine basis to reject data outside an acceptable range of variability. When selecting a method of detection, it is also important to consider its applicability, its field of applications, and its limitations, by including factors such as its ability to detect the target analyte in a given matrix, the duration of the analyses, its cost effectiveness, and the necessary sample sizes for testing. Thus, the current GMO detection methods should be evaluated against a common set of performance criteria.

  5. Technology-assisted psychoanalysis.

    PubMed

    Scharff, Jill Savege

    2013-06-01

    Teleanalysis-remote psychoanalysis by telephone, voice over internet protocol (VoIP), or videoteleconference (VTC)-has been thought of as a distortion of the frame that cannot support authentic analytic process. Yet it can augment continuity, permit optimum frequency of analytic sessions for in-depth analytic work, and enable outreach to analysands in areas far from specialized psychoanalytic centers. Theoretical arguments against teleanalysis are presented and countered and its advantages and disadvantages discussed. Vignettes of analytic process from teleanalytic sessions are presented, and indications, contraindications, and ethical concerns are addressed. The aim is to provide material from which to judge the authenticity of analytic process supported by technology.

  6. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. MODULE 4: ANALYTICAL GUIDE. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  7. Development and Preliminary Evaluation of a FAP Protocol: Brief Relationship Enhancement

    ERIC Educational Resources Information Center

    Holman, Gareth; Kohlenberg, Robert J.; Tsai, Mavis

    2012-01-01

    The purpose of this study was to develop a brief Functional Analytic Psychotherapy (FAP) protocol that will facilitate reliable implementation of FAP interventions, thus supporting research on FAP process and outcome. The treatment was a four-session individual therapy for clients who were interested in improving their relationship with their…

  8. Evaluation of Aspergillus PCR protocols for testing serum specimens.

    PubMed

    White, P Lewis; Mengoli, Carlo; Bretagne, Stéphane; Cuenca-Estrella, Manuel; Finnstrom, Niklas; Klingspor, Lena; Melchers, Willem J G; McCulloch, Elaine; Barnes, Rosemary A; Donnelly, J Peter; Loeffler, Juergen

    2011-11-01

    A panel of human serum samples spiked with various amounts of Aspergillus fumigatus genomic DNA was distributed to 23 centers within the European Aspergillus PCR Initiative to determine analytical performance of PCR. Information regarding specific methodological components and PCR performance was requested. The information provided was made anonymous, and meta-regression analysis was performed to determine any procedural factors that significantly altered PCR performance. Ninety-seven percent of protocols were able to detect a threshold of 10 genomes/ml on at least one occasion, with 83% of protocols reproducibly detecting this concentration. Sensitivity and specificity were 86.1% and 93.6%, respectively. Positive associations between sensitivity and the use of larger sample volumes, an internal control PCR, and PCR targeting the internal transcribed spacer (ITS) region were shown. Negative associations between sensitivity and the use of larger elution volumes (≥100 μl) and PCR targeting the mitochondrial genes were demonstrated. Most Aspergillus PCR protocols used to test serum generate satisfactory analytical performance. Testing serum requires less standardization, and the specific recommendations shown in this article will only improve performance.

  9. Evaluation of Aspergillus PCR Protocols for Testing Serum Specimens▿†

    PubMed Central

    White, P. Lewis; Mengoli, Carlo; Bretagne, Stéphane; Cuenca-Estrella, Manuel; Finnstrom, Niklas; Klingspor, Lena; Melchers, Willem J. G.; McCulloch, Elaine; Barnes, Rosemary A.; Donnelly, J. Peter; Loeffler, Juergen

    2011-01-01

    A panel of human serum samples spiked with various amounts of Aspergillus fumigatus genomic DNA was distributed to 23 centers within the European Aspergillus PCR Initiative to determine analytical performance of PCR. Information regarding specific methodological components and PCR performance was requested. The information provided was made anonymous, and meta-regression analysis was performed to determine any procedural factors that significantly altered PCR performance. Ninety-seven percent of protocols were able to detect a threshold of 10 genomes/ml on at least one occasion, with 83% of protocols reproducibly detecting this concentration. Sensitivity and specificity were 86.1% and 93.6%, respectively. Positive associations between sensitivity and the use of larger sample volumes, an internal control PCR, and PCR targeting the internal transcribed spacer (ITS) region were shown. Negative associations between sensitivity and the use of larger elution volumes (≥100 μl) and PCR targeting the mitochondrial genes were demonstrated. Most Aspergillus PCR protocols used to test serum generate satisfactory analytical performance. Testing serum requires less standardization, and the specific recommendations shown in this article will only improve performance. PMID:21940479

  10. Analytical approach to cross-layer protocol optimization in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2008-04-01

    In the distributed operations of route discovery and maintenance, strong interaction occurs across mobile ad hoc network (MANET) protocol layers. Quality of service (QoS) requirements of multimedia service classes must be satisfied by the cross-layer protocol, along with minimization of the distributed power consumption at nodes and along routes to battery-limited energy constraints. In previous work by the author, cross-layer interactions in the MANET protocol are modeled in terms of a set of concatenated design parameters and associated resource levels by multivariate point processes (MVPPs). Determination of the "best" cross-layer design is carried out using the optimal control of martingale representations of the MVPPs. In contrast to the competitive interaction among nodes in a MANET for multimedia services using limited resources, the interaction among the nodes of a wireless sensor network (WSN) is distributed and collaborative, based on the processing of data from a variety of sensors at nodes to satisfy common mission objectives. Sensor data originates at the nodes at the periphery of the WSN, is successively transported to other nodes for aggregation based on information-theoretic measures of correlation and ultimately sent as information to one or more destination (decision) nodes. The "multimedia services" in the MANET model are replaced by multiple types of sensors, e.g., audio, seismic, imaging, thermal, etc., at the nodes; the QoS metrics associated with MANETs become those associated with the quality of fused information flow, i.e., throughput, delay, packet error rate, data correlation, etc. Significantly, the essential analytical approach to MANET cross-layer optimization, now based on the MVPPs for discrete random events occurring in the WSN, can be applied to develop the stochastic characteristics and optimality conditions for cross-layer designs of sensor network protocols. Functional dependencies of WSN performance metrics are described in terms of the concatenated protocol parameters. New source-to-destination routes are sought that optimize cross-layer interdependencies to achieve the "best available" performance in the WSN. The protocol design, modified from a known reactive protocol, adapts the achievable performance to the transient network conditions and resource levels. Control of network behavior is realized through the conditional rates of the MVPPs. Optimal cross-layer protocol parameters are determined by stochastic dynamic programming conditions derived from models of transient packetized sensor data flows. Moreover, the defining conditions for WSN configurations, grouping sensor nodes into clusters and establishing data aggregation at processing nodes within those clusters, lead to computationally tractable solutions to the stochastic differential equations that describe network dynamics. Closed-form solution characteristics provide an alternative to the "directed diffusion" methods for resource-efficient WSN protocols published previously by other researchers. Performance verification of the resulting cross-layer designs is found by embedding the optimality conditions for the protocols in actual WSN scenarios replicated in a wireless network simulation environment. Performance tradeoffs among protocol parameters remain for a sequel to the paper.

  11. Multi-residue determination of the sorption of illicit drugs and pharmaceuticals to wastewater suspended particulate matter using pressurised liquid extraction, solid phase extraction and liquid chromatography coupled with tandem mass spectrometry.

    PubMed

    Baker, David R; Kasprzyk-Hordern, Barbara

    2011-11-04

    Presented is the first comprehensive study of drugs of abuse on suspended particulate matter (SPM) in wastewater. Analysis of SPM is crucial to prevent the under-reporting of the levels of analyte that may be present in wastewater. Analytical methods to date analyse the aqueous part of wastewater samples only, removing SPM through the use of filtration or centrifugation. The development of an analytical method to determine 60 compounds on SPM using a combination of pressurised liquid extraction, solid phase extraction and liquid chromatography coupled with tandem mass spectrometry (PLE-SPE-LC-MS/MS) is reported. The range of compounds monitored included stimulants, opioid and morphine derivatives, benzodiazepines, antidepressants, dissociative anaesthetics, drug precursors, and their metabolites. The method was successfully validated (parameters studied: linearity and range, recovery, accuracy, reproducibility, repeatability, matrix effects, and limits of detection and quantification). The developed methodology was applied to SPM samples collected at three wastewater treatment plants in the UK. The average proportion of analyte on SPM as opposed to in the aqueous phase was <5% for several compounds including cocaine, benzoylecgonine, MDMA, and ketamine; whereas the proportion was >10% with regard to methadone, EDDP, EMDP, BZP, fentanyl, nortramadol, norpropoxyphene, sildenafil and all antidepressants (dosulepin, amitriptyline, nortriptyline, fluoxetine and norfluoxetine). Consequently, the lack of SPM analysis in wastewater sampling protocol could lead to the under-reporting of the measured concentration of some compounds. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Development of a fast and simple gas chromatographic protocol based on the combined use of alkyl chloroformate and solid phase microextraction for the assay of polyamines in human urine.

    PubMed

    Naccarato, Attilio; Elliani, Rosangela; Cavaliere, Brunella; Sindona, Giovanni; Tagarelli, Antonio

    2018-05-11

    Polyamines are aliphatic amines with low molecular weight that are widely recognized as one of the most important cancer biomarkers for early diagnosis and treatment. The goal of the work herein presented is the development of a rapid and simple method for the quantification of free polyamines (i.e., putrescine, cadaverine, spermidine, spermine) and N-monoacetylated polyamines (i.e., N 1 -Acetylspermidine, N 8 -Acetylspermidine, and N 1 -Acetylspermine) in human urine. A preliminary derivatization with propyl chloroformate combined with the use of solid phase microextraction (SPME) allowed for an easy and automatable protocol involving minimal sample handling and no consumption of organic solvents. The affinity of the analytes toward five commercial SPME coatings was evaluated in univariate mode, and the best result in terms of analyte extraction was achieved using the divinylbenzene/carboxen/polydimethylsiloxane fiber. The variables affecting the performance of SPME analysis were optimized by the multivariate approach of experimental design and, in particular, using a central composite design (CCD). The optimal working conditions in terms of response values are the following: extraction temperature 40 °C, extraction time of 15 min and no addition of NaCl. Analyses were carried out by gas chromatography-triple quadrupole mass spectrometry (GC-QqQ-MS) in selected reaction monitoring (SRM) acquisition mode. The developed method was validated according to the guidelines issued by the Food and Drug Administration (FDA). The satisfactory performances reached in terms of linearity, sensitivity (LOQs between 0.01 and 0.1 μg/mL), matrix effect (68-121%), accuracy, and precision (inter-day values between -24% and +16% and in the range 3.3-28.4%, respectively) make the proposed protocol suitable to be adopted for quantification of these important biomarkers in urine samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. An integrated platform for directly widely-targeted quantitative analysis of feces part I: Platform configuration and method validation.

    PubMed

    Song, Yuelin; Song, Qingqing; Li, Jun; Zheng, Jiao; Li, Chun; Zhang, Yuan; Zhang, Lingling; Jiang, Yong; Tu, Pengfei

    2016-07-08

    Direct analysis is of great importance to understand the real chemical profile of a given sample, notably biological materials, because either chemical degradation or diverse errors and uncertainties might be resulted from sophisticated protocols. In comparison with biofluids, it is still challenging for direct analysis of solid biological samples using high performance liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Herein, a new analytical platform was configured by online hyphenating pressurized liquid extraction (PLE), turbulent flow chromatography (TFC), and LC-MS/MS. A facile, but robust PLE module was constructed based on the phenomenon that noticeable back-pressure can be generated during rapid fluid passing through a narrow tube. TFC column that is advantageous at extracting low molecular analytes from rushing fluid was employed to link at the outlet of the PLE module to capture constituents-of-interest. An electronic 6-port/2-position valve was introduced between TFC column and LC-MS/MS to fragment each measurement into extraction and elution phases, whereas LC-MS/MS took the charge of analyte separation and monitoring. As a proof of concept, simultaneous determination of 24 endogenous substances including eighteen steroids, five eicosanoids, and one porphyrin in feces was carried out in this paper. Method validation assays demonstrated the analytical platform to be qualified for directly simultaneous measurement of diverse endogenous analytes in fecal matrices. Application of this integrated platform on homolog-focused profiling of feces is discussed in a companion paper. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    PubMed

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.

  15. Thawing as a critical pre-analytical step in the lipidomic profiling of plasma samples: New standardized protocol.

    PubMed

    Pizarro, Consuelo; Arenzana-Rámila, Irene; Pérez-del-Notario, Nuria; Pérez-Matute, Patricia; González-Sáiz, José María

    2016-03-17

    Lipid profiling is a promising tool for the discovery and subsequent identification of biomarkers associated with various diseases. However, data quality is quite dependent on the pre-analytical methods employed. To date, potential confounding factors that may affect lipid metabolite levels after the thawing of plasma for biomarker exploration studies have not been thoroughly evaluated. In this study, by means of experimental design methodology, we performed the first in-depth examination of the ways in which thawing conditions affect lipid metabolite levels. After the optimization stage, we concluded that temperature, sample volume and the thawing method were the determining factors that had to be exhaustively controlled in the thawing process to ensure the quality of biomarker discovery. Best thawing conditions were found to be: 4 °C, with 0.25 mL of human plasma and ultrasound (US) thawing. The new US proposed thawing method was quicker than the other methods we studied, allowed more features to be identified and increased the signal of the lipids. In view of its speed, efficiency and detectability, the US thawing method appears to be a simple, economical method for the thawing of plasma samples, which could easily be applied in clinical laboratories before lipid profiling studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Virus removal retention challenge tests performed at lab scale and pilot scale during operation of membrane units.

    PubMed

    Humbert, H; Machinal, C; Labaye, Ivan; Schrotter, J C

    2011-01-01

    The determination of the virus retention capabilities of UF units during operation is essential for the operators of drinking water treatment facilities in order to guarantee an efficient and stable removal of viruses through time. In previous studies, an effective method (MS2-phage challenge tests) was developed by the Water Research Center of Veolia Environnement for the measurement of the virus retention rates (Log Removal Rate, LRV) of commercially available hollow fiber membranes at lab scale. In the present work, the protocol for monitoring membrane performance was transferred from lab scale to pilot scale. Membrane performances were evaluated during pilot trial and compared to the results obtained at lab scale with fibers taken from the pilot plant modules. PFU culture method was compared to RT-PCR method for the calculation of LRV in both cases. Preliminary tests at lab scale showed that both methods can be used interchangeably. For tests conducted on virgin membrane, a good consistency was observed between lab and pilot scale results with the two analytical methods used. This work intends to show that a reliable determination of the membranes performances based on RT-PCR analytical method can be achieved during the operation of the UF units.

  17. Determination of refractive and volatile elements in sediment using laser ablation inductively coupled plasma mass spectrometry.

    PubMed

    Duodu, Godfred Odame; Goonetilleke, Ashantha; Allen, Charlotte; Ayoko, Godwin A

    2015-10-22

    Wet-milling protocol was employed to produce pressed powder tablets with excellent cohesion and homogeneity suitable for laser ablation (LA) analysis of volatile and refractive elements in sediment. The influence of sample preparation on analytical performance was also investigated, including sample homogeneity, accuracy and limit of detection. Milling in volatile solvent for 40 min ensured sample is well mixed and could reasonably recover both volatile (Hg) and refractive (Zr) elements. With the exception of Cr (-52%) and Nb (+26%) major, minor and trace elements in STSD-1 and MESS-3 could be analysed within ±20% of the certified values. Comparison of the method with total digestion method using HF was tested by analysing 10 different sediment samples. The laser method recovers significantly higher amounts of analytes such as Ag, Cd, Sn and Sn than the total digestion method making it a more robust method for elements across the periodic table. LA-ICP-MS also eliminates the interferences from chemical reagents as well as the health and safety risks associated with digestion processes. Therefore, it can be considered as an enhanced method for the analysis of heterogeneous matrices such as river sediments. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Simultaneous dispersive liquid-liquid microextraction derivatisation and gas chromatography mass spectrometry analysis of subcritical water extracts of sweet and sour cherry stems.

    PubMed

    Švarc-Gajić, Jaroslava; Clavijo, Sabrina; Suárez, Ruth; Cvetanović, Aleksandra; Cerdà, Víctor

    2018-03-01

    Cherry stems have been used in traditional medicine mostly for the treatment of urinary tract infections. Extraction with subcritical water, according to its selectivity, efficiency and other aspects, differs substantially from conventional extraction techniques. The complexity of plant subcritical water extracts is due to the ability of subcritical water to extract different chemical classes of different physico-chemical properties and polarities in a single run. In this paper, dispersive liquid-liquid microextraction (DLLME) with simultaneous derivatisation was optimised for the analysis of complex subcritical water extracts of cherry stems to allow simple and rapid preparation prior to gas chromatography-mass spectrometry (GC-MS). After defining optimal extracting and dispersive solvents, the optimised method was used for the identification of compounds belonging to different chemical classes in a single analytical run. The developed sample preparation protocol enabled simultaneous extraction and derivatisation, as well as convenient coupling with GC-MS analysis, reducing the analysis time and number of steps. The applied analytical protocol allowed simple and rapid chemical screening of subcritical water extracts and was used for the comparison of subcritical water extracts of sweet and sour cherry stems. Graphical abstract DLLME GC MS analysis of cherry stem extracts obtained by subcritical water.

  19. Benefits and Limitations of DNA Barcoding and Metabarcoding in Herbal Product Authentication

    PubMed Central

    Raclariu, Ancuta Cristina; Heinrich, Michael; Ichim, Mihael Cristin

    2017-01-01

    Abstract Introduction Herbal medicines play an important role globally in the health care sector and in industrialised countries they are often considered as an alternative to mono‐substance medicines. Current quality and authentication assessment methods rely mainly on morphology and analytical phytochemistry‐based methods detailed in pharmacopoeias. Herbal products however are often highly processed with numerous ingredients, and even if these analytical methods are accurate for quality control of specific lead or marker compounds, they are of limited suitability for the authentication of biological ingredients. Objective To review the benefits and limitations of DNA barcoding and metabarcoding in complementing current herbal product authentication. Method Recent literature relating to DNA based authentication of medicinal plants, herbal medicines and products are summarised to provide a basic understanding of how DNA barcoding and metabarcoding can be applied to this field. Results Different methods of quality control and authentication have varying resolution and usefulness along the value chain of these products. DNA barcoding can be used for authenticating products based on single herbal ingredients and DNA metabarcoding for assessment of species diversity in processed products, and both methods should be used in combination with appropriate hyphenated chemical methods for quality control. Conclusions DNA barcoding and metabarcoding have potential in the context of quality control of both well and poorly regulated supply systems. Standardisation of protocols for DNA barcoding and DNA sequence‐based identification are necessary before DNA‐based biological methods can be implemented as routine analytical approaches and approved by the competent authorities for use in regulated procedures. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd. PMID:28906059

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendron, R.; Engebrecht, C.

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  1. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    USGS Publications Warehouse

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The <2-mm fraction of each sample was analyzed for Al, Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of 19 organochlorine pesticides by gas chromatography. Only three of these samples had detectable pesticide concentrations. A separate sample of A-horizon soil was collected for microbial characterization by phospholipid fatty acid analysis (PLFA), soil enzyme assays, and determination of selected human and agricultural pathogens. Collection, preservation and analysis of samples for both organic compounds and microbial characterization add a great degree of complication to the sampling and preservation protocols and a significant increase to the cost for a continental-scale survey. Both these issues must be considered carefully prior to adopting these parameters as part of the soil geochemical survey of North America.

  2. Colloidal Mechanisms of Gold Nanoparticle Loss in Asymmetric Flow Field-Flow Fractionation.

    PubMed

    Jochem, Aljosha-Rakim; Ankah, Genesis Ngwa; Meyer, Lars-Arne; Elsenberg, Stephan; Johann, Christoph; Kraus, Tobias

    2016-10-07

    Flow field-flow fractionation is a powerful method for the analysis of nanoparticle size distributions, but its widespread use has been hampered by large analyte losses, especially of metal nanoparticles. Here, we report on the colloidal mechanisms underlying the losses. We systematically studied gold nanoparticles (AuNPs) during asymmetrical flow field-flow fractionation (AF4) by systematic variation of the particle properties and the eluent composition. Recoveries of AuNPs (core diameter 12 nm) stabilized by citrate or polyethylene glycol (PEG) at different ionic strengths were determined. We used online UV-vis detection and off-line elementary analysis to follow particle losses during full analysis runs, runs without cross-flow, and runs with parts of the instrument bypassed. The combination allowed us to calculate relative and absolute analyte losses at different stages of the analytic protocol. We found different loss mechanisms depending on the ligand. Citrate-stabilized particles degraded during analysis and suffered large losses (up to 74%). PEG-stabilized particles had smaller relative losses at moderate ionic strengths (1-20%) that depended on PEG length. Long PEGs at higher ionic strengths (≥5 mM) caused particle loss due to bridging adsorption at the membrane. Bulk agglomeration was not a relevant loss mechanism at low ionic strengths ≤5 mM for any of the studied particles. An unexpectedly large fraction of particles was lost at tubing and other internal surfaces. We propose that the colloidal mechanisms observed here are relevant loss mechanisms in many particle analysis protocols and discuss strategies to avoid them.

  3. Feasibility and utility of applications of the common data model to multiple, disparate observational health databases

    PubMed Central

    Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B

    2015-01-01

    Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757

  4. Protocol for Short- and Longer-term Spatial Learning and Memory in Mice

    PubMed Central

    Willis, Emily F.; Bartlett, Perry F.; Vukovic, Jana

    2017-01-01

    Studies on the role of the hippocampus in higher cognitive functions such as spatial learning and memory in rodents are reliant upon robust and objective behavioral tests. This protocol describes one such test—the active place avoidance (APA) task. This behavioral task involves the mouse continuously integrating visual cues to orientate itself within a rotating arena in order to actively avoid a shock zone, the location of which remains constant relative to the room. This protocol details the step-by-step procedures for a novel paradigm of the hippocampal-dependent APA task, measuring acquisition of spatial learning during a single 20-min trial (i.e., short-term memory), with spatial memory encoding and retrieval (i.e., long-term memory) assessed by trials conducted over consecutive days. Using the APA task, cognitive flexibility can be assessed using the reversal learning paradigm, as this increases the cognitive load required for efficient performance in the task. In addition to a detailed experimental protocol, this paper also describes the range of its possible applications, the expected key results, as well as the analytical methods to assess the data, and the pitfalls/troubleshooting measures. The protocol described herein is highly robust and produces replicable results, thus presenting an important paradigm that enables the assessment of subtle short-term changes in spatial learning and memory, such as those observed for many experimental interventions. PMID:29089878

  5. Spectrophotometric methods as a novel screening approach for analysis of dihydropyrimidine dehydrogenase activity before treatment with 5-fluorouracil chemotherapy.

    PubMed

    Dolegowska, B; Ostapowicz, A; Stanczyk-Dunaj, M; Blogowski, W

    2012-08-01

    5-Fluorouracil (5-FU) is one of the most commonly used chemotherapeutics in the treatment of malignancies originating from breast, prostate, ovarian, skin and gastrointestinal tissues. Around 80% of administered dose of 5-FU is catabolized by dihydropirymidine dehydrogenase (DPD). Patients, in whom a deficiency or insufficient activity of this enzyme is observed, are at great risk of development of severe, even lethal, 5-FU toxicity. According to recent studies, so far over 30 mutations of DPYD gene, which are associated with DPD deficiency/insufficiency, have already been discovered. Currently, there are several analytical methods used for measurements of DPD activity. However, in this paper we report a novel, simple, economical and more accessible spectrophotometric method for measurements of DPD activity in the peripheral blood mononuclear cells (PBMCs) that was developed and validated on analysis of 200 generally healthy volunteers aged 22-63. We present two spectrophotometric protocols in this study, and as a reference method we used already described reverse phase high-performance liquid chromatography (RP HPLC) analysis. Basing on our findings, we conclude that spectrophotometric methods may be used as a screening protocol preceding 5-FU-based chemotherapy. Nevertheless, before introduction into clinical reality, our results should be confirmed in further larger studies.

  6. GMOMETHODS: the European Union database of reference methods for GMO analysis.

    PubMed

    Bonfini, Laura; Van den Bulcke, Marc H; Mazzara, Marco; Ben, Enrico; Patak, Alexandre

    2012-01-01

    In order to provide reliable and harmonized information on methods for GMO (genetically modified organism) analysis we have published a database called "GMOMETHODS" that supplies information on PCR assays validated according to the principles and requirements of ISO 5725 and/or the International Union of Pure and Applied Chemistry protocol. In addition, the database contains methods that have been verified by the European Union Reference Laboratory for Genetically Modified Food and Feed in the context of compliance with an European Union legislative act. The web application provides search capabilities to retrieve primers and probes sequence information on the available methods. It further supplies core data required by analytical labs to carry out GM tests and comprises information on the applied reference material and plasmid standards. The GMOMETHODS database currently contains 118 different PCR methods allowing identification of 51 single GM events and 18 taxon-specific genes in a sample. It also provides screening assays for detection of eight different genetic elements commonly used for the development of GMOs. The application is referred to by the Biosafety Clearing House, a global mechanism set up by the Cartagena Protocol on Biosafety to facilitate the exchange of information on Living Modified Organisms. The publication of the GMOMETHODS database can be considered an important step toward worldwide standardization and harmonization in GMO analysis.

  7. Protocols and characterization data for 2D, 3D, and slice-based tumor models from the PREDECT project.

    PubMed

    de Hoogt, Ronald; Estrada, Marta F; Vidic, Suzana; Davies, Emma J; Osswald, Annika; Barbier, Michael; Santo, Vítor E; Gjerde, Kjersti; van Zoggel, Hanneke J A A; Blom, Sami; Dong, Meng; Närhi, Katja; Boghaert, Erwin; Brito, Catarina; Chong, Yolanda; Sommergruber, Wolfgang; van der Kuip, Heiko; van Weerden, Wytske M; Verschuren, Emmy W; Hickman, John; Graeser, Ralph

    2017-11-21

    Two-dimensional (2D) culture of cancer cells in vitro does not recapitulate the three-dimensional (3D) architecture, heterogeneity and complexity of human tumors. More representative models are required that better reflect key aspects of tumor biology. These are essential studies of cancer biology and immunology as well as for target validation and drug discovery. The Innovative Medicines Initiative (IMI) consortium PREDECT (www.predect.eu) characterized in vitro models of three solid tumor types with the goal to capture elements of tumor complexity and heterogeneity. 2D culture and 3D mono- and stromal co-cultures of increasing complexity, and precision-cut tumor slice models were established. Robust protocols for the generation of these platforms are described. Tissue microarrays were prepared from all the models, permitting immunohistochemical analysis of individual cells, capturing heterogeneity. 3D cultures were also characterized using image analysis. Detailed step-by-step protocols, exemplary datasets from the 2D, 3D, and slice models, and refined analytical methods were established and are presented.

  8. Protocols and characterization data for 2D, 3D, and slice-based tumor models from the PREDECT project

    PubMed Central

    de Hoogt, Ronald; Estrada, Marta F.; Vidic, Suzana; Davies, Emma J.; Osswald, Annika; Barbier, Michael; Santo, Vítor E.; Gjerde, Kjersti; van Zoggel, Hanneke J. A. A.; Blom, Sami; Dong, Meng; Närhi, Katja; Boghaert, Erwin; Brito, Catarina; Chong, Yolanda; Sommergruber, Wolfgang; van der Kuip, Heiko; van Weerden, Wytske M.; Verschuren, Emmy W.; Hickman, John; Graeser, Ralph

    2017-01-01

    Two-dimensional (2D) culture of cancer cells in vitro does not recapitulate the three-dimensional (3D) architecture, heterogeneity and complexity of human tumors. More representative models are required that better reflect key aspects of tumor biology. These are essential studies of cancer biology and immunology as well as for target validation and drug discovery. The Innovative Medicines Initiative (IMI) consortium PREDECT (www.predect.eu) characterized in vitro models of three solid tumor types with the goal to capture elements of tumor complexity and heterogeneity. 2D culture and 3D mono- and stromal co-cultures of increasing complexity, and precision-cut tumor slice models were established. Robust protocols for the generation of these platforms are described. Tissue microarrays were prepared from all the models, permitting immunohistochemical analysis of individual cells, capturing heterogeneity. 3D cultures were also characterized using image analysis. Detailed step-by-step protocols, exemplary datasets from the 2D, 3D, and slice models, and refined analytical methods were established and are presented. PMID:29160867

  9. The international fine aerosol networks

    NASA Astrophysics Data System (ADS)

    Cahill, Thomas A.

    1993-04-01

    The adoption by the United States of a PIXE-based protocol for its fine aerosol network, after open competitions involving numerous laboratories and methods, has encouraged cooperation with other countries possessing similar capabilities and similar needs. These informal cooperative programs, involving about a dozen countries at the end of 1991, almost all use PIXE as a major component of the analytical protocols. The University of California, Davis, Air Quality Group assisted such programs through indefinite loans of a quality assurance sampler, the IMPROVE Channel A, and analyses at no cost of a small fraction of the samples taken in a side-by-side configuration. In December 1991, the World Meteorological Organization chose a protocol essentially identical to IMPROVE for the Global Atmospheric Watch (GAW) network and began deploying units, the IMPROVE Channel A, to sites around the world. Preferred analyses include fine (less than about 2.5 μm) mass, ions by ion chromatography and elements by PIXE + PESA (or, lacking that, XRF). This paper will describe progress in both programs, giving examples of the utility of the data and projecting the future expansion of the network to about 20 GAW sites by 1994.

  10. A step-by-step protocol for assaying protein carbonylation in biological samples.

    PubMed

    Colombo, Graziano; Clerici, Marco; Garavaglia, Maria Elisa; Giustarini, Daniela; Rossi, Ranieri; Milzani, Aldo; Dalle-Donne, Isabella

    2016-04-15

    Protein carbonylation represents the most frequent and usually irreversible oxidative modification affecting proteins. This modification is chemically stable and this feature is particularly important for storage and detection of carbonylated proteins. Many biochemical and analytical methods have been developed during the last thirty years to assay protein carbonylation. The most successful method consists on protein carbonyl (PCO) derivatization with 2,4-dinitrophenylhydrazine (DNPH) and consequent spectrophotometric assay. This assay allows a global quantification of PCO content due to the ability of DNPH to react with carbonyl giving rise to an adduct able to absorb at 366 nm. Similar approaches were also developed employing chromatographic separation, in particular HPLC, and parallel detection of absorbing adducts. Subsequently, immunological techniques, such as Western immunoblot or ELISA, have been developed leading to an increase of sensitivity in protein carbonylation detection. Currently, they are widely employed to evaluate change in total protein carbonylation and eventually to highlight the specific proteins undergoing selective oxidation. In the last decade, many mass spectrometry (MS) approaches have been developed for the identification of the carbonylated proteins and the relative amino acid residues modified to carbonyl derivatives. Although these MS methods are much more focused and detailed due to their ability to identify the amino acid residues undergoing carbonylation, they still require too expensive equipments and, therefore, are limited in distribution. In this protocol paper, we summarise and comment on the most diffuse protocols that a standard laboratory can employ to assess protein carbonylation; in particular, we describe step-by-step the different protocols, adding suggestions coming from our on-bench experience. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR LABORATORY ANALYSIS OF HAIR SAMPLES FOR MERCURY (RTI-L-1.0)

    EPA Science Inventory

    The purpose of this protocol is to provide guidelines for the analysis of hair samples for total mercury by cold vapor atomic fluorescence (CVAFS) spectrometry. This protocol describes the methodology and all other analytical aspects involved in the analysis. Keywords: hair; s...

  12. Lipidomic analysis of biological samples: Comparison of liquid chromatography, supercritical fluid chromatography and direct infusion mass spectrometry methods.

    PubMed

    Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal

    2017-11-24

    Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Methods for CT automatic exposure control protocol translation between scanner platforms.

    PubMed

    McKenney, Sarah E; Seibert, J Anthony; Lamba, Ramit; Boone, John M

    2014-03-01

    An imaging facility with a diverse fleet of CT scanners faces considerable challenges when propagating CT protocols with consistent image quality and patient dose across scanner makes and models. Although some protocol parameters can comfortably remain constant among scanners (eg, tube voltage, gantry rotation time), the automatic exposure control (AEC) parameter, which selects the overall mA level during tube current modulation, is difficult to match among scanners, especially from different CT manufacturers. Objective methods for converting tube current modulation protocols among CT scanners were developed. Three CT scanners were investigated, a GE LightSpeed 16 scanner, a GE VCT scanner, and a Siemens Definition AS+ scanner. Translation of the AEC parameters such as noise index and quality reference mAs across CT scanners was specifically investigated. A variable-diameter poly(methyl methacrylate) phantom was imaged on the 3 scanners using a range of AEC parameters for each scanner. The phantom consisted of 5 cylindrical sections with diameters of 13, 16, 20, 25, and 32 cm. The protocol translation scheme was based on matching either the volumetric CT dose index or image noise (in Hounsfield units) between two different CT scanners. A series of analytic fit functions, corresponding to different patient sizes (phantom diameters), were developed from the measured CT data. These functions relate the AEC metric of the reference scanner, the GE LightSpeed 16 in this case, to the AEC metric of a secondary scanner. When translating protocols between different models of CT scanners (from the GE LightSpeed 16 reference scanner to the GE VCT system), the translation functions were linear. However, a power-law function was necessary to convert the AEC functions of the GE LightSpeed 16 reference scanner to the Siemens Definition AS+ secondary scanner, because of differences in the AEC functionality designed by these two companies. Protocol translation on the basis of quantitative metrics (volumetric CT dose index or measured image noise) is feasible. Protocol translation has a dependency on patient size, especially between the GE and Siemens systems. Translation schemes that preserve dose levels may not produce identical image quality. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. A simple method for the small scale synthesis and solid-phase extraction purification of steroid sulfates.

    PubMed

    Waller, Christopher C; McLeod, Malcolm D

    2014-12-01

    Steroid sulfates are a major class of steroid metabolite that are of growing importance in fields such as anti-doping analysis, the detection of residues in agricultural produce or medicine. Despite this, many steroid sulfate reference materials may have limited or no availability hampering the development of analytical methods. We report simple protocols for the rapid synthesis and purification of steroid sulfates that are suitable for adoption by analytical laboratories. Central to this approach is the use of solid-phase extraction (SPE) for purification, a technique routinely used for sample preparation in analytical laboratories around the world. The sulfate conjugates of sixteen steroid compounds encompassing a wide range of steroid substitution patterns and configurations are prepared, including the previously unreported sulfate conjugates of the designer steroids furazadrol (17β-hydroxyandrostan[2,3-d]isoxazole), isofurazadrol (17β-hydroxyandrostan[3,2-c]isoxazole) and trenazone (17β-hydroxyestra-4,9-dien-3-one). Structural characterization data, together with NMR and mass spectra are reported for all steroid sulfates, often for the first time. The scope of this approach for small scale synthesis is highlighted by the sulfation of 1μg of testosterone (17β-hydroxyandrost-4-en-3-one) as monitored by liquid chromatography-mass spectrometry (LCMS). Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Benefits and Limitations of DNA Barcoding and Metabarcoding in Herbal Product Authentication.

    PubMed

    Raclariu, Ancuta Cristina; Heinrich, Michael; Ichim, Mihael Cristin; de Boer, Hugo

    2018-03-01

    Herbal medicines play an important role globally in the health care sector and in industrialised countries they are often considered as an alternative to mono-substance medicines. Current quality and authentication assessment methods rely mainly on morphology and analytical phytochemistry-based methods detailed in pharmacopoeias. Herbal products however are often highly processed with numerous ingredients, and even if these analytical methods are accurate for quality control of specific lead or marker compounds, they are of limited suitability for the authentication of biological ingredients. To review the benefits and limitations of DNA barcoding and metabarcoding in complementing current herbal product authentication. Recent literature relating to DNA based authentication of medicinal plants, herbal medicines and products are summarised to provide a basic understanding of how DNA barcoding and metabarcoding can be applied to this field. Different methods of quality control and authentication have varying resolution and usefulness along the value chain of these products. DNA barcoding can be used for authenticating products based on single herbal ingredients and DNA metabarcoding for assessment of species diversity in processed products, and both methods should be used in combination with appropriate hyphenated chemical methods for quality control. DNA barcoding and metabarcoding have potential in the context of quality control of both well and poorly regulated supply systems. Standardisation of protocols for DNA barcoding and DNA sequence-based identification are necessary before DNA-based biological methods can be implemented as routine analytical approaches and approved by the competent authorities for use in regulated procedures. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd.

  16. An ultra low-power and traffic-adaptive medium access control protocol for wireless body area network.

    PubMed

    Ullah, Sana; Kwak, Kyung Sup

    2012-06-01

    Wireless Body Area Network (WBAN) consists of low-power, miniaturized, and autonomous wireless sensor nodes that enable physicians to remotely monitor vital signs of patients and provide real-time feedback with medical diagnosis and consultations. It is the most reliable and cheaper way to take care of patients suffering from chronic diseases such as asthma, diabetes and cardiovascular diseases. Some of the most important attributes of WBAN is low-power consumption and delay. This can be achieved by introducing flexible duty cycling techniques on the energy constraint sensor nodes. Stated otherwise, low duty cycle nodes should not receive frequent synchronization and control packets if they have no data to send/receive. In this paper, we introduce a Traffic-adaptive MAC protocol (TaMAC) by taking into account the traffic information of the sensor nodes. The protocol dynamically adjusts the duty cycle of the sensor nodes according to their traffic-patterns, thus solving the idle listening and overhearing problems. The traffic-patterns of all sensor nodes are organized and maintained by the coordinator. The TaMAC protocol is supported by a wakeup radio that is used to accommodate emergency and on-demand events in a reliable manner. The wakeup radio uses a separate control channel along with the data channel and therefore it has considerably low power consumption requirements. Analytical expressions are derived to analyze and compare the performance of the TaMAC protocol with the well-known beacon-enabled IEEE 802.15.4 MAC, WiseMAC, and SMAC protocols. The analytical derivations are further validated by simulation results. It is shown that the TaMAC protocol outperforms all other protocols in terms of power consumption and delay.

  17. A slotted access control protocol for metropolitan WDM ring networks

    NASA Astrophysics Data System (ADS)

    Baziana, P. A.; Pountourakis, I. E.

    2009-03-01

    In this study we focus on the serious scalability problems that many access protocols for WDM ring networks introduce due to the use of a dedicated wavelength per access node for either transmission or reception. We propose an efficient slotted MAC protocol suitable for WDM ring metropolitan area networks. The proposed network architecture employs a separate wavelength for control information exchange prior to the data packet transmission. Each access node is equipped with a pair of tunable transceivers for data communication and a pair of fixed tuned transceivers for control information exchange. Also, each access node includes a set of fixed delay lines for synchronization reasons; to keep the data packets, while the control information is processed. An efficient access algorithm is applied to avoid both the data wavelengths and the receiver collisions. In our protocol, each access node is capable of transmitting and receiving over any of the data wavelengths, facing the scalability issues. Two different slot reuse schemes are assumed: the source and the destination stripping schemes. For both schemes, performance measures evaluation is provided via an analytic model. The analytical results are validated by a discrete event simulation model that uses Poisson traffic sources. Simulation results show that the proposed protocol manages efficient bandwidth utilization, especially under high load. Also, comparative simulation results prove that our protocol achieves significant performance improvement as compared with other WDMA protocols which restrict transmission over a dedicated data wavelength. Finally, performance measures evaluation is explored for diverse numbers of buffer size, access nodes and data wavelengths.

  18. Multiplexed Liquid Chromatography-Multiple Reaction Monitoring Mass Spectrometry Quantification of Cancer Signaling Proteins

    PubMed Central

    Chen, Yi; Fisher, Kate J.; Lloyd, Mark; Wood, Elizabeth R.; Coppola, Domenico; Siegel, Erin; Shibata, David; Chen, Yian A.; Koomen, John M.

    2017-01-01

    Quantitative evaluation of protein expression across multiple cancer-related signaling pathways (e.g. Wnt/β-catenin, TGF-β, receptor tyrosine kinases (RTK), MAP kinases, NF-κB, and apoptosis) in tumor tissues may enable the development of a molecular profile for each individual tumor that can aid in the selection of appropriate targeted cancer therapies. Here, we describe the development of a broadly applicable protocol to develop and implement quantitative mass spectrometry assays using cell line models and frozen tissue specimens from colon cancer patients. Cell lines are used to develop peptide-based assays for protein quantification, which are incorporated into a method based on SDS-PAGE protein fractionation, in-gel digestion, and liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM/MS). This analytical platform is then applied to frozen tumor tissues. This protocol can be broadly applied to the study of human disease using multiplexed LC-MRM assays. PMID:28808993

  19. IFSA: a microfluidic chip-platform for frit-based immunoassay protocols

    NASA Astrophysics Data System (ADS)

    Hlawatsch, Nadine; Bangert, Michael; Miethe, Peter; Becker, Holger; Gärtner, Claudia

    2013-03-01

    Point-of-care diagnostics (POC) is one of the key application fields for lab-on-a-chip devices. While in recent years much of the work has concentrated on integrating complex molecular diagnostic assays onto a microfluidic device, there is a need to also put comparatively simple immunoassay-type protocols on a microfluidic platform. In this paper, we present the development of a microfluidic cartridge using an immunofiltration approach. In this method, the sandwich immunoassay takes place in a porous frit on which the antibodies have immobilized. The device is designed to be able to handle three samples in parallel and up to four analytical targets per sample. In order to meet the critical cost targets for the diagnostic market, the microfluidic chip has been designed and manufactured using high-volume manufacturing technologies in mind. Validation experiments show comparable sensitivities in comparison with conventional immunofiltration kits.

  20. Bioanalytical challenge: A review of environmental and pharmaceuticals contaminants in human milk.

    PubMed

    Lopes, Bianca Rebelo; Barreiro, Juliana Cristina; Cass, Quezia Bezerra

    2016-10-25

    An overview of bioanalytical methods for the determination of environmental and pharmaceutical contaminants in human milk is presented. The exposure of children to these contaminants through lactation has been widely investigated. The human milk contains diverse proteins, lipids, and carbohydrates and the concentration of these components is drastically altered during the lactation period providing a high degree of an analytical challenge. Sample collection and pretreatment are still considered the Achilles' heel. This review presents liquid chromatographic methods developed in the last 10 years for this complex matrix with focuses in the extraction and quantification steps. Green sample preparation protocols have been emphasized. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Chemical Functionalization of Plasmonic Surface Biosensors: A Tutorial Review on Issues, Strategies, and Costs

    PubMed Central

    2017-01-01

    In an ideal plasmonic surface sensor, the bioactive area, where analytes are recognized by specific biomolecules, is surrounded by an area that is generally composed of a different material. The latter, often the surface of the supporting chip, is generally hard to be selectively functionalized, with respect to the active area. As a result, cross talks between the active area and the surrounding one may occur. In designing a plasmonic sensor, various issues must be addressed: the specificity of analyte recognition, the orientation of the immobilized biomolecule that acts as the analyte receptor, and the selectivity of surface coverage. The objective of this tutorial review is to introduce the main rational tools required for a correct and complete approach to chemically functionalize plasmonic surface biosensors. After a short introduction, the review discusses, in detail, the most common strategies for achieving effective surface functionalization. The most important issues, such as the orientation of active molecules and spatial and chemical selectivity, are considered. A list of well-defined protocols is suggested for the most common practical situations. Importantly, for the reported protocols, we also present direct comparisons in term of costs, labor demand, and risk vs benefit balance. In addition, a survey of the most used characterization techniques necessary to validate the chemical protocols is reported. PMID:28796479

  2. Algorithms and software for U-Pb geochronology by LA-ICPMS

    NASA Astrophysics Data System (ADS)

    McLean, Noah M.; Bowring, James F.; Gehrels, George

    2016-07-01

    The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.

  3. Simultaneous detection of xenon and krypton in equine plasma by gas chromatography-tandem mass spectrometry for doping control.

    PubMed

    Kwok, Wai Him; Choi, Timmy L S; So, Pui-Kin; Yao, Zhong-Ping; Wan, Terence S M

    2017-02-01

    Xenon can activate the hypoxia-inducible factors (HIFs). As such, it has been allegedly used in human sports for increasing erythropoiesis. Krypton, another noble gas with reported narcosis effect, can also be expected to be a potential and less expensive erythropoiesis stimulating agent. This has raised concern about the misuse of noble gases as doping agents in equine sports. The aim of the present study is to establish a method for the simultaneous detection of xenon and krypton in equine plasma for the purpose of doping control. Xenon- or krypton-fortified equine plasma samples were prepared according to reported protocols. The target noble gases were simultaneously detected by gas chromatography-triple quadrupole mass spectrometry using headspace injection. Three xenon isotopes at m/z 129, 131, and 132, and four krypton isotopes at m/z 82, 83, 84, and 86 were targeted in selected reaction monitoring mode (with the precursor ions and product ions at identical mass settings), allowing unambiguous identification of the target analytes. Limits of detection for xenon and krypton were about 19 pmol/mL and 98 pmol/mL, respectively. Precision for both analytes was less than 15%. The method has good specificity as background analyte signals were not observed in negative equine plasma samples (n = 73). Loss of analytes under different storage temperatures has also been evaluated. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Optimization of oncological {sup 18}F-FDG PET/CT imaging based on a multiparameter analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menezes, Vinicius O., E-mail: vinicius@radtec.com.br; Machado, Marcos A. D.; Queiroz, Cleiton C.

    2016-02-15

    Purpose: This paper describes a method to achieve consistent clinical image quality in {sup 18}F-FDG scans accounting for patient habitus, dose regimen, image acquisition, and processing techniques. Methods: Oncological PET/CT scan data for 58 subjects were evaluated retrospectively to derive analytical curves that predict image quality. Patient noise equivalent count rate and coefficient of variation (CV) were used as metrics in their analysis. Optimized acquisition protocols were identified and prospectively applied to 179 subjects. Results: The adoption of different schemes for three body mass ranges (<60 kg, 60–90 kg, >90 kg) allows improved image quality with both point spread functionmore » and ordered-subsets expectation maximization-3D reconstruction methods. The application of this methodology showed that CV improved significantly (p < 0.0001) in clinical practice. Conclusions: Consistent oncological PET/CT image quality on a high-performance scanner was achieved from an analysis of the relations existing between dose regimen, patient habitus, acquisition, and processing techniques. The proposed methodology may be used by PET/CT centers to develop protocols to standardize PET/CT imaging procedures and achieve better patient management and cost-effective operations.« less

  5. Innovative Approach for Interstitial Cystitis: Vaginal Pessaries Loaded Diazepam—A Preliminary Study

    PubMed Central

    Capra, P.; Perugini, P.; Bleve, M.; Pavanetto, P.; Musitelli, G.; Rovereto, B.; Porru, D.

    2013-01-01

    Bladder pain is a characteristic disorder of interstitial cystitis. Diazepam is well known for its antispasmodic activity in the treatment of muscular hypertonus. The aim of this work was to develop and characterize vaginal pessaries as an intravaginal delivery system of diazepam for the treatment of interstitial cystitis. In particular, the performance of two types of formulations, with and without beta-glucan, was compared. In particular, the preparation of pessaries, according to the modified Pharmacopeia protocol, the setup of the analytical method to determine diazepam, pH evaluation, dissolution profile, and photostability assay were reported. Results showed that the modified protocol permitted obtaining optimal vaginal pessaries, without air bubbles, with good consistency and handling and with good pH profiles. In order to determine the diazepam amount, calibration curves with good correlation coefficients were obtained, by the spectrophotometric method, using placebo pessaries as matrix with the addition of diazepam standard solution. This method was demonstrated sensible and accurate to determine the amount of drug in batches. Dissolution profiles showed a complete diazepam release just after 15 minutes, even if beta-glucan pessaries released drug more gradually. Finally, a possible drug photodegradation after exacerbated UV-visible exposition was evaluated. PMID:26555976

  6. Apheresis product identification in the transplant center: development of point-of-care protocols for extended blood typing of stem cell apheresis products.

    PubMed

    Cummerow, C; Schwind, P; Spicher, M; Spohn, G; Geisen, C; Seifried, E; Bönig, H

    2012-06-01

    Transfusion of the 'wrong' stem cell product would almost inevitably be lethal, yet assays to confirm the contents of the product bag, except by checking labels and paperwork, are lacking. To increase the likelihood that a product mix-up would be detected in the transplant center, we developed a simple protocol for extended blood typing and hence, for confirmation of donor/product identity, on a tube segment. Apheresis samples were applied, directly or after erythrocyte enrichment, to commercially available blood typing assays, including lateral flow cards and gel agglutination cards. Without sample modification, low hematocrit and high leukocyte count obviated definitive blood typing. Using the most simple erythrocyte enrichment protocol, that is, centrifugation, reliable blood group analysis became possible with either assay. Other, more cumbersome pre-analytical protocols were also successful but provided no advantage. The preferred method was validated on 100 samples; ABD was correctly identified in 100% of cases. Of the other Rh Ags, all except two 'small e', in both cases in heterozygous individuals, were detected; there were no false positives. A simple, inexpensive point-of-care assay for extended blood typing of apheresis products is available, which can reduce the fatal risk of administering the wrong stem cell product.

  7. SU-F-R-11: Designing Quality and Safety Informatics Through Implementation of a CT Radiation Dose Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, JM; Samei, E; Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, NC

    2016-06-15

    Purpose: Recent legislative and accreditation requirements have driven rapid development and implementation of CT radiation dose monitoring solutions. Institutions must determine how to improve quality, safety, and consistency of their clinical performance. The purpose of this work was to design a strategy and meaningful characterization of results from an in-house, clinically-deployed dose monitoring solution. Methods: A dose monitoring platform was designed by our imaging physics group that focused on extracting protocol parameters, dose metrics, and patient demographics and size. Compared to most commercial solutions, which focus on individual exam alerts and global thresholds, the program sought to characterize overall consistencymore » and targeted thresholds based on eight analytic interrogations. Those were based on explicit questions related to protocol application, national benchmarks, protocol and size-specific dose targets, operational consistency, outliers, temporal trends, intra-system variability, and consistent use of electronic protocols. Using historical data since the start of 2013, 95% and 99% intervals were used to establish yellow and amber parameterized dose alert thresholds, respectively, as a function of protocol, scanner, and size. Results: Quarterly reports have been generated for three hospitals for 3 quarters of 2015 totaling 27880, 28502, 30631 exams, respectively. Four adult and two pediatric protocols were higher than external institutional benchmarks. Four protocol dose levels were being inconsistently applied as a function of patient size. For the three hospitals, the minimum and maximum amber outlier percentages were [1.53%,2.28%], [0.76%,1.8%], [0.94%,1.17%], respectively. Compared with the electronic protocols, 10 protocols were found to be used with some inconsistency. Conclusion: Dose monitoring can satisfy requirements with global alert thresholds and patient dose records, but the real value is in optimizing patient-specific protocols, balancing image quality trade-offs that dose-reduction strategies promise, and improving the performance and consistency of a clinical operation. Data plots that capture patient demographics and scanner performance demonstrate that value.« less

  8. Electroanalytical sensing of chromium(III) and (VI) utilising gold screen printed macro electrodes.

    PubMed

    Metters, Jonathan P; Kadara, Rashid O; Banks, Craig E

    2012-02-21

    We report the fabrication of gold screen printed macro electrodes which are electrochemically characterised and contrasted to polycrystalline gold macroelectrodes with their potential analytical application towards the sensing of chromium(III) and (VI) critically explored. It is found that while these gold screen printed macro electrodes have electrode kinetics typically one order of magnitude lower than polycrystalline gold macroelectrodes as is measured via a standard redox probe, in terms of analytical sensing, these gold screen printed macro electrodes mimic polycrystalline gold in terms of their analytical performance towards the sensing of chromium(III) and (VI), whilst boasting additional advantages over the macro electrode due to their disposable one-shot nature and the ease of mass production. An additional advantage of these gold screen printed macro electrodes compared to polycrystalline gold is the alleviation of the requirement to potential cycle the latter to form the required gold oxide which aids in the simplification of the analytical protocol. We demonstrate that gold screen printed macro electrodes allow the low micro-molar sensing of chromium(VI) in aqueous solutions over the range 10 to 1600 μM with a limit of detection (3σ) of 4.4 μM. The feasibility of the analytical protocol is also tested through chromium(VI) detection in environmental samples.

  9. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  10. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    PubMed Central

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson’s disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Conclusions Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson’s disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer’s, Huntington’s, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications. PMID:27494614

  11. RNA-DNA hybrid (R-loop) immunoprecipitation mapping: an analytical workflow to evaluate inherent biases

    PubMed Central

    Halász, László; Karányi, Zsolt; Boros-Oláh, Beáta; Kuik-Rózsa, Tímea; Sipos, Éva; Nagy, Éva; Mosolygó-L, Ágnes; Mázló, Anett; Rajnavölgyi, Éva; Halmos, Gábor; Székvölgyi, Lóránt

    2017-01-01

    The impact of R-loops on the physiology and pathology of chromosomes has been demonstrated extensively by chromatin biology research. The progress in this field has been driven by technological advancement of R-loop mapping methods that largely relied on a single approach, DNA-RNA immunoprecipitation (DRIP). Most of the DRIP protocols use the experimental design that was developed by a few laboratories, without paying attention to the potential caveats that might affect the outcome of RNA-DNA hybrid mapping. To assess the accuracy and utility of this technology, we pursued an analytical approach to estimate inherent biases and errors in the DRIP protocol. By performing DRIP-sequencing, qPCR, and receiver operator characteristic (ROC) analysis, we tested the effect of formaldehyde fixation, cell lysis temperature, mode of genome fragmentation, and removal of free RNA on the efficacy of RNA-DNA hybrid detection and implemented workflows that were able to distinguish complex and weak DRIP signals in a noisy background with high confidence. We also show that some of the workflows perform poorly and generate random answers. Furthermore, we found that the most commonly used genome fragmentation method (restriction enzyme digestion) led to the overrepresentation of lengthy DRIP fragments over coding ORFs, and this bias was enhanced at the first exons. Biased genome sampling severely compromised mapping resolution and prevented the assignment of precise biological function to a significant fraction of R-loops. The revised workflow presented herein is established and optimized using objective ROC analyses and provides reproducible and highly specific RNA-DNA hybrid detection. PMID:28341774

  12. Two methods for proteomic analysis of formalin-fixed, paraffin embedded tissue result in differential protein identification, data quality, and cost.

    PubMed

    Luebker, Stephen A; Wojtkiewicz, Melinda; Koepsell, Scott A

    2015-11-01

    Formalin-fixed paraffin-embedded (FFPE) tissue is a rich source of clinically relevant material that can yield important translational biomarker discovery using proteomic analysis. Protocols for analyzing FFPE tissue by LC-MS/MS exist, but standardization of procedures and critical analysis of data quality is limited. This study compared and characterized data obtained from FFPE tissue using two methods: a urea in-solution digestion method (UISD) versus a commercially available Qproteome FFPE Tissue Kit method (Qkit). Each method was performed independently three times on serial sections of homogenous FFPE tissue to minimize pre-analytical variations and analyzed with three technical replicates by LC-MS/MS. Data were evaluated for reproducibility and physiochemical distribution, which highlighted differences in the ability of each method to identify proteins of different molecular weights and isoelectric points. Each method replicate resulted in a significant number of new protein identifications, and both methods identified significantly more proteins using three technical replicates as compared to only two. UISD was cheaper, required less time, and introduced significant protein modifications as compared to the Qkit method, which provided more precise and higher protein yields. These data highlight significant variability among method replicates and type of method used, despite minimizing pre-analytical variability. Utilization of only one method or too few replicates (both method and technical) may limit the subset of proteomic information obtained. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Evaluating community health centers’ adoption of a new global capitation payment (eCHANGE) study protocol

    PubMed Central

    Angier, H; O’Malley, JP; Marino, M; McConnell, KJ; Cottrell, E; Jacob, RL; Likumahuwa-Ackman, S; Heintzman, J; Huguet, N; Bailey, SR; DeVoe, JE

    2017-01-01

    Primary care patient-centered medical homes (PCMHs) are an effective healthcare delivery model. Evidence regarding the most effective payment models for increased coordination efforts is sparse. This protocol paper describes the evaluation of an Alternative Payment Methodology (APM) implemented in a subset of Oregon community health centers (CHCs), using a prospective matched observational design. The APM is a primary care payment reform intervention that changed Oregon’s Medicaid payment for several CHCs from fee-for-service reimbursement to a per-member-per-month capitated payment. We will implement a difference-in-difference analytic approach to evaluate pre-post APM changes between intervention and control groups, including: 1) clinic-level outcomes, 2) patient-level clinical outcomes, and 3) patient-level econometric outcomes. Findings from the project will be of national significance, as there is a need for evidence regarding how novel payment methods might enhance PCMH capabilities and support their capacity to produce better quality and outcomes. If this capitated payment method is proven effective, study findings will inform dissemination of similar APMs nationwide. PMID:27836506

  14. Preparation of Formalin-fixed Paraffin-embedded Tissue Cores for both RNA and DNA Extraction.

    PubMed

    Patel, Palak G; Selvarajah, Shamini; Boursalie, Suzanne; How, Nathan E; Ejdelman, Joshua; Guerard, Karl-Philippe; Bartlett, John M; Lapointe, Jacques; Park, Paul C; Okello, John B A; Berman, David M

    2016-08-21

    Formalin-fixed paraffin embedded tissue (FFPET) represents a valuable, well-annotated substrate for molecular investigations. The utility of FFPET in molecular analysis is complicated both by heterogeneous tissue composition and low yields when extracting nucleic acids. A literature search revealed a paucity of protocols addressing these issues, and none that showed a validated method for simultaneous extraction of RNA and DNA from regions of interest in FFPET. This method addresses both issues. Tissue specificity was achieved by mapping cancer areas of interest on microscope slides and transferring annotations onto FFPET blocks. Tissue cores were harvested from areas of interest using 0.6 mm microarray punches. Nucleic acid extraction was performed using a commercial FFPET extraction system, with modifications to homogenization, deparaffinization, and Proteinase K digestion steps to improve tissue digestion and increase nucleic acid yields. The modified protocol yields sufficient quantity and quality of nucleic acids for use in a number of downstream analyses, including a multi-analyte gene expression platform, as well as reverse transcriptase coupled real time PCR analysis of mRNA expression, and methylation-specific PCR (MSP) analysis of DNA methylation.

  15. Evaluating community health centers' adoption of a new global capitation payment (eCHANGE) study protocol.

    PubMed

    Angier, H; O'Malley, J P; Marino, M; McConnell, K J; Cottrell, E; Jacob, R L; Likumahuwa-Ackman, S; Heintzman, J; Huguet, N; Bailey, S R; DeVoe, J E

    2017-01-01

    Primary care patient-centered medical homes (PCMHs) are an effective healthcare delivery model. Evidence regarding the most effective payment models for increased coordination efforts is sparse. This protocol paper describes the evaluation of an Alternative Payment Methodology (APM) implemented in a subset of Oregon community health centers (CHCs), using a prospective matched observational design. The APM is a primary care payment reform intervention that changed Oregon's Medicaid payment for several CHCs from fee-for-service reimbursement to a per-member-per-month capitated payment. We will implement a difference-in-difference analytic approach to evaluate pre-post APM changes between intervention and control groups, including: 1) clinic-level outcomes, 2) patient-level clinical outcomes, and 3) patient-level econometric outcomes. Findings from the project will be of national significance, as there is a need for evidence regarding how novel payment methods might enhance PCMH capabilities and support their capacity to produce better quality and outcomes. If this capitated payment method is proven effective, study findings will inform dissemination of similar APMs nationwide. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Out-of-equilibrium protocol for Rényi entropies via the Jarzynski equality.

    PubMed

    Alba, Vincenzo

    2017-06-01

    In recent years entanglement measures, such as the von Neumann and the Rényi entropies, provided a unique opportunity to access elusive features of quantum many-body systems. However, extracting entanglement properties analytically, experimentally, or in numerical simulations can be a formidable task. Here, by combining the replica trick and the Jarzynski equality we devise an alternative effective out-of-equilibrium protocol for measuring the equilibrium Rényi entropies. The key idea is to perform a quench in the geometry of the replicas. The Rényi entropies are obtained as the exponential average of the work performed during the quench. We illustrate an application of the method in classical Monte Carlo simulations, although it could be useful in different contexts, such as in quantum Monte Carlo, or experimentally in cold-atom systems. The method is most effective in the quasistatic regime, i.e., for a slow quench. As a benchmark, we compute the Rényi entropies in the Ising universality class in 1+1 dimensions. We find perfect agreement with the well-known conformal field theory predictions.

  17. Use of a deuterated internal standard with pyrolysis-GC/MS dimeric marker analysis to quantify tire tread particles in the environment.

    PubMed

    Unice, Kenneth M; Kreider, Marisa L; Panko, Julie M

    2012-11-08

    Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.

  18. Lipids and Fatty Acids in Algae: Extraction, Fractionation into Lipid Classes, and Analysis by Gas Chromatography Coupled with Flame Ionization Detector (GC-FID).

    PubMed

    Guihéneuf, Freddy; Schmid, Matthias; Stengel, Dagmar B

    2015-01-01

    Despite the number of biochemical studies exploring algal lipids and fatty acid biosynthesis pathways and profiles, analytical methods used by phycologists for this purpose are often diverse and incompletely described. Potential confusion and potential variability of the results between studies can therefore occur due to change of protocols for lipid extraction and fractionation, as well as fatty acid methyl esters (FAME) preparation before gas chromatography (GC) analyses. Here, we describe a step-by-step procedure for the profiling of neutral and polar lipids using techniques such as solid-liquid extraction (SLE), thin-layer chromatography (TLC), and gas chromatography coupled with flame ionization detector (GC-FID). As an example, in this protocol chapter, analyses of neutral and polar lipids from the marine microalga Pavlova lutheri (an EPA/DHA-rich haptophyte) will be outlined to describe the distribution of fatty acid residues within its major lipid classes. This method has been proven to be a reliable technique to assess changes in lipid and fatty acid profiles in several other microalgal species and seaweeds.

  19. Laser Ablation in situ (U-Th-Sm)/He and U-Pb Double-Dating of Apatite and Zircon: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    McInnes, B.; Danišík, M.; Evans, N.; McDonald, B.; Becker, T.; Vermeesch, P.

    2015-12-01

    We present a new laser-based technique for rapid, quantitative and automated in situ microanalysis of U, Th, Sm, Pb and He for applications in geochronology, thermochronometry and geochemistry (Evans et al., 2015). This novel capability permits a detailed interrogation of the time-temperature history of rocks containing apatite, zircon and other accessory phases by providing both (U-Th-Sm)/He and U-Pb ages (+trace element analysis) on single crystals. In situ laser microanalysis offers several advantages over conventional bulk crystal methods in terms of safety, cost, productivity and spatial resolution. We developed and integrated a suite of analytical instruments including a 193 nm ArF excimer laser system (RESOlution M-50A-LR), a quadrupole ICP-MS (Agilent 7700s), an Alphachron helium mass spectrometry system and swappable flow-through and ultra-high vacuum analytical chambers. The analytical protocols include the following steps: mounting/polishing in PFA Teflon using methods similar to those adopted for fission track etching; laser He extraction and analysis using a 2 s ablation at 5 Hz and 2-3 J/cm2fluence; He pit volume measurement using atomic force microscopy, and U-Th-Sm-Pb (plus optional trace element) analysis using traditional laser ablation methods. The major analytical challenges for apatite include the low U, Th and He contents relative to zircon and the elevated common Pb content. On the other hand, apatite typically has less extreme and less complex zoning of parent isotopes (primarily U and Th). A freeware application has been developed for determining (U-Th-Sm)/He ages from the raw analytical data and Iolite software was used for U-Pb age and trace element determination. In situ double-dating has successfully replicated conventional U-Pb and (U-Th)/He age variations in xenocrystic zircon from the diamondiferous Ellendale lamproite pipe, Western Australia and increased zircon analytical throughput by a factor of 50 over conventional methods.Reference: Evans NJ, McInnes BIA, McDonald B, Becker T, Vermeesch P, Danisik M, Shelley M, Marillo-Sialer E and Patterson D. An in situ technique for (U-Th-Sm)/He and U-Pb double dating. J Analytical Atomic Spectrometry, 30, 1636 - 1645.

  20. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Second-order data obtained by beta-cyclodextrin complexes: a novel approach for multicomponent analysis with three-way multivariate calibration methods.

    PubMed

    Khani, Rouhollah; Ghasemi, Jahan B; Shemirani, Farzaneh

    2014-10-01

    This research reports the first application of β-cyclodextrin (β-CD) complexes as a new method for generation of three way data, combined with second-order calibration methods for quantification of a binary mixture of caffeic (CA) and vanillic (VA) acids, as model compounds in fruit juices samples. At first, the basic experimental parameters affecting the formation of inclusion complexes between target analytes and β-CD were investigated and optimized. Then under the optimum conditions, parallel factor analysis (PARAFAC) and bilinear least squares/residual bilinearization (BLLS/RBL) were applied for deconvolution of trilinear data to get spectral and concentration profiles of CA and VA as a function of β-CD concentrations. Due to severe concentration profile overlapping between CA and VA in β-CD concentration dimension, PARAFAC could not be successfully applied to the studied samples. So, BLLS/RBL performed better than PARAFAC. The resolution of the model compounds was possible due to differences in the spectral absorbance changes of the β-CD complexes signals of the investigated analytes, opening a new approach for second-order data generation. The proposed method was validated by comparison with a reference method based on high-performance liquid chromatography photodiode array detection (HPLC-PDA), and no significant differences were found between the reference values and the ones obtained with the proposed method. Such a chemometrics-based protocol may be a very promising tool for more analytical applications in real samples monitoring, due to its advantages of simplicity, rapidity, accuracy, sufficient spectral resolution and concentration prediction even in the presence of unknown interferents. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Development of a validated liquid chromatography/tandem mass spectrometry method for the distinction of thyronine and thyronamine constitutional isomers and for the identification of new deiodinase substrates.

    PubMed

    Piehl, Susanne; Heberer, Thomas; Balizs, Gabor; Scanlan, Thomas S; Köhrle, Josef

    2008-10-01

    Thyronines (THs) and thyronamines (TAMs) are two groups of endogenous iodine-containing signaling molecules whose representatives differ from each other only regarding the number and/or the position of the iodine atoms. Both groups of compounds are substrates of three deiodinase isozymes, which catalyze the sequential reductive removal of iodine from the respective precursor molecule. In this study, a novel analytical method applying liquid chromatography/tandem mass spectrometry (LC-MS/MS) was developed. This method permitted the unequivocal, simultaneous identification and quantification of all THs and TAMs in the same biological sample. Furthermore, a liquid-liquid extraction procedure permitting the concurrent isolation of all THs and TAMs from biological matrices, namely deiodinase (Dio) reaction mixtures, was established. Method validation experiments with extracted TH and TAM analytes demonstrated that the method was selective, devoid of matrix effects, sensitive, linear over a wide range of analyte concentrations and robust in terms of reproducible recoveries, process efficiencies as well as intra-assay and inter-assay stability parameters. The method was applied to study the deiodination reactions of iodinated THs catalyzed by the three deiodinase isozymes. With the HPLC protocol developed herein, sufficient chromatographic separation of all constitutional TH and TAM isomers was achieved. Accordingly, the position of each iodine atom removed from a TH substrate in a Dio-catalyzed reaction was backtracked unequivocally. While several established deiodination reactions were verified, two as yet unknown reactions, namely the phenolic ring deiodination of 3',5'-diiodothyronine (3',5'-T2) by Dio2 and the tyrosyl ring deiodination of 3-monoiodothyronine (3-T1) by Dio3, were newly identified.

  3. Improving Data Transfer Throughput with Direct Search Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balaprakash, Prasanna; Morozov, Vitali; Kettimuthu, Rajkumar

    2016-01-01

    Improving data transfer throughput over high-speed long-distance networks has become increasingly difficult. Numerous factors such as nondeterministic congestion, dynamics of the transfer protocol, and multiuser and multitask source and destination endpoints, as well as interactions among these factors, contribute to this difficulty. A promising approach to improving throughput consists in using parallel streams at the application layer.We formulate and solve the problem of choosing the number of such streams from a mathematical optimization perspective. We propose the use of direct search methods, a class of easy-to-implement and light-weight mathematical optimization algorithms, to improve the performance of data transfers by dynamicallymore » adapting the number of parallel streams in a manner that does not require domain expertise, instrumentation, analytical models, or historic data. We apply our method to transfers performed with the GridFTP protocol, and illustrate the effectiveness of the proposed algorithm when used within Globus, a state-of-the-art data transfer tool, on productionWAN links and servers. We show that when compared to user default settings our direct search methods can achieve up to 10x performance improvement under certain conditions. We also show that our method can overcome performance degradation due to external compute and network load on source end points, a common scenario at high performance computing facilities.« less

  4. Cell-based quantification of biomarkers from an ultra-fast microfluidic immunofluorescent staining: application to human breast cancer cell lines

    NASA Astrophysics Data System (ADS)

    Migliozzi, D.; Nguyen, H. T.; Gijs, M. A. M.

    2018-02-01

    Immunohistochemistry (IHC) is one of the main techniques currently used in the clinics for biomarker characterization. It consists in colorimetric labeling with specific antibodies followed by microscopy analysis. The results are then used for diagnosis and therapeutic targeting. Well-known drawbacks of such protocols are their limited accuracy and precision, which prevent the clinicians from having quantitative and robust IHC results. With our work, we combined rapid microfluidic immunofluorescent staining with efficient image-based cell segmentation and signal quantification to increase the robustness of both experimental and analytical protocols. The experimental protocol is very simple and based on fast-fluidic-exchange in a microfluidic chamber created on top of the formalin-fixed-paraffin-embedded (FFPE) slide by clamping it a silicon chip with a polydimethyl siloxane (PDMS) sealing ring. The image-processing protocol is based on enhancement and subsequent thresholding of the local contrast of the obtained fluorescence image. As a case study, given that the human epidermal growth factor receptor 2 (HER2) protein is often used as a biomarker for breast cancer, we applied our method to HER2+ and HER2- cell lines. We report very fast (5 minutes) immunofluorescence staining of both HER2 and cytokeratin (a marker used to define the tumor region) on FFPE slides. The image-processing program can segment cells correctly and give a cell-based quantitative immunofluorescent signal. With this method, we found a reproducible well-defined separation for the HER2-to-cytokeratin ratio for positive and negative control samples.

  5. Micelle assisted thin-film solid phase microextraction: a new approach for determination of quaternary ammonium compounds in environmental samples.

    PubMed

    Boyacı, Ezel; Pawliszyn, Janusz

    2014-09-16

    Determination of quaternary ammonium compounds (QACs) often is considered to be a challenging undertaking owing to secondary interactions of the analytes' permanently charged quaternary ammonium head or hydrophobic tail with the utilized labware. Here, for the first time, a micelle assisted thin-film solid phase microextraction (TF-SPME) using a zwitterionic detergent 3-[(3-cholamidopropyl)dimethylammonio]-1-propanesulfonate (CHAPS) as a matrix modifier is introduced as a novel approach for in-laboratory sample preparation of the challenging compounds. The proposed micelle assisted TF-SPME method offers suppression/enhancement free electrospray ionization of analytes in mass spectrometric detection, minimal interaction of the micelles with the TF-SPME coating, and chromatographic stationary phase and analysis free of secondary interactions. Moreover, it was found that the matrix modifier has multiple functions; when its concentration is found below the critical micelle concentration (CMC), the matrix modifier primarily acts as a surface deactivator; above its CMC, it acts as a stabilizer for QACs. Additionally, shorter equilibrium extraction times in the presence of the modifier demonstrated that micelles also assist in the transfer of analytes from the bulk of the sample to the surface of the coating. The developed micelle assisted TF-SPME protocol using the 96-blade system requires only 30 min of extraction and 15 min of desorption. Together with a conditioning step (15 min), the entire method is 60 min; considering the advantage of using the 96-blade system, if all the blades in the brush are used, the sample preparation time per sample is 0.63 min. Moreover, the recoveries for all analytes with the developed method were found to range within 80.2-97.3%; as such, this method can be considered an open bed solid phase extraction. The proposed method was successfully validated using real samples.

  6. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    PubMed

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. A Model for Developing Clinical Analytics Capacity: Closing the Loops on Outcomes to Optimize Quality.

    PubMed

    Eggert, Corinne; Moselle, Kenneth; Protti, Denis; Sanders, Dale

    2017-01-01

    Closed Loop Analytics© is receiving growing interest in healthcare as a term referring to information technology, local data and clinical analytics working together to generate evidence for improvement. The Closed Loop Analytics model consists of three loops corresponding to the decision-making levels of an organization and the associated data within each loop - Patients, Protocols, and Populations. The authors propose that each of these levels should utilize the same ecosystem of electronic health record (EHR) and enterprise data warehouse (EDW) enabled data, in a closed-loop fashion, with that data being repackaged and delivered to suit the analytic and decision support needs of each level, in support of better outcomes.

  8. Efficient Online Optimized Quantum Control for Adiabatic Quantum Computation

    NASA Astrophysics Data System (ADS)

    Quiroz, Gregory

    Adiabatic quantum computation (AQC) relies on controlled adiabatic evolution to implement a quantum algorithm. While control evolution can take many forms, properly designed time-optimal control has been shown to be particularly advantageous for AQC. Grover's search algorithm is one such example where analytically-derived time-optimal control leads to improved scaling of the minimum energy gap between the ground state and first excited state and thus, the well-known quadratic quantum speedup. Analytical extensions beyond Grover's search algorithm present a daunting task that requires potentially intractable calculations of energy gaps and a significant degree of model certainty. Here, an in situ quantum control protocol is developed for AQC. The approach is shown to yield controls that approach the analytically-derived time-optimal controls for Grover's search algorithm. In addition, the protocol's convergence rate as a function of iteration number is shown to be essentially independent of system size. Thus, the approach is potentially scalable to many-qubit systems.

  9. Identification and quantitative determination of diphenylarsenic compounds in abandoned toxic smoke canisters.

    PubMed

    Hanaoka, Shigeyuki; Nomura, Koji; Kudo, Shinichi

    2005-09-02

    Knowledge of the exact nature of the constituents of abandoned chemical weapons (ACW) is a prerequisite for their orderly destruction. Here we report the development of analytical procedures to identify diphenylchloroarsine (DA/Clark I), diphenylcyanoarsine (DC/Clark II) and related substances employed in one of the munitions known as "Red canister". Both DA and DC are relatively unstable under conventional analytical procedures without thiol derivatization. Unfortunately however, thiol drivatization affords the same volatile organo-arsenic derivative from several different diphenylarsenic compounds, making it impossible to identify and quantify the original compounds. Further, diminishing the analytical interference caused by the celluloid powder used as a stacking material in the weapons, is also essential for accurate analysis. In this study, extraction and instrumental conditions have been evaluated and an optimal protocol was determined. The analysis of Red canister samples following this protocol showed that most of the DA and DC associated with pumice had degraded to bis(diphenylarsine)oxide (BDPAO), while those associated with celluloid were dominantly degraded to diphenylarsinic acid (DPAA).

  10. High-throughput method for the quantitation of metabolites and co-factors from homocysteine-methionine cycle for nutritional status assessment.

    PubMed

    Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping

    2016-09-01

    There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.

  11. The applicability of real-time PCR in the diagnostic of cutaneous leishmaniasis and parasite quantification for clinical management: Current status and perspectives.

    PubMed

    Moreira, Otacilio C; Yadon, Zaida E; Cupolillo, Elisa

    2017-09-29

    Cutaneous leishmaniasis (CL) is spread worldwide and is the most common manifestation of leishmaniasis. Diagnosis is performed by combining clinical and epidemiological features, and through the detection of Leishmania parasites (or DNA) in tissue specimens or trough parasite isolation in culture medium. Diagnosis of CL is challenging, reflecting the pleomorphic clinical manifestations of this disease. Skin lesions vary in severity, clinical appearance, and duration, and in some cases, they can be indistinguishable from lesions related to other diseases. Over the past few decades, PCR-based methods, including real-time PCR assays, have been developed for Leishmania detection, quantification and species identification, improving the molecular diagnosis of CL. This review provides an overview of many real-time PCR methods reported for the diagnostic evaluation of CL and some recommendations for the application of these methods for quantification purposes for clinical management and epidemiological studies. Furthermore, the use of real-time PCR for Leishmania species identification is also presented. The advantages of real-time PCR protocols are numerous, including increased sensitivity and specificity and simpler standardization of diagnostic procedures. However, despite the numerous assays described, there is still no consensus regarding the methods employed. Furthermore, the analytical and clinical validation of CL molecular diagnosis has not followed international guidelines so far. A consensus methodology comprising a DNA extraction protocol with an exogenous quality control and an internal reference to normalize parasite load is still needed. In addition, the analytical and clinical performance of any consensus methodology must be accurately assessed. This review shows that a standardization initiative is essential to guide researchers and clinical laboratories towards the achievement of a robust and reproducible methodology, which will permit further evaluation of parasite load as a surrogate marker of prognosis and monitoring of aetiological treatment, particularly in multi-centric observational studies and clinical trials. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. SU-E-T-477: An Efficient Dose Correction Algorithm Accounting for Tissue Heterogeneities in LDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mashouf, S; Lai, P; Karotki, A

    2014-06-01

    Purpose: Seed brachytherapy is currently used for adjuvant radiotherapy of early stage prostate and breast cancer patients. The current standard for calculation of dose surrounding the brachytherapy seeds is based on American Association of Physicist in Medicine Task Group No. 43 (TG-43 formalism) which generates the dose in homogeneous water medium. Recently, AAPM Task Group No. 186 emphasized the importance of accounting for tissue heterogeneities. This can be done using Monte Carlo (MC) methods, but it requires knowing the source structure and tissue atomic composition accurately. In this work we describe an efficient analytical dose inhomogeneity correction algorithm implemented usingmore » MIM Symphony treatment planning platform to calculate dose distributions in heterogeneous media. Methods: An Inhomogeneity Correction Factor (ICF) is introduced as the ratio of absorbed dose in tissue to that in water medium. ICF is a function of tissue properties and independent of source structure. The ICF is extracted using CT images and the absorbed dose in tissue can then be calculated by multiplying the dose as calculated by the TG-43 formalism times ICF. To evaluate the methodology, we compared our results with Monte Carlo simulations as well as experiments in phantoms with known density and atomic compositions. Results: The dose distributions obtained through applying ICF to TG-43 protocol agreed very well with those of Monte Carlo simulations as well as experiments in all phantoms. In all cases, the mean relative error was reduced by at least 50% when ICF correction factor was applied to the TG-43 protocol. Conclusion: We have developed a new analytical dose calculation method which enables personalized dose calculations in heterogeneous media. The advantages over stochastic methods are computational efficiency and the ease of integration into clinical setting as detailed source structure and tissue segmentation are not needed. University of Toronto, Natural Sciences and Engineering Research Council of Canada.« less

  13. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.

  14. DNA barcode-based delineation of putative species: efficient start for taxonomic workflows

    PubMed Central

    Kekkonen, Mari; Hebert, Paul D N

    2014-01-01

    The analysis of DNA barcode sequences with varying techniques for cluster recognition provides an efficient approach for recognizing putative species (operational taxonomic units, OTUs). This approach accelerates and improves taxonomic workflows by exposing cryptic species and decreasing the risk of synonymy. This study tested the congruence of OTUs resulting from the application of three analytical methods (ABGD, BIN, GMYC) to sequence data for Australian hypertrophine moths. OTUs supported by all three approaches were viewed as robust, but 20% of the OTUs were only recognized by one or two of the methods. These OTUs were examined for three criteria to clarify their status. Monophyly and diagnostic nucleotides were both uninformative, but information on ranges was useful as sympatric sister OTUs were viewed as distinct, while allopatric OTUs were merged. This approach revealed 124 OTUs of Hypertrophinae, a more than twofold increase from the currently recognized 51 species. Because this analytical protocol is both fast and repeatable, it provides a valuable tool for establishing a basic understanding of species boundaries that can be validated with subsequent studies. PMID:24479435

  15. Capillary ion chromatography with on-column focusing for ultra-trace analysis of methanesulfonate and inorganic anions in limited volume Antarctic ice core samples.

    PubMed

    Rodriguez, Estrella Sanz; Poynter, Sam; Curran, Mark; Haddad, Paul R; Shellie, Robert A; Nesterenko, Pavel N; Paull, Brett

    2015-08-28

    Preservation of ionic species within Antarctic ice yields a unique proxy record of the Earth's climate history. Studies have been focused until now on two proxies: the ionic components of sea salt aerosol and methanesulfonic acid. Measurement of the all of the major ionic species in ice core samples is typically carried out by ion chromatography. Former methods, whilst providing suitable detection limits, have been based upon off-column preconcentration techniques, requiring larger sample volumes, with potential for sample contamination and/or carryover. Here, a new capillary ion chromatography based analytical method has been developed for quantitative analysis of limited volume Antarctic ice core samples. The developed analytical protocol applies capillary ion chromatography (with suppressed conductivity detection) and direct on-column sample injection and focusing, thus eliminating the requirement for off-column sample preconcentration. This limits the total sample volume needed to 300μL per analysis, allowing for triplicate sample analysis with <1mL of sample. This new approach provides a reliable and robust analytical method for the simultaneous determination of organic and inorganic anions, including fluoride, methanesulfonate, chloride, sulfate and nitrate anions. Application to composite ice-core samples is demonstrated, with coupling of the capillary ion chromatograph to high resolution mass spectrometry used to confirm the presence and purity of the observed methanesulfonate peak. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. How do gut feelings feature in tutorial dialogues on diagnostic reasoning in GP traineeship?

    PubMed

    Stolper, C F; Van de Wiel, M W J; Hendriks, R H M; Van Royen, P; Van Bokhoven, M A; Van der Weijden, T; Dinant, G J

    2015-05-01

    Diagnostic reasoning is considered to be based on the interaction between analytical and non-analytical cognitive processes. Gut feelings, a specific form of non-analytical reasoning, play a substantial role in diagnostic reasoning by general practitioners (GPs) and may activate analytical reasoning. In GP traineeships in the Netherlands, trainees mostly see patients alone but regularly consult with their supervisors to discuss patients and problems, receive feedback, and improve their competencies. In the present study, we examined the discussions of supervisors and their trainees about diagnostic reasoning in these so-called tutorial dialogues and how gut feelings feature in these discussions. 17 tutorial dialogues focussing on diagnostic reasoning were video-recorded and transcribed and the protocols were analysed using a detailed bottom-up and iterative content analysis and coding procedure. The dialogues were segmented into quotes. Each quote received a content code and a participant code. The number of words per code was used as a unit of analysis to quantitatively compare the contributions to the dialogues made by supervisors and trainees, and the attention given to different topics. The dialogues were usually analytical reflections on a trainee's diagnostic reasoning. A hypothetico-deductive strategy was often used, by listing differential diagnoses and discussing what information guided the reasoning process and might confirm or exclude provisional hypotheses. Gut feelings were discussed in seven dialogues. They were used as a tool in diagnostic reasoning, inducing analytical reflection, sometimes on the entire diagnostic reasoning process. The emphasis in these tutorial dialogues was on analytical components of diagnostic reasoning. Discussing gut feelings in tutorial dialogues seems to be a good educational method to familiarize trainees with non-analytical reasoning. Supervisors need specialised knowledge about these aspects of diagnostic reasoning and how to deal with them in medical education.

  17. Incorporating Aptamers in the Multiple Analyte Profiling Assays (xMAP): Detection of C-Reactive Protein.

    PubMed

    Bernard, Elyse D; Nguyen, Kathy C; DeRosa, Maria C; Tayabali, Azam F; Aranda-Rodriguez, Rocio

    2017-01-01

    Aptamers are short oligonucleotide sequences used in detection systems because of their high affinity binding to a variety of macromolecules. With the introduction of aptamers over 25 years ago came the exploration of their use in many different applications as a substitute for antibodies. Aptamers have several advantages; they are easy to synthesize, can bind to analytes for which it is difficult to obtain antibodies, and in some cases bind better than antibodies. As such, aptamer applications have significantly expanded as an adjunct to a variety of different immunoassay designs. The Multiple-Analyte Profiling (xMAP) technology developed by Luminex Corporation commonly uses antibodies for the detection of analytes in small sample volumes through the use of fluorescently coded microbeads. This technology permits the simultaneous detection of multiple analytes in each sample tested and hence could be applied in many research fields. Although little work has been performed adapting this technology for use with apatmers, optimizing aptamer-based xMAP assays would dramatically increase the versatility of analyte detection. We report herein on the development of an xMAP bead-based aptamer/antibody sandwich assay for a biomarker of inflammation (C-reactive protein or CRP). Protocols for the coupling of aptamers to xMAP beads, validation of coupling, and for an aptamer/antibody sandwich-type assay for CRP are detailed. The optimized conditions, protocols and findings described in this research could serve as a starting point for the development of new aptamer-based xMAP assays.

  18. Obstetrical complications associated with abnormal maternal serum markers analytes.

    PubMed

    Gagnon, Alain; Wilson, R Douglas

    2008-10-01

    To review the obstetrical outcomes associated with abnormally elevated or decreased level of one or more of the most frequently measured maternal serum marker analytes used in screening for aneuploidy. To provide guidance to facilitate the management of pregnancies that have abnormal levels of one of more markers and to assess the usefulness of these markers as a screening test. Perinatal outcomes associated with abnormal levels of maternal serum markers analytes are compared with the outcomes of pregnancies with normal levels of the same analytes or the general population. The Cochrane Library and Medline were searched for English-language articles published from 1966 to February 2007, relating to maternal serum markers and perinatal outcomes. Search terms included PAPP-A (pregnancy associated plasma protein A), AFP (alphafetoprotein), hCG (human chorionic gonadotropin), estriol, unconjugated estriol, inhibin, inhibin-A, maternal serum screen, triple marker screen, quadruple screen, integrated prenatal screen, first trimester screen, and combined prenatal screen. All study types were reviewed. Randomized controlled trials were considered evidence of the highest quality, followed by cohort studies. Key individual studies on which the recommendations are based are referenced. Supporting data for each recommendation are summarized with evaluative comments and references. The evidence was evaluated using the guidelines developed by the Canadian Task Force on Preventive Health Care. The evidence collected was reviewed by the Genetics Committee of the Society of Obstetricians and Gynaecologists of Canada. The benefit expected from this guideline is to facilitate early detection of potential adverse pregnancy outcomes when risks are identified at the time of a maternal serum screen. It will help further stratification of risk and provide options for pregnancy management to minimize the impact of pregnancy complications. The potential harms resulting from such practice are associated with the so called false positive (i.e., uncomplicated pregnancies labelled at increased risk for adverse perinatal outcomes), the potential stress associated with such a label, and the investigations performed for surveillance in this situation. No cost-benefit analysis is available to assess costs and savings associated with this guideline. SUMMARY STATEMENTS: 1. An unexplained level of a maternal serum marker analyte is defined as an abnormal level after confirmation of gestational age by ultrasound and exclusion of maternal, fetal, or placental causes for the abnormal level. (III) 2. Abnormally elevated levels of serum markers are associated with adverse pregnancy outcomes in twin pregnancies, after correction for the number of fetuses. Spontaneous or planned mutifetal reductions may result in abnormal elevations of serum markers. (II-2) RECOMMENDATIONS: 1. In the first trimester, an unexplained low PAPP-A (< 0.4 MoM) and/or a low hCG (< 0.5 MoM) are associated with an increased frequency of adverse obstetrical outcomes, and, at present, no specific protocol for treatment is available. (II-2A) In the second trimester, an unexplained elevation of maternal serum AFP (> 2.5 MoM), hCG (> 3.0 MoM), and/or inhibin-A (> or =2.0 MoM) or a decreased level of maternal serum AFP (< 0.25 MoM) and/or unconjugated estriol (< 0.5 MoM) are associated with an increased frequency of adverse obstetrical outcomes, and, at present, no specific protocol for treatment is available. (II-2A) 2. Pregnant woman with an unexplained elevated PAPP-A or hCG in the first trimester and an unexplained low hCG or inhibin-A and an unexplained elevated unconjugated estriol in the second trimester should receive normal antenatal care, as this pattern of analytes is not associated with adverse perinatal outcomes. (II-2A) 3. The combination of second or third trimester placenta previa and an unexplained elevated maternal serum AFP should increase the index of suspicion for placenta accreta, increta, or percreta. (II-2B) An assessment (ultrasound, MRI) of the placental-uterine interface should be performed. Abnormal invasion should be strongly suspected, and the planning of delivery location and technique should be done accordingly. (III-C) 4. A prenatal consultation with the medical genetics department is recommended for low unconjugated estriol levels (<0.3 MoM), as this analyte pattern can be associated with genetic conditions. (II-2B) 5. The clinical management protocol for identification of potential adverse obstetrical outcomes should be guided by one or more abnormal maternal serum marker analyte value rather than the false positive screening results for the trisomy 21 and/or the trisomy 18 screen. (II-2B) 6. Pregnant woman who are undergoing renal dialysis or who have had a renal transplant should be offered maternal serum screening, but interpretation of the result is difficult as the level of serum hCG is not reliable. (II-2A) 7. Abnormal maternal uterine artery Doppler in association with elevated maternal serum AFP, hCG, or inhibin-A or decreased PAPP-A identifies a group of women at greater risk of IUGR and gestational hypertension with proteinuria. Uterine artery Doppler measurements may be used in the evaluation of an unexplained abnormal level of either of these markers. (II-2B) 8. Further research is recommended to identify the best protocol for pregnancy management and surveillance in women identified at increased risk of adverse pregnancy outcomes based on an abnormality of a maternal serum screening analyte. (III-A) 9. In the absence of evidence supporting any specific surveillance protocol, an obstetrician should be consulted in order to establish a fetal surveillance plan specific to the increased obstetrical risks (maternal and fetal) identified. This plan may include enhanced patient education on signs and symptoms of the most common complications, increased frequency of antenatal visits, increased ultrasound (fetal growth, amniotic fluid levels), and fetal surveillance (biophysical profile, arterial and venous Doppler), and cervical length assessment. (III-A) 10. Limited information suggests that, in women with elevated hCG in the second trimester and/or abnormal uterine artery Doppler (at 22-24 weeks), low-dose aspirin (60-81 mg daily) is associated with higher birthweight and lower incidence of gestational hypertension with proteinuria. This therapy may be used in women who are at risk. (II-2B) 11. Further studies are recommended in order to assess the benefits of low-dose aspirin, low molecular weight heparin, or other therapeutic options in pregnancies determined to be at increased risk on the basis of an abnormal maternal serum screening analyte. (III-A) 12. Multiple maternal serum markers screening should not be used at present as a population-based screening method for adverse pregnancy outcomes (such as preeclampsia, placental abruption, and stillbirth) outside an established research protocol, as sensitivity is low, false positive rates are high, and no management protocol has been shown to clearly improve outcomes. (II-2D) When maternal serum screening is performed for the usual clinical indication (fetal aneuploidy and/or neural tube defect), abnormal analyte results can be utilized for the identification of pregnancies at risk and to direct their clinical management. (II-2B) Further studies are recommended to determine the optimal screening method for poor maternal and/or perinatal outcomes. (III-A).

  19. A core-shell column approach to a comprehensive high-performance liquid chromatography phenolic analysis of Vitis vinifera L. and interspecific hybrid grape juices, wines, and other matrices following either solid phase extraction or direct injection.

    PubMed

    Manns, David C; Mansfield, Anna Katharine

    2012-08-17

    Four high-throughput reverse-phase chromatographic protocols utilizing two different core-shell column chemistries have been developed to analyze the phenolic profiles of complex matrices, specifically targeting juices and wines produced from interspecific hybrid grape cultivars. Following pre-fractionation via solid-phase extraction or direct injection, individual protocols were designed to resolve, identify and quantify specific chemical classes of compounds including non-anthocyanin monomeric phenolics, condensed tannins following acid hydrolysis, and anthocyanins. Detection levels ranging from 1.2 ppb to 27.5 ppb, analyte %RSDs ranging from 0.04 to 0.38, and linear ranges of quantitation approaching five orders of magnitude were achieved using conventional HPLC instrumentation. Using C(18) column chemistry, the non-anthocyanin monomeric protocol effectively separated a set of 16 relevant phenolic compounds comprised flavan-3-ols, hydroxycinnamic acids, and flavonols in under 14 min. The same column was used to develop a 15-min protocol for hydrolyzed condensed tannin analysis. Two anthocyanin protocols are presented, one utilizing the same C(18) column, best suited for anthocyanidin and monoglucoside analysis, the other utilizing a pentafluorophenyl chemistry optimized to effectively separate complex mixtures of coexisting mono- and diglucoside anthocyanins. These protocols and column chemistries have been used initially to explore a wide variety of complex phenolic matrices, including red and white juices and wines produced from Vitis vinifera and interspecific hybrid grape cultivars, juices, teas, and plant extracts. Each protocol displayed robust matrix responses as written, yet are flexible enough to be easily modified to suit specifically tailored analytical requirements. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Simultaneous, untargeted metabolic profiling of polar and nonpolar metabolites by LC-Q-TOF mass spectrometry.

    PubMed

    Kirkwood, Jay S; Maier, Claudia; Stevens, Jan F

    2013-05-01

    At its most ambitious, untargeted metabolomics aims to characterize and quantify all of the metabolites in a given system. Metabolites are often present at a broad range of concentrations and possess diverse physical properties complicating this task. Performing multiple sample extractions, concentrating sample extracts, and using several separation and detection methods are common strategies to overcome these challenges but require a great amount of resources. This protocol describes the untargeted, metabolic profiling of polar and nonpolar metabolites with a single extraction and using a single analytical platform. © 2013 by John Wiley & Sons, Inc.

  1. A Mobility-Aware QoS Signaling Protocol for Ambient Networks

    NASA Astrophysics Data System (ADS)

    Jeong, Seong-Ho; Lee, Sung-Hyuck; Bang, Jongho

    Mobility-aware quality of service (QoS) signaling is crucial to provide seamless multimedia services in the ambient environment where mobile nodes may move frequently between different wireless access networks. The mobility of an IP-based node in ambient networks affects routing paths, and as a result, can have a significant impact on the operation and state management of QoS signaling protocols. In this paper, we first analyze the impact of mobility on QoS signaling protocols and how the protocols operate in mobility scenarios. We then propose an efficient mobility-aware QoS signaling protocol which can operate adaptively in ambient networks. The key features of the protocol include the fast discovery of a crossover node where the old and new paths converge or diverge due to handover and the localized state management for seamless services. Our analytical and simulation/experimental results show that the proposed/implemented protocol works better than existing protocols in the IP-based mobile environment.

  2. Dried Blood Spots - Preparing and Processing for Use in Immunoassays and in Molecular Techniques

    PubMed Central

    Grüner, Nico; Stambouli, Oumaima; Ross, R. Stefan

    2015-01-01

    The idea of collecting blood on a paper card and subsequently using the dried blood spots (DBS) for diagnostic purposes originated a century ago. Since then, DBS testing for decades has remained predominantly focused on the diagnosis of infectious diseases especially in resource-limited settings or the systematic screening of newborns for inherited metabolic disorders and only recently have a variety of new and innovative DBS applications begun to emerge. For many years, pre-analytical variables were only inappropriately considered in the field of DBS testing and even today, with the exception of newborn screening, the entire pre-analytical phase, which comprises the preparation and processing of DBS for their final analysis has not been standardized. Given this background, a comprehensive step-by-step protocol, which covers al the essential phases, is proposed, i.e., collection of blood; preparation of blood spots; drying of blood spots; storage and transportation of DBS; elution of DBS, and finally analyses of DBS eluates. The effectiveness of this protocol was first evaluated with 1,762 coupled serum/DBS pairs for detecting markers of hepatitis B virus, hepatitis C virus, and human immunodeficiency virus infections on an automated analytical platform. In a second step, the protocol was utilized during a pilot study, which was conducted on active drug users in the German cities of Berlin and Essen. PMID:25867233

  3. 2D-DIGE in Proteomics.

    PubMed

    Pasquali, Matias; Serchi, Tommaso; Planchon, Sebastien; Renaut, Jenny

    2017-01-01

    The two-dimensional difference gel electrophoresis method is a valuable approach for proteomics. The method, using cyanine fluorescent dyes, allows the co-migration of multiple protein samples in the same gel and their simultaneous detection, thus reducing experimental and analytical time. 2D-DIGE, compared to traditional post-staining 2D-PAGE protocols (e.g., colloidal Coomassie or silver nitrate), provides faster and more reliable gel matching, limiting the impact of gel to gel variation, and allows also a good dynamic range for quantitative comparisons. By the use of internal standards, it is possible to normalize for experimental variations in spot intensities and gel patterns. Here we describe the experimental steps we follow in our routine 2D-DIGE procedure that we then apply to multiple biological questions.

  4. Fast assessment of planar chromatographic layers quality using pulse thermovision method.

    PubMed

    Suszyński, Zbigniew; Świta, Robert; Loś, Joanna; Zarzycka, Magdalena B; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2014-12-19

    The main goal of this paper is to demonstrate capability of pulse thermovision (thermal-wave) methodology for sensitive detection of photothermal non-uniformities within light scattering and semi-transparent planar stationary phases. Successful visualization of stationary phases defects required signal processing protocols based on wavelet filtration, correlation analysis and k-means 3D segmentation. Such post-processing data handling approach allows extremely sensitive detection of thickness and structural changes within commercially available planar chromatographic layers. Particularly, a number of TLC and HPTLC stationary phases including silica, cellulose, aluminum oxide, polyamide and octadecylsilane coated with adsorbent layer ranging from 100 to 250μm were investigated. Presented detection protocol can be used as an efficient tool for fast screening the overall heterogeneity of any layered materials. Moreover, described procedure is very fast (few seconds including acquisition and data processing) and may be applied for fabrication processes online controlling. In spite of planar chromatographic plates this protocol can be used for assessment of different planar separation tools like paper based analytical devices or micro total analysis systems, consisted of organic and non-organic layers. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Reporting guidance considerations from a statistical perspective: overview of tools to enhance the rigour of reporting of randomised trials and systematic reviews.

    PubMed

    Hutton, Brian; Wolfe, Dianna; Moher, David; Shamseer, Larissa

    2017-05-01

    Research waste has received considerable attention from the biomedical community. One noteworthy contributor is incomplete reporting in research publications. When detailing statistical methods and results, ensuring analytic methods and findings are completely documented improves transparency. For publications describing randomised trials and systematic reviews, guidelines have been developed to facilitate complete reporting. This overview summarises aspects of statistical reporting in trials and systematic reviews of health interventions. A narrative approach to summarise features regarding statistical methods and findings from reporting guidelines for trials and reviews was taken. We aim to enhance familiarity of statistical details that should be reported in biomedical research among statisticians and their collaborators. We summarise statistical reporting considerations for trials and systematic reviews from guidance documents including the Consolidated Standards of Reporting Trials (CONSORT) Statement for reporting of trials, the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) Statement for trial protocols, the Statistical Analyses and Methods in the Published Literature (SAMPL) Guidelines for statistical reporting principles, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement for systematic reviews and PRISMA for Protocols (PRISMA-P). Considerations regarding sharing of study data and statistical code are also addressed. Reporting guidelines provide researchers with minimum criteria for reporting. If followed, they can enhance research transparency and contribute improve quality of biomedical publications. Authors should employ these tools for planning and reporting of their research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Modification of the BAX System PCR assay for detecting Salmonella in beef, produce, and soy protein isolate. Performance Tested Method 100201.

    PubMed

    Peng, Linda X; Wallace, Morgan; Andaloro, Bridget; Fallon, Dawn; Fleck, Lois; Delduco, Dan; Tice, George

    2011-01-01

    The BAX System PCR assay for Salmonella detection in foods was previously validated as AOAC Research Institute (RI) Performance Tested Method (PTM) 100201. New studies were conducted on beef and produce using the same media and protocol currently approved for the BAX System PCR assay for E. coli O157:H7 multiplex (MP). Additionally, soy protein isolate was tested for matrix extension using the U.S. Food and Drug Administration-Bacteriological Analytical Manual (FDA-BAM) enrichment protocols. The studies compared the BAX System method to the U.S. Department of Agriculture culture method for detecting Salmonella in beef and the FDA-BAM culture method for detecting Salmonella in produce and soy protein isolate. Method comparison studies on low-level inoculates showed that the BAX System assay for Salmonella performed as well as or better than the reference method for detecting Salmonella in beef and produce in 8-24 h enrichment when the BAX System E. coli O157:H7 MP media was used, and soy protein isolate in 20 h enrichment with lactose broth followed by 3 h regrowth in brain heart infusion broth. An inclusivity panel of 104 Salmonella strains with diverse serotypes was tested by the BAX System using the proprietary BAX System media and returned all positive results. Ruggedness factors involved in the enrichment phase were also evaluated by testing outside the specified parameters, and none of the factors examined affected the performance of the assay.

  7. The development of a change model of "exits" during cognitive analytic therapy for the treatment of depression.

    PubMed

    Sandhu, Sundeep Kaur; Kellett, Stephen; Hardy, Gillian

    2017-11-01

    "Exits" in cognitive analytic therapy (CAT) are methods that change unhelpful patterns or roles during the final "revision" phase of the therapy. How exits are conceived and achieved is currently poorly understood. This study focussed on the revision stage to explore and define how change is accomplished in CAT. Qualitative content analysis studied transcripts of sessions 6 and 7 of a protocol delivered 8-session CAT treatment for depression. Eight participants met the study inclusion criteria, and therefore, 16 sessions were analysed. The exit model developed contained 3 distinct (but interacting) phases: (a) developing an observing self via therapist input or client self-reflection, (b) breaking out of old patterns by creating new roles and procedures, and (c) utilisation of a range of methods to support and maintain change. Levels of interrater reliability for the exit categories that formed the model were good. The revision stage of CAT emerged as a complex and dynamic process involving 3 interacting stages. Further research is recommended to understand how exits relate to durability of change and whether change processes differ according to presenting problem. Exit work in cognitive analytic therapy is a dynamic process that requires progression through stages of insight, active change, and consolidation. Development of an "observing self" is an important foundation stone for change, and cognitive analytic therapists need to work within the client's zone of proximal development. A number of aspects appear important in facilitating change, such as attending to the process and feelings generated by change talk. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Metabolomic Analysis of Key Central Carbon Metabolism Carboxylic Acids as Their 3-Nitrophenylhydrazones by UPLC/ESI-MS

    PubMed Central

    Han, Jun; Gagnon, Susannah; Eckle, Tobias; Borchers, Christoph H.

    2014-01-01

    Multiple hydroxy-, keto-, di-, and tri-carboxylic acids are among the cellular metabolites of central carbon metabolism (CCM). Sensitive and reliable analysis of these carboxylates is important for many biological and cell engineering studies. In this work, we examined 3-nitrophenylhydrazine as a derivatizing reagent and optimized the reaction conditions for the measurement of ten CCM related carboxylic compounds, including glycolate, lactate, malate, fumarate, succinate, citrate, isocitrate, pyruvate, oxaloacetate, and α-ketoglutarate as their 3-nitrophenylhydrazones using LC/MS with electrospray ionization. With the derivatization protocol which we have developed, and using negative-ion multiple reaction monitoring on a triple-quadrupole instrument, all of the carboxylates showed good linearity within a dynamic range of ca. 200 to more than 2000. The on-column limits of detection and quantitation were from high femtomoles to low picomoles. The analytical accuracies for eight of the ten analytes were determined to be between 89.5 to 114.8% (CV≤7.4%, n=6). Using a quadrupole time-of-flight instrument, the isotopic distribution patterns of these carboxylates, extracted from a 13C-labeled mouse heart, were successfully determined by UPLC/MS with full-mass detection, indicating the possible utility of this analytical method for metabolic flux analysis. In summary, this work demonstrates an efficient chemical derivatization LC/MS method for metabolomic analysis of these key CCM intermediates in a biological matrix. PMID:23580203

  9. Validation of a liquid chromatography-tandem mass spectrometry method for the detection of nicotine biomarkers in hair and an evaluation of wash procedures for removal of environmental nicotine.

    PubMed

    Miller, Eleanor I; Murray, Gordon J; Rollins, Douglas E; Tiffany, Stephen T; Wilkins, Diana G

    2011-07-01

    The aim of this exploratory study was to develop and validate a liquid chromatography-tandem mass spectrometry (LC-MS-MS) method for the quantification of nicotine, eight nicotine metabolites, and two minor tobacco alkaloids in fortified analyte-free hair and subsequently apply this method to hair samples collected from active smokers. An additional aim of the study was to include an evaluation of different wash procedures for the effective removal of environmentally deposited nicotine from tobacco smoke. An apparatus was designed for the purpose of exposing analyte-free hair to environmental tobacco smoke in order to deposit nicotine onto the hair surface. A shampoo/water wash procedure was identified as the most effective means of removing nicotine. This wash procedure was utilized for a comparison of washed and unwashed heavy smoker hair samples. Analytes and corresponding deuterated internal standards were extracted using a cation-exchange solid-phase cartridge. LC-MS-MS was carried out using an Acquity™ UPLC(®) system (Waters) and a Quattro Premier XE™ triple quadrupole MS (Waters) operated in electrospray positive ionization mode, with multiple reaction monitoring data acquisition. The developed method was applied to hair samples collected from heavy smokers (n = 3) and low-level smokers (n = 3) collected through IRB-approved protocols. Nicotine, cotinine, and nornicotine were quantified in both the washed and unwashed hair samples collected from three heavy smokers, whereas 3-hydroxycotinine was quantified in only one unwashed sample and nicotine-1'-oxide in the washed and unwashed hair samples from two heavy smokers. In contrast, nicotine-1'-oxide was quantified in one of the three low-level smoker samples; nicotine was quantified in the other two low-level smoker samples. No other analytes were detected in the hair of the three low-level smokers.

  10. QA/QC in the laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  11. QA/QC in the laboratory. Session F

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  12. Design and Analysis of A Beacon-Less Routing Protocol for Large Volume Content Dissemination in Vehicular Ad Hoc Networks.

    PubMed

    Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea

    2016-11-01

    Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors' best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well.

  13. Design and Analysis of A Beacon-Less Routing Protocol for Large Volume Content Dissemination in Vehicular Ad Hoc Networks

    PubMed Central

    Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea

    2016-01-01

    Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors’ best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well. PMID:27809285

  14. Screening apatites for (U-Th)/He thermochronometry via continuous ramped heating: He age components and implications for age dispersion

    NASA Astrophysics Data System (ADS)

    McDannell, Kalin T.; Zeitler, Peter K.; Janes, Darwin G.; Idleman, Bruce D.; Fayon, Annia K.

    2018-02-01

    Old slowly-cooled apatites often yield dispersed (U-Th)/He ages for a variety of reasons, some well understood and some not. Analytical protocols like careful grain selection can reduce the impact of this dispersion but add costs in time and resources and too often have proven insufficient. We assess a new analytical protocol that utilizes static-gas measurement during continuous ramped heating (CRH) as a means to rapidly screen apatite samples. In about the time required for a conventional total-gas analysis, this method can discriminate between samples showing expected volume-diffusion behavior and those showing anomalous release patterns inconsistent with their direct use in thermochronologic applications. This method also appears able to discriminate between the radiogenic and extraneous 4He fractions released by a sample, potentially allowing ages to be corrected. Well-behaved examples such as the Durango standard and other apatites with good age reproducibility show the expected smooth, sigmoidal gas-release curves predicted for volume diffusion using typical apatite kinetics, with complete exhaustion by ∼900 °C for linear heating at 20 °C/min. Secondary factors such as U and Th zoning and alpha-loss distribution have a relatively minor impact on such profiles. In contrast, samples having greater age dispersion show significant He release in the form of outgassing spikes and He release deferred to higher temperatures. Screening results for a range of samples permit us to assess the degree to which CRH screening can identify misbehaving grains, give insight into the source of extraneous He, and suggest that in some cases it may be possible to correct ages for the presence of such components.

  15. Squeezed-state quantum key distribution with a Rindler observer

    NASA Astrophysics Data System (ADS)

    Zhou, Jian; Shi, Ronghua; Guo, Ying

    2018-03-01

    Lengthening the maximum transmission distance of quantum key distribution plays a vital role in quantum information processing. In this paper, we propose a directional squeezed-state protocol with signals detected by a Rindler observer in the relativistic quantum field framework. We derive an analytical solution to the transmission problem of squeezed states from the inertial sender to the accelerated receiver. The variance of the involved signal mode is closer to optimality than that of the coherent-state-based protocol. Simulation results show that the proposed protocol has better performance than the coherent-state counterpart especially in terms of the maximal transmission distance.

  16. Development and validation of a single robust HPLC method for the characterization of a pharmaceutical starting material and impurities from three suppliers using three separate synthetic routes.

    PubMed

    Sheldon, E M; Downar, J B

    2000-08-15

    Novel approaches to the development of analytical procedures for monitoring incoming starting material in support of chemical/pharmaceutical processes are described. High technology solutions were utilized for timely process development and preparation of high quality clinical supplies. A single robust HPLC method was developed and characterized for the analysis of the key starting material from three suppliers. Each supplier used a different process for the preparation of this material and, therefore, each suppliers' material exhibited a unique impurity profile. The HPLC method utilized standard techniques acceptable for release testing in a QC/manufacturing environment. An automated experimental design protocol was used to characterize the robustness of the HPLC method. The method was evaluated for linearity, limit of quantitation, solution stability, and precision of replicate injections. An LC-MS method that emulated the release HPLC method was developed and the identities of impurities were mapped between the two methods.

  17. The Successful Diagnosis and Typing of Systemic Amyloidosis Using A Microwave-Assisted Filter-Aided Fast Sample Preparation Method and LC/MS/MS Analysis

    PubMed Central

    Zou, Lili; Shen, Kaini; Zhong, Dingrong; Zhou, Daobin; Sun, Wei; Li, Jian

    2015-01-01

    Laser microdissection followed by mass spectrometry has been successfully used for amyloid typing. However, sample contamination can interfere with proteomic analysis, and overnight digestion limits the analytical throughput. Moreover, current quantitative analysis methods are based on the spectrum count, which ignores differences in protein length and may lead to misdiagnoses. Here, we developed a microwave-assisted filter-aided sample preparation (maFASP) method that can efficiently remove contaminants with a 10-kDa cutoff ultrafiltration unit and can accelerate the digestion process with the assistance of a microwave. Additionally, two parameters (P- and D-scores) based on the exponentially modified protein abundance index were developed to define the existence of amyloid deposits and those causative proteins with the greatest abundance. Using our protocol, twenty cases of systemic amyloidosis that were well-typed according to clinical diagnostic standards (training group) and another twenty-four cases without subtype diagnoses (validation group) were analyzed. Using this approach, sample preparation could be completed within four hours. We successfully subtyped 100% of the cases in the training group, and the diagnostic success rate in the validation group was 91.7%. This maFASP-aided proteomic protocol represents an efficient approach for amyloid diagnosis and subtyping, particularly for serum-contaminated samples. PMID:25984759

  18. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  19. Next Generation Offline Approaches to Trace Gas-Phase Organic Compound Speciation: Sample Collection and Analysis

    NASA Astrophysics Data System (ADS)

    Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.

    2017-12-01

    Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.

  20. Automated image quality assessment for chest CT scans.

    PubMed

    Reeves, Anthony P; Xie, Yiting; Liu, Shuang

    2018-02-01

    Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.

  1. Metabolic profiling of body fluids and multivariate data analysis.

    PubMed

    Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten

    2017-01-01

    Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.

  2. A "three-in-one" sample preparation method for simultaneous determination of B-group water-soluble vitamins in infant formula using VitaFast(®) kits.

    PubMed

    Zhang, Heng; Lan, Fang; Shi, Yupeng; Wan, Zhi-Gang; Yue, Zhen-Feng; Fan, Fang; Lin, Yan-Kui; Tang, Mu-Jin; Lv, Jing-Zhang; Xiao, Tan; Yi, Changqing

    2014-06-15

    VitaFast(®) test kits designed for the microbiological assay in microtiter plate format can be applied to quantitative determination of B-group water-soluble vitamins such as vitamin B12, folic acid and biotin, et al. Compared to traditional microbiological methods, VitaFast(®) kits significantly reduce sample processing time and provide greater reliability, higher productivity and better accuracy. Recently, simultaneous determination of vitamin B12, folic acid and biotin in one sample is urgently required when evaluating the quality of infant formulae in our practical work. However, the present sample preparation protocols which are developed for individual test systems, are incompatible with simultaneous determination of several analytes. To solve this problem, a novel "three-in-one" sample preparation method is herein developed for simultaneous determination of B-group water-soluble vitamins using VitaFast(®) kits. The performance of this novel "three-in-one" sample preparation method was systematically evaluated through comparing with individual sample preparation protocols. The experimental results of the assays which employed "three-in-one" sample preparation method were in good agreement with those obtained from conventional VitaFast(®) extraction methods, indicating that the proposed "three-in-one" sample preparation method is applicable to the present three VitaFast(®) vitamin test systems, thus offering a promising alternative for the three independent sample preparation methods. The proposed new sample preparation method will significantly improve the efficiency of infant formulae inspection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Glycan characterization of the NIST RM monoclonal antibody using a total analytical solution: From sample preparation to data analysis.

    PubMed

    Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M

    Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.

  4. High-throughput quantification for a drug mixture in rat plasma-a comparison of Ultra Performance liquid chromatography/tandem mass spectrometry with high-performance liquid chromatography/tandem mass spectrometry.

    PubMed

    Yu, Kate; Little, David; Plumb, Rob; Smith, Brian

    2006-01-01

    A quantitative Ultra Performance liquid chromatography/tandem mass spectrometry (UPL/MS/MS) protocol was developed for a five-compound mixture in rat plasma. A similar high-performance liquid chromatography/tandem mass spectrometry (HPLC/MS/MS) quantification protocol was developed for comparison purposes. Among the five test compounds, three preferred positive electrospray ionization (ESI) and two preferred negative ESI. As a result, both UPLC/MS/MS and HPLC/MS/MS analyses were performed by having the mass spectrometer collecting ESI multiple reaction monitoring (MRM) data in both positive and negative ion modes during a single injection. Peak widths for most standards were 4.8 s for the HPLC analysis and 2.4 s for the UPLC analysis. There were 17 to 20 data points obtained for each of the LC peaks. Compared with the HPLC/MS/MS method, the UPLC/MS/MS method offered 3-fold decrease in retention time, up to 10-fold increase in detected peak height, with 2-fold decrease in peak width. Limits of quantification (LOQs) for both HPLC and UPLC methods were evaluated. For UPLC/MS/MS analysis, a linear range up to four orders of magnitude was obtained with r2 values ranging from 0.991 to 0.998. The LOQs for the five analytes ranged from 0.08 to 9.85 ng/mL. Three levels of quality control (QC) samples were analyzed. For the UPLC/MS/MS protocol, the percent relative standard deviation (RSD%) for low QC (2 ng/mL) ranged from 3.42 to 8.67% (N = 18). The carryover of the UPLC/MS/MS protocol was negligible and the robustness of the UPLC/MS/MS system was evaluated with up to 963 QC injections. Copyright 2006 John Wiley & Sons, Ltd.

  5. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  6. Use of a Deuterated Internal Standard with Pyrolysis-GC/MS Dimeric Marker Analysis to Quantify Tire Tread Particles in the Environment

    PubMed Central

    Unice, Kenneth M.; Kreider, Marisa L.; Panko, Julie M.

    2012-01-01

    Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories. PMID:23202830

  7. The identification and characterization of non-coding and coding RNAs and their modified nucleosides by mass spectrometry

    PubMed Central

    Gaston, Kirk W; Limbach, Patrick A

    2014-01-01

    The analysis of ribonucleic acids (RNA) by mass spectrometry has been a valuable analytical approach for more than 25 years. In fact, mass spectrometry has become a method of choice for the analysis of modified nucleosides from RNA isolated out of biological samples. This review summarizes recent progress that has been made in both nucleoside and oligonucleotide mass spectral analysis. Applications of mass spectrometry in the identification, characterization and quantification of modified nucleosides are discussed. At the oligonucleotide level, advances in modern mass spectrometry approaches combined with the standard RNA modification mapping protocol enable the characterization of RNAs of varying lengths ranging from low molecular weight short interfering RNAs (siRNAs) to the extremely large 23 S rRNAs. New variations and improvements to this protocol are reviewed, including top-down strategies, as these developments now enable qualitative and quantitative measurements of RNA modification patterns in a variety of biological systems. PMID:25616408

  8. The identification and characterization of non-coding and coding RNAs and their modified nucleosides by mass spectrometry.

    PubMed

    Gaston, Kirk W; Limbach, Patrick A

    2014-01-01

    The analysis of ribonucleic acids (RNA) by mass spectrometry has been a valuable analytical approach for more than 25 years. In fact, mass spectrometry has become a method of choice for the analysis of modified nucleosides from RNA isolated out of biological samples. This review summarizes recent progress that has been made in both nucleoside and oligonucleotide mass spectral analysis. Applications of mass spectrometry in the identification, characterization and quantification of modified nucleosides are discussed. At the oligonucleotide level, advances in modern mass spectrometry approaches combined with the standard RNA modification mapping protocol enable the characterization of RNAs of varying lengths ranging from low molecular weight short interfering RNAs (siRNAs) to the extremely large 23 S rRNAs. New variations and improvements to this protocol are reviewed, including top-down strategies, as these developments now enable qualitative and quantitative measurements of RNA modification patterns in a variety of biological systems.

  9. High-Resolution Sequence-Function Mapping of Full-Length Proteins

    PubMed Central

    Kowalsky, Caitlin A.; Klesmith, Justin R.; Stapleton, James A.; Kelly, Vince; Reichkitzer, Nolan; Whitehead, Timothy A.

    2015-01-01

    Comprehensive sequence-function mapping involves detailing the fitness contribution of every possible single mutation to a gene by comparing the abundance of each library variant before and after selection for the phenotype of interest. Deep sequencing of library DNA allows frequency reconstruction for tens of thousands of variants in a single experiment, yet short read lengths of current sequencers makes it challenging to probe genes encoding full-length proteins. Here we extend the scope of sequence-function maps to entire protein sequences with a modular, universal sequence tiling method. We demonstrate the approach with both growth-based selections and FACS screening, offer parameters and best practices that simplify design of experiments, and present analytical solutions to normalize data across independent selections. Using this protocol, sequence-function maps covering full sequences can be obtained in four to six weeks. Best practices introduced in this manuscript are fully compatible with, and complementary to, other recently published sequence-function mapping protocols. PMID:25790064

  10. NASA-JSC Protocol for the Characterization of Single Wall Carbon Nanotube Material Quality

    NASA Technical Reports Server (NTRS)

    Arepalli, Sivaram; Nikolaev, Pasha; Gorelik, Olga; Hadjiev, Victor; Holmes, William; Devivar, Rodrigo; Files, Bradley; Yowell, Leonard

    2010-01-01

    It is well known that the raw as well as purified single wall carbon nanotube (SWCNT) material always contain certain amount of impurities of varying composition (mostly metal catalyst and non-tubular carbon). Particular purification method also creates defects and/or functional groups in the SWCNT material and therefore affects the its dispersability in solvents (important to subsequent application development). A number of analytical characterization tools have been used successfully in the past years to assess various properties of nanotube materials, but lack of standards makes it difficult to compare these measurements across the board. In this work we report the protocol developed at NASA-JSC which standardizes measurements using TEM, SEM, TGA, Raman and UV-Vis-NIR absorption techniques. Numerical measures are established for parameters such as metal content, homogeneity, thermal stability and dispersability, to allow easy comparison of SWCNT materials. We will also report on the recent progress in quantitative measurement of non-tubular carbon impurities and a possible purity standard for SWCNT materials.

  11. Tracking emerging mycotoxins in food: development of an LC-MS/MS method for free and modified Alternaria toxins.

    PubMed

    Puntscher, Hannes; Kütt, Mary-Liis; Skrinjar, Philipp; Mikula, Hannes; Podlech, Joachim; Fröhlich, Johannes; Marko, Doris; Warth, Benedikt

    2018-05-16

    Mycotoxins produced by Alternaria fungi are ubiquitous food contaminants, but analytical methods for generating comprehensive exposure data are rare. We describe the development of an LC-MS/MS method covering 17 toxins for investigating the natural occurrence of free and modified Alternaria toxins in tomato sauce, sunflower seed oil, and wheat flour. Target analytes included alternariol (AOH), AOH-3-glucoside, AOH-9-glucoside, AOH-3-sulfate, alternariol monomethyl ether (AME), AME-3-glucoside, AME-3-sulfate, altenuene, isoaltenuene, tenuazonic acid (TeA), tentoxin (TEN), altertoxin I and II, alterperylenol, stemphyltoxin III, altenusin, and altenuic acid III. Extensive optimization resulted in a time- and cost-effective sample preparation protocol and a chromatographic baseline separation of included isomers. Overall, adequate limits of detection (0.03-9 ng/g) and quantitation (0.6-18 ng/g), intermediate precision (9-44%), and relative recovery values (75-100%) were achieved. However, stemphyltoxin III, AOH-3-sulfate, AME-3-sulfate, altenusin, and altenuic acid III showed recoveries in wheat flour below 70%, while their performance was stable and reproducible. Our pilot study with samples from the Austrian retail market demonstrated that tomato sauces (n = 12) contained AOH, AME, TeA, and TEN in concentrations up to 20, 4, 322, and 0.6 ng/g, while sunflower seed oil (n = 7) and wheat flour samples (n = 9) were contaminated at comparatively lower levels. Interestingly and of relevance for risk assessment, AOH-9-glucoside, discovered for the first time in naturally contaminated food items, and AME-3-sulfate were found in concentrations similar to their parent toxins. In conclusion, the established multi-analyte method proved to be fit for purpose for generating comprehensive Alternaria toxin occurrence data in different food matrices. Graphical abstract ᅟ.

  12. Coconut coir pith lignin: A physicochemical and thermal characterization.

    PubMed

    Asoka Panamgama, L; Peramune, P R U S K

    2018-07-01

    The structural and thermal features of coconut coir pith lignin, isolated by three different extraction protocols incorporating two different energy supply sources, were characterized by different analytical tools. The three different chemical extraction protocols were alkaline - 7.5% (w/v) NaOH, organosolv - 85% (v/v) formic and acetic acids at 7:3 (v/v) ratio and polyethylene glycol (PEG): water ratio at 80:20wt%. The two sources of energy were thermal or microwave. Raw lignins were modified by epichlorohydrin to enhance reactivity, and the characteristics of raw and modified lignins were comparatively analysed. Using the thermal energy source, the alkaline and organosolv processes obtained the highest and lowest lignin yields of 26.4±1.5wt% and 3.4±0.2wt%, respectively, as shown by wet chemical analysis. Specific functional group analysis by Fourier transform infrared spectra (FTIR) revealed that significantly different amounts of hydroxyl and carbonyl groups exist in alkaline, organosolv and PEG lignins. Thermogravimetric analysis (TGA) illustrated that the lowest degradation onset temperature was recorded for organosolv lignin, and the overall order was organosolv

  13. Assessment of bioavailable B vitamin content in food using in vitro digestibility assay and LC-MS SIDA.

    PubMed

    Paalme, Toomas; Vilbaste, Allan; Kevvai, Kaspar; Nisamedtinov, Ildar; Hälvin-Tanilas, Kristel

    2017-11-01

    Standardized analytical methods, where each B vitamin is extracted from a given sample individually using separate procedures, typically ensure that the extraction conditions provide the maximum recovery of each vitamin. However, in the human gastrointestinal tract (GIT), the extraction conditions are the same for all vitamins. Here, we present an analytically feasible extraction protocol that simulates conditions in the GIT and provides a measure of the content of bioavailable vitamins using LC-MS stable isotope dilution assay. The results show that the activities of both human gastric and duodenal juices were insufficient to liberate absorbable vitamers (AV) from pure cofactors. The use of an intestinal brush border membrane (IBBM) fraction derived from the mucosal tissue of porcine small intestine ensured at least 70% AV recovery. The rate of AV liberation, however, was strongly dependent on the cofactor, e.g., in the case of NADH, it was magnitudes higher than in the case of thiamine diphosphate. For some vitamins in some food matrices, the use of the IBBM fraction assay resulted in lower values for the content of AV than conventional vitamin determination methods. Conventional methods likely overestimate the actual bioavailability of some vitamins in these cases. Graphical abstract Assessment of bioavailable B vitamin content in food.

  14. Analysis of flavonoids from lotus (Nelumbo nucifera) leaves using high performance liquid chromatography/photodiode array detector tandem electrospray ionization mass spectrometry and an extraction method optimized by orthogonal design.

    PubMed

    Chen, Sha; Wu, Ben-Hong; Fang, Jin-Bao; Liu, Yan-Ling; Zhang, Hao-Hao; Fang, Lin-Chuan; Guan, Le; Li, Shao-Hua

    2012-03-02

    The extraction protocol of flavonoids from lotus (Nelumbo nucifera) leaves was optimized through an orthogonal design. The solvent was the most important factor comparing solvent, solvent:tissue ratio, extraction time, and temperature. The highest yield of flavonoids was achieved with 70% methanol-water and a solvent:tissue ratio of 30:1 at 4 °C for 36 h. The optimized analytical method for HPLC was a multi-step gradient elution using 0.5% formic acid (A) and CH₃CN containing 0.1% formic acid (B), at a flow rate of 0.6 mL/min. Using this optimized method, thirteen flavonoids were simultaneously separated and identified by high performance liquid chromatography coupled with photodiode array detection/electrospray ionization mass spectrometry (HPLC/DAD/ESI-MS(n)). Five of the bioactive compounds are reported in lotus leaves for the first time. The flavonoid content of the leaves of three representative cultivars was assessed under the optimized extraction and HPLC analytical conditions, and the seed-producing cultivar 'Baijianlian' had the highest flavonoid content compared with rhizome-producing 'Zhimahuoulian' and wild floral cultivar 'Honglian'. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Determination of dopamine, serotonin, and their metabolites in pediatric cerebrospinal fluid by isocratic high performance liquid chromatography coupled with electrochemical detection.

    PubMed

    Hubbard, K Elaine; Wells, Amy; Owens, Thandranese S; Tagen, Michael; Fraga, Charles H; Stewart, Clinton F

    2010-06-01

    A method to rapidly measure dopamine (DA), dihydroxyindolphenylacetic acid, homovanillic acid, serotonin (5-HT) and 5-hydroxyindoleacetic acid concentrations in cerebrospinal fluid (CSF) has not yet been reported. A rapid, sensitive, and specific HPLC method was therefore developed using electrochemical detection. CSF was mixed with an antioxidant solution prior to freezing to prevent neurotransmitter degradation. Separation of the five analytes was obtained on an ESA MD-150 x 3.2 mm column with a flow rate of 0.37 mL/min and an acetonitrile-aqueous (5 : 95, v/v) mobile phase with 75 mM monobasic sodium phosphate buffer, 0.5 mM EDTA, 0.81 mM sodium octylsulfonate and 5% tetrahydrofuran. The optimal electrical potential settings were: guard cell +325 mV, E1 -100 mV and E2 +300 mV. Within-day and between-day precisions were <10% for all analytes and accuracies ranged from 91.0 to 106.7%. DA, 5-HT, and their metabolites were stable in CSF with antioxidant solution at 4 degrees C for 8 h in the autoinjector. This method was used to measure neurotransmitters in CSF obtained from children enrolled on an institutional medulloblastoma treatment protocol. Copyright 2009 John Wiley & Sons, Ltd.

  16. Rapid identification of regulated organic chemical compounds in toys using ambient ionization and a miniature mass spectrometry system.

    PubMed

    Guo, Xiangyu; Bai, Hua; Lv, Yueguang; Xi, Guangcheng; Li, Junfang; Ma, Xiaoxiao; Ren, Yue; Ouyang, Zheng; Ma, Qiang

    2018-04-01

    Rapid, on-site analysis was achieved through significantly simplified operation procedures for a wide variety of toy samples (crayon, temporary tattoo sticker, finger paint, modeling clay, and bubble solution) using a miniature mass spectrometry system with ambient ionization capability. The labor-intensive analytical protocols involving sample workup and chemical separation, traditionally required for MS-based analysis, were replaced by direct sampling analysis using ambient ionization methods. A Mini β ion trap miniature mass spectrometer was coupled with versatile ambient ionization methods, e.g. paper spray, extraction spray and slug-flow microextraction nanoESI for direct identification of prohibited colorants, carcinogenic primary aromatic amines, allergenic fragrances, preservatives and plasticizers from raw toy samples. The use of paper substrates coated with Co 3 O 4 nanoparticles allowed a great increase in sensitivity for paper spray. Limits of detection as low as 5μgkg -1 were obtained for target analytes. The methods being developed based on the integration of ambient ionization with miniature mass spectrometer represent alternatives to current in-lab MS analysis operation, and would enable fast, outside-the-lab screening of toy products to ensure children's safety and health. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. A surface plasmon resonance based biochip for the detection of patulin toxin

    NASA Astrophysics Data System (ADS)

    Pennacchio, Anna; Ruggiero, Giuseppe; Staiano, Maria; Piccialli, Gennaro; Oliviero, Giorgia; Lewkowicz, Aneta; Synak, Anna; Bojarski, Piotr; D'Auria, Sabato

    2014-08-01

    Patulin is a toxic secondary metabolite of a number of fungal species belonging to the genera Penicillium and Aspergillus. One important aspect of the patulin toxicity in vivo is an injury of the gastrointestinal tract including ulceration and inflammation of the stomach and intestine. Recently, patulin has been shown to be genotoxic by causing oxidative damage to the DNA, and oxidative DNA base modifications have been considered to play a role in mutagenesis and cancer initiation. Conventional analytical methods for patulin detection involve chromatographic analyses, such as HPLC, GC, and, more recently, techniques such as LC/MS and GC/MS. All of these methods require the use of extensive protocols and the use of expensive analytical instrumentation. In this work, the conjugation of a new derivative of patulin to the bovine serum albumin for the production of polyclonal antibodies is described, and an innovative competitive immune-assay for detection of patulin is presented. Experimentally, an important part of the detection method is based on the optical technique called surface plasmon resonance (SPR). Laser beam induced interactions between probe and target molecules in the vicinity of gold surface of the biochip lead to the shift in resonance conditions and consequently to slight but easily detectable change of reflectivity.

  18. Development of an integrated laboratory system for the monitoring of cyanotoxins in surface and drinking waters.

    PubMed

    Triantis, Theodoros; Tsimeli, Katerina; Kaloudis, Triantafyllos; Thanassoulias, Nicholas; Lytras, Efthymios; Hiskia, Anastasia

    2010-05-01

    A system of analytical processes has been developed in order to serve as a cost-effective scheme for the monitoring of cyanobacterial toxins on a quantitative basis, in surface and drinking waters. Five cyclic peptide hepatotoxins, microcystin-LR, -RR, -YR, -LA and nodularin were chosen as the target compounds. Two different enzyme-linked immunosorbent assays (ELISA) were validated in order to serve as primary quantitative screening tools. Validation results showed that the ELISA methods are sufficiently specific and sensitive with limits of detection (LODs) around 0.1 microg/L, however, matrix effects should be considered, especially with surface water samples or bacterial mass methanolic extracts. A colorimetric protein phosphatase inhibition assay (PPIA) utilizing protein phosphatase 2A and p-nitrophenyl phosphate as substrate, was applied in microplate format in order to serve as a quantitative screening method for the detection of the toxic activity associated with cyclic peptide hepatotoxins, at concentration levels >0.2 microg/L of MC-LR equivalents. A fast HPLC/PDA method has been developed for the determination of microcystins, by using a short, 50mm C18 column, with 1.8 microm particle size. Using this method a 10-fold reduction of sample run time was achieved and sufficient separation of microcystins was accomplished in less than 3 min. Finally, the analytical system includes an LC/MS/MS method that was developed for the determination of the 5 target compounds after SPE extraction. The method achieves extremely low limits of detection (<0.02 microg/L), in both surface and drinking waters and it is used for identification and verification purposes as well as for determinations at the ppt level. An analytical protocol that includes the above methods has been designed and validated through the analysis of a number of real samples. Copyright 2009 Elsevier Ltd. All rights reserved.

  19. Mainstream Smoke Levels of Volatile Organic Compounds in 50 U.S. Domestic Cigarette Brands Smoked With the ISO and Canadian Intense Protocols.

    PubMed

    Pazo, Daniel Y; Moliere, Fallon; Sampson, Maureen M; Reese, Christopher M; Agnew-Heard, Kimberly A; Walters, Matthew J; Holman, Matthew R; Blount, Benjamin C; Watson, Clifford H; Chambers, David M

    2016-09-01

    A significant portion of the increased risk of cancer and respiratory disease from exposure to cigarette smoke is attributed to volatile organic compounds (VOCs). In this study, 21 VOCs were quantified in mainstream cigarette smoke from 50U.S. domestic brand varieties that included high market share brands and 2 Kentucky research cigarettes (3R4F and 1R5F). Mainstream smoke was generated under ISO 3308 and Canadian Intense (CI) smoking protocols with linear smoking machines with a gas sampling bag collection followed by solid phase microextraction/gas chromatography/mass spectrometry (SPME/GC/MS) analysis. For both protocols, mainstream smoke VOC amounts among the different brand varieties were strongly correlated between the majority of the analytes. Overall, Pearson correlation (r) ranged from 0.68 to 0.99 for ISO and 0.36 to 0.95 for CI. However, monoaromatic compounds were found to increase disproportionately compared to unsaturated, nitro, and carbonyl compounds under the CI smoking protocol where filter ventilation is blocked. Overall, machine generated "vapor phase" amounts (µg/cigarette) are primarily attributed to smoking protocol (e.g., blocking of vent holes, puff volume, and puff duration) and filter ventilation. A possible cause for the disproportionate increase in monoaromatic compounds could be increased pyrolysis under low oxygen conditions associated with the CI protocol. This is the most comprehensive assessment of volatile organic compounds (VOCs) in cigarette smoke to date, encompassing 21 toxic VOCs, 50 different cigarette brand varieties, and 2 different machine smoking protocols (ISO and CI). For most analytes relative proportions remain consistent among U.S. cigarette brand varieties regardless of smoking protocol, however the CI smoking protocol did cause up to a factor of 6 increase in the proportion of monoaromatic compounds. This study serves as a basis to assess VOC exposure as cigarette smoke is a principle source of overall population-level VOC exposure in the United States. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  20. An analytical approach to test and design upper limb prosthesis.

    PubMed

    Veer, Karan

    2015-01-01

    In this work the signal acquiring technique, the analysis models and the design protocols of the prosthesis are discussed. The different methods to estimate the motion intended by the amputee from surface electromyogram (SEMG) signals based on time and frequency domain parameters are presented. The experiment proposed that the used techniques can help significantly in discriminating the amputee's motions among four independent activities using dual channel set-up. Further, based on experimental results, the design and working of an artificial arm have been covered under two constituents--the electronics design and the mechanical assembly. Finally, the developed hand prosthesis allows the amputated persons to perform daily routine activities easily.

  1. Distributed event-triggered consensus strategy for multi-agent systems under limited resources

    NASA Astrophysics Data System (ADS)

    Noorbakhsh, S. Mohammad; Ghaisari, Jafar

    2016-01-01

    The paper proposes a distributed structure to address an event-triggered consensus problem for multi-agent systems which aims at concurrent reduction in inter-agent communication, control input actuation and energy consumption. Following the proposed approach, asymptotic convergence of all agents to consensus requires that each agent broadcasts its sampled-state to the neighbours and updates its control input only at its own triggering instants, unlike the existing related works. Obviously, it decreases the network bandwidth usage, sensor energy consumption, computation resources usage and actuator wears. As a result, it facilitates the implementation of the proposed consensus protocol in the real-world applications with limited resources. The stability of the closed-loop system under an event-based protocol is proved analytically. Some numerical results are presented which confirm the analytical discussion on the effectiveness of the proposed design.

  2. Inter-comparison of NIOSH and IMPROVE protocols for OC and EC determination: implications for inter-protocol data conversion

    NASA Astrophysics Data System (ADS)

    Wu, Cheng; Huang, X. H. Hilda; Ng, Wai Man; Griffith, Stephen M.; Zhen Yu, Jian

    2016-09-01

    Organic carbon (OC) and elemental carbon (EC) are operationally defined by analytical methods. As a result, OC and EC measurements are protocol dependent, leading to uncertainties in their quantification. In this study, more than 1300 Hong Kong samples were analyzed using both National Institute for Occupational Safety and Health (NIOSH) thermal optical transmittance (TOT) and Interagency Monitoring of Protected Visual Environment (IMPROVE) thermal optical reflectance (TOR) protocols to explore the cause of EC disagreement between the two protocols. EC discrepancy mainly (83 %) arises from a difference in peak inert mode temperature, which determines the allocation of OC4NSH, while the rest (17 %) is attributed to a difference in the optical method (transmittance vs. reflectance) applied for the charring correction. Evidence shows that the magnitude of the EC discrepancy is positively correlated with the intensity of the biomass burning signal, whereby biomass burning increases the fraction of OC4NSH and widens the disagreement in the inter-protocol EC determination. It is also found that the EC discrepancy is positively correlated with the abundance of metal oxide in the samples. Two approaches (M1 and M2) that translate NIOSH TOT OC and EC data into IMPROVE TOR OC and EC data are proposed. M1 uses direct relationship between ECNSH_TOT and ECIMP_TOR for reconstruction: M1 : ECIMP_TOR = a × ECNSH_TOT + b; while M2 deconstructs ECIMP_TOR into several terms based on analysis principles and applies regression only on the unknown terms: M2 : ECIMP_TOR = AECNSH + OC4NSH - (a × PCNSH_TOR + b), where AECNSH, apparent EC by the NIOSH protocol, is the carbon that evolves in the He-O2 analysis stage, OC4NSH is the carbon that evolves at the fourth temperature step of the pure helium analysis stage of NIOSH, and PCNSH_TOR is the pyrolyzed carbon as determined by the NIOSH protocol. The implementation of M1 to all urban site data (without considering seasonal specificity) yields the following equation: M1(urban data) : ECIMP_TOR = 2.20 × ECNSH_TOT - 0.05. While both M1 and M2 are acceptable, M2 with site-specific parameters provides the best reconstruction performance. Secondary OC (SOC) estimation using OC and EC by the two protocols is compared. An analysis of the usability of reconstructed ECIMP_TOR and OCIMP_TOR suggests that the reconstructed values are not suitable for SOC estimation due to the poor reconstruction of the OC / EC ratio.

  3. Extending the Kerberos Protocol for Distributed Data as a Service

    DTIC Science & Technology

    2012-09-20

    exported as a UIMA [11] PEAR file for deployment to IBM Content Analytics (ICA). A UIMA PEAR file is a deployable text analytics “pipeline” (analogous...to a web application packaged in a WAR file). ICA is a text analysis and search application that supports UIMA . The key entities targeted by NLP rules...workbench. [Online]. Available: https: //www.ibm.com/developerworks/community/alphaworks/lrw/ [11] Apache UIMA . [Online]. Available: http

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Part, Florian; Zecha, Gudrun; Causon, Tim

    Highlights: • First review on detection of nanomaterials in complex waste samples. • Focus on nanoparticles in solid, liquid and gaseous waste samples. • Summary of current applicable methods for nanowaste detection and characterisation. • Limitations and challenges of characterisation of nanoparticles in waste. - Abstract: Engineered nanomaterials (ENMs) are already extensively used in diverse consumer products. Along the life cycle of a nano-enabled product, ENMs can be released and subsequently accumulate in the environment. Material flow models also indicate that a variety of ENMs may accumulate in waste streams. Therefore, a new type of waste, so-called nanowaste, is generatedmore » when end-of-life ENMs and nano-enabled products are disposed of. In terms of the precautionary principle, environmental monitoring of end-of-life ENMs is crucial to allow assessment of the potential impact of nanowaste on our ecosystem. Trace analysis and quantification of nanoparticulate species is very challenging because of the variety of ENM types that are used in products and low concentrations of nanowaste expected in complex environmental media. In the framework of this paper, challenges in nanowaste characterisation and appropriate analytical techniques which can be applied to nanowaste analysis are summarised. Recent case studies focussing on the characterisation of ENMs in waste streams are discussed. Most studies aim to investigate the fate of nanowaste during incineration, particularly considering aerosol measurements; whereas, detailed studies focusing on the potential release of nanowaste during waste recycling processes are currently not available. In terms of suitable analytical methods, separation techniques coupled to spectrometry-based methods are promising tools to detect nanowaste and determine particle size distribution in liquid waste samples. Standardised leaching protocols can be applied to generate soluble fractions stemming from solid wastes, while micro- and ultrafiltration can be used to enrich nanoparticulate species. Imaging techniques combined with X-ray-based methods are powerful tools for determining particle size, morphology and screening elemental composition. However, quantification of nanowaste is currently hampered due to the problem to differentiate engineered from naturally-occurring nanoparticles. A promising approach to face these challenges in nanowaste characterisation might be the application of nanotracers with unique optical properties, elemental or isotopic fingerprints. At present, there is also a need to develop and standardise analytical protocols regarding nanowaste sampling, separation and quantification. In general, more experimental studies are needed to examine the fate and transport of ENMs in waste streams and to deduce transfer coefficients, respectively to develop reliable material flow models.« less

  5. Design and analysis issues for economic analysis alongside clinical trials.

    PubMed

    Marshall, Deborah A; Hux, Margaret

    2009-07-01

    Clinical trials can offer a valuable and efficient opportunity to collect the health resource use and outcomes data for economic evaluation. However, economic and clinical studies differ fundamentally in the question they seek to answer. The design and analysis of trial-based cost-effectiveness studies require special consideration, which are reviewed in this article. Traditional randomized controlled trials, using an experimental design with a controlled protocol, are designed to measure safety and efficacy for product registration. Cost-effectiveness analysis seeks to measure effectiveness in the context of routine clinical practice, and requires collection of health care resources to allow estimation of cost over an equal timeframe for each treatment alternative. In assessing suitability of a trial for economic data collection, the comparator treatment and other protocol factors need to reflect current clinical practice and the trial follow-up must be sufficiently long to capture important costs and effects. The broadest available population and a measure of effectiveness reflecting important benefits for patients are preferred for economic analyses. Special analytical issues include dealing with missing and censored cost data, assessing uncertainty of the incremental cost-effectiveness ratio, and accounting for the underlying heterogeneity in patient subgroups. Careful consideration also needs to be given to data from multinational studies since practice patterns can differ across countries. Although clinical trials can be an efficient opportunity to collect data for economic evaluation, careful consideration of the suitability of the study design, and appropriate analytical methods must be applied to obtain rigorous results.

  6. Identification of polar, ionic, and highly water soluble organic pollutants in untreated industrial wastewaters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castillo, M.; Alonso, M.C.; Riu, J.

    1999-04-15

    This paper presents a generic protocol for the determination of polar, ionic, and highly water soluble organic pollutants on untreated industrial wastewaters involving the use of two different solid-phase extraction (SPE) methodologies followed by liquid chromatography-mass spectrometry (LC-MS). Untreated industrial wastewaters might contain natural and synthetic dissolved organic compounds with total organic carbon (TOC) values varying between 100 and 3000 mg/L. All polar, ionic and highly water soluble compounds comprising more than 95% of the organic content and with major contribution to the total toxicity of the sample cannot be analyzed by conventional gas chromatography-mass spectrometry (GC-MS), and LC-MS ismore » a good alternative. In this work two extraction procedures were used to obtain fractionated extracts of the nonionic polar compounds: a polymeric Isolute ENV + SPE cartridge for the preconcentration of anionic analytes and a sequential solid-phase extraction (SSPE) method percolating the samples first in octadecylsilica cartridge in series with the polymeric Lichrolut EN cartridge. Average recoveries ranging from 72% to 103% were obtained for a variety of 23 different analytes. Determination of nonionic pollutants was accomplished by reverse-phase liquid chromatography-atmospheric pressure chemical ionization-mass spectrometry (LC-APCI-MS), while anionic compounds were analyzed by ion pair chromatography-electrospray-mass spectrometry (IP-ESI-MS) and LC-ESI-MS. This protocol was applied to a pilot survey of textile and tannery wastewaters leading to the identification and quantification of 33 organic pollutants.« less

  7. Isolation and purification of all-trans diadinoxanthin and all-trans diatoxanthin from diatom Phaeodactylum tricornutum.

    PubMed

    Kuczynska, Paulina; Jemiola-Rzeminska, Malgorzata

    2017-01-01

    Two diatom-specific carotenoids are engaged in the diadinoxanthin cycle, an important mechanism which protects these organisms against photoinhibition caused by absorption of excessive light energy. A high-performance and economical procedure of isolation and purification of diadinoxanthin and diatoxanthin from the marine diatom Phaeodactylum tricornutum using a four-step procedure has been developed. It is based on the use of commonly available materials and does not require advanced technology. Extraction of pigments, saponification, separation by partition and then open column chromatography, which comprise the complete experimental procedure, can be performed within 2 days. This method allows HPLC grade diadinoxanthin and diatoxanthin of a purity of 99 % or more to be obtained, and the efficiency was estimated to be 63 % for diadinoxanthin and 73 % for diatoxanthin. Carefully selected diatom culture conditions as well as analytical ones ensure highly reproducible performance. A protocol can be used to isolate and purify the diadinoxanthin cycle pigments both on analytical and preparative scale.

  8. Instrumental neutron activation analysis for studying size-fractionated aerosols

    NASA Astrophysics Data System (ADS)

    Salma, Imre; Zemplén-Papp, Éva

    1999-10-01

    Instrumental neutron activation analysis (INAA) was utilized for studying aerosol samples collected into a coarse and a fine size fraction on Nuclepore polycarbonate membrane filters. As a result of the panoramic INAA, 49 elements were determined in an amount of about 200-400 μg of particulate matter by two irradiations and four γ-spectrometric measurements. The analytical calculations were performed by the absolute ( k0) standardization method. The calibration procedures, application protocol and the data evaluation process are described and discussed. They make it possible now to analyse a considerable number of samples, with assuring the quality of the results. As a means of demonstrating the system's analytical capabilities, the concentration ranges, median or mean atmospheric concentrations and detection limits are presented for an extensive series of aerosol samples collected within the framework of an urban air pollution study in Budapest. For most elements, the precision of the analysis was found to be beyond the uncertainty represented by the sampling techniques and sample variability.

  9. Quality assurance and quality control of geochemical data—A primer for the research scientist

    USGS Publications Warehouse

    Geboy, Nicholas J.; Engle, Mark A.

    2011-01-01

    Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and interpreting results. This primer has been developed to provide basic information and guidance about developing QA/QC protocols for geochemical studies. It is not intended to be a comprehensive guide but rather an introduction to key concepts tied to a list of relevant references for further reading. The guidelines are presented in stepwise order beginning with presampling considerations and continuing through final data interpretation. The goal of this primer is to outline basic QA/QC practices that scientists can use before, during, and after chemical analysis to ensure the validity of the data they collect with the goal of providing defendable results and conclusions.

  10. Deterministic generation of remote entanglement with active quantum feedback

    DOE PAGES

    Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...

    2015-12-10

    We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less

  11. Operational Implementation of a 2-Hour Prebreathe Protocol for International Space Station

    NASA Technical Reports Server (NTRS)

    Waligora, James M.; Conkin, J.; Foster, P. P.; Schneider, S.; Loftin, Karin C.; Gernhardt, Michael L.; Vann, R.

    2000-01-01

    Procedures, equipment, and analytical techniques were developed to implement the ground tested 2-hour protocol in-flight operations. The methods are: 1) The flight protocol incorporates additional safety margin over the ground tested protocol. This includes up to 20 min of additional time on enriched O2 during suit purge and pressure check, increased duration of extravehicular activity (EVA) preparation exercise during O2 prebreathing (up to 90 min vs; the tested 24 min), and reduced rates of depressurization. The ground test observations were combined with model projections of the conservative measures (using statistical models from Duke University and NASA JSQ to bound the risk of Type I and Type II decompression sickness (DCS). 2) An inflight exercise device using the in-flight ergometer and elastic tubes for upper body exercise was developed to replicate the dual cycle exercise in the ground trials. 3) A new in-flight breathing system was developed and man-tested. 4) A process to monitor inflight experience with the protocol, including the use of an in-suit Doppler bubble monitor when available, was developed. The results are: 1) The model projections of the conservative factors of the operational protocol were shown to reduce the risk of DCS to levels consistent with the observations of no DCS to date in the shuttle program. 2) Cross over trials of the dual cycle ergometer used in ground tests and the in-flight exercise system verified that02consumption and the % division of work between upper and lower body was not significantly different at the p= 0.05 level. 3) The in-flight breathing system was demonstrated to support work rates generating 75% O2(max) in 95 percentile subjects. 4) An in-flight monitoring plan with acceptance criteria was put in place for the 2-hour prebreathe protocol. And the conclusions are: The 2-hour protocol has been approved for flight, and all implementation efforts are in place to allow use of the protocol as early as flight ISS 7A, now scheduled in November of 2000.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R; Ware, Anne E

    Two catalytic fast pyrolysis (CFP) oils (bottom/heavy fraction) were analyzed in various solvents that are used in common analytical methods (nuclear magnetic resonance - NMR, gas chromatography - GC, gel permeation chromatography - GPC, thermogravimetric analysis - TGA) for oil characterization and speciation. A more accurate analysis of the CFP oils can be obtained by identification and exploitation of solvent miscibility characteristics. Acetone and tetrahydrofuran can be used to completely solubilize CFP oils for analysis by GC and tetrahydrofuran can be used for traditional organic GPC analysis of the oils. DMSO-d6 can be used to solubilize CFP oils for analysismore » by 13C NMR. The fractionation of oils into solvents that did not completely solubilize the whole oils showed that miscibility can be related to the oil properties. This allows for solvent selection based on physico-chemical properties of the oils. However, based on semi-quantitative comparisons of the GC chromatograms, the organic solvent fractionation schemes did not speciate the oils based on specific analyte type. On the other hand, chlorinated solvents did fractionate the oils based on analyte size to a certain degree. Unfortunately, like raw pyrolysis oil, the matrix of the CFP oils is complicated and is not amenable to simple liquid-liquid extraction (LLE) or solvent fractionation to separate the oils based on the chemical and/or physical properties of individual components. For reliable analyses, for each analytical method used, it is critical that the bio-oil sample is both completely soluble and also not likely to react with the chosen solvent. The adoption of the standardized solvent selection protocols presented here will allow for greater reproducibility of analysis across different users and facilities.« less

  13. The NIST Quantitative Infrared Database

    PubMed Central

    Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.

    1999-01-01

    With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.

  14. A parameterization method and application in breast tomosynthesis dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xinhua; Zhang, Da; Liu, Bob

    2013-09-15

    Purpose: To present a parameterization method based on singular value decomposition (SVD), and to provide analytical parameterization of the mean glandular dose (MGD) conversion factors from eight references for evaluating breast tomosynthesis dose in the Mammography Quality Standards Act (MQSA) protocol and in the UK, European, and IAEA dosimetry protocols.Methods: MGD conversion factor is usually listed in lookup tables for the factors such as beam quality, breast thickness, breast glandularity, and projection angle. The authors analyzed multiple sets of MGD conversion factors from the Hologic Selenia Dimensions quality control manual and seven previous papers. Each data set was parameterized usingmore » a one- to three-dimensional polynomial function of 2–16 terms. Variable substitution was used to improve accuracy. A least-squares fit was conducted using the SVD.Results: The differences between the originally tabulated MGD conversion factors and the results computed using the parameterization algorithms were (a) 0.08%–0.18% on average and 1.31% maximum for the Selenia Dimensions quality control manual, (b) 0.09%–0.66% on average and 2.97% maximum for the published data by Dance et al. [Phys. Med. Biol. 35, 1211–1219 (1990); ibid. 45, 3225–3240 (2000); ibid. 54, 4361–4372 (2009); ibid. 56, 453–471 (2011)], (c) 0.74%–0.99% on average and 3.94% maximum for the published data by Sechopoulos et al. [Med. Phys. 34, 221–232 (2007); J. Appl. Clin. Med. Phys. 9, 161–171 (2008)], and (d) 0.66%–1.33% on average and 2.72% maximum for the published data by Feng and Sechopoulos [Radiology 263, 35–42 (2012)], excluding one sample in (d) that does not follow the trends in the published data table.Conclusions: A flexible parameterization method is presented in this paper, and was applied to breast tomosynthesis dosimetry. The resultant data offer easy and accurate computations of MGD conversion factors for evaluating mean glandular breast dose in the MQSA protocol and in the UK, European, and IAEA dosimetry protocols. Microsoft Excel™ spreadsheets are provided for the convenience of readers.« less

  15. Realist theory construction for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Kemp, Lynn A; Jalaludin, Bin B

    2016-01-01

    We have recently described a protocol for a study that aims to build a theory of neighbourhood context and postnatal depression. That protocol proposed a critical realist Explanatory Theory Building Method comprising of an: (1) emergent phase, (2) construction phase, and (3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design was described. The protocol also described in detail the Theory Construction Phase which will be presented here. The Theory Construction Phase will include: (1) defining stratified levels; (2) analytic resolution; (3) abductive reasoning; (4) comparative analysis (triangulation); (5) retroduction; (6) postulate and proposition development; (7) comparison and assessment of theories; and (8) conceptual frameworks and model development. The stratified levels of analysis in this study were predominantly social and psychological. The abductive analysis used the theoretical frames of: Stress Process; Social Isolation; Social Exclusion; Social Services; Social Capital, Acculturation Theory and Global-economic level mechanisms. Realist propositions are presented for each analysis of triangulated data. Inference to best explanation is used to assess and compare theories. A conceptual framework of maternal depression, stress and context is presented that includes examples of mechanisms at psychological, social, cultural and global-economic levels. Stress was identified as a necessary mechanism that has the tendency to cause several outcomes including depression, anxiety, and health harming behaviours. The conceptual framework subsequently included conditional mechanisms identified through the retroduction including the stressors of isolation and expectations and buffers of social support and trust. The meta-theory of critical realism is used here to generate and construct social epidemiological theory using stratified ontology and both abductive and retroductive analysis. The findings will be applied to the development of a middle range theory and subsequent programme theory for local perinatal child and family interventions.

  16. Large-scale pesticide testing in olives by liquid chromatography-electrospray tandem mass spectrometry using two sample preparation methods based on matrix solid-phase dispersion and QuEChERS.

    PubMed

    Gilbert-López, Bienvenida; García-Reyes, Juan F; Lozano, Ana; Fernández-Alba, Amadeo R; Molina-Díaz, Antonio

    2010-09-24

    In this work we have evaluated the performance of two sample preparation methodologies for the large-scale multiresidue analysis of pesticides in olives using liquid chromatography-electrospray tandem mass spectrometry (LC-MS/MS). The tested sample treatment methodologies were: (1) liquid-liquid partitioning with acetonitrile followed by dispersive solid-phase extraction clean-up using GCB, PSA and C18 sorbents (QuEChERS method - modified for fatty vegetables) and (2) matrix solid-phase dispersion (MSPD) using aminopropyl as sorbent material and a final clean-up performed in the elution step using Florisil. An LC-MS/MS method covering 104 multiclass pesticides was developed to examine the performance of these two protocols. The separation of the compounds from the olive extracts was achieved using a short C18 column (50 mm x 4.6 mm i.d.) with 1.8 microm particle size. The identification and confirmation of the compounds was based on retention time matching along with the presence (and ratio) of two typical MRM transitions. Limits of detection obtained were lower than 10 microgkg(-1) for 89% analytes using both sample treatment protocols. Recoveries studies performed on olives samples spiked at two concentration levels (10 and 100 microgkg(-1)) yielded average recoveries in the range 70-120% for most analytes when QuEChERS procedure is employed. When MSPD was the choice for sample extraction, recoveries obtained were in the range 50-70% for most of target compounds. The proposed methods were successfully applied to the analysis of real olives samples, revealing the presence of some of the target species in the microgkg(-1) range. Besides the evaluation of the sample preparation approaches, we also discuss the use of advanced software features associated to MRM method development that overcome several limitations and drawbacks associated to MS/MS methods (time segments boundaries, tedious method development/manual scheduling and acquisition limitations). This software feature recently offered by different vendors is based on an algorithm that associates retention time data for each individual MS/MS transition, so that the number of simultaneously traced transitions throughout the entire chromatographic run (dwell times and sensitivity) is maximized. Copyright 2010 Elsevier B.V. All rights reserved.

  17. Experimental control in software reliability certification

    NASA Technical Reports Server (NTRS)

    Trammell, Carmen J.; Poore, Jesse H.

    1994-01-01

    There is growing interest in software 'certification', i.e., confirmation that software has performed satisfactorily under a defined certification protocol. Regulatory agencies, customers, and prospective reusers all want assurance that a defined product standard has been met. In other industries, products are typically certified under protocols in which random samples of the product are drawn, tests characteristic of operational use are applied, analytical or statistical inferences are made, and products meeting a standard are 'certified' as fit for use. A warranty statement is often issued upon satisfactory completion of a certification protocol. This paper outlines specific engineering practices that must be used to preserve the validity of the statistical certification testing protocol. The assumptions associated with a statistical experiment are given, and their implications for statistical testing of software are described.

  18. Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples

    PubMed Central

    2012-01-01

    Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466

  19. Urban upgrading and its impact on health: a "quasi-experimental" mixed-methods study protocol for the BH-Viva Project.

    PubMed

    Friche, Amélia Augusta de Lima; Dias, Maria Angélica de Salles; Reis, Priscila Brandão Dos; Dias, Cláudia Silva; Caiaffa, Waleska Teixeira

    2015-11-01

    There is little scientific evidence that urban upgrading helps improve health or reduce inequities. This article presents the design for the BH-Viva Project, a "quasi-experimental", multiphase, mixed-methods study with quantitative and qualitative components, proposing an analytical model for monitoring the effects that interventions in the urban environment can have on residents' health in slums in Belo Horizonte, Minas Gerais State, Brazil. A preliminary analysis revealed intra-urban differences in age-specific mortality when comparing areas with and without interventions; the mortality rate from 2002 to 2012 was stable in the "formal city", increased in slums without interventions, and decreased in slums with interventions. BH-Viva represents an effort at advancing methodological issues, providing learning and theoretical backing for urban health research and research methods, allowing their application and extension to other urban contexts.

  20. Stereoselective Luche reduction of deoxynivalenol and three of its acetylated derivatives at C8.

    PubMed

    Fruhmann, Philipp; Hametner, Christian; Mikula, Hannes; Adam, Gerhard; Krska, Rudolf; Fröhlich, Johannes

    2014-01-10

    The trichothecene mycotoxin deoxynivalenol (DON) is a well known and common contaminant in food and feed. Acetylated derivatives and other biosynthetic precursors can occur together with the main toxin. A key biosynthetic step towards DON involves an oxidation of the 8-OH group of 7,8-dihydroxycalonectrin. Since analytical standards for the intermediates are not available and these intermediates are therefore rarely studied, we aimed for a synthetic method to invert this reaction, making a series of calonectrin-derived precursors accessible. We did this by developing an efficient protocol for stereoselective Luche reduction at C8. This method was used to access 3,7,8,15-tetrahydroxyscirpene, 3-deacetyl-7,8-dihydroxycalonectrin, 15-deacetyl-7,8-dihydroxycalonectrin and 7,8-dihydroxycalonectrin, which were characterized using several NMR techniques. Beside the development of a method which could basically be used for all type B trichothecenes, we opened a synthetic route towards different acetylated calonectrins.

  1. Faults and foibles of quantitative scanning electron microscopy/energy dispersive x-ray spectrometry (SEM/EDS)

    NASA Astrophysics Data System (ADS)

    Newbury, Dale E.; Ritchie, Nicholas W. M.

    2012-06-01

    Scanning electron microscopy with energy dispersive x-ray spectrometry (SEM/EDS) is a powerful and flexible elemental analysis method that can identify and quantify elements with atomic numbers > 4 (Be) present as major constituents (where the concentration C > 0.1 mass fraction, or 10 weight percent), minor (0.01<= C <= 0.1) and trace (C < 0.01, with a minimum detectable limit of ~+/- 0.0005 - 0.001 under routine measurement conditions, a level which is analyte and matrix dependent ). SEM/EDS can select specimen volumes with linear dimensions from ~ 500 nm to 5 μm depending on composition (masses ranging from ~ 10 pg to 100 pg) and can provide compositional maps that depict lateral elemental distributions. Despite the maturity of SEM/EDS, which has a history of more than 40 years, and the sophistication of modern analytical software, the method is vulnerable to serious shortcomings that can lead to incorrect elemental identifications and quantification errors that significantly exceed reasonable expectations. This paper will describe shortcomings in peak identification procedures, limitations on the accuracy of quantitative analysis due to specimen topography or failures in physical models for matrix corrections, and quantitative artifacts encountered in xray elemental mapping. Effective solutions to these problems are based on understanding the causes and then establishing appropriate measurement science protocols. NIST DTSA II and Lispix are open source analytical software available free at www.nist.gov that can aid the analyst in overcoming significant limitations to SEM/EDS.

  2. Developing strategies to enhance loading efficiency of erythrosensors

    NASA Astrophysics Data System (ADS)

    Bustamante Lopez, Sandra C.; Ritter, Sarah C.; Meissner, Kenith E.

    2014-02-01

    For diabetics, continuous glucose monitoring and the resulting tighter control of glucose levels ameliorate serious complications from hypoglycemia and hyperglycemia. Diabetics measure their blood glucose levels multiple times a day by finger pricks, or use implantable monitoring devices. Still, glucose and other analytes in the blood fluctuate throughout the day and the current monitoring methods are invasive, immunogenic, and/or present biodegradation problems. Using carrier erythrocytes loaded with a fluorescent sensor, we seek to develop a biodegradable, efficient, and potentially cost effective method to continuously sense blood analytes. We aim to reintroduce sensor-loaded erythrocytes to the bloodstream and conserve the erythrocytes lifetime of 120 days in the circulatory system. Here, we compare the efficiency of two loading techniques: hypotonic dilution and electroporation. Hypotonic dilution employs hypotonic buffer to create transient pores in the erythrocyte membrane, allowing dye entrance and a hypertonic buffer to restore tonicity. Electroporation relies on controlled electrical pulses that results in reversible pores formation to allow cargo entrance, follow by incubation at 37°C to reseal. As part of the cellular characterization of loaded erythrocytes, we focus on cell size, shape, and hemoglobin content. Cell recovery, loading efficiency and cargo release measurements render optimal loading conditions. The detected fluorescent signal from sensor-loaded erythrocytes can be translated into a direct measurement of analyte levels in the blood stream. The development of a suitable protocol to engineer carrier erythrocytes has profound and lasting implications in the erythrosensor's lifespan and sensing capabilities.

  3. Reanalysis of a 15-year Archive of IMPROVE Samples

    NASA Astrophysics Data System (ADS)

    Hyslop, N. P.; White, W. H.; Trzepla, K.

    2013-12-01

    The IMPROVE (Interagency Monitoring of PROtected Visual Environments) network monitors aerosol concentrations at 170 remote sites throughout the United States. Twenty-four-hour filter samples of particulate matter are collected every third day and analyzed for chemical composition. About 30 of the sites have operated continuously since 1988, and the sustained data record (http://views.cira.colostate.edu/web/) offers a unique window on regional aerosol trends. All elemental analyses have been performed by Crocker Nuclear Laboratory at the University of California in Davis, and sample filters collected since 1995 are archived on campus. The suite of reported elements has remained constant, but the analytical methods employed for their determination have evolved. For example, the elements Na - Mn were determined by PIXE until November 2001, then by XRF analysis in a He-flushed atmosphere through 2004, and by XRF analysis in vacuum since January 2005. In addition to these fundamental changes, incompletely-documented operational factors such as detector performance and calibration details have introduced variations in the measurements. Because the past analytical methods were non-destructive, the archived filters can be re-analyzed with the current analytical systems and protocols. The 15-year sample archives from Great Smoky Mountains, Mount Rainier, and Point Reyes National Parks were selected for reanalysis. The agreement between the new analyses and original determinations varies with element and analytical era (Figure 1). Temporal trends for some elements are affected by these changes in measurement technique while others are not (Figure 2). Figure 1. Repeatability of analyses for sulfur and vanadium at Great Smoky Mountains National Park. Each point shows the ratio of mass loadings determined by the original analysis and recent reanalysis. Major method distinctions are indicated at the top. Figure 2. Trends, based on Thiel-Sen regression, in lead concentrations based on the original and reanalysis data.

  4. Incineration of polychlorinated biphenyls in high-efficiency boilers: a viable disposal option

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, G.T.; Wolf, P.; Fennelly, P.F.

    1984-03-01

    Approximately 750 million pounds of polychlorinated biphenyls (PCBs) remain in service today in the United States. The eventual disposition of these materials and the vast stock piles already removed from commerce and use represents a formidable problem to both U.S. industry (e.g., utility companies) and federal and state environmental agencies. Despite the fact that available disposal options include the use of high-temperature incineration, disposal efforts have been significantly hampered by the lack of approved incineration facilities. The results of comprehensive PCB incineration programs conducted in accordance with EPA test protocols at each of three high-efficiency boiler sites are presented. Fluemore » gas sampling procedures included the use of both the modified method 5 PCB train and the Source Assessment Sampling System (SASS). Analytical protocols included the use of gas chromatography (GC/ECD) and combined gas chromatography/mass spectrometry (GC/MS). PCB destruction efficiency data for each of nine test runs were in excess of the 99.9% values assumed by the EPA regulation. The cumulative data set lends further credibility to the use of high-efficiency boilers as a viable disposal option for PCB contaminated (50-500 ppm) waste oils when conducted in strict accordance with existing EPA protocols.« less

  5. A Randomized Trial Comparing Mail versus In-Office Distribution of the CAHPS Clinician and Group Survey

    PubMed Central

    Anastario, Michael P; Rodriguez, Hector P; Gallagher, Patricia M; Cleary, Paul D; Shaller, Dale; Rogers, William H; Bogen, Karen; Safran, Dana Gelb

    2010-01-01

    Objective To assess the effect of survey distribution protocol (mail versus handout) on data quality and measurement of patient care experiences. Data Sources/Study Setting Multisite randomized trial of survey distribution protocols. Analytic sample included 2,477 patients of 15 clinicians at three practice sites in New York State. Data Collection/Extraction Methods Mail and handout distribution modes were alternated weekly at each site for 6 weeks. Principal Findings Handout protocols yielded an incomplete distribution rate (74 percent) and lower overall response rates (40 percent versus 58 percent) compared with mail. Handout distribution rates decreased over time and resulted in more favorable survey scores compared with mailed surveys. There were significant mode–physician interaction effects, indicating that data cannot simply be pooled and adjusted for mode. Conclusions In-office survey distribution has the potential to bias measurement and comparison of physicians and sites on patient care experiences. Incomplete distribution rates observed in-office, together with between-office differences in distribution rates and declining rates over time suggest staff may be burdened by the process and selective in their choice of patients. Further testing with a larger physician and site sample is important to definitively establish the potential role for in-office distribution in obtaining reliable, valid assessment of patient care experiences. PMID:20579126

  6. Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis

    PubMed Central

    Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.

    2011-01-01

    Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184

  7. Scoring System for the Management of Acute Gallstone Pancreatitis: Cost Analysis of a Prospective Study.

    PubMed

    Prigoff, Jake G; Swain, Gary W; Divino, Celia M

    2016-05-01

    Predicting the presence of a persistent common bile duct (CBD) stone is a difficult and expensive task. The aim of this study is to determine if a previously described protocol-based scoring system is a cost-effective strategy. The protocol includes all patients with gallstone pancreatitis and stratifies them based on laboratory values and imaging to high, medium, and low likelihood of persistent stones. The patient's stratification then dictates the next course of management. A decision analytic model was developed to compare the costs for patients who followed the protocol versus those that did not. Clinical data model inputs were obtained from a prospective study conducted at The Mount Sinai Medical Center to validate the protocol from Oct 2009 to May 2013. The study included all patients presenting with gallstone pancreatitis regardless of disease severity. Seventy-three patients followed the proposed protocol and 32 did not. The protocol group cost an average of $14,962/patient and the non-protocol group cost $17,138/patient for procedural costs. Mean length of stay for protocol and non-protocol patients was 5.6 and 7.7 days, respectively. The proposed protocol is a cost-effective way to determine the course for patients with gallstone pancreatitis, reducing total procedural costs over 12 %.

  8. Parent skills training for parents of children or adults with developmental disorders: systematic review and meta-analysis protocol.

    PubMed

    Reichow, Brian; Kogan, Cary; Barbui, Corrado; Smith, Isaac; Yasamy, M Taghi; Servili, Chiara

    2014-08-27

    Developmental disorders, including intellectual disability and autism spectrum disorders, may limit an individual's capacity to conduct daily activities. The emotional and economic burden on families caring for an individual with a developmental disorder is substantial, and quality of life may be limited by a lack of services. Therefore, finding effective treatments to help this population should be a priority. Recent work has shown parent skills training interventions improve developmental, behavioural and family outcomes. The purpose of this review protocol is to extend previous findings by systematically analysing randomised controlled trials of parent skills training programmes for parents of children with developmental disorders including intellectual disabilities and autism spectrum disorders and use meta-analytic techniques to identify programme components reliably associated with successful outcomes of parent skills training programmes. We will include all studies conducted using randomised control trials designs that compare a group of parents receiving a parent skills training programme to a group of parents in a no-treatment control, waitlist control or treatment as usual comparison group. To locate studies, we will conduct an extensive electronic database search and then use snowball methods, with no limits to publication year or language. We will present a narrative synthesis including visual displays of study effects on child and parental outcomes and conduct a quantitative synthesis of the effects of parent skills training programmes using meta-analytic techniques. No ethical issues are foreseen and ethical approval is not required given this is a protocol for a systematic review. The findings of this study will be disseminated through peer-reviewed publications and international conference presentations. Updates of the review will be conducted, as necessary, to inform and guide practice. PROSPERO (CRD42014006993). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. Tracking Matrix Effects in the Analysis of DNA Adducts of Polycyclic Aromatic Hydrocarbons

    PubMed Central

    Klaene, Joshua J.; Flarakos, Caroline; Glick, James; Barret, Jennifer T.; Zarbl, Helmut; Vouros, Paul

    2015-01-01

    LC-MS using electrospray ionization is currently the method of choice in bio-organic analysis covering a wide range of applications in a broad spectrum of biological media. The technique is noted for its high sensitivity but one major limitation which hinders achievement of its optimal sensitivity is the signal suppression due to matrix inferences introduced by the presence of co-extracted compounds during the sample preparation procedure. The analysis of DNA adducts of common environmental carcinogens is particularly sensitive to such matrix effects as sample preparation is a multistep process which involves “contamination” of the sample due to the addition of enzymes and other reagents for digestion of the DNA in order to isolate the analyte(s). This problem is further exacerbated by the need to reach low levels of quantitation (LOQ in the ppb level) while also working with limited (2-5 μg) quantities of sample. We report here on the systematic investigation of ion signal suppression contributed by each individual step involved in the sample preparation associated with the analysis of DNA adducts of polycyclic aromatic hydrocarbon (PAH) using as model analyte dG-BaP, the deoxyguanosine adduct of benzo[a]pyrene (BaP). The individual matrix contribution of each one of these sources to analyte signal was systematically addressed as were any interactive effects. The information was used to develop a validated analytical protocol for the target biomarker at levels typically encountered in vivo using as little as 2 μg of DNA and applied to a dose response study using a metabolically competent cell line. PMID:26607319

  10. Application of non-traditional stable isotopes in analytical ecogeochemistry assessed by MC ICP-MS--A critical review.

    PubMed

    Irrgeher, Johanna; Prohaska, Thomas

    2016-01-01

    Analytical ecogeochemistry is an evolving scientific field dedicated to the development of analytical methods and tools and their application to ecological questions. Traditional stable isotopic systems have been widely explored and have undergone continuous development during the last century. The variations of the isotopic composition of light elements (H, O, N, C, and S) have provided the foundation of stable isotope analysis followed by the analysis of traditional geochemical isotope tracers (e.g., Pb, Sr, Nd, Hf). Questions in a considerable diversity of scientific fields have been addressed, many of which can be assigned to the field of ecogeochemistry. Over the past 15 years, other stable isotopes (e.g., Li, Zn, Cu, Cl) have emerged gradually as novel tools for the investigation of scientific topics that arise in ecosystem research and have enabled novel discoveries and explorations. These systems are often referred to as non-traditional isotopes. The small isotopic differences of interest that are increasingly being addressed for a growing number of isotopic systems represent a challenge to the analytical scientist and push the limits of today's instruments constantly. This underlines the importance of a metrologically sound concept of analytical protocols and procedures and a solid foundation of data processing strategies and uncertainty considerations before these small isotopic variations can be interpreted in the context of applied ecosystem research. This review focuses on the development of isotope research in ecogeochemistry, the requirements for successful detection of small isotopic shifts, and highlights the most recent and innovative applications in the field.

  11. Quantitative analysis of substituted N,N-dimethyl-tryptamines in the presence of natural type XII alkaloids.

    PubMed

    Ivanova, Bojidarka; Spiteller, Michael

    2012-10-01

    This paper reports the qualitative and quantitative analysis (QA) of mixtures of hallucinogens, N,N-dimethyltryptamine (DMT) (1), 5-methoxy- (la) and 5-hydroxy-N,N-dimethyltryptamine (1b) in the presence of beta-carbolines (indole alkaloids of type XII) ((2), (3) and (5)}. The validated electronic absorption spectroscopic (EAs) protocol achieved a concentration limit of detection (LOD) of 7.2.10(-7) mol/L {concentration limit of quantification (LOQ) of 24.10(-7) mol/L) using bands (lambda max within 260+/-0.23-262+/-0.33 nm. Metrology, including accuracy, measurement repeatability, measurement precision, trueness of measurement, and reproducibility of the measurements are presented using N,N-dimethyltryptamine (DMA) as standard. The analytical quantities of mixtures of alkaloids 4, 6 and 7 are: lambda max 317+/-0.45, 338+/-0.69 and 430+/-0.09 for 4 (LOD, 8.6.10(-7) mol/L; LOQ, 28.66(6), mol/L), as well as 528+/-0.75 nm for 6 and 7 (LOD, 8.2.10(-7) mol/L; LOQ, 27.33(3), mol/L), respectively. The partially validated protocols by high performance liquid chromatography (HPLC), electrospray ionization (ESI), mass spectrometry (MS), both in single and tandem operation (MS/MS) mode, as well as matrix/assisted laser desorption/ionization (MALDI) MS are elaborated. The Raman spectroscopic (RS) protocol for analysis of psychoactive substances, characterized by strong fluorescence RS profile was developed, with the detection limits being discussed. The known synergistic effect leading to increase the psychoactive and hallucinogenic properties and the reported acute poisoning cases from 1-7, make the present study emergent, since as well the current lack of analytical data and the herein metrology obtained contributed to the elaboration of highly selective and precise analytical protocols, which would be of interest in the field of criminal forensic analysis.

  12. What a drop can do: dried blood spots as a minimally invasive method for integrating biomarkers into population-based research.

    PubMed

    McDade, Thomas W; Williams, Sharon; Snodgrass, J Josh

    2007-11-01

    Logistical constraints associated with the collection and analysis of biological samples in community-based settings have been a significant impediment to integrative, multilevel bio-demographic and biobehavioral research. However recent methodological developments have overcome many of these constraints and have also expanded the options for incorporating biomarkers into population-based health research in international as well as domestic contexts. In particular using dried blood spot (DBS) samples-drops of whole blood collected on filter paper from a simple finger prick-provides a minimally invasive method for collecting blood samples in nonclinical settings. After a brief discussion of biomarkers more generally, we review procedures for collecting, handling, and analyzing DBS samples. Advantages of using DBS samples-compared with venipuncture include the relative ease and low cost of sample collection, transport, and storage. Disadvantages include requirements for assay development and validation as well as the relatively small volumes of sample. We present the results of a comprehensive literature review of published protocols for analysis of DBS samples, and we provide more detailed analysis of protocols for 45 analytes likely to be of particular relevance to population-level health research. Our objective is to provide investigators with the information they need to make informed decisions regarding the appropriateness of blood spot methods for their research interests.

  13. Investigating nurse practitioners in the private sector: a theoretically informed research protocol.

    PubMed

    Adams, Margaret; Gardner, Glenn; Yates, Patsy

    2017-06-01

    To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.

  14. An Organic Decontamination Method for Sampling Devices used in Life-detection Studies

    NASA Technical Reports Server (NTRS)

    Eigenbrode, Jennifer; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E.F.

    2008-01-01

    Organic decontamination of sampling and storage devices are crucial steps for life-detection, habitability, and ecological investigations of extremophiles living in the most inhospitable niches of Earth, Mars and elsewhere. However, one of the main stumbling blocks for Mars-analogue life-detection studies in terrestrial remote field-sites is the capability to clean instruments and sampling devices to organic levels consistent with null values. Here we present a new seven-step, multi-reagent cleaning and decontamination protocol that was adapted and tested on a glacial ice-coring device and on a rover-guided scoop used for sediment sampling both deployed multiple times during two field seasons of the Arctic Mars Analog Svalbard Expedition AMASE). The effectiveness of the protocols for both devices was tested by (1)in situ metabolic measurements via APT, (2)in situ lipopolysacchride (LPS) quantifications via low-level endotoxin assays, and(3) laboratory-based molecular detection via gas chromatography-mass spectrometry. Our results show that the combination and step-wise application of disinfectants with oxidative and solvation properties for sterilization are effective at removing cellular remnants and other organic traces to levels necessary for molecular organic- and life-detection studies. The validation of this seven-step protocol - specifically for ice sampling - allows us to proceed with confidence in kmskia4 analogue investigations of icy environments. However, results from a rover scoop test showed that this protocol is also suitable for null-level decontamination of sample acquisition devices. Thus, this protocol may be applicable to a variety of sampling devices and analytical instrumentation used for future astrobiology missions to Enceladus, and Europa, as well as for sample-return missions.

  15. Updated operational protocols for the U.S. Geological Survey Precipitation Chemistry Quality Assurance Project in support of the National Atmospheric Deposition Program

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Martin, RoseAnn

    2017-02-06

    The U.S. Geological Survey Branch of Quality Systems operates the Precipitation Chemistry Quality Assurance Project (PCQA) for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) and National Atmospheric Deposition Program/Mercury Deposition Network (NADP/MDN). Since 1978, various programs have been implemented by the PCQA to estimate data variability and bias contributed by changing protocols, equipment, and sample submission schemes within NADP networks. These programs independently measure the field and laboratory components which contribute to the overall variability of NADP wet-deposition chemistry and precipitation depth measurements. The PCQA evaluates the quality of analyte-specific chemical analyses from the two, currently (2016) contracted NADP laboratories, Central Analytical Laboratory and Mercury Analytical Laboratory, by comparing laboratory performance among participating national and international laboratories. Sample contamination and stability are evaluated for NTN and MDN by using externally field-processed blank samples provided by the Branch of Quality Systems. A colocated sampler program evaluates the overall variability of NTN measurements and bias between dissimilar precipitation gages and sample collectors.This report documents historical PCQA operations and general procedures for each of the external quality-assurance programs from 2007 to 2016.

  16. Fast liquid chromatographic-tandem mass spectrometric method using mixed-mode phase chromatography and solid phase extraction for the determination of 12 mono-hydroxylated brominated diphenyl ethers in human serum.

    PubMed

    Petropoulou, Syrago-Styliani E; Duong, Wendy; Petreas, Myrto; Park, June-Soo

    2014-08-22

    Hydroxylated polybrominated diphenyl ethers (OH-PBDEs) are formed from the oxidative metabolism of polybrominated diphenyl ethers (PBDEs) in humans, rats and mice, but their quantitation in human blood and other matrices with liquid chromatography-mass spectrometric techniques has been a challenge. In this study, a novel analytical method was developed and validated using only 250 μL of human serum for the quantitation of twelve OH-PBDEs, fully chromatographically separated in a 15 min analytical run. This method includes two novel approaches: an enzymatic hydrolysis procedure and a chromatographic separation using a mixed mode chromatography column. The enzymatic hydrolysis (EH) was found critical for 4'-OH-BDE17, which was not detectable without it. For the sample clean up, a solid phase extraction protocol was developed and validated for the extraction of the 12 congeners from human serum. In addition, for the first time baseline resolution of two components was achieved that correspond to a single peak previously identified as 6'-OH-BDE99. The method was validated for linearity, accuracy, precision, matrix effects, limit of quantification, limit of detection, sample stability and overall efficiency. Recoveries (absolute and relative) ranged from 66 to 130% with relative standard deviations <21% for all analytes. Limit of detection and quantitation ranged from 4 to 90 pg mL(-1) and 6-120 pg mL(-1), respectively, with no carry over effects. This method was applied in ten commercially available human serum samples from the general US population. The mean values of the congeners detected in all samples are 4'-OH-BDE17 (34.2 pg mL(-1)), 4-OH-BDE42 (33.9 pg mL(-1)), 5-OH-BDE47 (17.5 pg mL(-1)) and 4'-OH-BDE49 (12.4 pg mL(-1)). Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Towards a full integration of optimization and validation phases: An analytical-quality-by-design approach.

    PubMed

    Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph

    2015-05-22

    When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Multi-class multi-residue analysis of veterinary drugs in meat using enhanced matrix removal lipid cleanup and liquid chromatography-tandem mass spectrometry.

    PubMed

    Zhao, Limian; Lucas, Derick; Long, David; Richter, Bruce; Stevens, Joan

    2018-05-11

    This study presents the development and validation of a quantitation method for the analysis of multi-class, multi-residue veterinary drugs using lipid removal cleanup cartridges, enhanced matrix removal lipid (EMR-Lipid), for different meat matrices by liquid chromatography tandem mass spectrometry detection. Meat samples were extracted using a two-step solid-liquid extraction followed by pass-through sample cleanup. The method was optimized based on the buffer and solvent composition, solvent additive additions, and EMR-Lipid cartridge cleanup. The developed method was then validated in five meat matrices, porcine muscle, bovine muscle, bovine liver, bovine kidney and chicken liver to evaluate the method performance characteristics, such as absolute recoveries and precision at three spiking levels, calibration curve linearity, limit of quantitation (LOQ) and matrix effect. The results showed that >90% of veterinary drug analytes achieved satisfactory recovery results of 60-120%. Over 97% analytes achieved excellent reproducibility results (relative standard deviation (RSD) < 20%), and the LOQs were 1-5 μg/kg in the evaluated meat matrices. The matrix co-extractive removal efficiency by weight provided by EMR-lipid cartridge cleanup was 42-58% in samples. The post column infusion study showed that the matrix ion suppression was reduced for samples with the EMR-Lipid cartridge cleanup. The reduced matrix ion suppression effect was also confirmed with <15% frequency of compounds with significant quantitative ion suppression (>30%) for all tested veterinary drugs in all of meat matrices. The results showed that the two-step solid-liquid extraction provides efficient extraction for the entire spectrum of veterinary drugs, including the difficult classes such as tetracyclines, beta-lactams etc. EMR-Lipid cartridges after extraction provided efficient sample cleanup with easy streamlined protocol and minimal impacts on analytes recovery, improving method reliability and consistency. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Heavy vehicle driver workload assessment. Task 3, task analysis data collection

    DOT National Transportation Integrated Search

    This technical report consists of a collection of task analytic data to support heavy vehicle driver workload assessment and protocol development. Data were collected from professional drivers to provide insights into the following issues: the meanin...

  20. 42 CFR 493.1251 - Standard: Procedure manual.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...

  1. 42 CFR 493.1251 - Standard: Procedure manual.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...

  2. 42 CFR 493.1251 - Standard: Procedure manual.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...

  3. Combined Heat and Power Protocol for Uniform Methods Project | Advanced

    Science.gov Websites

    Manufacturing Research | NREL Combined Heat and Power Protocol for Uniform Methods Project Combined Heat and Power Protocol for Uniform Methods Project NREL developed a protocol that provides a ; is consistent with the scope and other protocols developed for the Uniform Methods Project (UMP

  4. W-MAC: A Workload-Aware MAC Protocol for Heterogeneous Convergecast in Wireless Sensor Networks

    PubMed Central

    Xia, Ming; Dong, Yabo; Lu, Dongming

    2011-01-01

    The power consumption and latency of existing MAC protocols for wireless sensor networks (WSNs) are high in heterogeneous convergecast, where each sensor node generates different amounts of data in one convergecast operation. To solve this problem, we present W-MAC, a workload-aware MAC protocol for heterogeneous convergecast in WSNs. A subtree-based iterative cascading scheduling mechanism and a workload-aware time slice allocation mechanism are proposed to minimize the power consumption of nodes, while offering a low data latency. In addition, an efficient schedule adjustment mechanism is provided for adapting to data traffic variation and network topology change. Analytical and simulation results show that the proposed protocol provides a significant energy saving and latency reduction in heterogeneous convergecast, and can effectively support data aggregation to further improve the performance. PMID:22163753

  5. Hanford analytical sample projections FY 1998--FY 2002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management,more » and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.« less

  6. Systems Biology Approach in Hypertension Research.

    PubMed

    Delles, Christian; Husi, Holger

    2017-01-01

    Systems biology is an approach to study all genes, gene transcripts, proteins, metabolites, and their interactions in specific cells, tissues, organs, or the whole organism. It is based on data derived from high-throughput analytical technologies and bioinformatics tools to analyze these data, and aims to understand the whole system rather than individual aspects of it. Systems biology can be applied to virtually all conditions and diseases and therefore also to hypertension and its underlying vascular disorders. Unlike other methods in this book there is no clear-cut protocol to explain a systems biology approach. We will instead outline some of the most important and common steps in the generation and analysis of systems biology data.

  7. Speciated arsenic in air: measurement methodology and risk assessment considerations.

    PubMed

    Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L

    2012-01-01

    Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate data more useful for arsenic inhalation risk assessment, and a more robust documentation of quality assurance/quality control (QA/QC) protocols is necessary to ensure accuracy, precision, representativeness, and comparability.

  8. The international experience of bacterial screen testing of platelet components with an automated microbial detection system: a need for consensus testing and reporting guidelines.

    PubMed

    Benjamin, Richard J; McDonald, Carl P

    2014-04-01

    The BacT/ALERT microbial detection system (bioMerieux, Inc, Durham, NC) is in routine use in many blood centers as a prerelease test for platelet collections. Published reports document wide variation in practices and outcomes. A systematic review of the English literature was performed to describe publications assessing the use of the BacT/ALERT culture system on platelet collections as a routine screen test of more than 10000 platelet components. Sixteen publications report the use of confirmatory testing to substantiate initial positive culture results but use varying nomenclature to classify the results. Preanalytical and analytical variables that may affect the outcomes differ widely between centers. Incomplete description of protocol details complicates comparison between sites. Initial positive culture results range from 539 to 10606 per million (0.054%-1.061%) and confirmed positive from 127 to 1035 per million (0.013%-0.104%) donations. False-negative results determined by outdate culture range from 662 to 2173 per million (0.066%-0.217%) and by septic reactions from 0 to 66 per million (0%-0.007%) collections. Current culture protocols represent pragmatic compromises between optimizing analytical sensitivity and ensuring the timely availability of platelets for clinical needs. Insights into the effect of protocol variations on outcomes are generally restricted to individual sites that implement limited changes to their protocols over time. Platelet manufacturers should reassess the adequacy of their BacT/ALERT screening protocols in light of the growing international experience and provide detailed documentation of all variables that may affect culture outcomes when reporting results. We propose a framework for a standardized nomenclature for reporting of the results of BacT/ALERT screening. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. A Routine Experimental Protocol for qHNMR Illustrated with Taxol⊥

    PubMed Central

    Pauli, Guido F.; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Quantitative 1H NMR (qHNMR) provides a value-added dimension to the standard spectroscopic data set involved in structure analysis, especially when analyzing bioactive molecules and elucidating new natural products. The qHNMR method can be integrated into any routine qualitative workflow without much additional effort by simply establishing quantitative conditions for the standard solution 1H NMR experiments. Moreover, examination of different chemical lots of taxol and a Taxus brevifolia extract as working examples led to a blueprint for a generic approach to performing a routinely practiced 13C-decoupled qHNMR experiment, and for recognizing its potential and main limitations. The proposed protocol is based on a newly assembled 13C GARP broadband decoupled proton acquisition sequence that reduces spectroscopic complexity by removal of carbon satellites. The method is capable of providing qualitative and quantitative NMR data simultaneously and covers various analytes from pure compounds to complex mixtures such as metabolomes. Due to a routinely achievable dynamic range of 300:1 (0.3%) or better, qHNMR qualifies for applications ranging from reference standards to biologically active compounds to metabolome analysis. Providing a “cookbook” approach to qHNMR, acquisition conditions are described that can be adapted for contemporary NMR spectrometers of all major manufacturers. PMID:17298095

  10. Stochastic gradient ascent outperforms gamers in the Quantum Moves game

    NASA Astrophysics Data System (ADS)

    Sels, Dries

    2018-04-01

    In a recent work on quantum state preparation, Sørensen and co-workers [Nature (London) 532, 210 (2016), 10.1038/nature17620] explore the possibility of using video games to help design quantum control protocols. The authors present a game called "Quantum Moves" (https://www.scienceathome.org/games/quantum-moves/) in which gamers have to move an atom from A to B by means of optical tweezers. They report that, "players succeed where purely numerical optimization fails." Moreover, by harnessing the player strategies, they can "outperform the most prominent established numerical methods." The aim of this Rapid Communication is to analyze the problem in detail and show that those claims are untenable. In fact, without any prior knowledge and starting from a random initial seed, a simple stochastic local optimization method finds near-optimal solutions which outperform all players. Counterdiabatic driving can even be used to generate protocols without resorting to numeric optimization. The analysis results in an accurate analytic estimate of the quantum speed limit which, apart from zero-point motion, is shown to be entirely classical in nature. The latter might explain why gamers are reasonably good at the game. A simple modification of the BringHomeWater challenge is proposed to test this hypothesis.

  11. Gold Nanomaterials in Consumer Cosmetics Nanoproducts: Analyses, Characterization, and Dermal Safety Assessment.

    PubMed

    Cao, Mingjing; Li, Jiayang; Tang, Jinglong; Chen, Chunying; Zhao, Yuliang

    2016-10-01

    Establishment of analytical methods of engineered nanomaterials in consumer products for their human and environmental risk assessment becomes urgent for both academic and industrial needs. Owing to the difficulties and challenges around nanomaterials in complex media, proper chemical separation and biological assays of nanomaterials from nanoproducts needs to be firstly developed. Herein, a facile and rapid method to separate and analyze gold nanomaterials in cosmetics is reported. Gold nanomaterials are successfully separated from different facial or eye creams and their physiochemical properties are analyzed by quantitative and qualitative state-of-the art techniques with high sensitivity or high spatial resolution. In turn, a protocol including quantification of gold by inductively coupled plasma mass spectrometry and thorough characterization of morphology, size distribution, and surface property by electron microscopes, atomic force microscope, and X-ray photoelectron spectroscope is developed. Subsequently, the preliminary toxicity assessment indicates that gold nanomaterials in cosmetic creams have no observable toxicity to human keratinocytes even after 24 h exposure up to a concentration of 200 μg mL -1 . The environmental scanning electron microscope reveals that gold nanomaterials are mostly attached on the cell membrane. Thus, the present study provides a full analysis protocol for toxicity assessment of gold nanomaterials in consumer products (cosmetic creams). © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Determination of dissolved bromate in drinking water by ion chromatography and post column reaction: interlaboratory study.

    PubMed

    Cordeiro, Fernando; Robouch, Piotr; de la Calle, Maria Beatriz; Emteborg, Håkan; Charoud-Got, Jean; Schmitz, Franz

    2011-01-01

    A collaborative study, International Evaluation Measurement Programme-25a, was conducted in accordance with international protocols to determine the performance characteristics of an analytical method for the determination of dissolved bromate in drinking water. The method should fulfill the analytical requirements of Council Directive 98/83/EC (referred to in this work as the Drinking Water Directive; DWD). The new draft standard method under investigation is based on ion chromatography followed by post-column reaction and UV detection. The collaborating laboratories used the Draft International Organization for Standardization (ISO)/Draft International Standard (DIS) 11206 document. The existing standard method (ISO 15061:2001) is based on ion chromatography using suppressed conductivity detection, in which a preconcentration step may be required for the determination of bromate concentrations as low as 3 to 5 microg/L. The new method includes a dilution step that reduces the matrix effects, thus allowing the determination of bromate concentrations down to 0.5 microg/L. Furthermore, the method aims to minimize any potential interference of chlorite ions. The collaborative study investigated different types of drinking water, such as soft, hard, and mineral water. Other types of water, such as raw water (untreated), swimming pool water, a blank (named river water), and a bromate standard solution, were included as test samples. All test matrixes except the swimming pool water were spiked with high-purity potassium bromate to obtain bromate concentrations ranging from 1.67 to 10.0 microg/L. Swimming pool water was not spiked, as this water was incurred with bromate. Test samples were dispatched to 17 laboratories from nine different countries. Sixteen participants reported results. The repeatability RSD (RSD(r)) ranged from 1.2 to 4.1%, while the reproducibility RSD (RSDR) ranged from 2.3 to 5.9%. These precision characteristics compare favorably with those of ISO 15601. A thorough comparison of the performance characteristics is presented in this report. All method performance characteristics obtained in the frame of this collaborative study indicate that the draft ISO/DIS 11206 standard method meets the requirements set down by the DWD. It can, therefore, be considered to fit its intended analytical purpose.

  13. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  14. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  15. Harmonization of strategies for the validation of quantitative analytical procedures. A SFSTP proposal--Part I.

    PubMed

    Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L

    2004-11-15

    This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory documents.

  16. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    PubMed

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, J.; Giam, C.S.

    Polynuclear azaarenes in a creosote-pentachlorophenol wood preservative wastewater were analyzed. The total concentration of azaarenes was determined to be 1300 mg kg/sup -1/. Potential adverse effects of these compounds on environmental quality and health suggest a need to develop analytical protocols for measuing azaarenes in hazardous wastes.

  18. Demonstrating Patterns in the Views Of Stakeholders Regarding Ethically-Salient Issues in Clinical Research: A Novel Use of Graphical Models in Empirical Ethics Inquiry.

    PubMed

    Kim, Jane Paik; Roberts, Laura Weiss

    Empirical ethics inquiry works from the notion that stakeholder perspectives are necessary for gauging the ethical acceptability of human studies and assuring that research aligns with societal expectations. Although common, studies involving different populations often entail comparisons of trends that problematize the interpretation of results. Using graphical model selection - a technique aimed at transcending limitations of conventional methods - this report presents data on the ethics of clinical research with two objectives: (1) to display the patterns of views held by ill and healthy individuals in clinical research as a test of the study's original hypothesis and (2) to introduce graphical model selection as a key analytic tool for ethics research. In this IRB-approved, NIH-funded project, data were collected from 60 mentally ill and 43 physically ill clinical research protocol volunteers, 47 healthy protocol-consented participants, and 29 healthy individuals without research protocol experience. Respondents were queried on the ethical acceptability of research involving people with mental and physical illness (i.e., cancer, HIV, depression, schizophrenia, and post-traumatic stress disorder) and non-illness related sources of vulnerability (e.g., age, class, gender, ethnicity). Using a statistical algorithm, we selected graphical models to display interrelationships among responses to questions. Both mentally and physically ill protocol volunteers revealed a high degree of connectivity among ethically-salient perspectives. Healthy participants, irrespective of research protocol experience, revealed patterns of views that were not highly connected. Between ill and healthy protocol participants, the pattern of views is vastly different. Experience with illness was tied to dense connectivity, whereas healthy individuals expressed views with sparse connections. In offering a nuanced perspective on the interrelation of ethically relevant responses, graphical model selection has the potential to bring new insights to the field of ethics.

  19. It's Time to Develop a New "Draft Test Protocol" for a Mars Sample Return Mission (or Two…).

    PubMed

    Rummel, John D; Kminek, Gerhard

    2018-04-01

    The last time NASA envisioned a sample return mission from Mars, the development of a protocol to support the analysis of the samples in a containment facility resulted in a "Draft Test Protocol" that outlined required preparations "for the safe receiving, handling, testing, distributing, and archiving of martian materials here on Earth" (Rummel et al., 2002 ). This document comprised a specific protocol to be used to conduct a biohazard test for a returned martian sample, following the recommendations of the Space Studies Board of the US National Academy of Sciences. Given the planned launch of a sample-collecting and sample-caching rover (Mars 2020) in 2 years' time, and with a sample return planned for the end of the next decade, it is time to revisit the Draft Test Protocol to develop a sample analysis and biohazard test plan to meet the needs of these future missions. Key Words: Biohazard detection-Mars sample analysis-Sample receiving facility-Protocol-New analytical techniques-Robotic sample handling. Astrobiology 18, 377-380.

  20. High‐precision determination of lithium and magnesium isotopes utilising single column separation and multi‐collector inductively coupled plasma mass spectrometry

    PubMed Central

    Misra, Sambuddha; Lloyd, Nicholas; Elderfield, Henry; Bickle, Mike J.

    2017-01-01

    Rationale Li and Mg isotopes are increasingly used as a combined tool within the geosciences. However, established methods require separate sample purification protocols utilising several column separation procedures. This study presents a single‐step cation‐exchange method for quantitative separation of trace levels of Li and Mg from multiple sample matrices. Methods The column method utilises the macro‐porous AGMP‐50 resin and a high‐aspect ratio column, allowing quantitative separation of Li and Mg from natural waters, sediments, rocks and carbonate matrices following the same elution protocol. High‐precision isotope determination was conducted by multi‐collector inductively coupled plasma mass spectrometry (MC‐ICPMS) on the Thermo Scientific™ NEPTUNE Plus™ fitted with 1013 Ω amplifiers which allow accurate and precise measurements at ion beams ≤0.51 V. Results Sub‐nanogram Li samples (0.3–0.5 ng) were regularly separated (yielding Mg masses of 1–70 μg) using the presented column method. The total sample consumption during isotopic analysis is <0.5 ng Li and <115 ng Mg with long‐term external 2σ precisions of ±0.39‰ for δ7Li and ±0.07‰ for δ26Mg. The results for geological reference standards and seawater analysed by our method are in excellent agreement with published values despite the order of magnitude lower sample consumption. Conclusions The possibility of eluting small sample masses and the low analytical sample consumption make this method ideal for samples of limited mass or low Li concentration, such as foraminifera, mineral separates or dilute river waters. PMID:29078008

  1. Development of a passive sampler for gaseous mercury

    NASA Astrophysics Data System (ADS)

    Gustin, M. S.; Lyman, S. N.; Kilner, P.; Prestbo, E.

    2011-10-01

    Here we describe work toward development of the components of a cost effective passive sampling system for gaseous Hg that could be broadly deployed by nontechnical staff. The passive sampling system included an external shield to reduce turbulence and exposure to precipitation and dust, a diffusive housing that directly protects the collection surface during deployment and handling, and a collection surface. A protocol for cleaning and deploying the sampler and an analytical method were developed. Our final design consisted of a polycarbonate external shield enclosing a custom diffusive housing made from expanded PTFE tubing. Two collection surfaces were investigated, gold sputter-coated quartz plates and silver wires. Research showed the former would require extensive quality control for use, while the latter had interferences with other atmosphere constituents. Although the gold surface exhibited the best performance over space and time, gradual passivation would limit reuse. For both surfaces lack of contamination during shipping, deployment and storage indicated that the handling protocols developed worked well with nontechnical staff. We suggest that the basis for this passive sampling system is sound, but further exploration and development of a reliable collection surface is needed.

  2. Boundary mediated position control of traveling waves

    NASA Astrophysics Data System (ADS)

    Martens, Steffen; Ziepke, Alexander; Engel, Harald

    Reaction control is an essential task in biological systems and chemical process industry. Often, the excitable medium supporting wave propagation exhibits an irregular shape and/or is limited in size. In particular, the analytic treatment of wave phenomena is notoriously difficult due to the spatial modulation of the domain's. Recently, we have provided a first systematic treatment by applying asymptotic perturbation analysis leading to an approximate description that involves a reduction of dimensionality; the 3D RD equation with spatially dependent NFBCs on the reactants reduces to a 1D reaction-diffusion-advection equation. Here, we present a novel method to control the position ϕ (t) of traveling waves in modulated domains according to a prespecified protocol of motion. Given this protocol, the ``optimal'' geometry of reactive domains Q (x) is found as the solution of the perturbatively derived equation of motion. Noteworthy, such a boundary control can be expressed in terms of the uncontrolled wave profile and its propagation velocity, rendering detailed knowledge of the reaction kinetics unnecessary. German Science Foundation DFG through the SFB 910 ''Control of Self-Organizing Nonlinear Systems''.

  3. Laser direct-write for fabrication of three-dimensional paper-based devices.

    PubMed

    He, P J W; Katis, I N; Eason, R W; Sones, C L

    2016-08-16

    We report the use of a laser-based direct-write (LDW) technique that allows the design and fabrication of three-dimensional (3D) structures within a paper substrate that enables implementation of multi-step analytical assays via a 3D protocol. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depths of hydrophobic barriers that are formed within a substrate which, when carefully designed and integrated, produce 3D flow paths. So far, we have successfully used this depth-variable patterning protocol for stacking and sealing of multi-layer substrates, for assembly of backing layers for two-dimensional (2D) lateral flow devices and finally for fabrication of 3D devices. Since the 3D flow paths can also be formed via a single laser-writing process by controlling the patterning parameters, this is a distinct improvement over other methods that require multiple complicated and repetitive assembly procedures. This technique is therefore suitable for cheap, rapid and large-scale fabrication of 3D paper-based microfluidic devices.

  4. Detecting very low allele fraction variants using targeted DNA sequencing and a novel molecular barcode-aware variant caller.

    PubMed

    Xu, Chang; Nezami Ranjbar, Mohammad R; Wu, Zhong; DiCarlo, John; Wang, Yexun

    2017-01-03

    Detection of DNA mutations at very low allele fractions with high accuracy will significantly improve the effectiveness of precision medicine for cancer patients. To achieve this goal through next generation sequencing, researchers need a detection method that 1) captures rare mutation-containing DNA fragments efficiently in the mix of abundant wild-type DNA; 2) sequences the DNA library extensively to deep coverage; and 3) distinguishes low level true variants from amplification and sequencing errors with high accuracy. Targeted enrichment using PCR primers provides researchers with a convenient way to achieve deep sequencing for a small, yet most relevant region using benchtop sequencers. Molecular barcoding (or indexing) provides a unique solution for reducing sequencing artifacts analytically. Although different molecular barcoding schemes have been reported in recent literature, most variant calling has been done on limited targets, using simple custom scripts. The analytical performance of barcode-aware variant calling can be significantly improved by incorporating advanced statistical models. We present here a highly efficient, simple and scalable enrichment protocol that integrates molecular barcodes in multiplex PCR amplification. In addition, we developed smCounter, an open source, generic, barcode-aware variant caller based on a Bayesian probabilistic model. smCounter was optimized and benchmarked on two independent read sets with SNVs and indels at 5 and 1% allele fractions. Variants were called with very good sensitivity and specificity within coding regions. We demonstrated that we can accurately detect somatic mutations with allele fractions as low as 1% in coding regions using our enrichment protocol and variant caller.

  5. Non-porous membrane-assisted liquid-liquid extraction of UV filter compounds from water samples.

    PubMed

    Rodil, Rosario; Schrader, Steffi; Moeder, Monika

    2009-06-12

    A method for the determination of nine UV filter compounds [benzophenone-3 (BP-3), isoamyl methoxycinnamate, 4-methylbenzylidene camphor, octocrylene (OC), butyl methoxydibenzoylmethane, ethylhexyl dimethyl p-aminobenzoate (OD-PABA), ethylhexyl methoxycinnamate (EHMC), ethylhexyl salicylate and homosalate] in water samples was developed and evaluated. The procedure includes non-porous membrane-assisted liquid-liquid extraction (MALLE) and LC-atmospheric pressure photoionization (APPI)-MS/MS. Membrane bags made of different polymeric materials were examined to enable a fast and simple extraction of the target analytes. Among the polymeric materials tested, low- and high-density polyethylene membranes proved to be well suited to adsorb the analytes from water samples. Finally, 2 cm length tailor-made membrane bags were prepared from low-density polyethylene in order to accommodate 100 microL of propanol. The fully optimised protocol provides recoveries from 76% to 101% and limits of detection (LOD) between 0.4 ng L(-1) (OD-PABA) and 16 ng L(-1) (EHMC). The interday repeatability of the whole protocol was below 18%. The effective separation of matrix molecules was proved by only marginal matrix influence during the APPI-MS analysis since no ion suppression effects were observed. During the extraction step, the influence of the matrix was only significant when non-treated wastewater was analysed. The analysis of lake water indicated the presence of seven UV filter compounds included in this study at concentrations between 40 ng L(-1) (BP-3) and 4381 ng L(-1) (OC). In non-treated wastewater several UV filters were also detected at concentration levels as high as 5322 ng L(-1) (OC).

  6. Energy Efficient Medium Access Control Protocol for Clustered Wireless Sensor Networks with Adaptive Cross-Layer Scheduling.

    PubMed

    Sefuba, Maria; Walingo, Tom; Takawira, Fambirai

    2015-09-18

    This paper presents an Energy Efficient Medium Access Control (MAC) protocol for clustered wireless sensor networks that aims to improve energy efficiency and delay performance. The proposed protocol employs an adaptive cross-layer intra-cluster scheduling and an inter-cluster relay selection diversity. The scheduling is based on available data packets and remaining energy level of the source node (SN). This helps to minimize idle listening on nodes without data to transmit as well as reducing control packet overhead. The relay selection diversity is carried out between clusters, by the cluster head (CH), and the base station (BS). The diversity helps to improve network reliability and prolong the network lifetime. Relay selection is determined based on the communication distance, the remaining energy and the channel quality indicator (CQI) for the relay cluster head (RCH). An analytical framework for energy consumption and transmission delay for the proposed MAC protocol is presented in this work. The performance of the proposed MAC protocol is evaluated based on transmission delay, energy consumption, and network lifetime. The results obtained indicate that the proposed MAC protocol provides improved performance than traditional cluster based MAC protocols.

  7. Modelling the protocol stack in NCS with deterministic and stochastic petri net

    NASA Astrophysics Data System (ADS)

    Hui, Chen; Chunjie, Zhou; Weifeng, Zhu

    2011-06-01

    Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.

  8. Energy Efficient Medium Access Control Protocol for Clustered Wireless Sensor Networks with Adaptive Cross-Layer Scheduling

    PubMed Central

    Sefuba, Maria; Walingo, Tom; Takawira, Fambirai

    2015-01-01

    This paper presents an Energy Efficient Medium Access Control (MAC) protocol for clustered wireless sensor networks that aims to improve energy efficiency and delay performance. The proposed protocol employs an adaptive cross-layer intra-cluster scheduling and an inter-cluster relay selection diversity. The scheduling is based on available data packets and remaining energy level of the source node (SN). This helps to minimize idle listening on nodes without data to transmit as well as reducing control packet overhead. The relay selection diversity is carried out between clusters, by the cluster head (CH), and the base station (BS). The diversity helps to improve network reliability and prolong the network lifetime. Relay selection is determined based on the communication distance, the remaining energy and the channel quality indicator (CQI) for the relay cluster head (RCH). An analytical framework for energy consumption and transmission delay for the proposed MAC protocol is presented in this work. The performance of the proposed MAC protocol is evaluated based on transmission delay, energy consumption, and network lifetime. The results obtained indicate that the proposed MAC protocol provides improved performance than traditional cluster based MAC protocols. PMID:26393608

  9. A Convenient Method for Extraction and Analysis with High-Pressure Liquid Chromatography of Catecholamine Neurotransmitters and Their Metabolites.

    PubMed

    Xie, Li; Chen, Liqin; Gu, Pan; Wei, Lanlan; Kang, Xuejun

    2018-03-01

    The extraction and analysis of catecholamine neurotransmitters in biological fluids is of great importance in assessing nervous system function and related diseases, but their precise measurement is still a challenge. Many protocols have been described for neurotransmitter measurement by a variety of instruments, including high-pressure liquid chromatography (HPLC). However, there are shortcomings, such as complicated operation or hard-to-detect multiple targets, which cannot be avoided, and presently, the dominant analysis technique is still HPLC coupled with sensitive electrochemical or fluorimetric detection, due to its high sensitivity and good selectivity. Here, a detailed protocol is described for the pretreatment and detection of catecholamines with high pressure liquid chromatography with electrochemical detection (HPLC-ECD) in real urine samples of infants, using electrospun composite nanofibers composed of polymeric crown ether with polystyrene as adsorbent, also known as the packed-fiber solid phase extraction (PFSPE) method. We show how urine samples can be easily precleaned by a nanofiber-packed solid phase column, and how the analytes in the sample can be rapidly enriched, desorbed, and detected on an ECD system. PFSPE greatly simplifies the pretreatment procedures for biological samples, allowing for decreased time, expense, and reduction of the loss of targets. Overall, this work illustrates a simple and convenient protocol for solid-phase extraction coupled to an HPLC-ECD system for simultaneous determination of three monoamine neurotransmitters (norepinephrine (NE), epinephrine (E), dopamine (DA)) and two of their metabolites (3-methoxy-4-hydroxyphenylglycol (MHPG) and 3,4-dihydroxy-phenylacetic acid (DOPAC)) in infants' urine. The established protocol was applied to assess the differences of urinary catecholamines and their metabolites between high-risk infants with perinatal brain damage and healthy controls. Comparative analysis revealed a significant difference in urinary MHPG between the two groups, indicating that the catecholamine metabolites may be an important candidate marker for early diagnosis of cases at risk for brain damage in infants.

  10. Determination of free sulphydryl groups in wheat gluten under the influence of different time and temperature of incubation: method validation.

    PubMed

    Rakita, Slađana; Pojić, Milica; Tomić, Jelena; Torbica, Aleksandra

    2014-05-01

    The aim of the present study was to determine the characteristics of an analytical method for determination of free sulphydryl (SH) groups of wheat gluten performed with previous gluten incubation for variable times (45, 90 and 135min) at variable temperatures (30 and 37°C), in order to determine its fitness-for-purpose. It was observed that the increase in temperature and gluten incubation time caused the increase in the amount of free SH groups, with more dynamic changes at 37°C. The method characteristics identified as relevant were: linearity, limit of detection, limit of quantification, precision (repeatability and reproducibility) and measurement uncertainty, which were checked within the validation protocol, while the method performance was monitored by X- and R-control charts. Identified method characteristics demonstrated its acceptable fitness-for-purpose, when assay included previous gluten incubation at 30°C. Although the method repeatability at 37°C was acceptable, the corresponding reproducibility did not meet the performance criterion on the basis of HORRAT value (HORRAT<2). Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Development and validation of a multiresidue method for the analysis of polybrominated diphenyl ethers, new brominated and organophosphorus flame retardants in sediment, sludge and dust.

    PubMed

    Cristale, Joyce; Lacorte, Silvia

    2013-08-30

    This study presents a multiresidue method for simultaneous extraction, clean-up and analysis of priority and emerging flame retardants in sediment, sewage sludge and dust. Studied compounds included eight polybrominated diphenyl ethers congeners, nine new brominated flame retardants and ten organophosphorus flame retardants. The analytical method was based on ultrasound-assisted extraction with ethyl acetate/cyclohexane (5:2, v/v), clean-up with Florisil cartridges and analysis by gas chromatography coupled to tandem mass spectrometry (GC-EI-MS/MS). Method development and validation protocol included spiked samples, certified reference material (for dust), and participation in an interlaboratory calibration. The method proved to be efficient and robust for extraction and determination of three families of flame retardants families in the studied solid matrices. The method was applied to river sediment, sewage sludge and dust samples, and allowed detection of 24 among the 27 studied flame retardants. Organophosphate esters, BDE-209 and decabromodiphenyl ethane were the most ubiquitous contaminants detected. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Identification and Quantification of N-Acyl Homoserine Lactones Involved in Bacterial Communication by Small-Scale Synthesis of Internal Standards and Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry.

    PubMed

    Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas

    2017-12-01

    N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. Graphical abstract ᅟ.

  13. High-Precision In Situ 87Sr/86Sr Analyses through Microsampling on Solid Samples: Applications to Earth and Life Sciences

    PubMed Central

    Di Salvo, Sara; Casalini, Martina; Marchionni, Sara; Adani, Teresa; Ulivi, Maurizio; Tommasini, Simone; Avanzinelli, Riccardo; Mazza, Paul P. A.; Francalanci, Lorella

    2018-01-01

    An analytical protocol for high-precision, in situ microscale isotopic investigations is presented here, which combines the use of a high-performing mechanical microsampling device and high-precision TIMS measurements on micro-Sr samples, allowing for excellent results both in accuracy and precision. The present paper is a detailed methodological description of the whole analytical procedure from sampling to elemental purification and Sr-isotope measurements. The method offers the potential to attain isotope data at the microscale on a wide range of solid materials with the use of minimally invasive sampling. In addition, we present three significant case studies for geological and life sciences, as examples of the various applications of microscale 87Sr/86Sr isotope ratios, concerning (i) the pre-eruptive mechanisms triggering recent eruptions at Nisyros volcano (Greece), (ii) the dynamics involved with the initial magma ascent during Eyjafjallajökull volcano's (Iceland) 2010 eruption, which are usually related to the precursory signals of the eruption, and (iii) the environmental context of a MIS 3 cave bear, Ursus spelaeus. The studied cases show the robustness of the methods, which can be also be applied in other areas, such as cultural heritage, archaeology, petrology, and forensic sciences. PMID:29850369

  14. A protocol for a systematic review to identify allergenic tree nuts and the molecules responsible for their allergenic properties.

    PubMed

    Javed, Bushra; Padfield, Philip; Sperrin, Matthew; Simpson, Angela; Mills, E N Clare

    2017-08-01

    Food regulations require that tree nuts and derived ingredients are included on food labels in order to help individuals with IgE-mediated allergies to avoid them. However, there is no consensus regarding which tree nut species should be included in this definition and specified on food labels. Allergen detection methods used for monitoring foods target allergen molecules, but it not clear which are the most relevant molecules to choose. A modified population-exposure-comparators-outcome (PECO) approach has been developed to systematically review the evidence regarding (1) which allergenic tree nuts should be included in food allergen labelling lists and (2) which are the clinically relevant allergens which should be used as analytical targets. A search strategy and criteria against which the evidence will be evaluated have been developed. The resulting evidence will be used to rank tree nuts with regards their ability to cause IgE-mediated allergies, and allergen molecules regarding their capacity to elicit an allergic reaction. The results of the systematic review will enable risk assessors and managers to identify tree nut species that should be included in food allergen labelling lists and ensure analytical methods for determination of allergens in foods are targeting appropriate molecules. Copyright © 2017. Published by Elsevier Ltd.

  15. Identification and Quantification of N-Acyl Homoserine Lactones Involved in Bacterial Communication by Small-Scale Synthesis of Internal Standards and Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas

    2017-12-01

    N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. [Figure not available: see fulltext.

  16. Quantitative mass spectrometry of unconventional human biological matrices

    NASA Astrophysics Data System (ADS)

    Dutkiewicz, Ewelina P.; Urban, Pawel L.

    2016-10-01

    The development of sensitive and versatile mass spectrometric methodology has fuelled interest in the analysis of metabolites and drugs in unconventional biological specimens. Here, we discuss the analysis of eight human matrices-hair, nail, breath, saliva, tears, meibum, nasal mucus and skin excretions (including sweat)-by mass spectrometry (MS). The use of such specimens brings a number of advantages, the most important being non-invasive sampling, the limited risk of adulteration and the ability to obtain information that complements blood and urine tests. The most often studied matrices are hair, breath and saliva. This review primarily focuses on endogenous (e.g. potential biomarkers, hormones) and exogenous (e.g. drugs, environmental contaminants) small molecules. The majority of analytical methods used chromatographic separation prior to MS; however, such a hyphenated methodology greatly limits analytical throughput. On the other hand, the mass spectrometric methods that exclude chromatographic separation are fast but suffer from matrix interferences. To enable development of quantitative assays for unconventional matrices, it is desirable to standardize the protocols for the analysis of each specimen and create appropriate certified reference materials. Overcoming these challenges will make analysis of unconventional human biological matrices more common in a clinical setting. This article is part of the themed issue 'Quantitative mass spectrometry'.

  17. Assessment of sample preservation techniques for pharmaceuticals, personal care products, and steroids in surface and drinking water.

    PubMed

    Vanderford, Brett J; Mawhinney, Douglas B; Trenholm, Rebecca A; Zeigler-Holady, Janie C; Snyder, Shane A

    2011-02-01

    Proper collection and preservation techniques are necessary to ensure sample integrity and maintain the stability of analytes until analysis. Data from improperly collected and preserved samples could lead to faulty conclusions and misinterpretation of the occurrence and fate of the compounds being studied. Because contaminants of emerging concern, such as pharmaceuticals and personal care products (PPCPs) and steroids, generally occur in surface and drinking water at ng/L levels, these compounds in particular require such protocols to accurately assess their concentrations. In this study, sample bottle types, residual oxidant quenching agents, preservation agents, and hold times were assessed for 21 PPCPs and steroids in surface water and finished drinking water. Amber glass bottles were found to have the least effect on target analyte concentrations, while high-density polyethylene bottles had the most impact. Ascorbic acid, sodium thiosulfate, and sodium sulfite were determined to be acceptable quenching agents and preservation with sodium azide at 4 °C led to the stability of the most target compounds. A combination of amber glass bottles, ascorbic acid, and sodium azide preserved analyte concentrations for 28 days in the tested matrices when held at 4 °C. Samples without a preservation agent were determined to be stable for all but two of the analytes when stored in amber glass bottles at 4 °C for 72 h. Results suggest that if improper protocols are utilized, reported concentrations of target PPCPs and steroids may be inaccurate.

  18. Uptake of a web-based oncology protocol system: how do cancer clinicians use eviQ cancer treatments online?

    PubMed Central

    2013-01-01

    Background The use of computerized systems to support evidence-based practice is commonplace in contemporary medicine. Despite the prolific use of electronic support systems there has been relatively little research on the uptake of web-based systems in the oncology setting. Our objective was to examine the uptake of a web-based oncology protocol system (http://www.eviq.org.au) by Australian cancer clinicians. Methods We used web-logfiles and Google Analytics to examine the characteristics of eviQ registrants from October 2009-December 2011 and patterns of use by cancer clinicians during a typical month. Results As of December 2011, there were 16,037 registrants; 85% of whom were Australian health care professionals. During a typical month 87% of webhits occurred in standard clinical hours (08:00 to 18:00 weekdays). Raw webhits were proportional to the size of clinician groups: nurses (47% of Australian registrants), followed by doctors (20%), and pharmacists (14%). However, pharmacists had up to three times the webhit rate of other clinical groups. Clinicians spent five times longer viewing chemotherapy protocol pages than other content and the protocols viewed reflect the most common cancers: lung, breast and colorectal. Conclusions Our results demonstrate eviQ is used by a range of health professionals involved in cancer treatment at the point-of-care. Continued monitoring of electronic decision support systems is vital to understanding how they are used in clinical practice and their impact on processes of care and patient outcomes. PMID:23497080

  19. Systematic assessment of benefits and risks: study protocol for a multi-criteria decision analysis using the Analytic Hierarchy Process for comparative effectiveness research

    PubMed Central

    Singh, Sonal

    2013-01-01

    Background: Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes.  Methods: This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation.  Discussion: Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences. PMID:24555077

  20. Glycosylated hemoglobin testing in the National Social Life, Health, and Aging Project.

    PubMed

    Gregg, Forest T; O'Doherty, Katie; Schumm, L Philip; McClintock, Martha K; Huang, Elbert S

    2014-11-01

    Longitudinal biomeasures of health are still new in nationally representative social science survey research. Data measuring blood sugar control provide opportunities for understanding the development of diabetes and its complications in older adults, but researchers must be aware that some of the differences across time can be due to variations in measurement procedures. This is a well-recognized issue whenever all samples cannot be assayed at the same time and we sought to present the analytic methods to quantify and adjust for the variation. We collected and analyzed HbA1C, glycated hemoglobin, a biomeasure of average blood sugar concentrations within the past few months. Improvements were made in the collection protocol for Wave 2, and assays were performed by a different lab. The HbA1C data obtained during Wave 1 and Wave 2 are consistent with the expected population distributions for differences by gender, age, race/ethnicity, and diabetes status. Age-adjusted mean HbA1C declined slightly from Wave 1 to Wave 2 by -0.19 (95% confidence interval [CI]: -0.27, -0.10), and the average longitudinal change was -0.12 (95% CI: -0.18, -0.06). Collection of HbA1C in Wave 2 permits researchers to examine the relationship between HbA1C and new health and social measures added in Wave 2, and to identify factors related to the change in HbA1C. Changes in collection protocol and labs between waves may have yielded small systematic differences that require analysts to carefully interpret absolute HbA1C values. We recommend analytic methods for cross wave differences in HbA1C and steps to ensure cross wave comparability in future studies. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Optimization of permanent breast seed implant dosimetry incorporating tissue heterogeneity

    NASA Astrophysics Data System (ADS)

    Mashouf, Shahram

    Seed brachytherapy is currently used for adjuvant radiotherapy of early stage prostate and breast cancer patients. The current standard for calculation of dose around brachytherapy sources is based on the AAPM TG43 formalism, which generates the dose in homogeneous water medium. Recently, AAPM task group no. 186 (TG186) emphasized the importance of accounting for heterogeneities. In this work we introduce an analytical dose calculation algorithm in heterogeneous media using CT images. The advantages over other methods are computational efficiency and the ease of integration into clinical use. An Inhomogeneity Correction Factor (ICF) is introduced as the ratio of absorbed dose in tissue to that in water medium. ICF is a function of tissue properties and independent of the source structure. The ICF is extracted using CT images and the absorbed dose in tissue can then be calculated by multiplying the dose as calculated by the TG43 formalism times ICF. To evaluate the methodology, we compared our results with Monte Carlo simulations as well as experiments in phantoms with known density and atomic compositions. The dose distributions obtained through applying ICF to TG43 protocol agreed very well with those of Monte Carlo simulations and experiments in all phantoms. In all cases, the mean relative error was reduced by at least a factor of two when ICF correction factor was applied to the TG43 protocol. In conclusion we have developed a new analytical dose calculation method, which enables personalized dose calculations in heterogeneous media using CT images. The methodology offers several advantages including the use of standard TG43 formalism, fast calculation time and extraction of the ICF parameters directly from Hounsfield Units. The methodology was implemented into our clinical treatment planning system where a cohort of 140 patients were processed to study the clinical benefits of a heterogeneity corrected dose.

  2. Multicentric Comparative Analytical Performance Study for Molecular Detection of Low Amounts of Toxoplasma gondii from Simulated Specimens▿ †

    PubMed Central

    Sterkers, Yvon; Varlet-Marie, Emmanuelle; Cassaing, Sophie; Brenier-Pinchart, Marie-Pierre; Brun, Sophie; Dalle, Frédéric; Delhaes, Laurence; Filisetti, Denis; Pelloux, Hervé; Yera, Hélène; Bastien, Patrick

    2010-01-01

    Although screening for maternal toxoplasmic seroconversion during pregnancy is based on immunodiagnostic assays, the diagnosis of clinically relevant toxoplasmosis greatly relies upon molecular methods. A problem is that this molecular diagnosis is subject to variation of performances, mainly due to a large diversity of PCR methods and primers and the lack of standardization. The present multicentric prospective study, involving eight laboratories proficient in the molecular prenatal diagnosis of toxoplasmosis, was a first step toward the harmonization of this diagnosis among university hospitals in France. Its aim was to compare the analytical performances of different PCR protocols used for Toxoplasma detection. Each center extracted the same concentrated Toxoplasma gondii suspension and tested serial dilutions of the DNA using its own assays. Differences in analytical sensitivities were observed between assays, particularly at low parasite concentrations (≤2 T. gondii genomes per reaction tube), with “performance scores” differing by a 20-fold factor among laboratories. Our data stress the fact that differences do exist in the performances of molecular assays in spite of expertise in the matter; we propose that laboratories work toward a detection threshold defined for a best sensitivity of this diagnosis. Moreover, on the one hand, intralaboratory comparisons confirmed previous studies showing that rep529 is a more adequate DNA target for this diagnosis than the widely used B1 gene. But, on the other hand, interlaboratory comparisons showed differences that appear independent of the target, primers, or technology and that hence rely essentially on proficiency and care in the optimization of PCR conditions. PMID:20610670

  3. A large-scale superhydrophobic surface-enhanced Raman scattering (SERS) platform fabricated via capillary force lithography and assembly of Ag nanocubes for ultratrace molecular sensing.

    PubMed

    Tan, Joel Ming Rui; Ruan, Justina Jiexin; Lee, Hiang Kwee; Phang, In Yee; Ling, Xing Yi

    2014-12-28

    An analytical platform with an ultratrace detection limit in the atto-molar (aM) concentration range is vital for forensic, industrial and environmental sectors that handle scarce/highly toxic samples. Superhydrophobic surface-enhanced Raman scattering (SERS) platforms serve as ideal platforms to enhance detection sensitivity by reducing the random spreading of aqueous solution. However, the fabrication of superhydrophobic SERS platforms is generally limited due to the use of sophisticated and expensive protocols and/or suffers structural and signal inconsistency. Herein, we demonstrate a high-throughput fabrication of a stable and uniform superhydrophobic SERS platform for ultratrace molecular sensing. Large-area box-like micropatterns of the polymeric surface are first fabricated using capillary force lithography (CFL). Subsequently, plasmonic properties are incorporated into the patterned surfaces by decorating with Ag nanocubes using the Langmuir-Schaefer technique. To create a stable superhydrophobic SERS platform, an additional 25 nm Ag film is coated over the Ag nanocube-decorated patterned template followed by chemical functionalization with perfluorodecanethiol. Our resulting superhydrophobic SERS platform demonstrates excellent water-repellency with a static contact angle of 165° ± 9° and a consequent analyte concentration factor of 59-fold, as compared to its hydrophilic counterpart. By combining the analyte concentration effect of superhydrophobic surfaces with the intense electromagnetic "hot spots" of Ag nanocubes, our superhydrophobic SERS platform achieves an ultra-low detection limit of 10(-17) M (10 aM) for rhodamine 6G using just 4 μL of analyte solutions, corresponding to an analytical SERS enhancement factor of 10(13). Our fabrication protocol demonstrates a simple, cost- and time-effective approach for the large-scale fabrication of a superhydrophobic SERS platform for ultratrace molecular detection.

  4. Considerations in detecting CDC select agents under field conditions

    NASA Astrophysics Data System (ADS)

    Spinelli, Charles; Soelberg, Scott; Swanson, Nathaneal; Furlong, Clement; Baker, Paul

    2008-04-01

    Surface Plasmon Resonance (SPR) has become a widely accepted technique for real-time detection of interactions between receptor molecules and ligands. Antibody may serve as receptor and can be attached to the gold surface of the SPR device, while candidate analyte fluids contact the detecting antibody. Minute, but detectable, changes in refractive indices (RI) indicate that analyte has bound to the antibody. A decade ago, an inexpensive, robust, miniature and fully integrated SPR chip, called SPREETA, was developed. University of Washington (UW) researchers subsequently developed a portable, temperature-regulated instrument, called SPIRIT, to simultaneously use eight of these three-channel SPREETA chips. A SPIRIT prototype instrument was tested in the field, coupled to a remote reporting system on a surrogate unmanned aerial vehicle (UAV). Two target protein analytes were released sequentially as aerosols with low analyte concentration during each of three flights and were successfully detected and verified. Laboratory experimentation with a more advanced SPIRIT instrument demonstrated detection of very low levels of several select biological agents that might be employed by bioterrorists. Agent detection under field-like conditions is more challenging, especially as analyte concentrations are reduced and complex matricies are introduced. Two different sample preconditioning protocols have been developed for select agents in complex matrices. Use of these preconditioning techniques has allowed laboratory detection in spiked heavy mud of Francisella tularensis at 10 3 CFU/ml, Bacillus anthracis spores at 10 3 CFU/ml, Staphylococcal enterotoxin B (SEB) at 1 ng/ml, and Vaccinia virus (a smallpox simulant) at 10 5 PFU/ml. Ongoing experiments are aimed at simultaneous detection of multiple agents in spiked heavy mud, using a multiplex preconditioning protocol.

  5. Analytical Models of Cross-Layer Protocol Optimization in Real-Time Wireless Sensor Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    The real-time interactions among the nodes of a wireless sensor network (WSN) to cooperatively process data from multiple sensors are modeled. Quality-of-service (QoS) metrics are associated with the quality of fused information: throughput, delay, packet error rate, etc. Multivariate point process (MVPP) models of discrete random events in WSNs establish stochastic characteristics of optimal cross-layer protocols. Discrete-event, cross-layer interactions in mobile ad hoc network (MANET) protocols have been modeled using a set of concatenated design parameters and associated resource levels by the MVPPs. Characterization of the "best" cross-layer designs for a MANET is formulated by applying the general theory of martingale representations to controlled MVPPs. Performance is described in terms of concatenated protocol parameters and controlled through conditional rates of the MVPPs. Modeling limitations to determination of closed-form solutions versus explicit iterative solutions for ad hoc WSN controls are examined.

  6. Determination of perfluorinated compounds in mollusks by matrix solid-phase dispersion and liquid chromatography-tandem mass spectrometry.

    PubMed

    Villaverde-de-Sáa, Eugenia; Quintana, José Benito; Rodil, Rosario; Ferrero-Refojos, Raúl; Rubí, Elisa; Cela, Rafael

    2012-01-01

    Perfluorinated compounds (PFCs) have been used for over 40 years in different commercial and industrial applications mainly as surfactants and surface protectors and have become an important class of marine emerging pollutants. This study presents the development and validation of a new analytical method to determine the simultaneous presence of eight PFCs in different kinds of mollusks using matrix solid-phase dispersion (MSPD) followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Simplicity of the analytical procedure, low volume of solvent and quantity of sample required, low global price, and integration of extraction and clean-up into a single step, are the most important advantages of the developed methodology. Solvent, solid support (dispersing agent), clean-up sorbent, and their amounts were optimized by means of an experimental design. In the final method, 0.5 g of sample are dispersed with 0.2 g of diatomaceous earth and transferred into a polypropylene syringe containing 4 g of silica as clean-up sorbent. Then, analytes are eluted with 20 mL of acetonitrile. The extract is finally concentrated to a final volume of 0.5 mL in methanol, avoiding extract dryness in order to prevent evaporation losses and injected in the LC-MS/MS. The combination of this MSPD protocol with LC-MS/MS afforded detection limits from 0.05 to 0.3 ng g(-1). Also, a good linearity was established for the eight PFCs in the range from limit of quantification (LOQ) to 500 ng mL(-1) with R(2) > 0.9917. The recovery of the method was studied with three types of spiked mollusk and was in the 64-126% range. Moreover, a mussel sample was spiked and aged for more than 1 month and analyzed by the developed method and a reference method, ion-pair extraction, for comparison, producing both methods statistically equal concentration values. The method was finally applied to the determination of PFCs in different kinds of mollusks revealing concentrations up to 8.3 ng g(-1) for perfluoroundecanoic acid.

  7. An Extrapolation of a Radical Equation More Accurately Predicts Shelf Life of Frozen Biological Matrices.

    PubMed

    De Vore, Karl W; Fatahi, Nadia M; Sass, John E

    2016-08-01

    Arrhenius modeling of analyte recovery at increased temperatures to predict long-term colder storage stability of biological raw materials, reagents, calibrators, and controls is standard practice in the diagnostics industry. Predicting subzero temperature stability using the same practice is frequently criticized but nevertheless heavily relied upon. We compared the ability to predict analyte recovery during frozen storage using 3 separate strategies: traditional accelerated studies with Arrhenius modeling, and extrapolation of recovery at 20% of shelf life using either ordinary least squares or a radical equation y = B1x(0.5) + B0. Computer simulations were performed to establish equivalence of statistical power to discern the expected changes during frozen storage or accelerated stress. This was followed by actual predictive and follow-up confirmatory testing of 12 chemistry and immunoassay analytes. Linear extrapolations tended to be the most conservative in the predicted percent recovery, reducing customer and patient risk. However, the majority of analytes followed a rate of change that slowed over time, which was fit best to a radical equation of the form y = B1x(0.5) + B0. Other evidence strongly suggested that the slowing of the rate was not due to higher-order kinetics, but to changes in the matrix during storage. Predicting shelf life of frozen products through extrapolation of early initial real-time storage analyte recovery should be considered the most accurate method. Although in this study the time required for a prediction was longer than a typical accelerated testing protocol, there are less potential sources of error, reduced costs, and a lower expenditure of resources. © 2016 American Association for Clinical Chemistry.

  8. Nicotine Metabolite Ratio (3-hydroxycotinine/cotinine) in Plasma and Urine by Different Analytical Methods and Laboratories: Implications for Clinical Implementation

    PubMed Central

    Tanner, Julie-Anne; Novalen, Maria; Jatlow, Peter; Huestis, Marilyn A.; Murphy, Sharon E.; Kaprio, Jaakko; Kankaanpää, Aino; Galanti, Laurence; Stefan, Cristiana; George, Tony P.; Benowitz, Neal L.; Lerman, Caryn; Tyndale, Rachel F.

    2015-01-01

    Background The highly genetically variable enzyme CYP2A6 metabolizes nicotine to cotinine (COT) and COT to trans-3′-hydroxycotinine (3HC). The nicotine metabolite ratio (NMR, 3HC/COT) is commonly used as a biomarker of CYP2A6 enzymatic activity, rate of nicotine metabolism, and total nicotine clearance; NMR is associated with numerous smoking phenotypes, including smoking cessation. Our objective was to investigate the impact of different measurement methods, at different sites, on plasma and urinary NMR measures from ad libitum smokers. Methods Plasma (n=35) and urine (n=35) samples were sent to eight different laboratories, which employed similar and different methods of COT and 3HC measurements to derive the NMR. We used Bland-Altman analysis to assess agreement, and Pearson correlations to evaluate associations, between NMR measured by different methods. Results Measures of plasma NMR were in strong agreement between methods according to Bland-Altman analysis (ratios 0.82–1.16) and were highly correlated (all Pearson r>0.96, P<0.0001). Measures of urinary NMR were in relatively weaker agreement (ratios 0.62–1.71) and less strongly correlated (Pearson r values of 0.66–0.98, P<0.0001) between different methods. Plasma and urinary COT and 3HC concentrations, while weaker than NMR, also showed good agreement in plasma, which was better than in urine, as was observed for NMR. Conclusions Plasma is a very reliable biological source for the determination of NMR, robust to differences in these analytical protocols or assessment site. Impact Together this indicates a reduced need for differential interpretation of plasma NMR results based on the approach used, allowing for direct comparison of different studies. PMID:26014804

  9. A review of designer anabolic steroids in equine sports.

    PubMed

    Waller, Christopher C; McLeod, Malcolm D

    2017-09-01

    In recent years, the potential for anabolic steroid abuse in equine sports has increased due to the growing availability of designer steroids. These compounds are readily accessible online in 'dietary' or 'nutritional' supplements and contain steroidal compounds which have never been tested or approved as veterinary agents. They typically have unusual structures or substitution and as a result may pass undetected through current anti-doping screening protocols, making them a significant concern for the integrity of the industry. Despite considerable focus in human sports, until recently there has been limited investigation into these compounds in equine systems. To effectively respond to the threat of designer steroids, a detailed understanding of their metabolism is needed to identify markers and metabolites arising from their misuse. A summary of the literature detailing the metabolism of these compounds in equine systems is presented with an aim to identify metabolites suitable for incorporation into screening protocols by anti-doping laboratories. The future of equine anti-doping research is likely to be guided by the incorporation of alternate testing matrices into routine screening, the improvement of in vitro technologies that can mimic in vivo equine metabolism, and the improvement of instrumentation or analytical methods that allow for the development of untargeted screening, and metabolomics approaches for use in anti-doping screening protocols. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Automation of dimethylation after guanidination labeling chemistry and its compatibility with common buffers and surfactants for mass spectrometry-based shotgun quantitative proteome analysis.

    PubMed

    Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang

    2013-07-25

    Isotope labeling liquid chromatography-mass spectrometry (LC-MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Preoperative vestibular assessment protocol of cochlear implant surgery: an analytical descriptive study.

    PubMed

    Bittar, Roseli Saraiva Moreira; Sato, Eduardo Setsuo; Ribeiro, Douglas Jósimo Silva; Tsuji, Robinson Koji

    Cochlear implants are undeniably an effective method for the recovery of hearing function in patients with hearing loss. To describe the preoperative vestibular assessment protocol in subjects who will be submitted to cochlear implants. Our institutional protocol provides the vestibular diagnosis through six simple tests: Romberg and Fukuda tests, assessment for spontaneous nystagmus, Head Impulse Test, evaluation for Head Shaking Nystagmus and caloric test. 21 patients were evaluated with a mean age of 42.75±14.38 years. Only 28% of the sample had all normal test results. The presence of asymmetric vestibular information was documented through the caloric test in 32% of the sample and spontaneous nystagmus was an important clue for the diagnosis. Bilateral vestibular areflexia was present in four subjects, unilateral arreflexia in three and bilateral hyporeflexia in two. The Head Impulse Test was a significant indicator for the diagnosis of areflexia in the tested ear (p=0.0001). The sensitized Romberg test using a foam pad was able to diagnose severe vestibular function impairment (p=0.003). The six clinical tests were able to identify the presence or absence of vestibular function and function asymmetry between the ears of the same individual. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  12. Effect of modulation of the particle size distributions in the direct solid analysis by total-reflection X-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Fernández-Ruiz, Ramón; Friedrich K., E. Josue; Redrejo, M. J.

    2018-02-01

    The main goal of this work was to investigate, in a systematic way, the influence of the controlled modulation of the particle size distribution of a representative solid sample with respect to the more relevant analytical parameters of the Direct Solid Analysis (DSA) by Total-reflection X-Ray Fluorescence (TXRF) quantitative method. In particular, accuracy, uncertainty, linearity and detection limits were correlated with the main parameters of their size distributions for the following elements; Al, Si, P, S, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, As, Se, Rb, Sr, Ba and Pb. In all cases strong correlations were finded. The main conclusion of this work can be resumed as follows; the modulation of particles shape to lower average sizes next to a minimization of the width of particle size distributions, produce a strong increment of accuracy, minimization of uncertainties and limit of detections for DSA-TXRF methodology. These achievements allow the future use of the DSA-TXRF analytical methodology for development of ISO norms and standardized protocols for the direct analysis of solids by mean of TXRF.

  13. Molecular imaging of cannabis leaf tissue with MeV-SIMS method

    NASA Astrophysics Data System (ADS)

    Jenčič, Boštjan; Jeromel, Luka; Ogrinc Potočnik, Nina; Vogel-Mikuš, Katarina; Kovačec, Eva; Regvar, Marjana; Siketić, Zdravko; Vavpetič, Primož; Rupnik, Zdravko; Bučar, Klemen; Kelemen, Mitja; Kovač, Janez; Pelicon, Primož

    2016-03-01

    To broaden our analytical capabilities with molecular imaging in addition to the existing elemental imaging with micro-PIXE, a linear Time-Of-Flight mass spectrometer for MeV Secondary Ion Mass Spectrometry (MeV-SIMS) was constructed and added to the existing nuclear microprobe at the Jožef Stefan Institute. We measured absolute molecular yields and damage cross-section of reference materials, without significant alteration of the fragile biological samples during the duration of measurements in the mapping mode. We explored the analytical capability of the MeV-SIMS technique for chemical mapping of the plant tissue of medicinal cannabis leaves. A series of hand-cut plant tissue slices were prepared by standard shock-freezing and freeze-drying protocol and deposited on the Si wafer. We show the measured MeV-SIMS spectra showing a series of peaks in the mass area of cannabinoids, as well as their corresponding maps. The indicated molecular distributions at masses of 345.5 u and 359.4 u may be attributed to the protonated THCA and THCA-C4 acids, and show enhancement in the areas with opened trichome morphology.

  14. Engineering fluidic delays in paper-based devices using laser direct-writing.

    PubMed

    He, P J W; Katis, I N; Eason, R W; Sones, C L

    2015-10-21

    We report the use of a new laser-based direct-write technique that allows programmable and timed fluid delivery in channels within a paper substrate which enables implementation of multi-step analytical assays. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depth and/or the porosity of hydrophobic barriers which, when fabricated in the fluid path, produce controllable fluid delay. We have patterned these flow delaying barriers at pre-defined locations in the fluidic channels using either a continuous wave laser at 405 nm, or a pulsed laser operating at 266 nm. Using this delay patterning protocol we generated flow delays spanning from a few minutes to over half an hour. Since the channels and flow delay barriers can be written via a common laser-writing process, this is a distinct improvement over other methods that require specialist operating environments, or custom-designed equipment. This technique can therefore be used for rapid fabrication of paper-based microfluidic devices that can perform single or multistep analytical assays.

  15. 40 CFR 61.356 - Recordkeeping requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... test protocol and the means by which sampling variability and analytical variability were accounted for... also establish the design minimum and average temperature in the combustion zone and the combustion... the design minimum and average temperatures across the catalyst bed inlet and outlet. (C) For a boiler...

  16. HYDROLYSIS OF MTBE IN GROUND WATER SAMPLES PRESERVED WITIH HYDROCHLORIC ACID

    EPA Science Inventory

    Conventional sampling and analytical protocols have poor sensitivity for fuel oxygenates that are alcohols, such as TBA. Because alcohols tend to stay with the water samples, they are not efficiently transferred to the gas chromatograph for separation and analysis. A common tec...

  17. HYDROLYSIS OF MTBE TO TBA IN GROUND WATER SAMPLES WITH HYDROCHLORIC ACID

    EPA Science Inventory

    Conventional sampling and analytical protocols have poor sensitivity for fuel oxygenates that are alcohols, such as tert-butyl alcohol (TBA). Because alcohols are miscible or highly soluble in water, alcohols are not efficiently transferred to the gas chromatograph for analysis....

  18. UTILIZATION OF T-RFLP (TERMINAL RESTRICTION FRAGMENT LENGTH POLYMORPHISM) TO CHARACTERIZE MIXED ECTOMYCORRHIZAL FUNGAL COMMUNITIES

    EPA Science Inventory

    Studies of ectomycorrhizal community structure have used a variety of analytical regimens including sole or partial reliance on gross morphological characterization of colonized root tips. Depending on the rigor of the classification protocol, this technique can incorrectly assig...

  19. Plant gum identification in historic artworks

    PubMed Central

    Granzotto, Clara; Arslanoglu, Julie; Rolando, Christian; Tokarski, Caroline

    2017-01-01

    We describe an integrated and straightforward new analytical protocol that identifies plant gums from various sample sources including cultural heritage. Our approach is based on the identification of saccharidic fingerprints using mass spectrometry following controlled enzymatic hydrolysis. We developed an enzyme cocktail suitable for plant gums of unknown composition. Distinctive MS profiles of gums such as arabic, cherry and locust-bean gums were successfully identified. A wide range of oligosaccharidic combinations of pentose, hexose, deoxyhexose and hexuronic acid were accurately identified in gum arabic whereas cherry and locust bean gums showed respectively PentxHexy and Hexn profiles. Optimized for low sample quantities, the analytical protocol was successfully applied to contemporary and historic samples including ‘Colour Box Charles Roberson & Co’ dating 1870s and drawings from the American painter Arthur Dove (1880–1946). This is the first time that a gum is accurately identified in a cultural heritage sample using structural information. Furthermore, this methodology is applicable to other domains (food, cosmetic, pharmaceutical, biomedical). PMID:28425501

  20. Fabrication of a Dipole-assisted Solid Phase Extraction Microchip for Trace Metal Analysis in Water Samples

    PubMed Central

    Chen, Ping-Hung; Chen, Shun-Niang; Tseng, Sheng-Hao; Deng, Ming-Jay; Lin, Yang-Wei; Sun, Yuh-Chang

    2016-01-01

    This paper describes a fabrication protocol for a dipole-assisted solid phase extraction (SPE) microchip available for trace metal analysis in water samples. A brief overview of the evolution of chip-based SPE techniques is provided. This is followed by an introduction to specific polymeric materials and their role in SPE. To develop an innovative dipole-assisted SPE technique, a chlorine (Cl)-containing SPE functionality was implanted into a poly(methyl methacrylate) (PMMA) microchip. Herein, diverse analytical techniques including contact angle analysis, Raman spectroscopic analysis, and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) analysis were employed to validate the utility of the implantation protocol of the C-Cl moieties on the PMMA. The analytical results of the X-ray absorption near-edge structure (XANES) analysis also demonstrated the feasibility of the Cl-containing PMMA used as an extraction medium by virtue of the dipole-ion interactions between the highly electronegative C-Cl moieties and the positively charged metal ions. PMID:27584954

  1. Evolving Pb isotope signatures of London airborne particulate matter (PM 10)-constraints from on-filter and solution-mode MC-ICP-MS.

    PubMed

    Noble, Stephen R; Horstwood, Matthew S A; Davy, Pamela; Pashley, Vanessa; Spiro, Baruch; Smith, Steve

    2008-07-01

    Pb isotope compositions of biologically significant PM(10) atmospheric particulates from a busy roadside location in London UK were measured using solution- and laser ablation-mode MC-ICP-MS. The solution-mode data for PM(10) sampled between 1998-2001 document a dramatic shift to increasingly radiogenic compositions as leaded petrol was phased out. LA-MC-ICP-MS isotope analysis, piloted on a subset of the available samples, is shown to be a potential reconnaissance analytical technique. PM(10) particles trapped on quartz filters were liberated from the filter surface, without ablating the filter substrate, using a 266 nm UV laser and a dynamic, large diameter, low-fluence ablation protocol. The Pb isotope evolution noted in the London data set obtained by both analytical protocols is similar to that observed elsewhere in Western Europe following leaded petrol elimination. The data therefore provide important baseline isotope composition information useful for continued UK atmospheric monitoring through the early 21(st) century.

  2. Suppressing spectral diffusion of emitted photons with optical pulses

    DOE PAGES

    Fotso, H. F.; Feiguin, A. E.; Awschalom, D. D.; ...

    2016-01-22

    In many quantum architectures the solid-state qubits, such as quantum dots or color centers, are interfaced via emitted photons. However, the frequency of photons emitted by solid-state systems exhibits slow uncontrollable fluctuations over time (spectral diffusion), creating a serious problem for implementation of the photon-mediated protocols. Here we show that a sequence of optical pulses applied to the solid-state emitter can stabilize the emission line at the desired frequency. We demonstrate efficiency, robustness, and feasibility of the method analytically and numerically. Taking nitrogen-vacancy center in diamond as an example, we show that only several pulses, with the width of 1more » ns, separated by few ns (which is not difficult to achieve) can suppress spectral diffusion. As a result, our method provides a simple and robust way to greatly improve the efficiency of photon-mediated entanglement and/or coupling to photonic cavities for solid-state qubits.« less

  3. Chapter 2: Commercial and Industrial Lighting Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W; Gowans, Dakers; Telarico, Chad

    The Commercial and Industrial Lighting Evaluation Protocol (the protocol) describes methods to account for gross energy savings resulting from the programmatic installation of efficient lighting equipment in large populations of commercial, industrial, and other nonresidential facilities. This protocol does not address savings resulting from changes in codes and standards, or from education and training activities. A separate Uniform Methods Project (UMP) protocol, Chapter 3: Commercial and Industrial Lighting Controls Evaluation Protocol, addresses methods for evaluating savings resulting from lighting control measures such as adding time clocks, tuning energy management system commands, and adding occupancy sensors.

  4. Determination of adsorbable organic halogens in surface water samples by combustion-microcoulometry versus combustion-ion chromatography titration.

    PubMed

    Kinani, Aziz; Sa Lhi, Hacène; Bouchonnet, Stéphane; Kinani, Said

    2018-03-02

    Adsorbable Organic Halogen (AOX) is an analytical parameter of considerable interest since it allows to evaluate the amount of organohalogen disinfection by-products (OXBPs) present in a water sample. Halogen speciation of AOX into adsorbable organic chlorine, bromine and iodine, respectively AOCl, AOBr and AOI, is extremely important since it has been shown that iodinated and brominated organic by-products tend to be more toxic than their chlorinated analogues. Chemical speciation of AOX can be performed by combustion-ion chromatography (C-IC). In the present work, the effectiveness of the nitrate wash according to ISO 9562 standard method protocol to eliminate halide ions interferences was firstly examined. False positive AOX values were observed when chloride concentration exceeded 100 ppm. The improvements made to the washing protocol have eliminated chloride interference for concentrations up to 1000 ppm. A C-IC method for chemical speciation of AOX into AOCl, AOBr, and AOI has been developed and validated. The most important analytical parameters were investigated. The following optimal conditions were established: an aqueous solution containing 2.4 mM sodium bicarbonate/2.0 mM sodium carbonate, and 2% acetone (v/v) as mobile phase, 2 mL of aqueous sodium thiosulfate (500 ppm) as absorption solution, 0.2 mL min -1 as water inlet flow rate for hydropyrolysis, and 10 min as post-combustion time. The method was validated according to NF T90-210 standard method. Calibration curves fitted through a quadratic equation show coefficients of determination (r 2 ) greater than 0.9998, and RSD less than 5%. The LOQs were 0.9, 4.3, and 5.7 μg L -1 Cl for AOCl, AOBr, and AOI, respectively. The accuracy, in terms of relative error, was within a ± 10% interval. The applicability of the validated method was demonstrated by the analysis of twenty four water samples from three rivers in France. The measurements reveals AOX amounts above 10 μg L -1 Cl in all untreated samples, suggesting the presence of organohalogen compounds in the sampled rivers. On weight concentration basis, AOCl accounted for 77-100% of AOX in the treated water samples. A good agreement between the conventional AOX method and the developed C-IC method was found. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Protocol for Tier 2 Evaluation of Vapor Intrusion at Corrective Action Sites

    DTIC Science & Technology

    2012-07-01

    622.92 600.12 437.08 433.44 411.1 Sulfur Hexafluoride (SF6) by NIOSH 6602 Modified Sulfur Hexafluoride 600 130 380 290 120 370 Notes: 1. VOC and SF6...6602 Modified Sulfur Hexafluoride 2400 2600 24 1500 260 14 18 1000 Notes: 1. VOC and SF6 samples were analyzed by Columbia Analytical Services, Inc. in...NIOSH 6602 Modified Sulfur Hexafluoride 3900 15 1800 1700 24 1600 Notes: 1. VOC and SF6 samples were analyzed by Columbia Analytical Services, Inc. in

  6. Development of an enrichment method for endogenous phosphopeptide characterization in human serum.

    PubMed

    La Barbera, Giorgia; Capriotti, Anna Laura; Cavaliere, Chiara; Ferraris, Francesca; Laus, Michele; Piovesana, Susy; Sparnacci, Katia; Laganà, Aldo

    2018-01-01

    The work describes the development of an enrichment method for the analysis of endogenous phosphopeptides in serum. Endogenous peptides can play significant biological roles, and some of them could be exploited as future biomarkers. In this context, blood is one of the most useful biofluids for screening, but a systematic investigation of the endogenous peptides, especially phosphorylated ones, is still lacking, mainly due to the lack of suitable analytical methods. Thus, in this paper, different phosphopeptide enrichment strategies were pursued, based either on metal oxide affinity chromatography (MOAC, in the form of commercial TiO 2 spin columns or magnetic graphitized carbon black-TiO 2 composite), or on immobilized metal ion affinity chromatography (IMAC, in the form of Ti 4+ -IMAC magnetic material or commercial Fe 3+ -IMAC spin columns). While MOAC strategies proved completely unsuccessful, probably due to interfering phospholipids displacing phosphopeptides, the IMAC materials performed very well. Different sample preparation strategies were tested, comprising direct dilution with the loading buffer, organic solvent precipitation, and lipid removal from the matrix, as well as the addition of phosphatase inhibitors during sample handling for maximized endogenous phosphopeptide enrichment. All data were acquired by a shotgun peptidomics approach, in which peptide samples were separated by reversed-phase nanoHPLC hyphenated with high-resolution tandem mass spectrometry. The devised method allowed the identification of 176 endogenous phosphopeptides in fresh serum added with inhibitors by the direct dilution protocol and the Ti 4+ -IMAC magnetic material enrichment, but good results could also be obtained from the commercial Fe 3+ -IMAC spin column adapted to the batch enrichment protocol.

  7. Rate-loss analysis of an efficient quantum repeater architecture

    NASA Astrophysics Data System (ADS)

    Guha, Saikat; Krovi, Hari; Fuchs, Christopher A.; Dutton, Zachary; Slater, Joshua A.; Simon, Christoph; Tittel, Wolfgang

    2015-08-01

    We analyze an entanglement-based quantum key distribution (QKD) architecture that uses a linear chain of quantum repeaters employing photon-pair sources, spectral-multiplexing, linear-optic Bell-state measurements, multimode quantum memories, and classical-only error correction. Assuming perfect sources, we find an exact expression for the secret-key rate, and an analytical description of how errors propagate through the repeater chain, as a function of various loss-and-noise parameters of the devices. We show via an explicit analytical calculation, which separately addresses the effects of the principle nonidealities, that this scheme achieves a secret-key rate that surpasses the Takeoka-Guha-Wilde bound—a recently found fundamental limit to the rate-vs-loss scaling achievable by any QKD protocol over a direct optical link—thereby providing one of the first rigorous proofs of the efficacy of a repeater protocol. We explicitly calculate the end-to-end shared noisy quantum state generated by the repeater chain, which could be useful for analyzing the performance of other non-QKD quantum protocols that require establishing long-distance entanglement. We evaluate that shared state's fidelity and the achievable entanglement-distillation rate, as a function of the number of repeater nodes, total range, and various loss-and-noise parameters of the system. We extend our theoretical analysis to encompass sources with nonzero two-pair-emission probability, using an efficient exact numerical evaluation of the quantum state propagation and measurements. We expect our results to spur formal rate-loss analysis of other repeater protocols and also to provide useful abstractions to seed analyses of quantum networks of complex topologies.

  8. Toroid cavity/coil NMR multi-detector

    DOEpatents

    Gerald, II, Rex E.; Meadows, Alexander D.; Gregar, Joseph S.; Rathke, Jerome W.

    2007-09-18

    An analytical device for rapid, non-invasive nuclear magnetic resonance (NMR) spectroscopy of multiple samples using a single spectrometer is provided. A modified toroid cavity/coil detector (TCD), and methods for conducting the simultaneous acquisition of NMR data for multiple samples including a protocol for testing NMR multi-detectors are provided. One embodiment includes a plurality of LC resonant circuits including spatially separated toroid coil inductors, each toroid coil inductor enveloping its corresponding sample volume, and tuned to resonate at a predefined frequency using a variable capacitor. The toroid coil is formed into a loop, where both ends of the toroid coil are brought into coincidence. Another embodiment includes multiple micro Helmholtz coils arranged on a circular perimeter concentric with a central conductor of the toroid cavity.

  9. Modelling and temporal performances evaluation of networked control systems using (max, +) algebra

    NASA Astrophysics Data System (ADS)

    Ammour, R.; Amari, S.

    2015-01-01

    In this paper, we address the problem of temporal performances evaluation of producer/consumer networked control systems. The aim is to develop a formal method for evaluating the response time of this type of control systems. Our approach consists on modelling, using Petri nets classes, the behaviour of the whole architecture including the switches that support multicast communications used by this protocol. (max, +) algebra formalism is then exploited to obtain analytical formulas of the response time and the maximal and minimal bounds. The main novelty is that our approach takes into account all delays experienced at the different stages of networked automation systems. Finally, we show how to apply the obtained results through an example of networked control system.

  10. Derivatization coupled to headspace programmed-temperature vaporizer gas chromatography with mass spectrometry for the determination of amino acids: Application to urine samples.

    PubMed

    González Paredes, Rosa María; García Pinto, Carmelo; Pérez Pavón, José Luis; Moreno Cordero, Bernardo

    2016-09-01

    A new method based on headspace programmed-temperature vaporizer gas chromatography with mass spectrometry has been developed and validated for the determination of amino acids (alanine, sarcosine, ethylglycine, valine, leucine, and proline) in human urine samples. Derivatization with ethyl chloroformate was employed successfully to determine the amino acids. The derivatization reaction conditions as well as the variables of the headspace sampling were optimized. The existence of a matrix effect was checked and the analytical characteristics of the method were determined. The limits of detection were 0.15-2.89 mg/L, and the limits of quantification were 0.46-8.67 mg/L. The instrumental repeatability was 1.6-11.5%. The quantification of the amino acids in six urine samples from healthy subjects was performed with the method developed with the one-point standard additions protocol, with norleucine as the internal standard. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Development of an Analytical Protocol for Determination of Cyanide in Human Biological Samples Based on Application of Ion Chromatography with Pulsed Amperometric Detection

    PubMed Central

    Ruman, Marek; Narkowicz, Sylwia; Namieśnik, Jacek

    2017-01-01

    A simple and accurate ion chromatography (IC) method with pulsed amperometric detection (PAD) was proposed for the determination of cyanide ion in urine, sweat, and saliva samples. The sample pretreatment relies on alkaline digestion and application of Dionex OnGuard II H cartridge. Under the optimized conditions, the method showed good linearity in the range of 1–100 μg/L for urine, 5–100 μg/L for saliva, and 3–100 μg/L for sweat samples with determination coefficients (R) > 0.992. Low detection limits (LODs) in the range of 1.8 μg/L, 5.1 μg/L, and 5.8 μg/L for urine, saliva, and sweat samples, respectively, and good repeatability (CV < 3%, n = 3) were obtained. The proposed method has been successfully applied to the analysis of human biological samples. PMID:29348966

  12. Metabolite profiling on apple volatile content based on solid phase microextraction and gas-chromatography time of flight mass spectrometry.

    PubMed

    Aprea, Eugenio; Gika, Helen; Carlin, Silvia; Theodoridis, Georgios; Vrhovsek, Urska; Mattivi, Fulvio

    2011-07-15

    A headspace SPME GC-TOF-MS method was developed for the acquisition of metabolite profiles of apple volatiles. As a first step, an experimental design was applied to find out the most appropriate conditions for the extraction of apple volatile compounds by SPME. The selected SPME method was applied in profiling of four different apple varieties by GC-EI-TOF-MS. Full scan GC-MS data were processed by MarkerLynx software for peak picking, normalisation, alignment and feature extraction. Advanced chemometric/statistical techniques (PCA and PLS-DA) were used to explore data and extract useful information. Characteristic markers of each variety were successively identified using the NIST library thus providing useful information for variety classification. The developed HS-SPME sampling method is fully automated and proved useful in obtaining the fingerprint of the volatile content of the fruit. The described analytical protocol can aid in further studies of the apple metabolome. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Development of an Analytical Protocol for Determination of Cyanide in Human Biological Samples Based on Application of Ion Chromatography with Pulsed Amperometric Detection.

    PubMed

    Jaszczak, Ewa; Ruman, Marek; Narkowicz, Sylwia; Namieśnik, Jacek; Polkowska, Żaneta

    2017-01-01

    A simple and accurate ion chromatography (IC) method with pulsed amperometric detection (PAD) was proposed for the determination of cyanide ion in urine, sweat, and saliva samples. The sample pretreatment relies on alkaline digestion and application of Dionex OnGuard II H cartridge. Under the optimized conditions, the method showed good linearity in the range of 1-100  μ g/L for urine, 5-100  μ g/L for saliva, and 3-100  μ g/L for sweat samples with determination coefficients ( R ) > 0.992. Low detection limits (LODs) in the range of 1.8  μ g/L, 5.1  μ g/L, and 5.8  μ g/L for urine, saliva, and sweat samples, respectively, and good repeatability (CV < 3%, n = 3) were obtained. The proposed method has been successfully applied to the analysis of human biological samples.

  14. Development of a method of clozapine dosage by selective electrode to the iodides.

    PubMed

    Teyeb, Hassen; Douki, Wahiba; Najjar, Mohamed Fadhel

    2012-07-01

    Clozapine (Leponex(®)), the main neuroleptic indicated in the treatment of resistant schizophrenia, requires therapeutic monitoring because of its side effects and the individual variability in metabolism. In addition, several cases of intoxication by this drug were described in the literature. In this work, we studied the indirect dosage of clozapine by selective electrode to the iodides for the optimization of an analytical protocol allowing therapeutic monitoring and the diagnosis of intoxication and/or overdose. Our results showed that the developed method is linear between 0.05 and 12.5 µg/mL (r = 0.980), with a limit of detection of 0.645.10(-3) µg/mL. It presents good precision (coefficient of variation less than 4%) and accuracy (coefficient less than 10%) for all the studied concentrations. With a domain of linearity covering a wide margin of concentrations, this method can be applicable to the dosage of clozapine in tablets and in different biological matrices, such as plasma, urines, and postmortem samples.

  15. Development of an LC-MS/MS method for the determination of endogenous cortisol in hair using (13)C3-labeled cortisol as surrogate analyte.

    PubMed

    Binz, Tina M; Braun, Ueli; Baumgartner, Markus R; Kraemer, Thomas

    2016-10-15

    Hair cortisol levels are increasingly applied as a measure for stress in humans and mammals. Cortisol is an endogenous compound and is always present within the hair matrix. Therefore, "cortisol-free hair matrix" is a critical point for any analytical method to accurately quantify especially low cortisol levels. The aim of this project was to modify current methods used for hair cortisol analysis to more accurately determine low endogenous cortisol concentrations in hair. For that purpose, (13)C3-labeled cortisol, which is not naturally present in hair (above 13C natural abundance levels), was used for calibration and comparative validation applying cortisol versus (13)C3-labeled cortisol. Cortisol was extracted from 20mg hair (standard sample amount) applying an optimized single step extraction protocol. An LC-MS/MS method was developed for the quantitative analysis of cortisol using either cortisol or (13)C3-cortisol as calibrators and D7-cortisone as internal standard (IS). The two methods (cortisol/(13)C3-labeled cortisol) were validated in a concentration range up to 500pg/mg and showed good linearity for both analytes (cortisol: R(2)=0.9995; (13)C3-cortisol R(2)=0.9992). Slight differences were observed for limit of detection (LOD) (0.2pg/mg/0.1pg/mg) and limit of quantification (LOQ) (1pg/mg/0.5pg/mg). Precision was good with a maximum deviation of 8.8% and 10% for cortisol and (13)C3-cortisol respectively. Accuracy and matrix effects were good for both analytes except for the quality control (QC) low cortisol. QC low (2.5pg/mg) showed matrix effects (126.5%, RSD 35.5%) and accuracy showed a deviation of 26% when using cortisol to spike. These effects are likely to be caused by the unknown amount of endogenous cortisol in the different hair samples used to determine validation parameters like matrix effect, LOQ and accuracy. No matrix effects were observed for the high QC (400pg/mg) samples. Recovery was good with 92.7%/87.3% (RSD 9.9%/6.2%) for QC low and 102.3%/82.1% (RSD 5.8%/11.4%) for QC high. After successful validation the applicability of the method could be proven. The study shows that the method is especially useful for determining low endogenous cortisol concentrations as they occur in cow hair for example. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Isolation and determination of ivermectin in post-mortem and in vivo tissues of dung beetles using a continuous solid phase extraction method followed by LC-ESI+-MS/MS

    PubMed Central

    Ortiz, Antonio J.; Cortez, Vieyle; Azzouz, Abdelmonaim

    2017-01-01

    A new analytical method based on solvent extraction, followed by continuous solid-phase extraction (SPE) clean-up using a polymeric sorbent, was demonstrated to be applicable for the detection of ivermectin in complex biological matrices of dung beetles (hemolymph, excreta or dry tissues) using liquid chromatography combined with positive electrospray ionization tandem mass spectrometry (LC/ESI+–MS/MS). Using a signal-to-noise ratio of 3:1, the limit of detection (LOD) in the insect matrices at trace levels was 0.01 ng g–1 and the limit of quantification (LOQ) was 0.1 ng g–1. The proposed method was successfully used to quantitatively determine the levels of ivermectin in the analysis of small samples in in vivo and post mortem samples, demonstrating the usefulness for quantitative analyses that are focused on future pharmacokinetic and bioavailability studies in insects and the establishment of a new protocol to study the impact of ivermectin on non-target arthropods such as dung beetles and other insects that are related with the “dung community”. Because satisfactory precision and accuracy values were obtained in both in vivo matrices, we suggest that the method can be consistently used for quantitative determinations that are focused on future pharmacokinetic and bioavailability studies in insects. Furthermore, this new analytical method was successfully applied to biological samples of dead dung beetles from the field suggesting that the method can be used to establish a new routine analysis of ivermectin residues in insect carcasses that is applied to complement typical mortality tests. PMID:28207908

  17. Size analysis of polyglutamine protein aggregates using fluorescence detection in an analytical ultracentrifuge.

    PubMed

    Polling, Saskia; Hatters, Danny M; Mok, Yee-Foong

    2013-01-01

    Defining the aggregation process of proteins formed by poly-amino acid repeats in cells remains a challenging task due to a lack of robust techniques for their isolation and quantitation. Sedimentation velocity methodology using fluorescence detected analytical ultracentrifugation is one approach that can offer significant insight into aggregation formation and kinetics. While this technique has traditionally been used with purified proteins, it is now possible for substantial information to be collected with studies using cell lysates expressing a GFP-tagged protein of interest. In this chapter, we describe protocols for sample preparation and setting up the fluorescence detection system in an analytical ultracentrifuge to perform sedimentation velocity experiments on cell lysates containing aggregates formed by poly-amino acid repeat proteins.

  18. Time- and cost-saving apparatus for analytical sample filtration

    Treesearch

    William R. Kenealy; Joseph C. Destree

    2005-01-01

    Simple and cost-effective protocols were developed for removing particulates from samples prior to analysis by high performance liquid chromatography and gas chromatography. A filter and vial holder were developed for use with a 96-well filtration plate. The device saves preparation time and costs.

  19. Evaluating Trends in Historical PM2.5 Element Concentrations by Reanalyzing a 15-Year Sample Archive

    NASA Astrophysics Data System (ADS)

    Hyslop, N. P.; White, W. H.; Trzepla, K.

    2014-12-01

    The IMPROVE (Interagency Monitoring of PROtected Visual Environments) network monitors aerosol concentrations at 170 remote sites throughout the United States. Twenty-four-hour filter samples of particulate matter are collected every third day and analyzed for chemical composition. About 30 of the sites have operated continuously since 1988, and the sustained data record (http://views.cira.colostate.edu/web/) offers a unique window on regional aerosol trends. All elemental analyses have been performed by Crocker Nuclear Laboratory at the University of California in Davis, and sample filters collected since 1995 are archived on campus. The suite of reported elements has remained constant, but the analytical methods employed for their determination have evolved. For example, the elements Na - Mn were determined by PIXE until November 2001, then by XRF analysis in a He-flushed atmosphere through 2004, and by XRF analysis in vacuum since January 2005. In addition to these fundamental changes, incompletely-documented operational factors such as detector performance and calibration details have introduced variations in the measurements. Because the past analytical methods were non-destructive, the archived filters can be re-analyzed with the current analytical systems and protocols. The 15-year sample archives from Great Smoky Mountains (GRSM), Mount Rainier (MORA), and Point Reyes National Parks (PORE) were selected for reanalysis. The agreement between the new analyses and original determinations varies with element and analytical era. The graph below compares the trend estimates for all the elements measured by IMPROVE based on the original and repeat analyses; the elements identified in color are measured above the detection limit more than 90% of the time. The trend estimates are sensitive to the treatment of non-detect data. The original and reanalysis trends are indistinguishable (have overlapping confidence intervals) for most of the well-detected elements.

  20. Cross-reactivity by botanicals used in dietary supplements and spices using the multiplex xMAP food allergen detection assay (xMAP FADA).

    PubMed

    Pedersen, Ronnie O; Nowatzke, William L; Cho, Chung Y; Oliver, Kerry G; Garber, Eric A E

    2018-06-18

    Food allergies affect some 15 million Americans. The only treatment for food allergies is a strict avoidance diet. To help ensure the reliability of food labels, analytical methods are employed; the most common being enzyme-linked immunosorbent assays (ELISAs). However, the commonly employed ELISAs are single analyte-specific and cannot distinguish between false positives due to cross-reactive homologous proteins; making the method of questionable utility for regulatory purposes when analyzing for unknown or multiple food allergens. Also, should the need arise to detect additional analytes, extensive research must be undertaken to develop new ELISAs. To address these and other limitations, a multiplex immunoassay, the xMAP® food allergen detection assay (xMAP FADA), was developed using 30 different antibodies against 14 different food allergens plus gluten. Besides incorporating two antibodies for the detection of most analytes, the xMAP FADA also relies on two different extraction protocols; providing multiple confirmatory end-points. Using the xMAP FADA, the cross-reactivities of 45 botanicals used in dietary supplements and spices commercially sold in the USA were assessed. Only a few displayed cross-reactivities with the antibodies in the xMAP FADA at levels exceeding 0.0001%. The utility of the xMAP FADA was exemplified by its ability to detect and distinguish between betel nut, saw palmetto, and acai which are in the same family as coconut. Other botanicals examined included allspice, amchur, anise seed, black pepper, caraway seed, cardamom, cayenne red pepper, sesame seed, poppy seed, white pepper, and wheat grass. The combination of direct antibody detection, multi-antibody profiling, high sensitivity, and a modular design made it possible for the xMAP FADA to distinguish between homologous antigens, provide multiple levels of built-in confirmatory analysis, and optimize the bead set cocktail to address specific needs.

  1. Loch Vale Watershed Project quality assurance report, 1995-1998

    USGS Publications Warehouse

    Allstott, E.J.; Bashkin, Michael A.; Baron, Jill S.

    1999-01-01

    The Loch Vale Watershed (LVWS) project was initiated in 1980 by the National Park Service with funding from the Aquatic Effects Research Program of the National Acid Precipitation Assessment Program. Initial research objectives were to understand the processes that would either mitigate or accelerate the effects of pollution on soil and surface water chemistry, and to build a record in which long-term trends could be identified and examined.It is important for all data collected in Loch Vale to meet the high standards of quality set forth in previous LVWS QA/QC reports and LVWS Methods Manuals. Given the ever-widening usage of data collected in Loch Vale, it is equally important to provide users of that data with a report assuring that all data are sound. Parameters covered in this report are the quality of meteorological measurements, hydrological measurements, surface water chemistry, and similarities in catch efficiency of two raingage types in Loch Vale for the period of 1995-1998.Routine sampling of weather conditions, precipitation chemistry, and stream/lake water chemistry began in 1982. Since then, all samples and data have been analyzed according to widely accepted and published methods. Weather data have been collected, analyzed, and stored by LVWS project personnel. Methods for the handling of meteorological data are well documented (Denning 1988, Edwards 1991, Newkirk 1995,and Allstott 1995). Precipitation chemistry has always been collected according to National Atmospheric Deposition Program protocol (Bigelow 1988), and analyzed at the Central Analytical Laboratory of the Illinois State Water Survey in Champaign, IL. QA/QC procedures of the National Atmospheric Deposition Program are well documented (Aubertin 1990). Protocols for sampling surface waters are also well documented (Newkirk 1995). Analysis of surface water chemistry has been performed using standard EPA protocol at the US Forest Service's Rocky Mt. Station Biogeochemistry Laboratory since 1993.

  2. Prediction of psychosis across protocols and risk cohorts using automated language analysis

    PubMed Central

    Corcoran, Cheryl M.; Carrillo, Facundo; Fernández‐Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C.; Bearden, Carrie E.; Cecchi, Guillermo A.

    2018-01-01

    Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer‐based natural language processing analyses, we previously showed that, among English‐speaking clinical (e.g., ultra) high‐risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross‐validate these automated linguistic analytic methods in a second larger risk cohort, also English‐speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine‐learning speech classifier – comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns – that had an 83% accuracy in predicting psychosis onset (intra‐protocol), a cross‐validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross‐protocol), and a 72% accuracy in discriminating the speech of recent‐onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at‐risk youths and identify linguistic targets for remediation and preventive intervention. More broadly, automated linguistic analysis can be a powerful tool for diagnosis and treatment across neuropsychiatry. PMID:29352548

  3. Prediction of psychosis across protocols and risk cohorts using automated language analysis.

    PubMed

    Corcoran, Cheryl M; Carrillo, Facundo; Fernández-Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C; Bearden, Carrie E; Cecchi, Guillermo A

    2018-02-01

    Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer-based natural language processing analyses, we previously showed that, among English-speaking clinical (e.g., ultra) high-risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross-validate these automated linguistic analytic methods in a second larger risk cohort, also English-speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine-learning speech classifier - comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns - that had an 83% accuracy in predicting psychosis onset (intra-protocol), a cross-validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross-protocol), and a 72% accuracy in discriminating the speech of recent-onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at-risk youths and identify linguistic targets for remediation and preventive intervention. More broadly, automated linguistic analysis can be a powerful tool for diagnosis and treatment across neuropsychiatry. © 2018 World Psychiatric Association.

  4. Nonesterified fatty acid determination for functional lipidomics: comprehensive ultrahigh performance liquid chromatography-tandem mass spectrometry quantitation, qualification, and parameter prediction.

    PubMed

    Hellmuth, Christian; Weber, Martina; Koletzko, Berthold; Peissner, Wolfgang

    2012-02-07

    Despite their central importance for lipid metabolism, straightforward quantitative methods for determination of nonesterified fatty acid (NEFA) species are still missing. The protocol presented here provides unbiased quantitation of plasma NEFA species by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Simple deproteination of plasma in organic solvent solution yields high accuracy, including both the unbound and initially protein-bound fractions, while avoiding interferences from hydrolysis of esterified fatty acids from other lipid classes. Sample preparation is fast and nonexpensive, hence well suited for automation and high-throughput applications. Separation of isotopologic NEFA is achieved using ultrahigh-performance liquid chromatography (UPLC) coupled to triple quadrupole LC-MS/MS detection. In combination with automated liquid handling, total assay time per sample is less than 15 min. The analytical spectrum extends beyond readily available NEFA standard compounds by a regression model predicting all the relevant analytical parameters (retention time, ion path settings, and response factor) of NEFA species based on chain length and number of double bonds. Detection of 50 NEFA species and accurate quantification of 36 NEFA species in human plasma is described, the highest numbers ever reported for a LC-MS application. Accuracy and precision are within widely accepted limits. The use of qualifier ions supports unequivocal analyte verification. © 2012 American Chemical Society

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kail, Brian W; Link, Dirk D; Morreale, Bryan D

    A method for selectively determining both free fatty acids (FFA) and triacylglycerides (TAGs) in biological oils was investigated and optimized using gas chromatography after esterification of the target species to their corresponding fatty acid methyl esters (FAMEs). The method used acid catalyzed esterification in methanolic solutions under conditions of varying severity to achieve complete conversion of more reactive FFAs while preserving the concentration of TAGs. Complete conversion of both free acids and glycerides to corresponding FAMEs was found to require more rigorous reaction conditions involving heating to 120°C for up to 2 h. Method validation was provided using gas chromatography–flamemore » ionization detection, gas chromatography–mass spectrometry, and liquid chromatography–mass spectrometry. The method improves on existing methods because it allows the total esterified lipid to be broken down by FAMEs contributed by FFA compared to FAMEs from both FFA and TAGs. Single and mixed-component solutions of pure fatty acids and triglycerides, as well as a sesame oil sample to simulate a complex biological oil, were used to optimize the methodologies. Key parameters that were investigated included: HCl-to-oil ratio, temperature and reaction time. Pure free fatty acids were found to esterify under reasonably mild conditions (10 min at 50°C with a 2.1:1 HCl to fatty acid ratio) with 97.6 ± 2.3% recovery as FAMEs, while triglycerides were largely unaffected under these reaction conditions. The optimized protocol demonstrated that it is possible to use esterification reactions to selectively determine the free acid content, total lipid content, and hence, glyceride content in biological oils. This protocol also allows gas chromatography analysis of FAMEs as a more ideal analyte than glyceride species in their native state.« less

  6. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  7. Simultaneous determination of three pesticide adjuvant residues in plant-derived agro-products using liquid chromatography-tandem mass spectrometry.

    PubMed

    Li, Hui; Jiang, Zejun; Cao, Xiaolin; Su, Hang; Shao, Hua; Jin, Fen; Abd El-Aty, A M; Wang, Jing

    2017-12-15

    Herein, an accurate and reliable isotope-labelled internal standard method was developed and validated for simultaneous determination of three polar pesticide adjuvants, namely 2-pyrrolidone, N-methyl-2-pyrrolidone, and N-ethyl-2-pyrrolidone in plant-derived agro-products. Matrices, including apple, cabbage, tomato, cucumber, rice, and wheat were extracted with a modified quick, easy, cheap, effective, rugged, and safe "QuEChERS" method and purified with a new clean-up sorbent (Z-Sep). A hydrophilic interaction liquid chromatography column (HILIC), exhibiting a lipophilic-hydrophilic character, was used to separate the three analytes over 10min using liquid chromatography-tandem mass spectrometry (LC-MS/MS). Matrix effects in various matrices were evaluated and an isotope-labelled internal standard method was employed to compensate for ion enhancement/suppression effects. At three fortification levels (2.0, 5.0, and 20.0μg/kg), the mean recoveries ranged from 78.5 to 112.1% with relative standard deviations (RSDs)<11.0% for all tested analytes. The limits of detection (LODs) and quantification (LOQs) were 0.04-0.45 and 0.12-1.58μg/kg in various matrices, respectively. The developed experimental protocol was successfully applied to monitor different samples purchased from local markets in Beijing, China. In conclusion, the developed method exhibited both high sensitivity and satisfactory accuracy and is suitable for the simultaneous determination of the three tested pesticide adjuvant residues in agro-products of plant origin. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Hydrodynamic size-based separation and characterization of protein aggregates from total cell lysates

    PubMed Central

    Tanase, Maya; Zolla, Valerio; Clement, Cristina C; Borghi, Francesco; Urbanska, Aleksandra M; Rodriguez-Navarro, Jose Antonio; Roda, Barbara; Zattoni, Andrea; Reschiglian, Pierluigi; Cuervo, Ana Maria; Santambrogio, Laura

    2016-01-01

    Herein we describe a protocol that uses hollow-fiber flow field-flow fractionation (FFF) coupled with multiangle light scattering (MALS) for hydrodynamic size-based separation and characterization of complex protein aggregates. The fractionation method, which requires 1.5 h to run, was successfully modified from the analysis of protein aggregates, as found in simple protein mixtures, to complex aggregates, as found in total cell lysates. In contrast to other related methods (filter assay, analytical ultracentrifugation, gel electrophoresis and size-exclusion chromatography), hollow-fiber flow FFF coupled with MALS allows a flow-based fractionation of highly purified protein aggregates and simultaneous measurement of their molecular weight, r.m.s. radius and molecular conformation (e.g., round, rod-shaped, compact or relaxed). The polyethersulfone hollow fibers used, which have a 0.8-mm inner diameter, allow separation of as little as 20 μg of total cell lysates. In addition, the ability to run the samples in different denaturing and nondenaturing buffer allows defining true aggregates from artifacts, which can form during sample preparation. The protocol was set up using Paraquat-induced carbonylation, a model that induces protein aggregation in cultured cells. This technique will advance the biochemical, proteomic and biophysical characterization of molecular-weight aggregates associated with protein mutations, as found in many CNS degenerative diseases, or chronic oxidative stress, as found in aging, and chronic metabolic and inflammatory conditions. PMID:25521790

  9. Direct dating of archaeological pottery by compound-specific 14C analysis of preserved lipids.

    PubMed

    Stott, Andrew W; Berstan, Robert; Evershed, Richard P; Bronk-Ramsey, Christopher; Hedges, Robert E M; Humm, Martin J

    2003-10-01

    A methodology is described demonstrating the utility of the compound-specific 14C technique as a direct means of dating archaeological pottery. The method uses automated preparative capillary gas chromatography employing wide-bore capillary columns to isolate individual compounds from lipid extracts of archaeological potsherds in high purity (>95%) and amounts (>200 microg) sufficient for radiocarbon dating using accelerator mass spectrometry (AMS). A protocol was developed and tested on n-alkanes and n-carboxylic acids possessing a broad range of 14C ages. Analytical blanks and controls allowed background 14C measurements to be assessed and potential sources of errors to be detected, i.e., contamination with modern or dead 14C, isotopic fraction effects, etc. A "Russian doll" method was developed to transfer isolated target compounds onto tin powder/capsules prior to combustion and AMS analyses. The major advantage of the compound-specific technique is that 14C dates obtained for individual compounds can be directly linked to the commodities processed in the vessels during their use, e.g., animal fats. The compound-specific 14C dating protocol was validated on a suite of ancient pottery whose predicted ages spanned a 5000-year date range. Initial results indicate that meaningful correlations can be obtained between the predicted date of pottery and that of the preserved lipids. These findings constitute an important step forward to the direct dating of archaeological pottery.

  10. Mycoestrogen determination in cow milk: Magnetic solid-phase extraction followed by liquid chromatography and tandem mass spectrometry analysis.

    PubMed

    Capriotti, Anna Laura; Cavaliere, Chiara; Foglia, Patrizia; La Barbera, Giorgia; Samperi, Roberto; Ventura, Salvatore; Laganà, Aldo

    2016-12-01

    Recently, magnetic solid-phase extraction has gained interest because it presents various operational advantages over classical solid-phase extraction. Furthermore, magnetic nanoparticles are easy to prepare, and various materials can be used in their synthesis. In the literature, there are only few studies on the determination of mycoestrogens in milk, although their carryover in milk has occurred. In this work, we wanted to develop the first (to the best of our knowledge) magnetic solid-phase extraction protocol for six mycoestrogens from milk, followed by liquid chromatography and tandem mass spectrometry analysis. Magnetic graphitized carbon black was chosen as the adsorbent, as this carbonaceous material, which is very different from the most diffuse graphene and carbon nanotubes, had already shown selectivity towards estrogenic compounds in milk. The graphitized carbon black was decorated with Fe 3 O 4 , which was confirmed by the characterization analyses. A milk deproteinization step was avoided, using only a suitable dilution in phosphate buffer as sample pretreatment. The overall process efficiency ranged between 52 and 102%, whereas the matrix effect considered as signal suppression was below 33% for all the analytes even at the lowest spiking level. The obtained method limits of quantification were below those of other published methods that employ classical solid-phase extraction protocols. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Impact of the Injection Protocol on an Impurity's Stationary State

    NASA Astrophysics Data System (ADS)

    Gamayun, Oleksandr; Lychkovskiy, Oleg; Burovski, Evgeni; Malcomson, Matthew; Cheianov, Vadim V.; Zvonarev, Mikhail B.

    2018-06-01

    We examine stationary-state properties of an impurity particle injected into a one-dimensional quantum gas. We show that the value of the impurity's end velocity lies between zero and the speed of sound in the gas and is determined by the injection protocol. This way, the impurity's constant motion is a dynamically emergent phenomenon whose description goes beyond accounting for the kinematic constraints of the Landau approach to superfluidity. We provide exact analytic results in the thermodynamic limit and perform finite-size numerical simulations to demonstrate that the predicted phenomena are within the reach of the ultracold gas experiments.

  12. Postnatal gestational age estimation using newborn screening blood spots: a proposed validation protocol

    PubMed Central

    Murphy, Malia S Q; Hawken, Steven; Atkinson, Katherine M; Milburn, Jennifer; Pervin, Jesmin; Gravett, Courtney; Stringer, Jeffrey S A; Rahman, Anisur; Lackritz, Eve; Chakraborty, Pranesh; Wilson, Kumanan

    2017-01-01

    Background Knowledge of gestational age (GA) is critical for guiding neonatal care and quantifying regional burdens of preterm birth. In settings where access to ultrasound dating is limited, postnatal estimates are frequently used despite the issues of accuracy associated with postnatal approaches. Newborn metabolic profiles are known to vary by severity of preterm birth. Recent work by our group and others has highlighted the accuracy of postnatal GA estimation algorithms derived from routinely collected newborn screening profiles. This protocol outlines the validation of a GA model originally developed in a North American cohort among international newborn cohorts. Methods Our primary objective is to use blood spot samples collected from infants born in Zambia and Bangladesh to evaluate our algorithm’s capacity to correctly classify GA within 1, 2, 3 and 4 weeks. Secondary objectives are to 1) determine the algorithm's accuracy in small-for-gestational-age and large-for-gestational-age infants, 2) determine its ability to correctly discriminate GA of newborns across dichotomous thresholds of preterm birth (≤34 weeks, <37 weeks GA) and 3) compare the relative performance of algorithms derived from newborn screening panels including all available analytes and those restricted to analyte subsets. The study population will consist of infants born to mothers already enrolled in one of two preterm birth cohorts in Lusaka, Zambia, and Matlab, Bangladesh. Dried blood spot samples will be collected and sent for analysis in Ontario, Canada, for model validation. Discussion This study will determine the validity of a GA estimation algorithm across ethnically diverse infant populations and assess population specific variations in newborn metabolic profiles. PMID:29104765

  13. Reference Standardization for Mass Spectrometry and High-resolution Metabolomics Applications to Exposome Research.

    PubMed

    Go, Young-Mi; Walker, Douglas I; Liang, Yongliang; Uppal, Karan; Soltow, Quinlyn A; Tran, ViLinh; Strobel, Frederick; Quyyumi, Arshed A; Ziegler, Thomas R; Pennell, Kurt D; Miller, Gary W; Jones, Dean P

    2015-12-01

    The exposome is the cumulative measure of environmental influences and associated biological responses throughout the lifespan, including exposures from the environment, diet, behavior, and endogenous processes. A major challenge for exposome research lies in the development of robust and affordable analytic procedures to measure the broad range of exposures and associated biologic impacts occurring over a lifetime. Biomonitoring is an established approach to evaluate internal body burden of environmental exposures, but use of biomonitoring for exposome research is often limited by the high costs associated with quantification of individual chemicals. High-resolution metabolomics (HRM) uses ultra-high resolution mass spectrometry with minimal sample preparation to support high-throughput relative quantification of thousands of environmental, dietary, and microbial chemicals. HRM also measures metabolites in most endogenous metabolic pathways, thereby providing simultaneous measurement of biologic responses to environmental exposures. The present research examined quantification strategies to enhance the usefulness of HRM data for cumulative exposome research. The results provide a simple reference standardization protocol in which individual chemical concentrations in unknown samples are estimated by comparison to a concurrently analyzed, pooled reference sample with known chemical concentrations. The approach was tested using blinded analyses of amino acids in human samples and was found to be comparable to independent laboratory results based on surrogate standardization or internal standardization. Quantification was reproducible over a 13-month period and extrapolated to thousands of chemicals. The results show that reference standardization protocol provides an effective strategy that will enhance data collection for cumulative exposome research. In principle, the approach can be extended to other types of mass spectrometry and other analytical methods. © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Reference Standardization for Mass Spectrometry and High-resolution Metabolomics Applications to Exposome Research

    PubMed Central

    Go, Young-Mi; Walker, Douglas I.; Liang, Yongliang; Uppal, Karan; Soltow, Quinlyn A.; Tran, ViLinh; Strobel, Frederick; Quyyumi, Arshed A.; Ziegler, Thomas R.; Pennell, Kurt D.; Miller, Gary W.; Jones, Dean P.

    2015-01-01

    The exposome is the cumulative measure of environmental influences and associated biological responses throughout the lifespan, including exposures from the environment, diet, behavior, and endogenous processes. A major challenge for exposome research lies in the development of robust and affordable analytic procedures to measure the broad range of exposures and associated biologic impacts occurring over a lifetime. Biomonitoring is an established approach to evaluate internal body burden of environmental exposures, but use of biomonitoring for exposome research is often limited by the high costs associated with quantification of individual chemicals. High-resolution metabolomics (HRM) uses ultra-high resolution mass spectrometry with minimal sample preparation to support high-throughput relative quantification of thousands of environmental, dietary, and microbial chemicals. HRM also measures metabolites in most endogenous metabolic pathways, thereby providing simultaneous measurement of biologic responses to environmental exposures. The present research examined quantification strategies to enhance the usefulness of HRM data for cumulative exposome research. The results provide a simple reference standardization protocol in which individual chemical concentrations in unknown samples are estimated by comparison to a concurrently analyzed, pooled reference sample with known chemical concentrations. The approach was tested using blinded analyses of amino acids in human samples and was found to be comparable to independent laboratory results based on surrogate standardization or internal standardization. Quantification was reproducible over a 13-month period and extrapolated to thousands of chemicals. The results show that reference standardization protocol provides an effective strategy that will enhance data collection for cumulative exposome research. In principle, the approach can be extended to other types of mass spectrometry and other analytical methods. PMID:26358001

  15. Constraints on the timing of Marinoan ``Snowball Earth'' glaciation by 187Re-187Os dating of a Neoproterozoic, post-glacial black shale in Western Canada

    NASA Astrophysics Data System (ADS)

    Kendall, Brian S.; Creaser, Robert A.; Ross, Gerald M.; Selby, David

    2004-06-01

    New Re-Os isotopic data were obtained from chlorite-grade black shales from the upper Old Fort Point Formation (Windermere Supergroup), a post-glacial Neoproterozoic marker horizon in western Canada. A Re-Os isochron date of 634±57 Ma (MSWD=65, n=5) was determined using the conventional inverse aqua regia digestion medium. However, dissolution of the same samples with a new CrO 3-H 2SO 4 dissolution technique [Chem. Geol. 200 (2003) 225] yielded a much more precise date of 607.8±4.7 Ma (MSWD=1.2). Both dates are in agreement with existing U-Pb age constraints that bracket the Old Fort Point Formation between ˜685 and ˜570 Ma. The distinctive Re-Os systematics recorded by the two analytical protocols is explained by dissolution of a variably radiogenic, detrital Os component by the aqua regia method. In contrast, the CrO 3-H 2SO 4 technique minimizes this detrital component by selectively dissolving organic matter that is dominated by hydrogenous (seawater) Re and Os. The date of 607.8±4.7 Ma is thus interpreted as the depositional age for the upper Old Fort Point Formation providing a minimum age constraint for the timing of the second Windermere glaciation in western Canada. This ice age is correlative with the Marinoan (˜620-600 Ma) ice age and older than the ˜580-Ma Gaskiers glaciation of northeastern North America. The new Re-Os age determined from the CrO 3-H 2SO 4 digestion technique thus provides further support to a growing body of evidence for a global Marinoan glacial episode. Such an interpretation would not be discernable from the imprecise Re-Os date obtained with the aqua regia protocol. These results also indicate the potential for Re-Os radiometric dating of black shales that was not previously recognized. Importantly, neither chlorite-grade metamorphism nor the low organic content (TOC <1%) of the Old Fort Point Formation precluded the determination of a precise Re-Os depositional age using the CrO 3-H 2SO 4 analytical protocol.

  16. VALIDATION OF STANDARD ANALYTICAL PROTOCOL FOR SEMI-VOLATILE ORGANIC COMPOUNDS

    EPA Science Inventory

    There is a growing concern with the potential for terrorist use of chemical weapons to cause civilian harm. In the event of an actual or suspected outdoor release of chemically hazardous material in a large area, the extent of contamination must be determined. This requires a s...

  17. An Evaluative Methodology for Virtual Communities Using Web Analytics

    ERIC Educational Resources Information Center

    Phippen, A. D.

    2004-01-01

    The evaluation of virtual community usage and user behaviour has its roots in social science approaches such as interview, document analysis and survey. Little evaluation is carried out using traffic or protocol analysis. Business approaches to evaluating customer/business web site usage are more advanced, in particular using advanced web…

  18. A Protocol-Analytic Study of Metacognition in Mathematical Problem Solving.

    ERIC Educational Resources Information Center

    Cai, Jinfa

    1994-01-01

    Metacognitive behaviors of subjects having high (n=2) and low (n=2) levels of mathematical experience were compared across four cognitive processes in mathematical problem solving: orientation, organization, execution, and verification. High-experience subjects engaged in self-regulation and spent more time on orientation and organization. (36…

  19. Application And Implication Of Nanomaterials In The Environment: An Overview Of Current Research At The Environmental Protection Agency (Romania)

    EPA Science Inventory

    The purpose of this presentation is to teach a course on analytical techniques, quality assurance, environmental research protocols, and basic soil environmental chemistry at the Environmental Health Center and Babes Bolyai University in Cluj, Romania. FOR FURTHER INFORMATI...

  20. A Trio of Human Molecular Genetics PCR Assays

    ERIC Educational Resources Information Center

    Reinking, Jeffrey L.; Waldo, Jennifer T.; Dinsmore, Jannett

    2013-01-01

    This laboratory exercise demonstrates three different analytical forms of the polymerase chain reaction (PCR) that allow students to genotype themselves at four different loci. Here, we present protocols to allow students to a) genotype a non-coding polymorphic Variable Number of Tandem Repeat (VNTR) locus on human chromosome 5 using conventional…

  1. Where Young People See Science: Everyday Activities Connected to Science

    ERIC Educational Resources Information Center

    Zimmerman, Heather Toomey; Bell, Philip

    2014-01-01

    This project analyses the prevalence and social construction of science in the everyday activities of multicultural, multilingual children in one urban community. Using cross-setting ethnographic fieldwork (i.e. home, museum, school, community), we developed an ecologically grounded interview protocol and analytical scheme for gauging students'…

  2. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    NASA Astrophysics Data System (ADS)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  3. Quantification of trace elements and speciation of iron in atmospheric particulate matter

    NASA Astrophysics Data System (ADS)

    Upadhyay, Nabin

    Trace metal species play important roles in atmospheric redox processes and in the generation of oxidants in cloud systems. The chemical impact of these elements on atmospheric and cloud chemistry is dependent on their occurrence, solubility and speciation. First, analytical protocols have been developed to determine trace elements in particulate matter samples collected for carbonaceous analysis. The validated novel protocols were applied to the determination of trace elements in particulate samples collected in the remote marine atmosphere and urban areas in Arizona to study air pollution issues. The second part of this work investigates on solubility and speciation in environmental samples. A detailed study on the impact of the nature and strength of buffer solutions on solubility and speciation of iron lead to a robust protocol, allowing for comparative measurements in matrices representative of cloud water conditions. Application of this protocol to samples from different environments showed low iron solubility (less than 1%) in dust-impacted events and higher solubility (5%) in anthropogenically impacted urban samples. In most cases, Fe(II) was the dominant oxidation state in the soluble fraction of iron. The analytical protocol was then applied to investigate iron processing by fogs. Field observations showed that only a small fraction (1%) of iron was scavenged by fog droplets for which each of the soluble and insoluble fraction were similar. A coarse time resolution limited detailed insights into redox cycling within fog system. Overall results suggested that the major iron species in the droplets was Fe(1I) (80% of soluble iron). Finally, the occurrence and sources of emerging organic pollutants in the urban atmosphere were investigated. Synthetic musk species are ubiquitous in the urban environment (less than 5 ng m-3) and investigations at wastewater treatment plants showed that wastewater aeration basins emit a substantial amount of these species to the atmosphere.

  4. Analysis of potential genotoxic impurities in rabeprazole active pharmaceutical ingredient via Liquid Chromatography-tandem Mass Spectrometry, following quality-by-design principles for method development.

    PubMed

    Iliou, Katerina; Malenović, Anđelija; Loukas, Yannis L; Dotsikas, Yannis

    2018-02-05

    A novel Liquid Chromatography-tandem mass spectrometry (LC-MS/MS) method is presented for the quantitative determination of two potential genotoxic impurities (PGIs) in rabeprazole active pharmaceutical ingredient (API). In order to overcome the analytical challenges in the trace analysis of PGIs, a development procedure supported by Quality-by-Design (QbD) principles was evaluated. The efficient separation between rabeprazole and the two PGIs in the shortest analysis time was set as the defined analytical target profile (ATP) and to this purpose utilization of a switching valve allowed the flow to be sent to waste when rabeprazole was eluted. The selected critical quality attributes (CQAs) were the separation criterion s between the critical peak pair and the capacity factor k of the last eluted compound. The effect of the following critical process parameters (CPPs) on the CQAs was studied: %ACN content, the pH and the concentration of the buffer salt in the mobile phase, as well as the stationary phase of the analytical column. D-Optimal design was implemented to set the plan of experiments with UV detector. In order to define the design space, Monte Carlo simulations with 5000 iterations were performed. Acceptance criteria were met for C 8 column (50×4mm, 5μm) , and the region having probability π≥95% to achieve satisfactory values of all defined CQAs was computed. The working point was selected with the mobile phase consisting ‎of ACN, ammonium formate 11mM at a ratio 31/69v/v with pH=6,8 for the water phase. The LC protocol was transferred to LC-MS/MS and validated according to ICH guidelines. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Development of a robust analytical framework for assessing landbird trends, dynamics and relationships with environmental covariates in the North Coast and Cascades Network

    USGS Publications Warehouse

    Ray, Chris; Saracco, James; Jenkins, Kurt J.; Huff, Mark; Happe, Patricia J.; Ransom, Jason I.

    2017-01-01

    During 2015-2016, we completed development of a new analytical framework for landbird population monitoring data from the National Park Service (NPS) North Coast and Cascades Inventory and Monitoring Network (NCCN). This new tool for analysis combines several recent advances in modeling population status and trends using point-count data and is designed to supersede the approach previously slated for analysis of trends in the NCCN and other networks, including the Sierra Nevada Network (SIEN). Advances supported by the new model-based approach include 1) the use of combined data on distance and time of detection to estimate detection probability without assuming perfect detection at zero distance, 2) seamless accommodation of variation in sampling effort and missing data, and 3) straightforward estimation of the effects of downscaled climate and other local habitat characteristics on spatial and temporal trends in landbird populations. No changes in the current field protocol are necessary to facilitate the new analyses. We applied several versions of the new model to data from each of 39 species recorded in the three mountain parks of the NCCN, estimating trends and climate relationships for each species during 2005-2014. Our methods and results are also reported in a manuscript in revision for the journal Ecosphere (hereafter, Ray et al.). Here, we summarize the methods and results outlined in depth by Ray et al., discuss benefits of the new analytical framework, and provide recommendations for its application to synthetic analyses of long-term data from the NCCN and SIEN. All code necessary for implementing the new analyses is provided within the Appendices to this report, in the form of fully annotated scripts written in the open-access programming languages R and JAGS.

  6. Exact results for the Floquet coin toss for driven integrable models

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Utso; Maity, Somnath; Banik, Uddipan; Dutta, Amit

    2018-05-01

    We study an integrable Hamiltonian reducible to free fermions, which is subjected to an imperfect periodic driving with the amplitude of driving (or kicking), randomly chosen from a binary distribution like a coin-toss problem. The randomness present in the driving protocol destabilizes the periodic steady state reached in the limit of perfectly periodic driving, leading to a monotonic rise of the stroboscopic residual energy with the number of periods (N ) for such Hamiltonians. We establish that a minimal deviation from the perfectly periodic driving in the present case using such protocols would always result in a bounded heating up of the system with N to an asymptotic finite value. Exploiting the completely uncorrelated nature of the randomness and the knowledge of the stroboscopic Floquet operator in the perfectly periodic situation, we provide an exact analytical formalism to derive the disorder averaged expectation value of the residual energy through a disorder operator. This formalism not only leads to an immense numerical simplification, but also enables us to derive an exact analytical form for the residual energy in the asymptotic limit which is universal, i.e., independent of the bias of coin-toss and the protocol chosen. Furthermore, this formalism clearly establishes the nature of the monotonic growth of the residual energy at intermediate N while clearly revealing the possible nonuniversal behavior of the same.

  7. Method for Derivatization and Detection of Chemical Weapons Convention Related Sulfur Chlorides via Electrophilic Addition with 3-Hexyne.

    PubMed

    Goud, D Raghavender; Pardasani, Deepak; Purohit, Ajay Kumar; Tak, Vijay; Dubey, Devendra Kumar

    2015-07-07

    Sulfur monochloride (S2Cl2) and sulfur dichloride (SCl2) are important precursors of the extremely toxic chemical warfare agent sulfur mustard and classified, respectively, into schedule 3.B.12 and 3.B.13 of the Chemical Weapons Convention (CWC). Hence, their detection and identification is of vital importance for verification of CWC. These chemicals are difficult to detect directly using chromatographic techniques as they decompose and do not elute. Until now, the use of gas chromatographic approaches to follow the derivatized sulfur chlorides is not reported in the literature. The electrophilic addition reaction of sulfur monochloride and sulfur dichloride toward 3-hexyne was explored for the development of a novel derivatization protocol, and the products were subjected to gas chromatography-mass spectrometric (GC-MS) analysis. Among various unsaturated reagents like alkenes and alkynes, symmetrical alkyne 3-hexyne was optimized to be the suitable derivatizing agent for these analytes. Acetonitrile was found to be the suitable solvent for the derivatization reaction. The sample preparation protocol for the identification of these analytes from hexane spiked with petrol matrix was also optimized. Liquid-liquid extraction followed by derivatization was employed for the identification of these analytes from petrol matrix. Under the established conditions, the detection and quantification limits are 2.6 μg/mL, 8.6 μg/mL for S2Cl2 and 2.3 μg/mL, 7.7 μg/mL for SCl2, respectively, in selected ion monitoring (SIM) mode. The calibration curve had a linear relationship with y = 0.022x - 0.331 and r(2) = 0.992 for the working range of 10 to 500 μg/mL for S2Cl2 and y = 0.007x - 0.064 and r(2) = 0.991 for the working range of 10 to 100 μg/mL for SCl2, respectively. The intraday RSDs were between 4.80 to 6.41%, 2.73 to 6.44% and interday RSDs were between 2.20 to 7.25% and 2.34 to 5.95% for S2Cl2 and SCl2, respectively.

  8. Use of Raman spectroscopy to identify carbon nanotube contamination at an analytical balance workstation.

    PubMed

    Braun, Elizabeth I; Huang, An; Tusa, Carolyn A; Yukica, Michael A; Pantano, Paul

    2016-12-01

    Carbon nanotubes (CNTs) are cylindrical molecules of carbon with diverse commercial applications. CNTs are also lightweight, easily airborne, and have been shown to be released during various phases of production and use. Therefore, as global CNT production increases, so do concerns that CNTs could pose a safety threat to those who are exposed to them. This makes it imperative to fully understand CNT release scenarios to make accurate risk assessments and to implement effective control measures. However, the current suite of direct-reading and off-line instrumentation used to monitor the release of CNTs in workplaces lack high chemical specificity, which complicates risk assessments when the sampling and/or measurements are performed at a single site where multiple CNT types are handled in the presence of naturally occurring background particles, or dust. Herein, we demonstrate the utility of Raman spectroscopy to unequivocally identify whether particulate matter collected from a multi-user analytical balance workstation comprised CNTs, as well as, whether the contamination included CNTs that were synthesized by a Ni/Y-catalyzed electric-arc method or a Co/Mo-catalyzed chemical vapor deposition method. Identifying the exact CNT type generated a more accurate risk assessment by knowing the metallic impurities involved, and it also led to the identification of the users who handled these CNTs, a review of their handling techniques, and an improved protocol for safely weighing CNTs.

  9. La–Ce isotope measurements by multicollector-ICPMS† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c7ja00256d

    PubMed Central

    Münker, Carsten; Strub, Erik

    2017-01-01

    The 138La–138Ce decay system (half-life 1.02 × 1011 years) is a potentially highly useful tool to unravel information about the timing of geological processes and about the interaction of geological reservoirs on earth, complementing information from the more popular 147Sm–143Nd and 176Lu–176Hf isotope systems. Previously published analytical protocols were limited to TIMS. Here we present for the first time an analytical protocol that employs MC-ICPMS, with an improved precision and sensitivity. To perform sufficiently accurate La–Ce measurements, an efficient ion-chromatographic procedure is required to separate Ce from the other rare earth elements (REE) and Ba quantitatively. This study presents an improved ion-chromatographic procedure that separates La and Ce from rock samples using a three-step column separation. After REE separation by cation exchange, Ce is separated employing an Ln Spec column and selective oxidation. In the last step, a cation clean-up chemistry is performed to remove all remaining interferences. Our MC-ICPMS measurement protocol includes all stable Ce isotopes (136Ce, 138Ce, 140Ce and 142Ce), by employing a 1010 ohm amplifier for the most abundant isotope 140Ce. An external reproducibility of ±0.25ε-units (2 r.s.d) has been routinely achieved for 138Ce measurements for as little as 150–600 ng Ce, depending on the sample–skimmer cone combinations being used. Because the traditionally used JMC-304 Ce reference material is not commercially available anymore, a new reference material was prepared from AMES laboratory Ce metal (Cologne-AMES). In order to compare the new material with the previously reported isotopic composition of AMES material prepared at Mainz (Mainz-AMES), Cologne-AMES and JMC-304 were measured relative to each other in the same analytical session, demonstrating isotope heterogeneity between the two AMES and different JMC-304 batches used in the literature. To enable sufficiently precise age correction of radiogenic 138Ce and to perform isochron dating, a protocol was developed where La and Ce concentrations are determined by isotope dilution (ID), using an isotope tracer enriched in 138La and 142Ce. The new protocols were applied to determine the variations of Ce isotope compositions and La–Ce concentrations of certified geochemical reference materials (CRMs): BCR-2, BCR-1, BHVO-2, JR-1, JA-2, JB-3, JG-1, JR-1, JB-1b, AGV-1 and one in-house La Palma standard. PMID:29456283

  10. Risk of Deep vein thrombosis in neurosurgery: State of the art on prophylaxis protocols and best clinical practices.

    PubMed

    Ganau, Mario; Prisco, Lara; Cebula, Helene; Todeschi, Julien; Abid, Houssem; Ligarotti, Gianfranco; Pop, Raoul; Proust, Francois; Chibbaro, Salvatore

    2017-11-01

    To analytically discuss some protocols in Deep vein thrombosis (DVT)/pulmonary Embolism (PE) prophylaxis currently use in Neurosurgical Departments around the world. Analysis of the prophylaxis protocols in the English literature: An analytical and narrative review of literature concerning DVT prophylaxis protocols in Neurosurgery have been conducted by a PubMed search (back to 1978). 80 abstracts were reviewed, and 74 articles were extracted. The majority of DVT seems to develop within the first week after a neurosurgical procedure, and a linear correlation between the duration of surgery and DVT occurrence has been highlighted. The incidence of DVT seems greater for cranial (7.7%) than spinal procedures (1.5%). Although intermittent pneumatic compression (IPC) devices provided adequate reduction of DVT/PE in some cranial and combined cranial/spinal series, low-dose subcutaneous unfractionated heparin (UFH) or low molecular-weight heparin (LMWH) further reduced the incidence, not always of DVT, but of PE. Nevertheless, low-dose heparin-based prophylaxis in cranial and spinal series risks minor and major postoperative haemorrhages: 2-4% in cranial series, 3.4% minor and 3.4% major haemorrhages in combined cranial/spinal series, and a 0.7% incidence of major/minor haemorrhages in spinal series. This analysis showed that currently most of the articles are represented by case series and case reports. As long as clear guidelines will not be defined and universally applied to this diverse group of patients, any prophylaxis for DVT and PE should be tailored to the individual patient with cautious assessment of benefits versus risks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. EPA Method 1615. Measurement of Enterovirus and Norovirus Occurrence in Water by Culture and RT-qPCR. Part III. Virus Detection by RT-qPCR

    PubMed Central

    Fout, G. Shay; Cashdollar, Jennifer L.; Griffin, Shannon M.; Brinkman, Nichole E.; Varughese, Eunice A.; Parshionikar, Sandhya U.

    2016-01-01

    EPA Method 1615 measures enteroviruses and noroviruses present in environmental and drinking waters. This method was developed with the goal of having a standardized method for use in multiple analytical laboratories during monitoring period 3 of the Unregulated Contaminant Monitoring Rule. Herein we present the protocol for extraction of viral ribonucleic acid (RNA) from water sample concentrates and for quantitatively measuring enterovirus and norovirus concentrations using reverse transcription-quantitative PCR (RT-qPCR). Virus concentrations for the molecular assay are calculated in terms of genomic copies of viral RNA per liter based upon a standard curve. The method uses a number of quality controls to increase data quality and to reduce interlaboratory and intralaboratory variation. The method has been evaluated by examining virus recovery from ground and reagent grade waters seeded with poliovirus type 3 and murine norovirus as a surrogate for human noroviruses. Mean poliovirus recoveries were 20% in groundwaters and 44% in reagent grade water. Mean murine norovirus recoveries with the RT-qPCR assay were 30% in groundwaters and 4% in reagent grade water. PMID:26862985

  12. Criteria for the Collection of Useful Respirator Performance Data in the Workplace

    PubMed Central

    Janssen, Larry; Zhuang, Ziqing; Shaffer, Ronald

    2016-01-01

    Workplace protection factors (WPFs) are intended to measure the ability of a respiratory protective device (RPD) to reduce contaminant exposure when used in the context of an effective respiratory protection program. In 1992, members of the American Industrial Hygiene Association Respiratory Protection Committee (RPC) published a review of important issues and considerations for measuring respirator performance in the workplace. The RPC recognized that respirator testing in workplaces can have a variety of objectives and endpoints, and that not all workplace measurements are WPFs. That paper addressed concerns in the general categories of 1) study objectives; 2) site selection; 3) subject selection and preparation; 4) sampling and analytical methods; and 5) data analysis. No specific protocol for measuring WPFs was recommended by the RPC, and attempts to reach a U.S. consensus on a WPF protocol since 1992 have not succeeded. Numerous studies have implemented the principles for WPF measurement described in the RPC paper. Modifications to the original recommendations have been made to reflect the current state of the art. This article describes what has been learned in recent years in each of the five categories identified in the 1992 discussion. Because of the wide variety of workplaces and work activities, contaminants and respiratory protective devices, a strict protocol is not appropriate for collecting WPF data. Rather, the minimum requirements for the collection and presentation of meaningful respirator performance data in the workplace are described. Understanding of these principles will permit useful RPD performance data to be generated. PMID:24579751

  13. Distributed Wireless Power Transfer With Energy Feedback

    NASA Astrophysics Data System (ADS)

    Lee, Seunghyun; Zhang, Rui

    2017-04-01

    Energy beamforming (EB) is a key technique for achieving efficient radio-frequency (RF) transmission enabled wireless energy transfer (WET). By optimally designing the waveforms from multiple energy transmitters (ETs) over the wireless channels, they can be constructively combined at the energy receiver (ER) to achieve an EB gain that scales with the number of ETs. However, the optimal design of EB waveforms requires accurate channel state information (CSI) at the ETs, which is challenging to obtain practically, especially in a distributed system with ETs at separate locations. In this paper, we study practical and efficient channel training methods to achieve optimal EB in a distributed WET system. We propose two protocols with and without centralized coordination, respectively, where distributed ETs either sequentially or in parallel adapt their transmit phases based on a low-complexity energy feedback from the ER. The energy feedback only depends on the received power level at the ER, where each feedback indicates one particular transmit phase that results in the maximum harvested power over a set of previously used phases. Simulation results show that the two proposed training protocols converge very fast in practical WET systems even with a large number of distributed ETs, while the protocol with sequential ET phase adaptation is also analytically shown to converge to the optimal EB design with perfect CSI by increasing the training time. Numerical results are also provided to evaluate the performance of the proposed distributed EB and training designs as compared to other benchmark schemes.

  14. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  15. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  16. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  17. Refinement for fault-tolerance: An aircraft hand-off protocol

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Schneider, Fred B.; Dehn, Jon

    1994-01-01

    Part of the Advanced Automation System (AAS) for air-traffic control is a protocol to permit flight hand-off from one air-traffic controller to another. The protocol must be fault-tolerant and, therefore, is subtle -- an ideal candidate for the application of formal methods. This paper describes a formal method for deriving fault-tolerant protocols that is based on refinement and proof outlines. The AAS hand-off protocol was actually derived using this method; that derivation is given.

  18. DIGE Analysis of Human Tissues.

    PubMed

    Gelfi, Cecilia; Capitanio, Daniele

    2018-01-01

    Two-dimensional difference gel electrophoresis (2-D DIGE) is an advanced and elegant gel electrophoretic analytical tool for comparative protein assessment. It is based on two-dimensional gel electrophoresis (2-DE) separation of fluorescently labeled protein extracts. The tagging procedures are designed to not interfere with the chemical properties of proteins with respect to their pI and electrophoretic mobility, once a proper labeling protocol is followed. The two-dye or three-dye systems can be adopted and their choice depends on specific applications. Furthermore, the use of an internal pooled standard makes 2-D DIGE a highly accurate quantitative method enabling multiple protein samples to be separated on the same two-dimensional gel. The image matching and cross-gel statistical analysis generates robust quantitative results making data validation by independent technologies successful.

  19. Microplastic Exposure Assessment in Aquatic Environments: Learning from Similarities and Differences to Engineered Nanoparticles.

    PubMed

    Hüffer, Thorsten; Praetorius, Antonia; Wagner, Stephan; von der Kammer, Frank; Hofmann, Thilo

    2017-03-07

    Microplastics (MPs) have been identified as contaminants of emerging concern in aquatic environments and research into their behavior and fate has been sharply increasing in recent years. Nevertheless, significant gaps remain in our understanding of several crucial aspects of MP exposure and risk assessment, including the quantification of emissions, dominant fate processes, types of analytical tools required for characterization and monitoring, and adequate laboratory protocols for analysis and hazard testing. This Feature aims at identifying transferrable knowledge and experience from engineered nanoparticle (ENP) exposure assessment. This is achieved by comparing ENP and MPs based on their similarities as particulate contaminants, whereas critically discussing specific differences. We also highlight the most pressing research priorities to support an efficient development of tools and methods for MPs environmental risk assessment.

  20. Compilation of annual reports of the Navy ELF (Extremely Low Frequency) communications system ecological monitoring program. Volume 1: Tabs A-E

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bruhn, J.; Cattelino, P.; Janke, R.; Jurgensen, M.; Mroz, G.; Reed, E. J.; Trettin, C.

    1984-07-01

    A long-term program of studying ELF electromagnetic influences on ecosystems in northwestern Wisconsin and the Upper Peninsula of Michigan is being conducted. Selection of study sites, monitoring protocols, and analytical methods were initiated in 1982. Data collection was initiated in 1983. Progress is described for studying the terrestrial, aquatic, and wetland ecosystems for the 10 projects comprising the ecological monitoring program. The 10 projects contain Herbaceous Plant Cover and Tree Studies; Litter Decomposition and Microflora; The Effects of Exposing the Slime Mold Physarum polycephalum; Soil Amoeba; Soil and Litter Arthropoda and Earthworm Studies; Biological Studies on Pollinating Insects (Megachilid Bees); Small Vertebrates (Small Mammals and Nesting Birds); Aquatic Ecosystems; Wetland Studies; and Field Studies of Effects of ELF on Migrating Birds.

  1. Modified salting-out method: high-yield, high-quality genomic DNA extraction from whole blood using laundry detergent.

    PubMed

    Nasiri, H; Forouzandeh, M; Rasaee, M J; Rahbarizadeh, F

    2005-01-01

    Different approaches have been used to extract DNA from whole blood. In most of these methods enzymes (such as proteinase K and RNAse A) or toxic organic solvents (such as phenol or guanidine isothiocyanate) are used. Since these enzymes are expensive, and most of the materials that are used routinely are toxic, it is desirable to apply an efficient DNA extraction procedure that does not require the use of such materials. In this study, genomic DNA was extracted by the salting-out method, but instead of using an analytical-grade enzyme and chemical detergents, as normally used for DNA isolation, a common laundry powder was used. Different concentrations of the powder were tested, and proteins were precipitated by NaCl-saturated distilled water. Finally, DNA precipitation was performed with the use of 96% ethanol. From the results, we conclude that the optimum concentration of laundry powder for the highest yield and purity of isolated DNA is 30 mg/mL. The procedure was optimized, and a final protocol is suggested. Following the same protocol, DNA was extracted from 100 blood samples, and their amounts were found to be >50 microg/mL of whole blood. The integrity of the DNA fragments was confirmed by agarose gel electrophoresis. Furthermore, the extracted DNA was used as a template for PCR reaction. The results obtained from PCR showed that the final solutions of extracted DNA did not contain any inhibitory material for the enzyme used in the PCR reaction, and indicated that the isolated DNA was of good quality. These results show that this method is simple, fast, safe, and cost-effective, and can be used in medical laboratories and research centers. Copyright 2005 Wiley-Liss, Inc.

  2. Outgassing and dimensional changes of polymer matrix composites in space

    NASA Technical Reports Server (NTRS)

    Tennyson, R. C.; Matthews, R.

    1993-01-01

    A thermal-vacuum outgassing model and test protocol for predicting outgassing times and dimensional changes for polymer matrix composites is described. Experimental results derived from a 'control' sample are used to provide the basis for analytical predictions to compare with the outgassing response of Long Duration Exposure Facility (LDEF) flight samples.

  3. Materials and Methods for Streamlined Laboratory Analysis of Environmental Samples, FY 2016 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Addleman, Raymond S.; Naes, Benjamin E.; McNamara, Bruce K.

    The International Atomic Energy Agency (IAEA) relies upon laboratory analysis of environmental samples (typically referred to as “swipes”) collected during on-site inspections of safeguarded facilities to support the detection and deterrence of undeclared activities. Unfortunately, chemical processing and assay of the samples is slow and expensive. A rapid, effective, and simple extraction process and analysis method is needed to provide certified results with improved timeliness at reduced costs (principally in the form of reduced labor), while maintaining or improving sensitivity and efficacy. To address these safeguard needs the Pacific Northwest National Laboratory (PNNL) explored and demonstrated improved methods for environmentalmore » sample (ES) analysis. Improvements for both bulk and particle analysis were explored. To facilitate continuity and adoption, the new sampling materials and processing methods will be compatible with existing IAEA protocols for ES analysis. PNNL collaborated with Oak Ridge National Laboratory (ORNL), which performed independent validation of the new bulk analysis methods and compared performance to traditional IAEA’s Network of Analytical Laboratories (NWAL) protocol. ORNL efforts are reported separately. This report describes PNNL’s FY 2016 progress, which was focused on analytical application supporting environmental monitoring of uranium enrichment plants and nuclear fuel processing. In the future the technology could be applied to other safeguard applications and analytes related to fuel manufacturing, reprocessing, etc. PNNL’s FY 2016 efforts were broken into two tasks and a summary of progress, accomplishments and highlights are provided below. Principal progress and accomplishments on Task 1, Optimize Materials and Methods for ICP-MS Environmental Sample Analysis, are listed below. • Completed initial procedure for rapid uranium extraction from ES swipes based upon carbonate-peroxide chemistry (delivered to ORNL for evaluation). • Explored improvements to carbonate-peroxide rapid uranium extraction chemistry. • Evaluated new sampling materials and methods (in collaboration with ORNL). • Demonstrated successful ES extractions from standard and novel swipes for a wide range uranium compounds of interest including UO 2F 2 and UO 2(NO 3) 2, U 3O 8 and uranium ore concentrate. • Completed initial discussions with commercial suppliers of PTFE swipe materials. • Submitted one manuscript for publication. Two additional drafts are being prepared. Principal progress and accomplishments on Task 2, Optimize Materials and Methods for Direct SIMS Environmental Sample Analysis, are listed below. • Designed a SIMS swipe sample holder that retrofits into existing equipment and provides simple, effective, and rapid mounting of ES samples for direct assay while enabling automation and laboratory integration. • Identified preferred conductive sampling materials with better performance characteristics. • Ran samples on the new PNNL NWAL equivalent Cameca 1280 SIMS system. • Obtained excellent agreement between isotopic ratios for certified materials and direct SIMS assay of very low levels of LEU and HEU UO 2F 2 particles on carbon fiber sampling material. Sample activities range from 1 to 500 CPM (uranium mass on sample is dependent upon specific isotope ratio but is frequently in the subnanogram range). • Found that the presence of the UF molecular ions, as measured by SIMS, provides chemical information about the particle that is separate from the uranium isotopics and strongly suggests that those particles originated from an UF6 enrichment activity. • Submitted one manuscript for publication. Another manuscript is in preparation.« less

  4. A splay tree-based approach for efficient resource location in P2P networks.

    PubMed

    Zhou, Wei; Tan, Zilong; Yao, Shaowen; Wang, Shipu

    2014-01-01

    Resource location in structured P2P system has a critical influence on the system performance. Existing analytical studies of Chord protocol have shown some potential improvements in performance. In this paper a splay tree-based new Chord structure called SChord is proposed to improve the efficiency of locating resources. We consider a novel implementation of the Chord finger table (routing table) based on the splay tree. This approach extends the Chord finger table with additional routing entries. Adaptive routing algorithm is proposed for implementation, and it can be shown that hop count is significantly minimized without introducing any other protocol overheads. We analyze the hop count of the adaptive routing algorithm, as compared to Chord variants, and demonstrate sharp upper and lower bounds for both worst-case and average case settings. In addition, we theoretically analyze the hop reducing in SChord and derive the fact that SChord can significantly reduce the routing hops as compared to Chord. Several simulations are presented to evaluate the performance of the algorithm and support our analytical findings. The simulation results show the efficiency of SChord.

  5. Methodology for and the determination of the major constituents and metabolites of the Amazonian botanical medicine ayahuasca in human urine.

    PubMed

    McIlhenny, Ethan H; Riba, Jordi; Barbanoj, Manel J; Strassman, Rick; Barker, Steven A

    2011-09-01

    Ayahuasca, also known as caapi or yage among various South American groups, holds a highly esteemed and millennia-old position in these cultures' medical and religious pharmacopeia. There is now an increasing interest in the potential for modern medical applications of ayahuasca, as well as concerns regarding its increasing potential for abuse. Toxicological and clinical research to address these issues will require information regarding its metabolism and clearance. Thus, a rapid, sensitive and specific method for characterization and quantitation of the major constituents and of the metabolites of ayahuasca in urine is needed. The present research provides a protocol for conducting such analyses. The characteristics of the method, conducted by sample dilution and using HPLC-electrospray ionization (ESI)-selected reaction monitoring (SRM)-tandem mass spectrometry, are presented. The application of the analytical protocol to urine samples collected from three individuals that were administered ayahuasca has also been demonstrated. The data show that the major metabolite of the hallucinogenic component of ayahuasca, N,N-dimethyltryptamine (DMT), is the corresponding N-oxide, the first time this metabolite has been described in in vivo studies in humans. Further, very little DMT was detected in urine, despite the inhibition of monoamine oxidase afforded by the presence of the harmala alkaloids in ayahuasca. The major harmala alkaloid excreted was tetrahydroharmine. Other excretion products and metabolites were also identified and quantified. The method described would be suitable for use in further toxicological and clinical research on ayahuasca. Copyright © 2010 John Wiley & Sons, Ltd.

  6. Normalization of cortical thickness measurements across different T1 magnetic resonance imaging protocols by novel W-Score standardization.

    PubMed

    Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong

    2017-10-01

    The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while maintaining the AD disease effect. Compared to conventional regression methods, our method showed the best performance for in terms of controlling the protocol effect while preserving disease information. Protocol-specific w-score standardization effectively resolved the concerns of conventional regression methods. It showed the best performance for improving the compatibility of a T1 MR post-processed feature, cortical thickness. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  8. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  9. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  10. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  11. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  12. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  13. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    PubMed

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  14. Automatically measuring brain ventricular volume within PACS using artificial intelligence.

    PubMed

    Yepes-Calderon, Fernando; Nelson, Marvin D; McComb, J Gordon

    2018-01-01

    The picture archiving and communications system (PACS) is currently the standard platform to manage medical images but lacks analytical capabilities. Staying within PACS, the authors have developed an automatic method to retrieve the medical data and access it at a voxel level, decrypted and uncompressed that allows analytical capabilities while not perturbing the system's daily operation. Additionally, the strategy is secure and vendor independent. Cerebral ventricular volume is important for the diagnosis and treatment of many neurological disorders. A significant change in ventricular volume is readily recognized, but subtle changes, especially over longer periods of time, may be difficult to discern. Clinical imaging protocols and parameters are often varied making it difficult to use a general solution with standard segmentation techniques. Presented is a segmentation strategy based on an algorithm that uses four features extracted from the medical images to create a statistical estimator capable of determining ventricular volume. When compared with manual segmentations, the correlation was 94% and holds promise for even better accuracy by incorporating the unlimited data available. The volume of any segmentable structure can be accurately determined utilizing the machine learning strategy presented and runs fully automatically within the PACS.

  15. Analytical validation of a reference laboratory ELISA for the detection of feline leukemia virus p27 antigen.

    PubMed

    Buch, Jesse S; Clark, Genevieve H; Cahill, Roberta; Thatcher, Brendon; Smith, Peter; Chandrashekar, Ramaswamy; Leutenegger, Christian M; O'Connor, Thomas P; Beall, Melissa J

    2017-09-01

    Feline leukemia virus (FeLV) is an oncogenic retrovirus of cats. Immunoassays for the p27 core protein of FeLV aid in the detection of FeLV infections. Commercial microtiter-plate ELISAs have rapid protocols and visual result interpretation, limiting their usefulness in high-throughput situations. The purpose of our study was to validate the PetChek FeLV 15 ELISA, which is designed for the reference laboratory, and incorporates sequential, orthogonal screening and confirmatory protocols. A cutoff for the screening assay was established with 100% accuracy using 309 feline samples (244 negative, 65 positive) defined by the combined results of FeLV PCR and an independent reference p27 antigen ELISA. Precision of the screening assay was measured using a panel of 3 samples (negative, low-positive, and high-positive). The intra-assay coefficient of variation (CV) was 3.9-7.9%; the inter-assay CV was 6.0-8.6%. For the confirmatory assay, the intra-assay CV was 3.0-4.7%, and the inter-assay CV was 7.4-9.7%. The analytical sensitivity for p27 antigen was 3.7 ng/mL for inactivated whole FeLV and 1.2 ng/mL for purified recombinant FeLV p27. Analytical specificity was demonstrated based on the absence of cross-reactivity to related retroviruses. No interference was observed for samples containing added bilirubin, hemoglobin, or lipids. Based on these results, the new high-throughput design of the PetChek FeLV 15 ELISA makes it suitable for use in reference laboratory settings and maintains overall analytical performance.

  16. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  17. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  18. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  19. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  20. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

Top