Sample records for common analytical method

  1. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    PubMed

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  2. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  3. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  4. Reliability of proton NMR spectroscopy for the assessment of frying oil oxidation

    USDA-ARS?s Scientific Manuscript database

    Although there are many analytical methods developed to assess oxidation of edible oil, it is still common to see a lack of consistency in results from different methods. This inconsistency is expected since there are numerous oxidation products and any analytical method measuring only one kind of o...

  5. ANALYTICAL METHOD COMPARISONS BY ESTIMATES OF PRECISION AND LOWER DETECTION LIMIT

    EPA Science Inventory

    The paper describes the use of principal component analysis to estimate the operating precision of several different analytical instruments or methods simultaneously measuring a common sample of a material whose actual value is unknown. This approach is advantageous when none of ...

  6. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    PubMed

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  7. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China

    PubMed Central

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li’an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-01-01

    Abstract A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box–Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China. PMID:26945390

  8. Fluorescence polarization immunoassays for rapid, accurate, and sensitive determination of mycotoxins

    USDA-ARS?s Scientific Manuscript database

    Analytical methods for the determination of mycotoxins in foods are commonly based on chromatographic techniques (GC, HPLC or LC-MS). Although these methods permit a sensitive and accurate determination of the analyte, they require skilled personnel and are time-consuming, expensive, and unsuitable ...

  9. A Structural and Correlational Analysis of Two Common Measures of Personal Epistemology

    ERIC Educational Resources Information Center

    Laster, Bonnie Bost

    2010-01-01

    Scope and Method of Study: The current inquiry is a factor analytic study which utilizes first and second order factor analytic methods to examine the internal structures of two measurements of personal epistemological beliefs: the Schommer Epistemological Questionnaire (SEQ) and Epistemic Belief Inventory (EBI). The study also examines the…

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, W.W.; Sullivan, H.H.

    Electroless nicke-plate characteristics are substantially influenced by percent phosphorous concentrations. Available ASTM analytical methods are designed for phosphorous concentrations of less than one percent compared to the 4.0 to 20.0% concentrations common in electroless nickel plate. A variety of analytical adaptations are applied through the industry resulting in poor data continuity. This paper presents a statistical comparison of five analytical methods and recommends accurate and precise procedures for use in percent phosphorous determinations in electroless nickel plate. 2 figures, 1 table.

  11. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  12. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    PubMed

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Modern data science for analytical chemical data - A comprehensive review.

    PubMed

    Szymańska, Ewa

    2018-10-22

    Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. 3D-MICE: integration of cross-sectional and longitudinal imputation for multi-analyte longitudinal clinical data.

    PubMed

    Luo, Yuan; Szolovits, Peter; Dighe, Anand S; Baron, Jason M

    2018-06-01

    A key challenge in clinical data mining is that most clinical datasets contain missing data. Since many commonly used machine learning algorithms require complete datasets (no missing data), clinical analytic approaches often entail an imputation procedure to "fill in" missing data. However, although most clinical datasets contain a temporal component, most commonly used imputation methods do not adequately accommodate longitudinal time-based data. We sought to develop a new imputation algorithm, 3-dimensional multiple imputation with chained equations (3D-MICE), that can perform accurate imputation of missing clinical time series data. We extracted clinical laboratory test results for 13 commonly measured analytes (clinical laboratory tests). We imputed missing test results for the 13 analytes using 3 imputation methods: multiple imputation with chained equations (MICE), Gaussian process (GP), and 3D-MICE. 3D-MICE utilizes both MICE and GP imputation to integrate cross-sectional and longitudinal information. To evaluate imputation method performance, we randomly masked selected test results and imputed these masked results alongside results missing from our original data. We compared predicted results to measured results for masked data points. 3D-MICE performed significantly better than MICE and GP-based imputation in a composite of all 13 analytes, predicting missing results with a normalized root-mean-square error of 0.342, compared to 0.373 for MICE alone and 0.358 for GP alone. 3D-MICE offers a novel and practical approach to imputing clinical laboratory time series data. 3D-MICE may provide an additional tool for use as a foundation in clinical predictive analytics and intelligent clinical decision support.

  15. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  16. Analytical and quasi-Bayesian methods as development of the iterative approach for mixed radiation biodosimetry.

    PubMed

    Słonecka, Iwona; Łukasik, Krzysztof; Fornalski, Krzysztof W

    2018-06-04

    The present paper proposes two methods of calculating components of the dose absorbed by the human body after exposure to a mixed neutron and gamma radiation field. The article presents a novel approach to replace the common iterative method in its analytical form, thus reducing the calculation time. It also shows a possibility of estimating the neutron and gamma doses when their ratio in a mixed beam is not precisely known.

  17. Keeping It Simple: Can We Estimate Malting Quality Potential Using an Isothermal Mashing Protocol and Common Laboratory Instrumentation?

    USDA-ARS?s Scientific Manuscript database

    Current methods for generating malting quality metrics have been developed largely to support commercial malting and brewing operations, providing accurate, reproducible analytical data to guide malting and brewing production. Infrastructure to support these analytical operations often involves sub...

  18. Assessing the impact of natural policy experiments on socioeconomic inequalities in health: how to apply commonly used quantitative analytical methods?

    PubMed

    Hu, Yannan; van Lenthe, Frank J; Hoffmann, Rasmus; van Hedel, Karen; Mackenbach, Johan P

    2017-04-20

    The scientific evidence-base for policies to tackle health inequalities is limited. Natural policy experiments (NPE) have drawn increasing attention as a means to evaluating the effects of policies on health. Several analytical methods can be used to evaluate the outcomes of NPEs in terms of average population health, but it is unclear whether they can also be used to assess the outcomes of NPEs in terms of health inequalities. The aim of this study therefore was to assess whether, and to demonstrate how, a number of commonly used analytical methods for the evaluation of NPEs can be applied to quantify the effect of policies on health inequalities. We identified seven quantitative analytical methods for the evaluation of NPEs: regression adjustment, propensity score matching, difference-in-differences analysis, fixed effects analysis, instrumental variable analysis, regression discontinuity and interrupted time-series. We assessed whether these methods can be used to quantify the effect of policies on the magnitude of health inequalities either by conducting a stratified analysis or by including an interaction term, and illustrated both approaches in a fictitious numerical example. All seven methods can be used to quantify the equity impact of policies on absolute and relative inequalities in health by conducting an analysis stratified by socioeconomic position, and all but one (propensity score matching) can be used to quantify equity impacts by inclusion of an interaction term between socioeconomic position and policy exposure. Methods commonly used in economics and econometrics for the evaluation of NPEs can also be applied to assess the equity impact of policies, and our illustrations provide guidance on how to do this appropriately. The low external validity of results from instrumental variable analysis and regression discontinuity makes these methods less desirable for assessing policy effects on population-level health inequalities. Increased use of the methods in social epidemiology will help to build an evidence base to support policy making in the area of health inequalities.

  19. Analytical method for the simultaneous determination of polyfunctional amines used as monomers in the manufacture of food packaging materials.

    PubMed

    Paseiro-Cerrato, R; de Quirós, A Rodríguez-Bernaldo; Sendón, Raquel; Bustos, Juana; Ruíz, E; Cruz, J M; Paseiro-Losada, P

    2011-10-07

    This paper describes the development of a multi-analyte method for the determination of polyfunctional amines commonly used as monomers in the manufacture of food contact materials. Amines were analyzed by high-performance-liquid chromatography with diode-array detection (HPLC-DAD) after derivatization with dansyl chloride. The chromatographic analysis and the derivatization conditions were optimized. The proposed method was validated in terms of linearity, limits of detection and repeatabilities. The method showed an excellent sensitivity (LOD≤0.05 μg/mL) and appropriate repeatabilites (RSD (n=7)≤5%)). LC-MS/MS was used as a confirmatory technique. The stability of the amines in five food simulants (distilled water, 3% acetic acid, 10% ethanol, 50% ethanol and olive oil) under the most common testing conditions (10 days at 40 °C) was also studied. Results showed that amines had an acceptable stability in aqueous simulants but in the olive oil a loss of 100% was observed for all analytes. Copyright © 2011. Published by Elsevier B.V.

  20. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Treesearch

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  1. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  2. A multicenter nationwide reference intervals study for common biochemical analytes in Turkey using Abbott analyzers.

    PubMed

    Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif

    2014-12-01

    A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.

  3. Introducing Chemometrics to the Analytical Curriculum: Combining Theory and Lab Experience

    ERIC Educational Resources Information Center

    Gilbert, Michael K.; Luttrell, Robert D.; Stout, David; Vogt, Frank

    2008-01-01

    Beer's law is an ideal technique that works only in certain situations. A method for dealing with more complex conditions needs to be integrated into the analytical chemistry curriculum. For that reason, the capabilities and limitations of two common chemometric algorithms, classical least squares (CLS) and principal component regression (PCR),…

  4. Enumeration of sugars and sugar alcohols hydroxyl groups by aqueous-based acetylation and MALDI-TOF mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    A method is described for enumerating hydroxyl groups on analytes in aqueous media is described, and applied to some common polyalcohols (erythritol, mannitol, and xylitol) and selected carbohydrates. The analytes were derivatized in water with vinyl acetate in presence of sodium phosphate buffer. ...

  5. Maximum Likelihood Estimation in Meta-Analytic Structural Equation Modeling

    ERIC Educational Resources Information Center

    Oort, Frans J.; Jak, Suzanne

    2016-01-01

    Meta-analytic structural equation modeling (MASEM) involves fitting models to a common population correlation matrix that is estimated on the basis of correlation coefficients that are reported by a number of independent studies. MASEM typically consist of two stages. The method that has been found to perform best in terms of statistical…

  6. Analytic modeling of aerosol size distributions

    NASA Technical Reports Server (NTRS)

    Deepack, A.; Box, G. P.

    1979-01-01

    Mathematical functions commonly used for representing aerosol size distributions are studied parametrically. Methods for obtaining best fit estimates of the parameters are described. A catalog of graphical plots depicting the parametric behavior of the functions is presented along with procedures for obtaining analytical representations of size distribution data by visual matching of the data with one of the plots. Examples of fitting the same data with equal accuracy by more than one analytic model are also given.

  7. Probabilistic power flow using improved Monte Carlo simulation method with correlated wind sources

    NASA Astrophysics Data System (ADS)

    Bie, Pei; Zhang, Buhan; Li, Hang; Deng, Weisi; Wu, Jiasi

    2017-01-01

    Probabilistic Power Flow (PPF) is a very useful tool for power system steady-state analysis. However, the correlation among different random injection power (like wind power) brings great difficulties to calculate PPF. Monte Carlo simulation (MCS) and analytical methods are two commonly used methods to solve PPF. MCS has high accuracy but is very time consuming. Analytical method like cumulants method (CM) has high computing efficiency but the cumulants calculating is not convenient when wind power output does not obey any typical distribution, especially when correlated wind sources are considered. In this paper, an Improved Monte Carlo simulation method (IMCS) is proposed. The joint empirical distribution is applied to model different wind power output. This method combines the advantages of both MCS and analytical method. It not only has high computing efficiency, but also can provide solutions with enough accuracy, which is very suitable for on-line analysis.

  8. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  9. An analytical method for prediction of stability lobes diagram of milling of large-size thin-walled workpiece

    NASA Astrophysics Data System (ADS)

    Yao, Jiming; Lin, Bin; Guo, Yu

    2017-01-01

    Different from common thin-walled workpiece, in the process of milling of large-size thin-walled workpiece chatter in the axial direction along the spindle is also likely to happen because of the low stiffness of the workpiece in this direction. An analytical method for prediction of stability lobes of milling of large-size thin-walled workpiece is presented in this paper. In the method, not only frequency response function of the tool point but also frequency response function of the workpiece is considered.

  10. Coding and Commonality Analysis: Non-ANOVA Methods for Analyzing Data from Experiments.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    The advantages and disadvantages of three analytic methods used to analyze experimental data in educational research are discussed. The same hypothetical data set is used with all methods for a direct comparison. The Analysis of Variance (ANOVA) method and its several analogs are collectively labeled OVA methods and are evaluated. Regression…

  11. Detection methods and performance criteria for genetically modified organisms.

    PubMed

    Bertheau, Yves; Diolez, Annick; Kobilinsky, André; Magin, Kimberly

    2002-01-01

    Detection methods for genetically modified organisms (GMOs) are necessary for many applications, from seed purity assessment to compliance of food labeling in several countries. Numerous analytical methods are currently used or under development to support these needs. The currently used methods are bioassays and protein- and DNA-based detection protocols. To avoid discrepancy of results between such largely different methods and, for instance, the potential resulting legal actions, compatibility of the methods is urgently needed. Performance criteria of methods allow evaluation against a common standard. The more-common performance criteria for detection methods are precision, accuracy, sensitivity, and specificity, which together specifically address other terms used to describe the performance of a method, such as applicability, selectivity, calibration, trueness, precision, recovery, operating range, limit of quantitation, limit of detection, and ruggedness. Performance criteria should provide objective tools to accept or reject specific methods, to validate them, to ensure compatibility between validated methods, and be used on a routine basis to reject data outside an acceptable range of variability. When selecting a method of detection, it is also important to consider its applicability, its field of applications, and its limitations, by including factors such as its ability to detect the target analyte in a given matrix, the duration of the analyses, its cost effectiveness, and the necessary sample sizes for testing. Thus, the current GMO detection methods should be evaluated against a common set of performance criteria.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byerly, Benjamin L.; Stanley, Floyd; Spencer, Khal

    In our study, a certified plutonium metal reference material (CRM 126) with a known production history is examined using analytical methods that are commonly employed in nuclear forensics for provenancing and attribution. Moreover, the measured plutonium isotopic composition and actinide assay are consistent with values reported on the reference material certificate. Model ages from U/Pu and Am/Pu chronometers agree with the documented production timeline. Finally, these results confirm the utility of these analytical methods and highlight the importance of a holistic approach for forensic study of unknown materials.

  13. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  14. A comparative review of optical surface contamination assessment techniques

    NASA Technical Reports Server (NTRS)

    Heaney, James B.

    1987-01-01

    This paper will review the relative sensitivities and practicalities of the common surface analytical methods that are used to detect and identify unwelcome adsorbants on optical surfaces. The compared methods include visual inspection, simple reflectometry and transmissiometry, ellipsometry, infrared absorption and attenuated total reflectance spectroscopy (ATR), Auger electron spectroscopy (AES), scanning electron microscopy (SEM), secondary ion mass spectrometry (SIMS), and mass accretion determined by quartz crystal microbalance (QCM). The discussion is biased toward those methods that apply optical thin film analytical techniques to spacecraft optical contamination problems. Examples are cited from both ground based and in-orbit experiments.

  15. Relative frequencies of constrained events in stochastic processes: An analytical approach.

    PubMed

    Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C

    2015-10-01

    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.

  16. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Common decision limits --The need for harmonised immunoassays.

    PubMed

    Sturgeon, Catharine M

    2014-05-15

    The main aim of clinical guidelines is to encourage the best clinical outcome for patients and the best use of resources, no matter where patients are investigated or managed. Where guidelines incorporate decision limits based on levels of analytes in serum, plasma or urine these may determine whether or not to treat or may be used to tailor further treatment. Consideration should be given to the effect of method-related differences in results when implementing common decision limits. Available evidence suggests that for some analytes the implications for the patient may be serious, e.g. in terms of missed biopsies or unnecessary prostatic biopsies when prostate specific antigen is measured. Major causes of between-method differences are reviewed and means of addressing them considered. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Advances in simultaneous DSC-FTIR microspectroscopy for rapid solid-state chemical stability studies: some dipeptide drugs as examples.

    PubMed

    Lin, Shan-Yang; Wang, Shun-Li

    2012-04-01

    The solid-state chemistry of drugs has seen growing importance in the pharmaceutical industry for the development of useful API (active pharmaceutical ingredients) of drugs and stable dosage forms. The stability of drugs in various solid dosage forms is an important issue because solid dosage forms are the most common pharmaceutical formulation in clinical use. In solid-state stability studies of drugs, an ideal accelerated method must not only be selected by different complicated methods, but must also detect the formation of degraded product. In this review article, an analytical technique combining differential scanning calorimetry and Fourier-transform infrared (DSC-FTIR) microspectroscopy simulates the accelerated stability test, and simultaneously detects the decomposed products in real time. The pharmaceutical dipeptides aspartame hemihydrate, lisinopril dihydrate, and enalapril maleate either with or without Eudragit E were used as testing examples. This one-step simultaneous DSC-FTIR technique for real-time detection of diketopiperazine (DKP) directly evidenced the dehydration process and DKP formation as an impurity common in pharmaceutical dipeptides. DKP formation in various dipeptides determined by different analytical methods had been collected and compiled. Although many analytical methods have been applied, the combined DSC-FTIR technique is an easy and fast analytical method which not only can simulate the accelerated drug stability testing but also at the same time enable to explore phase transformation as well as degradation due to thermal-related reactions. This technique offers quick and proper interpretations. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  20. Rapid and sensitive analytical method for monitoring of 12 organotin compounds in natural waters.

    PubMed

    Vahčič, Mitja; Milačič, Radmila; Sčančar, Janez

    2011-03-01

    A rapid analytical method for the simultaneous determination of 12 different organotin compounds (OTC): methyl-, butyl-, phenyl- and octyl-tins in natural water samples was developed. It comprises of in situ derivatisation (by using NaBEt4) of OTC in salty or fresh water sample matrix adjusted to pH 6 with Tris-citrate buffer, extraction of ethylated OTC into hexane, separation of OTC in organic phase on 15 m GC column and subsequent quantitative determination of separated OTC by ICP-MS. To optimise the pH of ethylation, phosphate, carbonate and Tris-citrate buffer were investigated alternatively to commonly applied sodium acetate - acetic acid buffer. The ethylation yields in Tris-citrate buffer were found to be better for TBT, MOcT and DOcT in comparison to commonly used acetate buffer. Iso-octane and hexane were examined as organic phase for extraction of ethylated OTC. The advantage of hexane was in its ability for quantitative determination of TMeT. GC column of 15 m in length was used for separation of studied OTC under the optimised separation conditions and its performances compared to 30 m column. The analytical method developed enables sensitive simultaneous determination of 12 different OTC and appreciably shortened analysis time in larger series of water samples. LOD's obtained for the newly developed method ranged from 0.05-0.06 ng Sn L-1 for methyl-, 0.11-0.45 ng Sn L-1 for butyl-, 0.11-0.16 ng Sn L-1 for phenyl-, and 0.07-0.10 ng Sn L-1 for octyl-tins. By applying the developed analytical method, marine water samples from the Northern Adriatic Sea containing mainly butyl- and methyl-tin species were analysed to confirm the proposed method's applicability.

  1. Besifloxacin: A Critical Review of Its Characteristics, Properties, and Analytical Methods.

    PubMed

    Tótoli, Eliane Gandolpho; Salgado, Hérida Regina Nunes

    2018-03-04

    Bacterial conjunctivitis has high impact on the health of the population, since it represents more than a third of ocular pathologies reported by health services worldwide. There is a high incidence of bacterial resistance to the antimicrobials most commonly used for the treatment of conjunctivitis. In this context, besifloxacin stands out, since it is a fluoroquinolone developed exclusively for topical ophthalmic use, presenting a low risk of developing resistance due to its reduced systemic exposure. Bausch & Lomb markets it as ophthalmic suspension, under the trade name Besivance™. Literature review on besifloxacin is presented, covering its pharmaceutical and clinical characteristics, and the analytical methods used to measure the drug in pharmaceutical products and biological samples. High performance liquid chromatography is the most used method for this purpose. A discussion on Green Chemistry is also presented, focusing the importance of the development of green analytical methods for the analysis of drugs.

  2. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  3. Forensic investigation of plutonium metal: a case study of CRM 126

    DOE PAGES

    Byerly, Benjamin L.; Stanley, Floyd; Spencer, Khal; ...

    2016-11-01

    In our study, a certified plutonium metal reference material (CRM 126) with a known production history is examined using analytical methods that are commonly employed in nuclear forensics for provenancing and attribution. Moreover, the measured plutonium isotopic composition and actinide assay are consistent with values reported on the reference material certificate. Model ages from U/Pu and Am/Pu chronometers agree with the documented production timeline. Finally, these results confirm the utility of these analytical methods and highlight the importance of a holistic approach for forensic study of unknown materials.

  4. Periodized Daubechies wavelets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Restrepo, J.M.; Leaf, G.K.; Schlossnagle, G.

    1996-03-01

    The properties of periodized Daubechies wavelets on [0,1] are detailed and counterparts which form a basis for L{sup 2}(R). Numerical examples illustrate the analytical estimates for convergence and demonstrated by comparison with Fourier spectral methods the superiority of wavelet projection methods for approximations. The analytical solution to inner products of periodized wavelets and their derivatives, which are known as connection coefficients, is presented, and their use ius illustrated in the approximation of two commonly used differential operators. The periodization of the connection coefficients in Galerkin schemes is presented in detail.

  5. Uptake of recommended common reference intervals for chemical pathology in Australia.

    PubMed

    Jones, Graham Rd; Koetsier, Sabrina

    2017-05-01

    Background Reference intervals are a vital part of reporting numerical pathology results. It is known, however, that variation in reference intervals between laboratories is common, even when analytical methods support common reference intervals. In response to this, in Australia, the Australasian Association of Clinical Biochemists together with the Royal College of Pathologists of Australasia published in 2014 a set of recommended common reference intervals for 11 common serum analytes (sodium, potassium, chloride, bicarbonate, creatinine male, creatinine female, calcium, calcium adjusted for albumin, phosphate, magnesium, lactate dehydrogenase, alkaline phosphatase and total protein). Methods Uptake of recommended common reference intervals in Australian laboratories was assessed using data from four annual cycles of the RCPAQAP reference intervals external quality assurance programme. Results Over three years, from 2013 to 2016, the use of the recommended upper and lower reference limits has increased from 40% to 83%. Nearly half of the intervals in use by enrolled laboratories in 2016 have been changed in this time period, indicating an active response to the guidelines. Conclusions These data support the activities of the Australasian Association of Clinical Biochemists and Royal College of Pathologists of Australasia in demonstrating a change in laboratory behaviour to reduce unnecessary variation in reference intervals and thus provide a consistent message to doctor and patients irrespective of the laboratory used.

  6. Rapid qualitative and quantitative analysis of proanthocyanidin oligomers and polymers by UPLC-MS/MS

    USDA-ARS?s Scientific Manuscript database

    Proanthocyanidins (PAs) are a structurally complex and bioactive group of tannins. Detailed analysis of PA concentration, composition, and structure typically requires the use of one or more time-consuming analytical methods. For example, the commonly employed thiolysis and phloroglucinolysis method...

  7. Accuracy verification and identification of matrix effects. The College of American Pathologists' Protocol.

    PubMed

    Eckfeldt, J H; Copeland, K R

    1993-04-01

    Proficiency testing using stabilized control materials has been used for decades as a means of monitoring and improving performance in the clinical laboratory. Often, the commonly used proficiency testing materials exhibit "matrix effects" that cause them to behave differently from fresh human specimens in certain clinical analytic systems. Because proficiency testing is the primary method in which regulatory agencies have chosen to evaluate clinical laboratory performance, the College of American Pathologists (CAP) has proposed guidelines for investigating the influence of matrix effects on their Survey results. The purpose of this investigation was to determine the feasibility, usefulness, and potential problems associated with this CAP Matrix Effect Analytical Protocol, in which fresh patient specimens and CAP proficiency specimens are analyzed simultaneously by a field method and a definitive, reference, or other comparative method. The optimal outcome would be that both the fresh human and CAP Survey specimens agree closely with the comparative method result. However, this was not always the case. Using several different analytic configurations, we were able to demonstrate matrix and calibration biases for several of the analytes investigated.

  8. Validation of Analytical Damping Ratio by Fatigue Stress Limit

    NASA Astrophysics Data System (ADS)

    Foong, Faruq Muhammad; Chung Ket, Thein; Beng Lee, Ooi; Aziz, Abdul Rashid Abdul

    2018-03-01

    The optimisation process of a vibration energy harvester is usually restricted to experimental approaches due to the lack of an analytical equation to describe the damping of a system. This study derives an analytical equation, which describes the first mode damping ratio of a clamp-free cantilever beam under harmonic base excitation by combining the transverse equation of motion of the beam with the damping-stress equation. This equation, as opposed to other common damping determination methods, is independent of experimental inputs or finite element simulations and can be solved using a simple iterative convergence method. The derived equation was determined to be correct for cases when the maximum bending stress in the beam is below the fatigue limit stress of the beam. However, an increasing trend in the error between the experiment and the analytical results were observed at high stress levels. Hence, the fatigue limit stress was used as a parameter to define the validity of the analytical equation.

  9. Big data in sleep medicine: prospects and pitfalls in phenotyping

    PubMed Central

    Bianchi, Matt T; Russo, Kathryn; Gabbidon, Harriett; Smith, Tiaundra; Goparaju, Balaji; Westover, M Brandon

    2017-01-01

    Clinical polysomnography (PSG) databases are a rich resource in the era of “big data” analytics. We explore the uses and potential pitfalls of clinical data mining of PSG using statistical principles and analysis of clinical data from our sleep center. We performed retrospective analysis of self-reported and objective PSG data from adults who underwent overnight PSG (diagnostic tests, n=1835). Self-reported symptoms overlapped markedly between the two most common categories, insomnia and sleep apnea, with the majority reporting symptoms of both disorders. Standard clinical metrics routinely reported on objective data were analyzed for basic properties (missing values, distributions), pairwise correlations, and descriptive phenotyping. Of 41 continuous variables, including clinical and PSG derived, none passed testing for normality. Objective findings of sleep apnea and periodic limb movements were common, with 51% having an apnea–hypopnea index (AHI) >5 per hour and 25% having a leg movement index >15 per hour. Different visualization methods are shown for common variables to explore population distributions. Phenotyping methods based on clinical databases are discussed for sleep architecture, sleep apnea, and insomnia. Inferential pitfalls are discussed using the current dataset and case examples from the literature. The increasing availability of clinical databases for large-scale analytics holds important promise in sleep medicine, especially as it becomes increasingly important to demonstrate the utility of clinical testing methods in management of sleep disorders. Awareness of the strengths, as well as caution regarding the limitations, will maximize the productive use of big data analytics in sleep medicine. PMID:28243157

  10. Non-monetary valuation using Multi-Criteria Decision Analysis: Sensitivity of additive aggregation methods to scaling and compensation assumptions

    EPA Science Inventory

    Analytical methods for Multi-Criteria Decision Analysis (MCDA) support the non-monetary valuation of ecosystem services for environmental decision making. Many published case studies transform ecosystem service outcomes into a common metric and aggregate the outcomes to set land ...

  11. GC/FT-IR ANALYSIS OF THE THERMALLY LABILE COMPOUND TRIS (2,3-DIBROMOPROPYL) PHOSPHATE

    EPA Science Inventory

    A fast and convenient GC method has been developed for a compound [tris(2,3-dibromopropyl)phosphate] that poses a difficult analytical problem for both GC (thermal instability/low volatility) and LC (not amenable to commonly available, sensitive detectors) analysis. his method em...

  12. Development and validation of a simple high-performance liquid chromatography analytical method for simultaneous determination of phytosterols, cholesterol and squalene in parenteral lipid emulsions.

    PubMed

    Novak, Ana; Gutiérrez-Zamora, Mercè; Domenech, Lluís; Suñé-Negre, Josep M; Miñarro, Montserrat; García-Montoya, Encarna; Llop, Josep M; Ticó, Josep R; Pérez-Lozano, Pilar

    2018-02-01

    A simple analytical method for simultaneous determination of phytosterols, cholesterol and squalene in lipid emulsions was developed owing to increased interest in their clinical effects. Method development was based on commonly used stationary (C 18 , C 8 and phenyl) and mobile phases (mixtures of acetonitrile, methanol and water) under isocratic conditions. Differences in stationary phases resulted in peak overlapping or coelution of different peaks. The best separation of all analyzed compounds was achieved on Zorbax Eclipse XDB C 8 (150 × 4.6 mm, 5 μm; Agilent) and ACN-H 2 O-MeOH, 80:19.5:0.5 (v/v/v). In order to achieve a shorter time of analysis, the method was further optimized and gradient separation was established. The optimized analytical method was validated and tested for routine use in lipid emulsion analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    PubMed

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  14. Method development and validation for simultaneous quantification of 15 drugs of abuse and prescription drugs and 7 of their metabolites in whole blood relevant in the context of driving under the influence of drugs--usefulness of multi-analyte calibration.

    PubMed

    Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas

    2014-11-01

    In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Flexible nano- and microliter injections on a single liquid chromatography-mass spectrometry system: Minimizing sample preparation and maximizing linear dynamic range.

    PubMed

    Lubin, Arnaud; Sheng, Sheng; Cabooter, Deirdre; Augustijns, Patrick; Cuyckens, Filip

    2017-11-17

    Lack of knowledge on the expected concentration range or insufficient linear dynamic range of the analytical method applied are common challenges for the analytical scientist. Samples that are above the upper limit of quantification are typically diluted and reanalyzed. The analysis of undiluted highly concentrated samples can cause contamination of the system, while the dilution step is time consuming and as the case for any sample preparation step, also potentially leads to precipitation, adsorption or degradation of the analytes. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A Short Research Note on Calculating Exact Distribution Functions and Random Sampling for the 3D NFW Profile

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Howlett, Cullan

    2018-06-01

    In this short note we publish the analytic quantile function for the Navarro, Frenk & White (NFW) profile. All known published and coded methods for sampling from the 3D NFW PDF use either accept-reject, or numeric interpolation (sometimes via a lookup table) for projecting random Uniform samples through the quantile distribution function to produce samples of the radius. This is a common requirement in N-body initial condition (IC), halo occupation distribution (HOD), and semi-analytic modelling (SAM) work for correctly assigning particles or galaxies to positions given an assumed concentration for the NFW profile. Using this analytic description allows for much faster and cleaner code to solve a common numeric problem in modern astronomy. We release R and Python versions of simple code that achieves this sampling, which we note is trivial to reproduce in any modern programming language.

  17. Dextroamphetamine: a pharmacologic countermeasure for space motion sickness and orthostatic dysfunction

    NASA Technical Reports Server (NTRS)

    Snow, L. Dale

    1996-01-01

    Dextroamphetamine has potential as a pharmacologic agent for the alleviation of two common health effects associated with microgravity. As an adjuvant to Space Motion Sickness (SMS) medication, dextroamphetamine can enhance treatment efficacy by reducing undesirable Central Nervous System (CNS) side effects of SMS medications. Secondly, dextroamphetamine may be useful for the prevention of symptoms of post-mission orthostatic intolerance caused by cardiovascular deconditioning during spaceflight. There is interest in developing an intranasal delivery form of dextroamphetanmine for use as a countermeasure in microgravity conditions. Development of this dosage form will require an analytical detection method with sensitivity in the low ng range (1 to 100 ng/mL). During the 1995 Summer Faculty Fellowship Program, two analytical methods were developed and evaluated for their suitability as quantitative procedures for dextroamphetamine in studies of product stability, bioavailability assessment, and pharmacokinetic evaluation. In developing some of the analytical methods, beta-phenylethylamine, a primary amine structurally similar to dextroamphetamine, was used. The first analytical procedure to be evaluated involved hexane extraction and subsequent fluorescamine labeling of beta-phenylethylamine. The second analytical procedure to be evaluated involved quantitation of dextroamphetamine by an Enzyme-Linked ImmunoSorbent Assay (ELISA).

  18. Smart phone: a popular device supports amylase activity assay in fisheries research.

    PubMed

    Thongprajukaew, Karun; Choodum, Aree; Sa-E, Barunee; Hayee, Ummah

    2014-11-15

    Colourimetric determinations of amylase activity were developed based on a standard dinitrosalicylic acid (DNS) staining method, using maltose as the analyte. Intensities and absorbances of red, green and blue (RGB) were obtained with iPhone imaging and Adobe Photoshop image analysis. Correlation of green and analyte concentrations was highly significant, and the accuracy of the developed method was excellent in analytical performance. The common iPhone has sufficient imaging ability for accurate quantification of maltose concentrations. Detection limits, sensitivity and linearity were comparable to a spectrophotometric method, but provided better inter-day precision. In quantifying amylase specific activity from a commercial source (P>0.02) and fish samples (P>0.05), differences compared with spectrophotometric measurements were not significant. We have demonstrated that iPhone imaging with image analysis in Adobe Photoshop has potential for field and laboratory studies of amylase. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Ultraviolet, Visible, and Fluorescence Spectroscopy

    NASA Astrophysics Data System (ADS)

    Penner, Michael H.

    Spectroscopy in the ultraviolet-visible (UV-Vis) range is one of the most commonly encountered laboratory techniques in food analysis. Diverse examples, such as the quantification of macrocomponents (total carbohydrate by the phenol-sulfuric acid method), quantification of microcomponents, (thiamin by the thiochrome fluorometric procedure), estimates of rancidity (lipid oxidation status by the thiobarbituric acid test), and surveillance testing (enzyme-linked immunoassays), are presented in this text. In each of these cases, the analytical signal for which the assay is based is either the emission or absorption of radiation in the UV-Vis range. This signal may be inherent in the analyte, such as the absorbance of radiation in the visible range by pigments, or a result of a chemical reaction involving the analyte, such as the colorimetric copper-based Lowry method for the analysis of soluble protein.

  20. An analytical approach to obtaining JWL parameters from cylinder tests

    NASA Astrophysics Data System (ADS)

    Sutton, B. D.; Ferguson, J. W.; Hodgson, A. N.

    2017-01-01

    An analytical method for determining parameters for the JWL Equation of State from cylinder test data is described. This method is applied to four datasets obtained from two 20.3 mm diameter EDC37 cylinder tests. The calculated pressure-relative volume (p-Vr) curves agree with those produced by hydro-code modelling. The average calculated Chapman-Jouguet (CJ) pressure is 38.6 GPa, compared to the model value of 38.3 GPa; the CJ relative volume is 0.729 for both. The analytical pressure-relative volume curves produced agree with the one used in the model out to the commonly reported expansion of 7 relative volumes, as do the predicted energies generated by integrating under the p-Vr curve. The calculated energy is within 1.6% of that predicted by the model.

  1. Development and validation of an UHPLC-MS/MS method for β2-agonists quantification in human urine and application to clinical samples.

    PubMed

    Bozzolino, Cristina; Leporati, Marta; Gani, Federica; Ferrero, Cinzia; Vincenti, Marco

    2018-02-20

    A fast analytical method for the simultaneous detection of 24 β 2 -agonists in human urine was developed and validated. The method covers the therapeutic drugs most commonly administered, but also potentially abused β 2 -agonists. The procedure is based on enzymatic deconjugation with β-glucuronidase followed by SPE clean up using mixed-phase cartridges with both ion-exchange and lipophilic properties. Instrumental analysis conducted by UHPLC-MS/MS allowed high peak resolution and rapid chromatographic separation, with reduced time and costs. The method was fully validated according ISO 17025:2005 principles. The following parameters were determined for each analyte: specificity, selectivity, linearity, limit of detection, limit of quantification, precision, accuracy, matrix effect, recovery and carry-over. The method was tested on real samples obtained from patients subjected to clinical treatment under chronic or acute therapy with either formoterol, indacaterol, salbutamol, or salmeterol. The drugs were administered using pressurized metered dose inhalers. All β 2 -agonists administered to the patients were detected in the real samples. The method proved adequate to accurately measure the concentration of these analytes in the real samples. The observed analytical data are discussed with reference to the administered dose and the duration of the therapy. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Tests of Measurement Invariance without Subgroups: A Generalization of Classical Methods

    ERIC Educational Resources Information Center

    Merkle, Edgar C.; Zeileis, Achim

    2013-01-01

    The issue of measurement invariance commonly arises in factor-analytic contexts, with methods for assessment including likelihood ratio tests, Lagrange multiplier tests, and Wald tests. These tests all require advance definition of the number of groups, group membership, and offending model parameters. In this paper, we study tests of measurement…

  3. An Alternative Approach to Conceptualizing Interviews in HRD Research

    ERIC Educational Resources Information Center

    Wang, Jia; Roulston, Kathryn J.

    2007-01-01

    Qualitative researchers in human resource development (HRD) frequently use in-depth interviews as a research method. Yet reports from qualitative studies in HRD commonly pay little or no analytical attention to the co-construction of interview data. That is, reports of qualitative research projects often treat interviews as a transparent method of…

  4. Analysis of polymeric phenolics in red wines using different techniques combined with gel permeation chromatography fractionation.

    PubMed

    Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén

    2006-04-21

    A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.

  5. Utilizing global data to estimate analytical performance on the Sigma scale: A global comparative analysis of methods, instruments, and manufacturers through external quality assurance and proficiency testing programs.

    PubMed

    Westgard, Sten A

    2016-06-01

    To assess the analytical performance of instruments and methods through external quality assessment and proficiency testing data on the Sigma scale. A representative report from five different EQA/PT programs around the world (2 US, 1 Canadian, 1 UK, and 1 Australasian) was accessed. The instrument group standard deviations were used as surrogate estimates of instrument imprecision. Performance specifications from the US CLIA proficiency testing criteria were used to establish a common quality goal. Then Sigma-metrics were calculated to grade the analytical performance. Different methods have different Sigma-metrics for each analyte reviewed. Summary Sigma-metrics estimate the percentage of the chemistry analytes that are expected to perform above Five Sigma, which is where optimized QC design can be implemented. The range of performance varies from 37% to 88%, exhibiting significant differentiation between instruments and manufacturers. Median Sigmas for the different manufacturers in three analytes (albumin, glucose, sodium) showed significant differentiation. Chemistry tests are not commodities. Quality varies significantly from manufacturer to manufacturer, instrument to instrument, and method to method. The Sigma-assessments from multiple EQA/PT programs provide more insight into the performance of methods and instruments than any single program by itself. It is possible to produce a ranking of performance by manufacturer, instrument and individual method. Laboratories seeking optimal instrumentation would do well to consult this data as part of their decision-making process. To confirm that these assessments are stable and reliable, a longer term study should be conducted that examines more results over a longer time period. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Researching Mental Health Disorders in the Era of Social Media: Systematic Review

    PubMed Central

    Vadillo, Miguel A; Curcin, Vasa

    2017-01-01

    Background Mental illness is quickly becoming one of the most prevalent public health problems worldwide. Social network platforms, where users can express their emotions, feelings, and thoughts, are a valuable source of data for researching mental health, and techniques based on machine learning are increasingly used for this purpose. Objective The objective of this review was to explore the scope and limits of cutting-edge techniques that researchers are using for predictive analytics in mental health and to review associated issues, such as ethical concerns, in this area of research. Methods We performed a systematic literature review in March 2017, using keywords to search articles on data mining of social network data in the context of common mental health disorders, published between 2010 and March 8, 2017 in medical and computer science journals. Results The initial search returned a total of 5386 articles. Following a careful analysis of the titles, abstracts, and main texts, we selected 48 articles for review. We coded the articles according to key characteristics, techniques used for data collection, data preprocessing, feature extraction, feature selection, model construction, and model verification. The most common analytical method was text analysis, with several studies using different flavors of image analysis and social interaction graph analysis. Conclusions Despite an increasing number of studies investigating mental health issues using social network data, some common problems persist. Assembling large, high-quality datasets of social media users with mental disorder is problematic, not only due to biases associated with the collection methods, but also with regard to managing consent and selecting appropriate analytics techniques. PMID:28663166

  7. Freeze-thaw approach: A practical sample preparation strategy for residue analysis of multi-class veterinary drugs in chicken muscle.

    PubMed

    Zhang, Meiyu; Li, Erfen; Su, Yijuan; Song, Xuqin; Xie, Jingmeng; Zhang, Yingxia; He, Limin

    2018-06-01

    Seven drugs from different classes, namely, fluoroquinolones (enrofloxacin, ciprofloxacin, sarafloxacin), sulfonamides (sulfadimidine, sulfamonomethoxine), and macrolides (tilmicosin, tylosin), were used as test compounds in chickens by oral administration, a simple extraction step after cryogenic freezing might allow the effective extraction of multi-class veterinary drug residues from minced chicken muscles by mix vortexing. On basis of the optimized freeze-thaw approach, a convenient, selective, and reproducible liquid chromatography with tandem mass spectrometry method was developed. At three spiking levels in blank chicken and medicated chicken muscles, average recoveries of the analytes were in the range of 71-106 and 63-119%, respectively. All the relative standard deviations were <20%. The limits of quantification of analytes were 0.2-5.0 ng/g. Regardless of the chicken levels, there were no significant differences (P > 0.05) in the average contents of almost any of the analytes in medicated chickens between this method and specific methods in the literature for the determination of specific analytes. Finally, the developed method was successfully extended to the monitoring of residues of 55 common veterinary drugs in food animal muscles. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Responding to Harmful Algal Blooms: Treatment Optimization

    EPA Science Inventory

    This presentation discusses: (1) analytical methods for toxins and cyanobacteria within the context of monitoring a treatment process, (2) toxin and cell removal capacities for common drinking water treatment processes, (3) issues to consider when evaluating a treatment facility...

  9. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  10. Simultaneous determination of thirteen different steroid hormones using micro UHPLC-MS/MS with on-line SPE system.

    PubMed

    Márta, Zoltán; Bobály, Balázs; Fekete, Jenő; Magda, Balázs; Imre, Tímea; Mészáros, Katalin Viola; Bálint, Mária; Szabó, Pál Tamás

    2018-02-20

    Ultratrace analysis of sample components requires excellent analytical performance in terms of limits of quantitation (LOQ). Micro UHPLC coupled to sensitive tandem mass spectrometry provides state of the art solution for such analytical problems. Using on-line SPE with column switching on a micro UHPLC-MS/MS system allowed to decrease LOQ without any complex sample preparation protocol. The presented method is capable of reaching satisfactory low LOQ values for analysis of thirteen different steroid molecules from human plasma without the most commonly used off-line SPE or compound derivatization. Steroids were determined by using two simple sample preparation methods, based on lower and higher plasma steroid concentrations. In the first method, higher analyte concentrations were directly determined after protein precipitation with methanol. The organic phase obtained from the precipitation was diluted with water and directly injected into the LC-MS system. In the second method, low steroid levels were determined by concentrating the organic phase after steroid extraction. In this case, analytes were extracted with ethyl acetate and reconstituted in 90/10 water/acetonitrile following evaporation to dryness. This step provided much lower LOQs, outperforming previously published values. The method has been validated and subsequently applied to clinical laboratory measurement. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Exact analytical modeling of magnetic vector potential in surface inset permanent magnet DC machines considering magnet segmentation

    NASA Astrophysics Data System (ADS)

    Jabbari, Ali

    2018-01-01

    Surface inset permanent magnet DC machine can be used as an alternative in automation systems due to their high efficiency and robustness. Magnet segmentation is a common technique in order to mitigate pulsating torque components in permanent magnet machines. An accurate computation of air-gap magnetic field distribution is necessary in order to calculate machine performance. An exact analytical method for magnetic vector potential calculation in surface inset permanent magnet machines considering magnet segmentation has been proposed in this paper. The analytical method is based on the resolution of Laplace and Poisson equations as well as Maxwell equation in polar coordinate by using sub-domain method. One of the main contributions of the paper is to derive an expression for the magnetic vector potential in the segmented PM region by using hyperbolic functions. The developed method is applied on the performance computation of two prototype surface inset magnet segmented motors with open circuit and on load conditions. The results of these models are validated through FEM method.

  12. New analytic results for speciation times in neutral models.

    PubMed

    Gernhard, Tanja

    2008-05-01

    In this paper, we investigate the standard Yule model, and a recently studied model of speciation and extinction, the "critical branching process." We develop an analytic way-as opposed to the common simulation approach-for calculating the speciation times in a reconstructed phylogenetic tree. Simple expressions for the density and the moments of the speciation times are obtained. Methods for dating a speciation event become valuable, if for the reconstructed phylogenetic trees, no time scale is available. A missing time scale could be due to supertree methods, morphological data, or molecular data which violates the molecular clock. Our analytic approach is, in particular, useful for the model with extinction, since simulations of birth-death processes which are conditioned on obtaining n extant species today are quite delicate. Further, simulations are very time consuming for big n under both models.

  13. Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.

    This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less

  14. An Analytical Approach to Obtaining JWL Parameters from Cylinder Tests

    NASA Astrophysics Data System (ADS)

    Sutton, Ben; Ferguson, James

    2015-06-01

    An analytical method for determining parameters for the JWL equation of state (EoS) from cylinder test data is described. This method is applied to four datasets obtained from two 20.3 mm diameter EDC37 cylinder tests. The calculated parameters and pressure-volume (p-V) curves agree with those produced by hydro-code modelling. The calculated Chapman-Jouguet (CJ) pressure is 38.6 GPa, compared to the model value of 38.3 GPa; the CJ relative volume is 0.729 for both. The analytical pressure-volume curves produced agree with the one used in the model out to the commonly reported expansion of 7 relative volumes, as do the predicted energies generated by integrating under the p-V curve. The calculated and model energies are 8.64 GPa and 8.76 GPa respectively.

  15. Comparison of adjoint and analytical Bayesian inversion methods for constraining Asian sources of carbon monoxide using satellite (MOPITT) measurements of CO columns

    NASA Astrophysics Data System (ADS)

    Kopacz, Monika; Jacob, Daniel J.; Henze, Daven K.; Heald, Colette L.; Streets, David G.; Zhang, Qiang

    2009-02-01

    We apply the adjoint of an atmospheric chemical transport model (GEOS-Chem CTM) to constrain Asian sources of carbon monoxide (CO) with 2° × 2.5° spatial resolution using Measurement of Pollution in the Troposphere (MOPITT) satellite observations of CO columns in February-April 2001. Results are compared to the more common analytical method for solving the same Bayesian inverse problem and applied to the same data set. The analytical method is more exact but because of computational limitations it can only constrain emissions over coarse regions. We find that the correction factors to the a priori CO emission inventory from the adjoint inversion are generally consistent with those of the analytical inversion when averaged over the large regions of the latter. The adjoint solution reveals fine-scale variability (cities, political boundaries) that the analytical inversion cannot resolve, for example, in the Indian subcontinent or between Korea and Japan, and some of that variability is of opposite sign which points to large aggregation errors in the analytical solution. Upward correction factors to Chinese emissions from the prior inventory are largest in central and eastern China, consistent with a recent bottom-up revision of that inventory, although the revised inventory also sees the need for upward corrections in southern China where the adjoint and analytical inversions call for downward correction. Correction factors for biomass burning emissions derived from the adjoint and analytical inversions are consistent with a recent bottom-up inventory on the basis of MODIS satellite fire data.

  16. Analytical methods for human biomonitoring of pesticides. A review.

    PubMed

    Yusa, Vicent; Millet, Maurice; Coscolla, Clara; Roca, Marta

    2015-09-03

    Biomonitoring of both currently-used and banned-persistent pesticides is a very useful tool for assessing human exposure to these chemicals. In this review, we present current approaches and recent advances in the analytical methods for determining the biomarkers of exposure to pesticides in the most commonly used specimens, such as blood, urine, and breast milk, and in emerging non-invasive matrices such as hair and meconium. We critically discuss the main applications for sample treatment, and the instrumental techniques currently used to determine the most relevant pesticide biomarkers. We finally look at the future trends in this field. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Selecting Surrogates for an Alkylphenol Ethoxylate Analytical Method in Sewage and Soil Matrices

    EPA Science Inventory

    Alkylphenol ethoxylates (APEs) are nonionic surfactants commonly used in industrial detergents. These products contain complex mixtures of branched and linear chains. APEs and their degradation products, alkylphenols, are highly toxic to aquatic organisms, potentially estrogeni...

  18. IN-SITU OXIDATION OF 1,4-DIOXANE (LABORATORY RESULTS)

    EPA Science Inventory

    Interest in the solvent stabilizer, 1,4-dioxane, is increasing because analytical detection limits have decreased indicating its presence at chlorinated volatile organic compound contaminated sites. The most common method for removing 1,4-dioxane from contaminated water is advanc...

  19. Recursively constructing analytic expressions for equilibrium distributions of stochastic biochemical reaction networks.

    PubMed

    Meng, X Flora; Baetica, Ania-Ariadna; Singhal, Vipul; Murray, Richard M

    2017-05-01

    Noise is often indispensable to key cellular activities, such as gene expression, necessitating the use of stochastic models to capture its dynamics. The chemical master equation (CME) is a commonly used stochastic model of Kolmogorov forward equations that describe how the probability distribution of a chemically reacting system varies with time. Finding analytic solutions to the CME can have benefits, such as expediting simulations of multiscale biochemical reaction networks and aiding the design of distributional responses. However, analytic solutions are rarely known. A recent method of computing analytic stationary solutions relies on gluing simple state spaces together recursively at one or two states. We explore the capabilities of this method and introduce algorithms to derive analytic stationary solutions to the CME. We first formally characterize state spaces that can be constructed by performing single-state gluing of paths, cycles or both sequentially. We then study stochastic biochemical reaction networks that consist of reversible, elementary reactions with two-dimensional state spaces. We also discuss extending the method to infinite state spaces and designing the stationary behaviour of stochastic biochemical reaction networks. Finally, we illustrate the aforementioned ideas using examples that include two interconnected transcriptional components and biochemical reactions with two-dimensional state spaces. © 2017 The Author(s).

  20. Recursively constructing analytic expressions for equilibrium distributions of stochastic biochemical reaction networks

    PubMed Central

    Baetica, Ania-Ariadna; Singhal, Vipul; Murray, Richard M.

    2017-01-01

    Noise is often indispensable to key cellular activities, such as gene expression, necessitating the use of stochastic models to capture its dynamics. The chemical master equation (CME) is a commonly used stochastic model of Kolmogorov forward equations that describe how the probability distribution of a chemically reacting system varies with time. Finding analytic solutions to the CME can have benefits, such as expediting simulations of multiscale biochemical reaction networks and aiding the design of distributional responses. However, analytic solutions are rarely known. A recent method of computing analytic stationary solutions relies on gluing simple state spaces together recursively at one or two states. We explore the capabilities of this method and introduce algorithms to derive analytic stationary solutions to the CME. We first formally characterize state spaces that can be constructed by performing single-state gluing of paths, cycles or both sequentially. We then study stochastic biochemical reaction networks that consist of reversible, elementary reactions with two-dimensional state spaces. We also discuss extending the method to infinite state spaces and designing the stationary behaviour of stochastic biochemical reaction networks. Finally, we illustrate the aforementioned ideas using examples that include two interconnected transcriptional components and biochemical reactions with two-dimensional state spaces. PMID:28566513

  1. Analytical Chemistry Laboratory Progress Report for FY 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less

  2. Statistical Learning Theory for High Dimensional Prediction: Application to Criterion-Keyed Scale Development

    PubMed Central

    Chapman, Benjamin P.; Weiss, Alexander; Duberstein, Paul

    2016-01-01

    Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in “big data” problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how three common SLT algorithms–Supervised Principal Components, Regularization, and Boosting—can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach—or perhaps because of them–SLT methods may hold value as a statistically rigorous approach to exploratory regression. PMID:27454257

  3. Analytical methods for determination of mycotoxins: a review.

    PubMed

    Turner, Nicholas W; Subrahmanyam, Sreenath; Piletsky, Sergey A

    2009-01-26

    Mycotoxins are small (MW approximately 700), toxic chemical products formed as secondary metabolites by a few fungal species that readily colonise crops and contaminate them with toxins in the field or after harvest. Ochratoxins and Aflatoxins are mycotoxins of major significance and hence there has been significant research on broad range of analytical and detection techniques that could be useful and practical. Due to the variety of structures of these toxins, it is impossible to use one standard technique for analysis and/or detection. Practical requirements for high-sensitivity analysis and the need for a specialist laboratory setting create challenges for routine analysis. Several existing analytical techniques, which offer flexible and broad-based methods of analysis and in some cases detection, have been discussed in this manuscript. There are a number of methods used, of which many are lab-based, but to our knowledge there seems to be no single technique that stands out above the rest, although analytical liquid chromatography, commonly linked with mass spectroscopy is likely to be popular. This review manuscript discusses (a) sample pre-treatment methods such as liquid-liquid extraction (LLE), supercritical fluid extraction (SFE), solid phase extraction (SPE), (b) separation methods such as (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC), and capillary electrophoresis (CE) and (c) others such as ELISA. Further currents trends, advantages and disadvantages and future prospects of these methods have been discussed.

  4. The combination of four analytical methods to explore skeletal muscle metabolomics: Better coverage of metabolic pathways or a marketing argument?

    PubMed

    Bruno, C; Patin, F; Bocca, C; Nadal-Desbarats, L; Bonnier, F; Reynier, P; Emond, P; Vourc'h, P; Joseph-Delafont, K; Corcia, P; Andres, C R; Blasco, H

    2018-01-30

    Metabolomics is an emerging science based on diverse high throughput methods that are rapidly evolving to improve metabolic coverage of biological fluids and tissues. Technical progress has led researchers to combine several analytical methods without reporting the impact on metabolic coverage of such a strategy. The objective of our study was to develop and validate several analytical techniques (mass spectrometry coupled to gas or liquid chromatography and nuclear magnetic resonance) for the metabolomic analysis of small muscle samples and evaluate the impact of combining methods for more exhaustive metabolite covering. We evaluated the muscle metabolome from the same pool of mouse muscle samples after 2 metabolite extraction protocols. Four analytical methods were used: targeted flow injection analysis coupled with mass spectrometry (FIA-MS/MS), gas chromatography coupled with mass spectrometry (GC-MS), liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS), and nuclear magnetic resonance (NMR) analysis. We evaluated the global variability of each compound i.e., analytical (from quality controls) and extraction variability (from muscle extracts). We determined the best extraction method and we reported the common and distinct metabolites identified based on the number and identity of the compounds detected with low analytical variability (variation coefficient<30%) for each method. Finally, we assessed the coverage of muscle metabolic pathways obtained. Methanol/chloroform/water and water/methanol were the best extraction solvent for muscle metabolome analysis by NMR and MS, respectively. We identified 38 metabolites by nuclear magnetic resonance, 37 by FIA-MS/MS, 18 by GC-MS, and 80 by LC-HRMS. The combination led us to identify a total of 132 metabolites with low variability partitioned into 58 metabolic pathways, such as amino acid, nitrogen, purine, and pyrimidine metabolism, and the citric acid cycle. This combination also showed that the contribution of GC-MS was low when used in combination with other mass spectrometry methods and nuclear magnetic resonance to explore muscle samples. This study reports the validation of several analytical methods, based on nuclear magnetic resonance and several mass spectrometry methods, to explore the muscle metabolome from a small amount of tissue, comparable to that obtained during a clinical trial. The combination of several techniques may be relevant for the exploration of muscle metabolism, with acceptable analytical variability and overlap between methods However, the difficult and time-consuming data pre-processing, processing, and statistical analysis steps do not justify systematically combining analytical methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Ethnographic/Qualitative Research: Theoretical Perspectives and Methodological Strategies.

    ERIC Educational Resources Information Center

    Butler, E. Dean

    This paper examines the metatheoretical concepts associated with ethnographic/qualitative educational inquiry and overviews the more commonly utilized research designs, data collection methods, and analytical approaches. The epistemological and ontological assumptions of this newer approach differ greatly from those of the traditional educational…

  6. Psychological Flexibility, ACT, and Organizational Behavior

    ERIC Educational Resources Information Center

    Bond, Frank W.; Hayes, Steven C.; Barnes-Holmes, Dermot

    2006-01-01

    This paper offers organizational behavior management (OBM) a behavior analytically consistent way to expand its analysis of, and methods for changing, organizational behavior. It shows how Relational Frame Theory (RFT) suggests that common, problematic, psychological processes emerge from language itself, and they produce psychological…

  7. Liquid chromatography method to determine polyamines in thermosetting polymers.

    PubMed

    Dopico-García, M S; López-Vilariño, J M; Fernández-Martínez, G; González-Rodríguez, M V

    2010-05-14

    A simple, robust and sensitive analytical method to determine three polyamines commonly used as hardeners in epoxy resin systems and in the manufacture of polyurethane is reported. The studied polyamines are: one tetramine, TETA (triethylenetetramine), and two diamines, IPDA (Isophorone diamine) and TCD-diamine (4,7-methano-1H-indene-5,?-dimethanamine, octahydro-). The latter has an incompletely defined structure, and, as far as we know, has not been previously determined by other methods. All three polyamines contain primary amines; TETA also contains secondary amines. The analytical method involves derivatization with 6-aminoquinolyl-N-hydroxysuccinimidyl carbamate, used for the first time for these compounds, followed by high performance liquid chromatography (HPLC) analysis with a fluorescence (FL) detector (lambda excitation 248nm, lambda emision 395nm). The HPLC-DAD-LTQ Orbitrap MS was used in order to provide structural information about the obtained derivatized compounds. The hybrid linear ion trap LTQ Orbitrap mass spectrometer has been introduced in recent years and provides a high mass accuracy. The structures of the derivatized analytes were identified from the protonated molecular ions [M+H](+) and corresponded to the fully labelled analytes. The following analytical parameters were determined for the method using the HPLC-FL: linearity, precision (2.5-10%), instrumental precision intraday (0.8-1.5%) and interday (2.9-6.3%), and detection limits (0.02-0.14mgL(-1)). The stability of stock solutions and derivatized compounds was also investigated. The method was applied to determine the amine free content in epoxy resin dust collected in workplaces. Copyright 2010 Elsevier B.V. All rights reserved.

  8. Analytical solutions for sequentially coupled one-dimensional reactive transport problems Part I: Mathematical derivations

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Clement, T. P.

    2008-02-01

    Multi-species reactive transport equations coupled through sorption and sequential first-order reactions are commonly used to model sites contaminated with radioactive wastes, chlorinated solvents and nitrogenous species. Although researchers have been attempting to solve various forms of these reactive transport equations for over 50 years, a general closed-form analytical solution to this problem is not available in the published literature. In Part I of this two-part article, we derive a closed-form analytical solution to this problem for spatially-varying initial conditions. The proposed solution procedure employs a combination of Laplace and linear transform methods to uncouple and solve the system of partial differential equations. Two distinct solutions are derived for Dirichlet and Cauchy boundary conditions each with Bateman-type source terms. We organize and present the final solutions in a common format that represents the solutions to both boundary conditions. In addition, we provide the mathematical concepts for deriving the solution within a generic framework that can be used for solving similar transport problems.

  9. Argon thermochronology of mineral deposits; a review of analytical methods, formulations, and selected applications

    USGS Publications Warehouse

    Snee, Lawrence W.

    2002-01-01

    40Ar/39Ar geochronology is an experimentally robust and versatile method for constraining time and temperature in geologic processes. The argon method is the most broadly applied in mineral-deposit studies. Standard analytical methods and formulations exist, making the fundamentals of the method well defined. A variety of graphical representations exist for evaluating argon data. A broad range of minerals found in mineral deposits, alteration zones, and host rocks commonly is analyzed to provide age, temporal duration, and thermal conditions for mineralization events and processes. All are discussed in this report. The usefulness of and evolution of the applicability of the method are demonstrated in studies of the Panasqueira, Portugal, tin-tungsten deposit; the Cornubian batholith and associated mineral deposits, southwest England; the Red Mountain intrusive system and associated Urad-Henderson molybdenum deposits; and the Eastern Goldfields Province, Western Australia.

  10. Measurement of "total" microcystins using the MMPB/LC/MS ...

    EPA Pesticide Factsheets

    The detection and quantification of microcystins, a family of toxins associated with harmful algal blooms, is complicated by their structural diversity and a lack of commercially available analytical standards for method development. As a result, most detection methods have focused on either a subset of microcystin congeners, as in US EPA Method 544, or on techniques which are sensitive to structural features common to most microcystins, as in the anti-ADDA ELISA method. A recent development has been the use of 2-methyl-3-methoxy-4-phenylbutyric acid (MMPB), which is produced by chemical oxidation the ADDA moiety in most microcystin congeners, as a proxy for the sum of congeners present. Conditions for the MMPB derivatization were evaluated and applied to water samples obtained from various HAB impacted surface waters, and results were compared with congener-based LC/MS/MS and ELISA methods. The detection and quantification of microcystins, a family of toxins associated with harmful algal blooms, is complicated by their structural diversity and a lack of commercially available analytical standards for method development. As a result, most detection methods have focused on either a subset of microcystin congeners, as in US EPA Method 544, or on techniques which are sensitive to structural features common to most microcystins, as in the anti-ADDA ELISA method. A recent development has been the use of 2-methyl-3-methoxy-4-phenylbutyric acid (MMPB), which is produce

  11. Multi-analyte method development for analysis of brominated flame retardants (BFRs) and PBDE metabolites in human serum.

    PubMed

    Lu, Dasheng; Jin, Yu'e; Feng, Chao; Wang, Dongli; Lin, Yuanjie; Qiu, Xinlei; Xu, Qian; Wen, Yimin; She, Jianwen; Wang, Guoquan; Zhou, Zhijun

    2017-09-01

    Commonly, analytical methods measuring brominated flame retardants (BFRs) of different chemical polarities in human serum are labor consuming and tedious. Our study used acidified diatomaceous earth as solid-phase extraction (SPE) adsorbent and defatting material to simultaneously determine the most abundant BFRs and their metabolites with different polarities in human serum samples. The analytes include three types of commercial BFRs, tetrabromobisphenol A (TBBPA), hexabromocyclododecane (HBCD) isomers, and polybrominated biphenyl ethers (PBDEs), and dominant hydroxylated BDE (OH-PBDE) and methoxylated BDE (MeO-PBDE) metabolites of PBDEs. The sample eluents were sequentially analyzed for PBDEs and MeO-BDEs on online gel permeation chromatography/gas chromatography-electron capture-negative ionization mass spectrometry (online GPC GC-ECNI-MS) and for TBBPA, HBCD, and OH-BDEs on liquid chromatography-tandem mass spectrometry (LC-MS/MS). Method recoveries were 67-134% with a relative standard deviation (RSD) of less than 20%. Method detection limits (MDLs) were 0.30-4.20 pg/mL fresh weight (f.w.) for all analytes, except for BDE-209 of 16 pg/mL f.w. The methodology was also applied in a pilot study, which analyzed ten real samples from healthy donors in China, and the majority of target analytes were detected with a detection rate of more than 80%. To our knowledge, it is the first time for us in effectively determining BFRs of most types in one aliquot of human serum samples. This new analytical method is more specific, sensitive, accurate, and time saving for routine biomonitoring of these BFRs and for integrated assessment of health risk of BFR exposure.

  12. S-curve networks and an approximate method for estimating degree distributions of complex networks

    NASA Astrophysics Data System (ADS)

    Guo, Jin-Li

    2010-12-01

    In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol version 4) addresses, this paper proposes a forecasting model by using S curve (logistic curve). The growing trend of IPv4 addresses in China is forecasted. There are some reference values for optimizing the distribution of IPv4 address resource and the development of IPv6. Based on the laws of IPv4 growth, that is, the bulk growth and the finitely growing limit, it proposes a finite network model with a bulk growth. The model is said to be an S-curve network. Analysis demonstrates that the analytic method based on uniform distributions (i.e., Barabási-Albert method) is not suitable for the network. It develops an approximate method to predict the growth dynamics of the individual nodes, and uses this to calculate analytically the degree distribution and the scaling exponents. The analytical result agrees with the simulation well, obeying an approximately power-law form. This method can overcome a shortcoming of Barabási-Albert method commonly used in current network research.

  13. A Method for Analyzing Commonalities in Clinical Trial Target Populations

    PubMed Central

    He, Zhe; Carini, Simona; Hao, Tianyong; Sim, Ida; Weng, Chunhua

    2014-01-01

    ClinicalTrials.gov presents great opportunities for analyzing commonalities in clinical trial target populations to facilitate knowledge reuse when designing eligibility criteria of future trials or to reveal potential systematic biases in selecting population subgroups for clinical research. Towards this goal, this paper presents a novel data resource for enabling such analyses. Our method includes two parts: (1) parsing and indexing eligibility criteria text; and (2) mining common eligibility features and attributes of common numeric features (e.g., A1c). We designed and built a database called “Commonalities in Target Populations of Clinical Trials” (COMPACT), which stores structured eligibility criteria and trial metadata in a readily computable format. We illustrate its use in an example analytic module called CONECT using COMPACT as the backend. Type 2 diabetes is used as an example to analyze commonalities in the target populations of 4,493 clinical trials on this disease. PMID:25954450

  14. An Improved Method for the Extraction and Thin-Layer Chromatography of Chlorophyll A and B from Spinach

    ERIC Educational Resources Information Center

    Quach, Hao T.; Steeper, Robert L.; Griffin, William G.

    2004-01-01

    A simple and fast method, which resolves chlorophyll a and b from spinach leaves on analytical plates while minimizing the appearance of chlorophyll degradation products is shown. An improved mobile phase for the Thin-layer chromatographic analysis of spinach extract that allows for the complete resolution of the common plant pigments found in…

  15. The Effectiveness of Circular Equating as a Criterion for Evaluating Equating.

    ERIC Educational Resources Information Center

    Wang, Tianyou; Hanson, Bradley A.; Harris, Deborah J.

    Equating a test form to itself through a chain of equatings, commonly referred to as circular equating, has been widely used as a criterion to evaluate the adequacy of equating. This paper uses both analytical methods and simulation methods to show that this criterion is in general invalid in serving this purpose. For the random groups design done…

  16. Quantitative determination of a-Arbutin, ß-Arbutin, Kojic acid, nicotinamide, hydroquinone, resorcinol, 4-methoxyphenol, 4-ethoxyphenol and ascorbic acid from skin whitening Products by HPLC-UV

    USDA-ARS?s Scientific Manuscript database

    Development of an analytical method for the simultaneous determination of multifarious skin whitening agents will provide an efficient tool to analyze skin whitening cosmetics. An HPLC-UV method was developed for quantitative analysis of six commonly used whitening agents, a-arbutin, ß-arbutin, koji...

  17. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen

    2016-01-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  18. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.

    2016-12-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  19. Variability in, variability out: best practice recommendations to standardize pre-analytical variables in the detection of circulating and tissue microRNAs.

    PubMed

    Khan, Jenna; Lieberman, Joshua A; Lockwood, Christina M

    2017-05-01

    microRNAs (miRNAs) hold promise as biomarkers for a variety of disease processes and for determining cell differentiation. These short RNA species are robust, survive harsh treatment and storage conditions and may be extracted from blood and tissue. Pre-analytical variables are critical confounders in the analysis of miRNAs: we elucidate these and identify best practices for minimizing sample variation in blood and tissue specimens. Pre-analytical variables addressed include patient-intrinsic variation, time and temperature from sample collection to storage or processing, processing methods, contamination by cells and blood components, RNA extraction method, normalization, and storage time/conditions. For circulating miRNAs, hemolysis and blood cell contamination significantly affect profiles; samples should be processed within 2 h of collection; ethylene diamine tetraacetic acid (EDTA) is preferred while heparin should be avoided; samples should be "double spun" or filtered; room temperature or 4 °C storage for up to 24 h is preferred; miRNAs are stable for at least 1 year at -20 °C or -80 °C. For tissue-based analysis, warm ischemic time should be <1 h; cold ischemic time (4 °C) <24 h; common fixative used for all specimens; formalin fix up to 72 h prior to processing; enrich for cells of interest; validate candidate biomarkers with in situ visualization. Most importantly, all specimen types should have standard and common workflows with careful documentation of relevant pre-analytical variables.

  20. Selection and authentication of botanical materials for the development of analytical methods.

    PubMed

    Applequist, Wendy L; Miller, James S

    2013-05-01

    Herbal products, for example botanical dietary supplements, are widely used. Analytical methods are needed to ensure that botanical ingredients used in commercial products are correctly identified and that research materials are of adequate quality and are sufficiently characterized to enable research to be interpreted and replicated. Adulteration of botanical material in commerce is common for some species. The development of analytical methods for specific botanicals, and accurate reporting of research results, depend critically on correct identification of test materials. Conscious efforts must therefore be made to ensure that the botanical identity of test materials is rigorously confirmed and documented through preservation of vouchers, and that their geographic origin and handling are appropriate. Use of material with an associated herbarium voucher that can be botanically identified is always ideal. Indirect methods of authenticating bulk material in commerce, for example use of organoleptic, anatomical, chemical, or molecular characteristics, are not always acceptable for the chemist's purposes. Familiarity with botanical and pharmacognostic literature is necessary to determine what potential adulterants exist and how they may be distinguished.

  1. Analytical Modeling for the Bending Resonant Frequency of Multilayered Microresonators with Variable Cross-Section

    PubMed Central

    Herrera-May, Agustín L.; Aguilera-Cortés, Luz A.; Plascencia-Mora, Hector; Rodríguez-Morales, Ángel L.; Lu, Jian

    2011-01-01

    Multilayered microresonators commonly use sensitive coating or piezoelectric layers for detection of mass and gas. Most of these microresonators have a variable cross-section that complicates the prediction of their fundamental resonant frequency (generally of the bending mode) through conventional analytical models. In this paper, we present an analytical model to estimate the first resonant frequency and deflection curve of single-clamped multilayered microresonators with variable cross-section. The analytical model is obtained using the Rayleigh and Macaulay methods, as well as the Euler-Bernoulli beam theory. Our model is applied to two multilayered microresonators with piezoelectric excitation reported in the literature. Both microresonators are composed by layers of seven different materials. The results of our analytical model agree very well with those obtained from finite element models (FEMs) and experimental data. Our analytical model can be used to determine the suitable dimensions of the microresonator’s layers in order to obtain a microresonator that operates at a resonant frequency necessary for a particular application. PMID:22164071

  2. Instrumental Surveillance of Water Quality.

    ERIC Educational Resources Information Center

    Miller, J. A.; And Others

    The role analytical instrumentation performs in the surveillance and control of the quality of water resources is reviewed. Commonly performed analyses may range from simple tests for physical parameters to more highly sophisticated radiological or spectrophotometric methods. This publication explores many of these types of water quality analyses…

  3. Masked mycotoxins in corn: an update

    USDA-ARS?s Scientific Manuscript database

    Mycotoxins are frequent contaminants in corn infested with Aspergillus and Fusarium molds. Consumption of mycotoxin products have been shown to be harmful to both humans and animals. Mycotoxins can be “masked” or “hidden” from detection by common antibody-based and chemical analytical methods. The “...

  4. A SIMPLE COLORIMETRIC METHOD TO DETECT BIOLOGICAL EVIDENCE OF HUMAN EXPOSURE TO MICROCYSTINS

    EPA Science Inventory

    Toxic cyanobacteria are contaminants of surface waters worldwide. Microcystins are some of the most commonly detected toxins. Biological evidence of human exposure may be difficult to obtain due to limitations associated with cost, laboratory capacity, analytic support, and exp...

  5. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    PubMed

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  6. Results of the International Energy Agency Round Robin on Fast Pyrolysis Bio-oil Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Douglas C.; Meier, Dietrich; Oasmaa, Anja

    An international round robin study of the production of fast pyrolysis bio-oil was undertaken. Fifteen institutions in six countries contributed. Three biomass samples were distributed to the laboratories for processing in fast pyrolysis reactors. Samples of the bio-oil produced were transported to a central analytical laboratory for analysis. The round robin was focused on validating the pyrolysis community understanding of production of fast pyrolysis bio-oil by providing a common feedstock for bio-oil preparation. The round robin included: •distribution of 3 feedstock samples from a common source to each participating laboratory; •preparation of fast pyrolysis bio-oil in each laboratory with themore » 3 feedstocks provided; •return of the 3 bio-oil products (minimum 500 ml) with operational description to a central analytical laboratory for bio-oil property determination. The analyses of interest were: density, viscosity, dissolved water, filterable solids, CHN, S, trace element analysis, ash, total acid number, pyrolytic lignin, and accelerated aging of bio-oil. In addition, an effort was made to compare the bio-oil components to the products of analytical pyrolysis through GC/MS analysis. The results showed that clear differences can occur in fast pyrolysis bio-oil properties by applying different reactor technologies or configurations. The comparison to analytical pyrolysis method suggested that Py-GC/MS could serve as a rapid screening method for bio-oil composition when produced in fluid-bed reactors. Furthermore, hot vapor filtration generally resulted in the most favorable bio-oil product, with respect to water, solids, viscosity, and total acid number. These results can be helpful in understanding the variation in bio-oil production methods and their effects on bio-oil product composition.« less

  7. A numerical test of the topographic bias

    NASA Astrophysics Data System (ADS)

    Sjöberg, L. E.; Joud, M. S. S.

    2018-02-01

    In 1962 A. Bjerhammar introduced the method of analytical continuation in physical geodesy, implying that surface gravity anomalies are downward continued into the topographic masses down to an internal sphere (the Bjerhammar sphere). The method also includes analytical upward continuation of the potential to the surface of the Earth to obtain the quasigeoid. One can show that also the common remove-compute-restore technique for geoid determination includes an analytical continuation as long as the complete density distribution of the topography is not known. The analytical continuation implies that the downward continued gravity anomaly and/or potential are/is in error by the so-called topographic bias, which was postulated by a simple formula of L E Sjöberg in 2007. Here we will numerically test the postulated formula by comparing it with the bias obtained by analytical downward continuation of the external potential of a homogeneous ellipsoid to an inner sphere. The result shows that the postulated formula holds: At the equator of the ellipsoid, where the external potential is downward continued 21 km, the computed and postulated topographic biases agree to less than a millimetre (when the potential is scaled to the unit of metre).

  8. An orientation soil survey at the Pebble Cu-Au-Mo porphyry deposit, Alaska

    USGS Publications Warehouse

    Smith, Steven M.; Eppinger, Robert G.; Fey, David L.; Kelley, Karen D.; Giles, S.A.

    2009-01-01

    Soil samples were collected in 2007 and 2008 along three traverses across the giant Pebble Cu-Au-Mo porphyry deposit. Within each soil pit, four subsamples were collected following recommended protocols for each of ten commonly-used and proprietary leach/digestion techniques. The significance of geochemical patterns generated by these techniques was classified by visual inspection of plots showing individual element concentration by each analytical method along the 2007 traverse. A simple matrix by element versus method, populated with a value based on the significance classification, provides a method for ranking the utility of methods and elements at this deposit. The interpretation of a complex multi-element dataset derived from multiple analytical techniques is challenging. An example of vanadium results from a single leach technique is used to illustrate the several possible interpretations of the data.

  9. Integrating Allergen Analysis Within a Risk Assessment Framework: Approaches to Development of Targeted Mass Spectrometry Methods for Allergen Detection and Quantification in the iFAAM Project.

    PubMed

    Nitride, Chiara; Lee, Victoria; Baricevic-Jones, Ivona; Adel-Patient, Karine; Baumgartner, Sabine; Mills, E N Clare

    2018-01-01

    Allergen analysis is central to implementing and monitoring food allergen risk assessment and management processes by the food industry, but current methods for the determination of allergens in foods give highly variable results. The European Union-funded "Integrated Approaches to Food Allergen and Allergy Risk Management" (iFAAM) project has been working to address gaps in knowledge regarding food allergen management and analysis, including the development of novel MS and immuno-based allergen determination methods. Common allergenic food ingredients (peanut, hazelnut, walnut, cow's milk [Bos domesticus], and hen's egg [Gallus domesticus]) and common food matrixes (chocolate dessert and cookie) have been used for both clinical studies and analytical method development to ensure that the new methods are clinically relevant. Allergen molecules have been used as analytical targets and allergenic ingredients incurred into matrixes at levels close to reference doses that may trigger the use of precautionary allergen labeling. An interlaboratory method comparison has been undertaken for the determination of peanut in chocolate dessert using MS and immuno-based methods. The iFAAM approach has highlighted the need for methods to report test results in allergenic protein. This will allow food business operators to use them in risk assessments that are founded on clinical study data in which protein has been used as a measure of allergenic potency.

  10. A general statistical test for correlations in a finite-length time series.

    PubMed

    Hanson, Jeffery A; Yang, Haw

    2008-06-07

    The statistical properties of the autocorrelation function from a time series composed of independently and identically distributed stochastic variables has been studied. Analytical expressions for the autocorrelation function's variance have been derived. It has been found that two common ways of calculating the autocorrelation, moving-average and Fourier transform, exhibit different uncertainty characteristics. For periodic time series, the Fourier transform method is preferred because it gives smaller uncertainties that are uniform through all time lags. Based on these analytical results, a statistically robust method has been proposed to test the existence of correlations in a time series. The statistical test is verified by computer simulations and an application to single-molecule fluorescence spectroscopy is discussed.

  11. Comparison of methods for determination of total oil sands-derived naphthenic acids in water samples.

    PubMed

    Hughes, Sarah A; Huang, Rongfu; Mahaffey, Ashley; Chelme-Ayala, Pamela; Klamerth, Nikolaus; Meshref, Mohamed N A; Ibrahim, Mohamed D; Brown, Christine; Peru, Kerry M; Headley, John V; Gamal El-Din, Mohamed

    2017-11-01

    There are several established methods for the determination of naphthenic acids (NAs) in waters associated with oil sands mining operations. Due to their highly complex nature, measured concentration and composition of NAs vary depending on the method used. This study compared different common sample preparation techniques, analytical instrument methods, and analytical standards to measure NAs in groundwater and process water samples collected from an active oil sands operation. In general, the high- and ultrahigh-resolution methods, namely high performance liquid chromatography time-of-flight mass spectrometry (UPLC-TOF-MS) and Orbitrap mass spectrometry (Orbitrap-MS), were within an order of magnitude of the Fourier transform infrared spectroscopy (FTIR) methods. The gas chromatography mass spectrometry (GC-MS) methods consistently had the highest NA concentrations and greatest standard error. Total NAs concentration was not statistically different between sample preparation of solid phase extraction and liquid-liquid extraction. Calibration standards influenced quantitation results. This work provided a comprehensive understanding of the inherent differences in the various techniques available to measure NAs and hence the potential differences in measured amounts of NAs in samples. Results from this study will contribute to the analytical method standardization for NA analysis in oil sands related water samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Sign changes as a universal concept in first-passage-time calculations

    NASA Astrophysics Data System (ADS)

    Braun, Wilhelm; Thul, Rüdiger

    2017-01-01

    First-passage-time problems are ubiquitous across many fields of study, including transport processes in semiconductors and biological synapses, evolutionary game theory and percolation. Despite their prominence, first-passage-time calculations have proven to be particularly challenging. Analytical results to date have often been obtained under strong conditions, leaving most of the exploration of first-passage-time problems to direct numerical computations. Here we present an analytical approach that allows the derivation of first-passage-time distributions for the wide class of nondifferentiable Gaussian processes. We demonstrate that the concept of sign changes naturally generalizes the common practice of counting crossings to determine first-passage events. Our method works across a wide range of time-dependent boundaries and noise strengths, thus alleviating common hurdles in first-passage-time calculations.

  13. Analytical Plug-In Method for Kernel Density Estimator Applied to Genetic Neutrality Study

    NASA Astrophysics Data System (ADS)

    Troudi, Molka; Alimi, Adel M.; Saoudi, Samir

    2008-12-01

    The plug-in method enables optimization of the bandwidth of the kernel density estimator in order to estimate probability density functions (pdfs). Here, a faster procedure than that of the common plug-in method is proposed. The mean integrated square error (MISE) depends directly upon [InlineEquation not available: see fulltext.] which is linked to the second-order derivative of the pdf. As we intend to introduce an analytical approximation of [InlineEquation not available: see fulltext.], the pdf is estimated only once, at the end of iterations. These two kinds of algorithm are tested on different random variables having distributions known for their difficult estimation. Finally, they are applied to genetic data in order to provide a better characterisation in the mean of neutrality of Tunisian Berber populations.

  14. New methods for new questions: obstacles and opportunities.

    PubMed

    Foster, E Michael; Kalil, Ariel

    2008-03-01

    Two forces motivate this special section, "New Methods for New Questions in Developmental Psychology." First are recent developments in social science methodology and the increasing availability of those methods in common software packages. Second, at the same time psychologists' understanding of developmental phenomena has continued to grow. At their best, these developments in theory and methods work in tandem, fueling each other. Newer methods make it possible for scientists to better test their ideas; better ideas lead methodologists to techniques that better reflect, capture, and quantify the underlying processes. The articles in this special section represent a sampling of these new methods and new questions. The authors describe common themes in these articles and identify barriers to future progress, such as the lack of data sharing by and analytical training for developmentalists.

  15. Ion-pairing HPLC methods to determine EDTA and DTPA in small molecule and biological pharmaceutical formulations.

    PubMed

    Wang, George; Tomasella, Frank P

    2016-06-01

    Ion-pairing high-performance liquid chromatography-ultraviolet (HPLC-UV) methods were developed to determine two commonly used chelating agents, ethylenediaminetetraacetic acid (EDTA) in Abilify® (a small molecule drug with aripiprazole as the active pharmaceutical ingredient) oral solution and diethylenetriaminepentaacetic acid (DTPA) in Yervoy® (a monoclonal antibody drug with ipilimumab as the active pharmaceutical ingredient) intravenous formulation. Since the analytes, EDTA and DTPA, do not contain chromophores, transition metal ions (Cu 2+ , Fe 3+ ) which generate highly stable metallocomplexes with the chelating agents were added into the sample preparation to enhance UV detection. The use of metallocomplexes with ion-pairing chromatography provides the ability to achieve the desired sensitivity and selectivity in the development of the method. Specifically, the sample preparation involving metallocomplex formation allowed sensitive UV detection. Copper was utilized for the determination of EDTA and iron was utilized for the determination of DTPA. In the case of EDTA, a gradient mobile phase separated the components of the formulation from the analyte. In the method for DTPA, the active drug substance, ipilimumab, was eluted in the void. In addition, the optimization of the concentration of the ion-pairing reagent was discussed as a means of enhancing the retention of the aminopolycarboxylic acids (APCAs) including EDTA and DTPA and the specificity of the method. The analytical method development was designed based on the chromatographic properties of the analytes, the nature of the sample matrix and the intended purpose of the method. Validation data were presented for the two methods. Finally, both methods were successfully utilized in determining the fate of the chelates.

  16. Shedding light on food fraud: spectrophotometric and spectroscopic methods as a tool against economically motivated adulteration of food

    NASA Astrophysics Data System (ADS)

    Petronijević, R. B.; Velebit, B.; Baltić, T.

    2017-09-01

    Intentional modification of food or substitution of food ingredients with the aim of gaining profit is food fraud or economically motivated adulteration (EMA). EMA appeared in the food supply chain, and following the global expansion of the food market, has become a world-scale problem for the global economy. Food frauds have involved oils, milk and meat products, infant formula, honey, juices, spices, etc. New legislation was enacted in the last decade in order to fight EMA. Effective analytical methods for food fraud detection are few and still in development. The majority of the methods in common use today for EMA detection are time consuming and inappropriate for use on the production line or out of the laboratory. The next step in the evolution of analytical techniques to combat food fraud is development of fast, accurate methods applicable using portable or handheld devices. Spectrophotometric and spectroscopic methods combined with chemometric analysis, and perhaps in combination with other rapid physico-chemical techniques, could be the answer. This review discusses some analytical techniques based on spectrophotometry and spectroscopy, which are used to reveal food fraud and EMA.

  17. A short history, principles, and types of ELISA, and our laboratory experience with peptide/protein analyses using ELISA.

    PubMed

    Aydin, Suleyman

    2015-10-01

    Playing a critical role in the metabolic homeostasis of living systems, the circulating concentrations of peptides/proteins are influenced by a variety of patho-physiological events. These peptide/protein concentrations in biological fluids are measured using various methods, the most common of which is enzymatic immunoassay EIA/ELISA and which guide the clinicians in diagnosing and monitoring diseases that inflict biological systems. All the techniques where enzymes are employed to show antigen-antibody reactions are generally referred to as enzymatic immunoassay EIA/ELISA method. Since the basic principles of EIA and ELISA are the same. The main objective of this review is to present an overview of the historical journey that had led to the invention of EIA/ELISA, an indispensible method for medical and research laboratories, types of ELISA developed after its invention [direct (the first ELISA method invented), indirect, sandwich and competitive methods], problems encountered during peptide/protein analyses (pre-analytical, analytical and post-analytical), rules to be followed to prevent these problems, and our laboratory experience of more than 15 years. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Statistical learning theory for high dimensional prediction: Application to criterion-keyed scale development.

    PubMed

    Chapman, Benjamin P; Weiss, Alexander; Duberstein, Paul R

    2016-12-01

    Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in "big data" problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how 3 common SLT algorithms-supervised principal components, regularization, and boosting-can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach-or perhaps because of them-SLT methods may hold value as a statistically rigorous approach to exploratory regression. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  20. A rapid miniaturized residue analytical method for the determination of zoxamide and its two acid metabolites in ginseng roots using UPLC-MS/MS.

    PubMed

    Podhorniak, Lynda V

    2014-04-30

    A miniaturized residue method was developed for the analysis of the fungicide zoxamide and its metabolites in dried ginseng root. The zoxamide metabolites, 3,5-dichloro-1,4-benzenedicarboxylic acid (DCBC) and 3,5-dichloro-4-hydroxymethylbenzoic acid (DCHB), are small acid molecules that have not been previously extracted from the ginseng matrix with common multiresidue methods. The presented extraction method effectively and rapidly recovers both the zoxamide parent compound and its acid metabolites from fortified ginseng root. The metabolites are extracted with an alkaline glycine buffer and the aqueous ginseng mixture is partitioned with ethyl acetate. In addition, this method avoids the use of derivatization of the small acid molecules by using UPLC-MS/MS instrumental analysis. In a quantitative validation of the analytical method at three levels for zoxamide (0.007 (LOD), 0.02 (LOQ), and 0.2 mg/kg) and four levels (0.07 (LOD), 0.2 (LOQ), and 0.6 and 6 mg/kg) for both metabolites, acceptable method performances were achieved with recoveries ranging from 86 to 107% (at levels of LOQ and 3×, 10×, and 30× the LOQ) with <20% RSD for the three analytes in accordance with international guidelines.1.

  1. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  2. A new unconditionally stable and consistent quasi-analytical in-stream water quality solution scheme for CSTR-based water quality simulators

    NASA Astrophysics Data System (ADS)

    Woldegiorgis, Befekadu Taddesse; van Griensven, Ann; Pereira, Fernando; Bauwens, Willy

    2017-06-01

    Most common numerical solutions used in CSTR-based in-stream water quality simulators are susceptible to instabilities and/or solution inconsistencies. Usually, they cope with instability problems by adopting computationally expensive small time steps. However, some simulators use fixed computation time steps and hence do not have the flexibility to do so. This paper presents a novel quasi-analytical solution for CSTR-based water quality simulators of an unsteady system. The robustness of the new method is compared with the commonly used fourth-order Runge-Kutta methods, the Euler method and three versions of the SWAT model (SWAT2012, SWAT-TCEQ, and ESWAT). The performance of each method is tested for different hypothetical experiments. Besides the hypothetical data, a real case study is used for comparison. The growth factors we derived as stability measures for the different methods and the R-factor—considered as a consistency measure—turned out to be very useful for determining the most robust method. The new method outperformed all the numerical methods used in the hypothetical comparisons. The application for the Zenne River (Belgium) shows that the new method provides stable and consistent BOD simulations whereas the SWAT2012 model is shown to be unstable for the standard daily computation time step. The new method unconditionally simulates robust solutions. Therefore, it is a reliable scheme for CSTR-based water quality simulators that use first-order reaction formulations.

  3. Much Ado about Nothing--Or at Best, Very Little

    ERIC Educational Resources Information Center

    Widaman, Keith F.

    2014-01-01

    Latent variable structural equation modeling has become the analytic method of choice in many domains of research in psychology and allied social sciences. One important aspect of a latent variable model concerns the relations hypothesized to hold between latent variables and their indicators. The most common specification of structural equation…

  4. Ultra-sensitive detection using integrated waveguide technologies

    USDA-ARS?s Scientific Manuscript database

    There is a pressing need to detect analytes at very low concentrations, such as food- and water-borne pathogens (e.g. E. coli O157:H7) and biothreat agents (e.g., anthrax, toxins). Common fluorescence detection methods, such as 96 well plate readers, are not sufficiently sensitive for low concentra...

  5. ANALYTICAL METHODOLOGY FOR THE DETERMINATION OF KEPONE (TRADEMARK) RESIDUES IN FISH, SHELLFISH, AND HI-VOL AIR FILTERS

    EPA Science Inventory

    The recent discovery of the pollution of the environment with Kepone has resulted in a tremendous interest in the development of residue methodology for the compound. Current multiresidue methods for the determination of the common organochlorinated pesticides do not yield good q...

  6. Methods to Assess Bioavailability of Hydrophobic Organic Contaminants: Principles, Operations, and Limitations

    PubMed Central

    Cui, Xinyi; Mayer, Philipp; Gan, Jay

    2013-01-01

    Many important environmental contaminants are hydrophobic organic contaminants (HOCs), which include PCBs, PAHs, PBDEs, DDT and other chlorinated insecticides, among others. Owing to their strong hydrophobicity, HOCs have their final destination in soil or sediment, where their ecotoxicological effects are closely regulated by sorption and thus bioavailability. The last two decades has seen a dramatic increase in research efforts in developing and applying partitioning based methods and biomimetic extractions for measuring HOC bioavailability. However, the many variations of both analytical methods and associated measurement endpoints are often a source of confusion for users. In this review, we distinguish the most commonly used analytical approaches based on their measurement objectives, and illustrate their practical operational steps, strengths and limitations using simple flowcharts. This review may serve as guidance for new users on the selection and use of established methods, and a reference for experienced investigators to identify potential topics for further research. PMID:23064200

  7. Suitability of analytical methods to measure solubility for the purpose of nanoregulation.

    PubMed

    Tantra, Ratna; Bouwmeester, Hans; Bolea, Eduardo; Rey-Castro, Carlos; David, Calin A; Dogné, Jean-Michel; Jarman, John; Laborda, Francisco; Laloy, Julie; Robinson, Kenneth N; Undas, Anna K; van der Zande, Meike

    2016-01-01

    Solubility is an important physicochemical parameter in nanoregulation. If nanomaterial is completely soluble, then from a risk assessment point of view, its disposal can be treated much in the same way as "ordinary" chemicals, which will simplify testing and characterisation regimes. This review assesses potential techniques for the measurement of nanomaterial solubility and evaluates the performance against a set of analytical criteria (based on satisfying the requirements as governed by the cosmetic regulation as well as the need to quantify the concentration of free (hydrated) ions). Our findings show that no universal method exists. A complementary approach is thus recommended, to comprise an atomic spectrometry-based method in conjunction with an electrochemical (or colorimetric) method. This article shows that although some techniques are more commonly used than others, a huge research gap remains, related with the need to ensure data reliability.

  8. Simultaneous Screening and Quantification of Basic, Neutral and Acidic Drugs in Blood Using UPLC-QTOF-MS.

    PubMed

    Bidny, Sergei; Gago, Kim; Chung, Phuong; Albertyn, Desdemona; Pasin, Daniel

    2017-04-01

    An analytical method using ultra performance liquid chromatography (UPLC) quadrupole time-of-flight mass spectrometry (QTOF-MS) was developed and validated for the targeted toxicological screening and quantification of commonly used pharmaceuticals and drugs of abuse in postmortem blood using 100 µL sample. It screens for more than 185 drugs and metabolites and quantifies more than 90 drugs. The selected compounds include classes of pharmaceuticals and drugs of abuse such as: antidepressants, antipsychotics, analgesics (including narcotic analgesics), anti-inflammatory drugs, benzodiazepines, beta-blockers, amphetamines, new psychoactive substances (NPS), cocaine and metabolites. Compounds were extracted into acetonitrile using a salting-out assisted liquid-liquid extraction (SALLE) procedure. The extracts were analyzed using a Waters ACQUITY UPLC coupled with a XEVO QTOF mass spectrometer. Separation of the analytes was achieved by gradient elution using Waters ACQUITY HSS C18 column (2.1 mm x 150 mm, 1.8 μm). The mass spectrometer was operated in both positive and negative electrospray ionization modes. The high-resolution mass spectrometry (HRMS) data was acquired using a patented Waters MSE acquisition mode which collected low and high energy spectra alternatively during the same acquisition. Positive identification of target analytes was based on accurate mass measurements of the molecular ion, product ion, peak area ratio and retention times. Calibration curves were linear over the concentration range 0.05-2 mg/L for basic and neutral analytes and 0.1-6 mg/L for acidic analytes with the correlation coefficients (r2) > 0.96 for most analytes. The limits of detection (LOD) were between 0.001-0.05 mg/L for all analytes. Good recoveries were achieved ranging from 80% to 100% for most analytes using the SALLE method. The method was validated for sensitivity, selectivity, accuracy, precision, stability, carryover and matrix effects. The developed method was tested on a number of authentic forensic samples producing consistent results that correlated with results obtained from other validated methods. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. ODF Maxima Extraction in Spherical Harmonic Representation via Analytical Search Space Reduction

    PubMed Central

    Aganj, Iman; Lenglet, Christophe; Sapiro, Guillermo

    2015-01-01

    By revealing complex fiber structure through the orientation distribution function (ODF), q-ball imaging has recently become a popular reconstruction technique in diffusion-weighted MRI. In this paper, we propose an analytical dimension reduction approach to ODF maxima extraction. We show that by expressing the ODF, or any antipodally symmetric spherical function, in the common fourth order real and symmetric spherical harmonic basis, the maxima of the two-dimensional ODF lie on an analytically derived one-dimensional space, from which we can detect the ODF maxima. This method reduces the computational complexity of the maxima detection, without compromising the accuracy. We demonstrate the performance of our technique on both artificial and human brain data. PMID:20879302

  10. A novel approach for quantitation of nonderivatized sialic acid in protein therapeutics using hydrophilic interaction chromatographic separation and nano quantity analyte detection.

    PubMed

    Chemmalil, Letha; Suravajjala, Sreekanth; See, Kate; Jordan, Eric; Furtado, Marsha; Sun, Chong; Hosselet, Stephen

    2015-01-01

    This paper describes a novel approach for the quantitation of nonderivatized sialic acid in glycoproteins, separated by hydrophilic interaction chromatography, and detection by Nano Quantity Analyte Detector (NQAD). The detection technique of NQAD is based on measuring change in the size of dry aerosol and converting the particle count rate into chromatographic output signal. NQAD detector is suitable for the detection of sialic acid, which lacks sufficiently active chromophore or fluorophore. The water condensation particle counting technology allows the analyte to be enlarged using water vapor to provide highest sensitivity. Derivatization-free analysis of glycoproteins using HPLC/NQAD method with PolyGLYCOPLEX™ amide column is well correlated with HPLC method with precolumn derivatization using 1, 2-diamino-4, 5-methylenedioxybenzene (DMB) as well as the Dionex-based high-pH anion-exchange chromatography (or ion chromatography) with pulsed amperometric detection (HPAEC-PAD). With the elimination of derivatization step, HPLC/NQAD method is more efficient than HPLC/DMB method. HPLC/NQAD method is more reproducible than HPAEC-PAD method as HPAEC-PAD method suffers high variability because of electrode fouling during analysis. Overall, HPLC/NQAD method offers broad linear dynamic range as well as excellent precision, accuracy, repeatability, reliability, and ease of use, with acceptable comparability to the commonly used HPAEC-PAD and HPLC/DMB methods. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  11. Frontiers of two-dimensional correlation spectroscopy. Part 2. Perturbation methods, fields of applications, and types of analytical probes

    NASA Astrophysics Data System (ADS)

    Noda, Isao

    2014-07-01

    Noteworthy experimental practices, which are advancing forward the frontiers of the field of two-dimensional (2D) correlation spectroscopy, are reviewed with the focus on various perturbation methods currently practiced to induce spectral changes, pertinent examples of applications in various fields, and types of analytical probes employed. Types of perturbation methods found in the published literature are very diverse, encompassing both dynamic and static effects. Although a sizable portion of publications report the use of dynamic perturbatuions, much greater number of studies employ static effect, especially that of temperature. Fields of applications covered by the literature are also very broad, ranging from fundamental research to practical applications in a number of physical, chemical and biological systems, such as synthetic polymers, composites and biomolecules. Aside from IR spectroscopy, which is the most commonly used tool, many other analytical probes are used in 2D correlation analysis. The ever expanding trend in depth, breadth and versatility of 2D correlation spectroscopy techniques and their broad applications all point to the robust and healthy state of the field.

  12. Simultaneous LC-MS/MS determination of 40 legal and illegal psychoactive drugs in breast and bovine milk.

    PubMed

    López-García, Ester; Mastroianni, Nicola; Postigo, Cristina; Valcárcel, Yolanda; González-Alonso, Silvia; Barceló, Damia; López de Alda, Miren

    2018-04-15

    This work presents a fast, sensitive and reliable multi-residue methodology based on fat and protein precipitation and liquid chromatography-tandem mass spectrometry for the determination of common legal and illegal psychoactive drugs, and major metabolites, in breast milk. One-fourth of the 40 target analytes is investigated for the first time in this biological matrix. The method was validated in breast milk and also in various types of bovine milk, as tranquilizers are occasionally administered to food-producing animals. Absolute recoveries were satisfactory for 75% of the target analytes. The use of isotopically labeled compounds assisted in correcting analyte losses due to ionization suppression matrix effects (higher in whole milk than in the other investigated milk matrices) and ensured the reliability of the results. Average method limits of quantification ranged between 0.4 and 6.8 ng/mL. Application of the developed method showed the presence of caffeine in breast milk samples (12-179 ng/mL). Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analyticalmore » chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.« less

  14. Multi-analytical Approaches Informing the Risk of Sepsis

    NASA Astrophysics Data System (ADS)

    Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael

    Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.

  15. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  16. Two Analyte Calibration From The Transient Response Of Potentiometric Sensors Employed With The SIA Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cartas, Raul; Mimendia, Aitor; Valle, Manel del

    2009-05-23

    Calibration models for multi-analyte electronic tongues have been commonly built using a set of sensors, at least one per analyte under study. Complex signals recorded with these systems are formed by the sensors' responses to the analytes of interest plus interferents, from which a multivariate response model is then developed. This work describes a data treatment method for the simultaneous quantification of two species in solution employing the signal from a single sensor. The approach used here takes advantage of the complex information recorded with one electrode's transient after insertion of sample for building the calibration models for both analytes.more » The departure information from the electrode was firstly processed by discrete wavelet for transforming the signals to extract useful information and reduce its length, and then by artificial neural networks for fitting a model. Two different potentiometric sensors were used as study case for simultaneously corroborating the effectiveness of the approach.« less

  17. Molecular detection of Borrelia burgdorferi sensu lato – An analytical comparison of real-time PCR protocols from five different Scandinavian laboratories

    PubMed Central

    Faller, Maximilian; Wilhelmsson, Peter; Kjelland, Vivian; Andreassen, Åshild; Dargis, Rimtas; Quarsten, Hanne; Dessau, Ram; Fingerle, Volker; Margos, Gabriele; Noraas, Sølvi; Ornstein, Katharina; Petersson, Ann-Cathrine; Matussek, Andreas; Lindgren, Per-Eric; Henningsson, Anna J.

    2017-01-01

    Introduction Lyme borreliosis (LB) is the most common tick transmitted disease in Europe. The diagnosis of LB today is based on the patient´s medical history, clinical presentation and laboratory findings. The laboratory diagnostics are mainly based on antibody detection, but in certain conditions molecular detection by polymerase chain reaction (PCR) may serve as a complement. Aim The purpose of this study was to evaluate the analytical sensitivity, analytical specificity and concordance of eight different real-time PCR methods at five laboratories in Sweden, Norway and Denmark. Method Each participating laboratory was asked to analyse three different sets of samples (reference panels; all blinded) i) cDNA extracted and transcribed from water spiked with cultured Borrelia strains, ii) cerebrospinal fluid spiked with cultured Borrelia strains, and iii) DNA dilution series extracted from cultured Borrelia and relapsing fever strains. The results and the method descriptions of each laboratory were systematically evaluated. Results and conclusions The analytical sensitivities and the concordance between the eight protocols were in general high. The concordance was especially high between the protocols using 16S rRNA as the target gene, however, this concordance was mainly related to cDNA as the type of template. When comparing cDNA and DNA as the type of template the analytical sensitivity was in general higher for the protocols using DNA as template regardless of the use of target gene. The analytical specificity for all eight protocols was high. However, some protocols were not able to detect Borrelia spielmanii, Borrelia lusitaniae or Borrelia japonica. PMID:28937997

  18. Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.

    PubMed

    Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G

    2018-06-01

    This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.

  19. Ray Tracing and Modal Methods for Modeling Radio Propagation in Tunnels With Rough Walls

    PubMed Central

    Zhou, Chenming

    2017-01-01

    At the ultrahigh frequencies common to portable radios, tunnels such as mine entries are often modeled by hollow dielectric waveguides. The roughness condition of the tunnel walls has an influence on radio propagation, and therefore should be taken into account when an accurate power prediction is needed. This paper investigates how wall roughness affects radio propagation in tunnels, and presents a unified ray tracing and modal method for modeling radio propagation in tunnels with rough walls. First, general analytical formulas for modeling the influence of the wall roughness are derived, based on the modal method and the ray tracing method, respectively. Second, the equivalence of the ray tracing and modal methods in the presence of wall roughnesses is mathematically proved, by showing that the ray tracing-based analytical formula can converge to the modal-based formula through the Poisson summation formula. The derivation and findings are verified by simulation results based on ray tracing and modal methods. PMID:28935995

  20. Lagrangian based methods for coherent structure detection

    NASA Astrophysics Data System (ADS)

    Allshouse, Michael R.; Peacock, Thomas

    2015-09-01

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.

  1. Analyzing the Heterogeneous Hierarchy of Cultural Heritage Materials: Analytical Imaging.

    PubMed

    Trentelman, Karen

    2017-06-12

    Objects of cultural heritage significance are created using a wide variety of materials, or mixtures of materials, and often exhibit heterogeneity on multiple length scales. The effective study of these complex constructions thus requires the use of a suite of complementary analytical technologies. Moreover, because of the importance and irreplaceability of most cultural heritage objects, researchers favor analytical techniques that can be employed noninvasively, i.e., without having to remove any material for analysis. As such, analytical imaging has emerged as an important approach for the study of cultural heritage. Imaging technologies commonly employed, from the macroscale through the micro- to nanoscale, are discussed with respect to how the information obtained helps us understand artists' materials and methods, the cultures in which the objects were created, how the objects may have changed over time, and importantly, how we may develop strategies for their preservation.

  2. Compensating for Effects of Humidity on Electronic Noses

    NASA Technical Reports Server (NTRS)

    Homer, Margie; Ryan, Margaret A.; Manatt, Kenneth; Zhou, Hanying; Manfreda, Allison

    2004-01-01

    A method of compensating for the effects of humidity on the readouts of electronic noses has been devised and tested. The method is especially appropriate for use in environments in which humidity is not or cannot be controlled for example, in the vicinity of a chemical spill, which can be accompanied by large local changes in humidity. Heretofore, it has been common practice to treat water vapor as merely another analyte, the concentration of which is determined, along with that of the other analytes, in a computational process based on deconvolution. This practice works well, but leaves room for improvement: changes in humidity can give rise to large changes in electronic-nose responses. If corrections for humidity are not made, the large humidity-induced responses may swamp smaller responses associated with low concentrations of analytes. The present method offers an improvement. The underlying concept is simple: One augments an electronic nose with a separate humidity and a separate temperature sensor. The outputs of the humidity and temperature sensors are used to generate values that are subtracted from the readings of the other sensors in an electronic nose to correct for the temperature-dependent contributions of humidity to those readings. Hence, in principle, what remains after corrections are the contributions of the analytes only. Laboratory experiments on a first-generation electronic nose have shown that this method is effective and improves the success rate of identification of analyte/ water mixtures. Work on a second-generation device was in progress at the time of reporting the information for this article.

  3. Research on bathymetry estimation by Worldview-2 based with the semi-analytical model

    NASA Astrophysics Data System (ADS)

    Sheng, L.; Bai, J.; Zhou, G.-W.; Zhao, Y.; Li, Y.-C.

    2015-04-01

    South Sea Islands of China are far away from the mainland, the reefs takes more than 95% of south sea, and most reefs scatter over interested dispute sensitive area. Thus, the methods of obtaining the reefs bathymetry accurately are urgent to be developed. Common used method, including sonar, airborne laser and remote sensing estimation, are limited by the long distance, large area and sensitive location. Remote sensing data provides an effective way for bathymetry estimation without touching over large area, by the relationship between spectrum information and bathymetry. Aimed at the water quality of the south sea of China, our paper develops a bathymetry estimation method without measured water depth. Firstly the semi-analytical optimization model of the theoretical interpretation models has been studied based on the genetic algorithm to optimize the model. Meanwhile, OpenMP parallel computing algorithm has been introduced to greatly increase the speed of the semi-analytical optimization model. One island of south sea in China is selected as our study area, the measured water depth are used to evaluate the accuracy of bathymetry estimation from Worldview-2 multispectral images. The results show that: the semi-analytical optimization model based on genetic algorithm has good results in our study area;the accuracy of estimated bathymetry in the 0-20 meters shallow water area is accepted.Semi-analytical optimization model based on genetic algorithm solves the problem of the bathymetry estimation without water depth measurement. Generally, our paper provides a new bathymetry estimation method for the sensitive reefs far away from mainland.

  4. Difficulties in fumonisin determination: the issue of hidden fumonisins.

    PubMed

    Dall'Asta, Chiara; Mangia, Mattia; Berthiller, Franz; Molinelli, Alexandra; Sulyok, Michael; Schuhmacher, Rainer; Krska, Rudolf; Galaverna, Gianni; Dossena, Arnaldo; Marchelli, Rosangela

    2009-11-01

    In this paper, the results obtained by five independent methods for the quantification of fumonisins B(1), B(2), and B(3) in raw maize are reported. Five naturally contaminated maize samples and a reference material were analyzed in three different laboratories. Although each method was validated and common calibrants were used, a poor agreement about fumonisin contamination levels was obtained. In order to investigate the interactions among analyte and matrix leading to this lack of consistency, the occurrence of fumonisin derivatives was checked. Significant amounts of hidden fumonisins were detected for all the considered samples. Furthermore, the application of an in vitro digestion protocol to raw maize allowed for a higher recovery of native fumonisins, suggesting that the interaction occurring among analytes and matrix macromolecules is associative rather than covalent. Depending on the analytical method as well as the maize sample, only 37-68% of the total fumonisin concentrations were found to be extractable from the samples. These results are particularly impressive and significant in the case of the certified reference material, underlying the actual difficulties in ascertaining the trueness of a method for fumonisin determination, opening thus an important issue for risk assessment.

  5. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  6. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  7. Effects of scale of movement, detection probability, and true population density on common methods of estimating population density

    DOE PAGES

    Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.; ...

    2017-08-25

    Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less

  8. Effects of scale of movement, detection probability, and true population density on common methods of estimating population density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.

    Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less

  9. An analytical model for pressure of volume fractured tight oil reservoir with horizontal well

    NASA Astrophysics Data System (ADS)

    Feng, Qihong; Dou, Kaiwen; Zhang, Xianmin; Xing, Xiangdong; Xia, Tian

    2017-05-01

    The property of tight oil reservoir is worse than common reservoir that we usually seen before, the porosity and permeability is low, the diffusion is very complex. Therefore, the ordinary depletion method is useless here. The volume fracture breaks through the conventional EOR mechanism, which set the target by amplifying the contact area of fracture and reservoir so as to improving the production of every single well. In order to forecast the production effectively, we use the traditional dual-porosity model, build an analytical model for production of volume fractured tight oil reservoir with horizontal well, and get the analytical solution in Laplace domain. Then we construct the log-log plot of dimensionless pressure and time by stiffest conversion. After that, we discuss the influential factors of pressure. Several factors like cross flow, skin factors and threshold pressure gradient was analyzed in the article. This model provides a useful method for tight oil production forecast and it has certain guiding significance for the production capacity prediction and dynamic analysis.

  10. Magnetic Nanoparticles for Antibiotics Detection

    PubMed Central

    Cristea, Cecilia; Tertis, Mihaela; Galatus, Ramona

    2017-01-01

    Widespread use of antibiotics has led to pollution of waterways, potentially creating resistance among freshwater bacterial communities. Microorganisms resistant to commonly prescribed antibiotics (superbug) have dramatically increased over the last decades. The presence of antibiotics in waters, in food and beverages in both their un-metabolized and metabolized forms are of interest for humans. This is due to daily exposure in small quantities, that, when accumulated, could lead to development of drug resistance to antibiotics, or multiply the risk of allergic reaction. Conventional analytical methods used to quantify antibiotics are relatively expensive and generally require long analysis time associated with the difficulties to perform field analyses. In this context, electrochemical and optical based sensing devices are of interest, offering great potentials for a broad range of analytical applications. This review will focus on the application of magnetic nanoparticles in the design of different analytical methods, mainly sensors, used for the detection of antibiotics in different matrices (human fluids, the environmental, food and beverages samples). PMID:28538684

  11. Analyzing Response Times in Tests with Rank Correlation Approaches

    ERIC Educational Resources Information Center

    Ranger, Jochen; Kuhn, Jorg-Tobias

    2013-01-01

    It is common practice to log-transform response times before analyzing them with standard factor analytical methods. However, sometimes the log-transformation is not capable of linearizing the relation between the response times and the latent traits. Therefore, a more general approach to response time analysis is proposed in the current…

  12. The Use of Cluster Analysis in Typological Research on Community College Students

    ERIC Educational Resources Information Center

    Bahr, Peter Riley; Bielby, Rob; House, Emily

    2011-01-01

    One useful and increasingly popular method of classifying students is known commonly as cluster analysis. The variety of techniques that comprise the cluster analytic family are intended to sort observations (for example, students) within a data set into subsets (clusters) that share similar characteristics and differ in meaningful ways from other…

  13. Designing Evaluations. 2012 Revision. Applied Research and Methods. GAO-12-208G

    ERIC Educational Resources Information Center

    US Government Accountability Office, 2012

    2012-01-01

    GAO assists congressional decision makers in their deliberations by furnishing them with analytical information on issues and options. Many diverse methodologies are needed to develop sound and timely answers to the questions the Congress asks. To provide GAO evaluators with basic information about the more commonly used methodologies, GAO's…

  14. Beyond single-stream with the Schrödinger method

    NASA Astrophysics Data System (ADS)

    Uhlemann, Cora; Kopp, Michael

    2016-10-01

    We investigate large scale structure formation of collisionless dark matter in the phase space description based on the Vlasov-Poisson equation. We present the Schrödinger method, originally proposed by \\cite{WK93} as numerical technique based on the Schrödinger Poisson equation, as an analytical tool which is superior to the common standard pressureless fluid model. Whereas the dust model fails and develops singularities at shell crossing the Schrödinger method encompasses multi-streaming and even virialization.

  15. Accounting for Uncertainty in Decision Analytic Models Using Rank Preserving Structural Failure Time Modeling: Application to Parametric Survival Models.

    PubMed

    Bennett, Iain; Paracha, Noman; Abrams, Keith; Ray, Joshua

    2018-01-01

    Rank Preserving Structural Failure Time models are one of the most commonly used statistical methods to adjust for treatment switching in oncology clinical trials. The method is often applied in a decision analytic model without appropriately accounting for additional uncertainty when determining the allocation of health care resources. The aim of the study is to describe novel approaches to adequately account for uncertainty when using a Rank Preserving Structural Failure Time model in a decision analytic model. Using two examples, we tested and compared the performance of the novel Test-based method with the resampling bootstrap method and with the conventional approach of no adjustment. In the first example, we simulated life expectancy using a simple decision analytic model based on a hypothetical oncology trial with treatment switching. In the second example, we applied the adjustment method on published data when no individual patient data were available. Mean estimates of overall and incremental life expectancy were similar across methods. However, the bootstrapped and test-based estimates consistently produced greater estimates of uncertainty compared with the estimate without any adjustment applied. Similar results were observed when using the test based approach on a published data showing that failing to adjust for uncertainty led to smaller confidence intervals. Both the bootstrapping and test-based approaches provide a solution to appropriately incorporate uncertainty, with the benefit that the latter can implemented by researchers in the absence of individual patient data. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. Orthogonal analytical methods for botanical standardization: Determination of green tea catechins by qNMR and LC-MS/MS

    PubMed Central

    Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106

  17. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Researching Mental Health Disorders in the Era of Social Media: Systematic Review.

    PubMed

    Wongkoblap, Akkapon; Vadillo, Miguel A; Curcin, Vasa

    2017-06-29

    Mental illness is quickly becoming one of the most prevalent public health problems worldwide. Social network platforms, where users can express their emotions, feelings, and thoughts, are a valuable source of data for researching mental health, and techniques based on machine learning are increasingly used for this purpose. The objective of this review was to explore the scope and limits of cutting-edge techniques that researchers are using for predictive analytics in mental health and to review associated issues, such as ethical concerns, in this area of research. We performed a systematic literature review in March 2017, using keywords to search articles on data mining of social network data in the context of common mental health disorders, published between 2010 and March 8, 2017 in medical and computer science journals. The initial search returned a total of 5386 articles. Following a careful analysis of the titles, abstracts, and main texts, we selected 48 articles for review. We coded the articles according to key characteristics, techniques used for data collection, data preprocessing, feature extraction, feature selection, model construction, and model verification. The most common analytical method was text analysis, with several studies using different flavors of image analysis and social interaction graph analysis. Despite an increasing number of studies investigating mental health issues using social network data, some common problems persist. Assembling large, high-quality datasets of social media users with mental disorder is problematic, not only due to biases associated with the collection methods, but also with regard to managing consent and selecting appropriate analytics techniques. ©Akkapon Wongkoblap, Miguel A Vadillo, Vasa Curcin. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.06.2017.

  19. Quantification of 4 antidepressants and a metabolite by LC-MS for therapeutic drug monitoring.

    PubMed

    Choong, Eva; Rudaz, Serge; Kottelat, Astrid; Haldemann, Sophie; Guillarme, Davy; Veuthey, Jean-Luc; Eap, Chin B

    2011-06-01

    A liquid chromatography method coupled to mass spectrometry was developed for the quantification of bupropion, its metabolite hydroxy-bupropion, moclobemide, reboxetine and trazodone in human plasma. The validation of the analytical procedure was assessed according to Société Française des Sciences et Techniques Pharmaceutiques and the latest Food and Drug Administration guidelines. The sample preparation was performed with 0.5 mL of plasma extracted on a cation-exchange solid phase 96-well plate. The separation was achieved in 14 min on a C18 XBridge column (2.1 mm×100 mm, 3.5 μm) using a 50 mM ammonium acetate pH 9/acetonitrile mobile phase in gradient mode. The compounds of interest were analysed in the single ion monitoring mode on a single quadrupole mass spectrometer working in positive electrospray ionisation mode. Two ions were selected per molecule to increase the number of identification points and to avoid as much as possible any false positives. Since selectivity is always a critical point for routine therapeutic drug monitoring, more than sixty common comedications for the psychiatric population were tested. For each analyte, the analytical procedure was validated to cover the common range of concentrations measured in plasma samples: 1-400 ng/mL for reboxetine and bupropion, 2-2000 ng/mL for hydroxy-bupropion, moclobemide, and trazodone. For all investigated compounds, reliable performance in terms of accuracy, precision, trueness, recovery, selectivity and stability was obtained. One year after its implementation in a routine process, this method demonstrated a high robustness with accurate values over the wide concentration range commonly observed among a psychiatric population. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  1. An automated real-time free phenytoin assay to replace the obsolete Abbott TDx method.

    PubMed

    Williams, Christopher; Jones, Richard; Akl, Pascale; Blick, Kenneth

    2014-01-01

    Phenytoin is a commonly used anticonvulsant that is highly protein bound with a narrow therapeutic range. The unbound fraction, free phenytoin (FP), is responsible for pharmacologic effects; therefore, it is essential to measure both FP and total serum phenytoin levels. Historically, the Abbott TDx method has been widely used for the measurement of FP and was the method used in our laboratory. However, the FP TDx assay was recently discontinued by the manufacturer, so we had to develop an alternative methodology. We evaluated the Beckman-Coulter DxC800 based FP method for linearity, analytical sensitivity, and precision. The analytical measurement range of the method was 0.41 to 5.30 microg/mL. Within-run and between-run precision studies yielded CVs of 3.8% and 5.5%, respectively. The method compared favorably with the TDx method, yielding the following regression equation: DxC800 = 0.9**TDx + 0.10; r2 = 0.97 (n = 97). The new FP assay appears to be an acceptable alternative to the TDx method.

  2. Entropy generation in Gaussian quantum transformations: applying the replica method to continuous-variable quantum information theory

    NASA Astrophysics Data System (ADS)

    Gagatsos, Christos N.; Karanikas, Alexandros I.; Kordas, Georgios; Cerf, Nicolas J.

    2016-02-01

    In spite of their simple description in terms of rotations or symplectic transformations in phase space, quadratic Hamiltonians such as those modelling the most common Gaussian operations on bosonic modes remain poorly understood in terms of entropy production. For instance, determining the quantum entropy generated by a Bogoliubov transformation is notably a hard problem, with generally no known analytical solution, while it is vital to the characterisation of quantum communication via bosonic channels. Here we overcome this difficulty by adapting the replica method, a tool borrowed from statistical physics and quantum field theory. We exhibit a first application of this method to continuous-variable quantum information theory, where it enables accessing entropies in an optical parametric amplifier. As an illustration, we determine the entropy generated by amplifying a binary superposition of the vacuum and a Fock state, which yields a surprisingly simple, yet unknown analytical expression.

  3. A Comparison of Analytical and Data Preprocessing Methods for Spectral Fingerprinting

    PubMed Central

    LUTHRIA, DEVANAND L.; MUKHOPADHYAY, SUDARSAN; LIN, LONG-ZE; HARNLY, JAMES M.

    2013-01-01

    Spectral fingerprinting, as a method of discriminating between plant cultivars and growing treatments for a common set of broccoli samples, was compared for six analytical instruments. Spectra were acquired for finely powdered solid samples using Fourier transform infrared (FT-IR) and Fourier transform near-infrared (NIR) spectrometry. Spectra were also acquired for unfractionated aqueous methanol extracts of the powders using molecular absorption in the ultraviolet (UV) and visible (VIS) regions and mass spectrometry with negative (MS−) and positive (MS+) ionization. The spectra were analyzed using nested one-way analysis of variance (ANOVA) and principal component analysis (PCA) to statistically evaluate the quality of discrimination. All six methods showed statistically significant differences between the cultivars and treatments. The significance of the statistical tests was improved by the judicious selection of spectral regions (IR and NIR), masses (MS+ and MS−), and derivatives (IR, NIR, UV, and VIS). PMID:21352644

  4. Choosing and Using Introns in Molecular Phylogenetics

    PubMed Central

    Creer, Simon

    2007-01-01

    Introns are now commonly used in molecular phylogenetics in an attempt to recover gene trees that are concordant with species trees, but there are a range of genomic, logistical and analytical considerations that are infrequently discussed in empirical studies that utilize intron data. This review outlines expedient approaches for locus selection, overcoming paralogy problems, recombination detection methods and the identification and incorporation of LVHs in molecular systematics. A range of parsimony and Bayesian analytical approaches are also described in order to highlight the methods that can currently be employed to align sequences and treat indels in subsequent analyses. By covering the main points associated with the generation and analysis of intron data, this review aims to provide a comprehensive introduction to using introns (or any non-coding nuclear data partition) in contemporary phylogenetics. PMID:19461984

  5. Correlation of finite element free vibration predictions using random vibration test data. M.S. Thesis - Cleveland State Univ.

    NASA Technical Reports Server (NTRS)

    Chambers, Jeffrey A.

    1994-01-01

    Finite element analysis is regularly used during the engineering cycle of mechanical systems to predict the response to static, thermal, and dynamic loads. The finite element model (FEM) used to represent the system is often correlated with physical test results to determine the validity of analytical results provided. Results from dynamic testing provide one means for performing this correlation. One of the most common methods of measuring accuracy is by classical modal testing, whereby vibratory mode shapes are compared to mode shapes provided by finite element analysis. The degree of correlation between the test and analytical mode shapes can be shown mathematically using the cross orthogonality check. A great deal of time and effort can be exhausted in generating the set of test acquired mode shapes needed for the cross orthogonality check. In most situations response data from vibration tests are digitally processed to generate the mode shapes from a combination of modal parameters, forcing functions, and recorded response data. An alternate method is proposed in which the same correlation of analytical and test acquired mode shapes can be achieved without conducting the modal survey. Instead a procedure is detailed in which a minimum of test information, specifically the acceleration response data from a random vibration test, is used to generate a set of equivalent local accelerations to be applied to the reduced analytical model at discrete points corresponding to the test measurement locations. The static solution of the analytical model then produces a set of deformations that once normalized can be used to represent the test acquired mode shapes in the cross orthogonality relation. The method proposed has been shown to provide accurate results for both a simple analytical model as well as a complex space flight structure.

  6. Analytical methods of the U.S. Geological Survey's New York District Water-Analysis Laboratory

    USGS Publications Warehouse

    Lawrence, Gregory B.; Lincoln, Tricia A.; Horan-Ross, Debra A.; Olson, Mark L.; Waldron, Laura A.

    1995-01-01

    The New York District of the U.S. Geological Survey (USGS) in Troy, N.Y., operates a water-analysis laboratory for USGS watershed-research projects in the Northeast that require analyses of precipitation and of dilute surface water and soil water for major ions; it also provides analyses of certain chemical constituents in soils and soil gas samples.This report presents the methods for chemical analyses of water samples, soil-water samples, and soil-gas samples collected in wateshed-research projects. The introduction describes the general materials and technicques for each method and explains the USGS quality-assurance program and data-management procedures; it also explains the use of cross reference to the three most commonly used methods manuals for analysis of dilute waters. The body of the report describes the analytical procedures for (1) solution analysis, (2) soil analysis, and (3) soil-gas analysis. The methods are presented in alphabetical order by constituent. The method for each constituent is preceded by (1) reference codes for pertinent sections of the three manuals mentioned above, (2) a list of the method's applications, and (3) a summary of the procedure. The methods section for each constitutent contains the following categories: instrumentation and equipment, sample preservation and storage, reagents and standards, analytical procedures, quality control, maintenance, interferences, safety considerations, and references. Sufficient information is presented for each method to allow the resulting data to be appropriately used in environmental investigations.

  7. Collaborative derivation of reference intervals for major clinical laboratory tests in Japan.

    PubMed

    Ichihara, Kiyoshi; Yomamoto, Yoshikazu; Hotta, Taeko; Hosogaya, Shigemi; Miyachi, Hayato; Itoh, Yoshihisa; Ishibashi, Midori; Kang, Dongchon

    2016-05-01

    Three multicentre studies of reference intervals were conducted recently in Japan. The Committee on Common Reference Intervals of the Japan Society of Clinical Chemistry sought to establish common reference intervals for 40 laboratory tests which were measured in common in the three studies and regarded as well harmonized in Japan. The study protocols were comparable with recruitment mostly from hospital workers with body mass index ≤28 and no medications. Age and sex distributions were made equal to obtain a final data size of 6345 individuals. Between-subgroup differences were expressed as the SD ratio (between-subgroup SD divided by SD representing the reference interval). Between-study differences were all within acceptable levels, and thus the three datasets were merged. By adopting SD ratio ≥0.50 as a guide, sex-specific reference intervals were necessary for 12 assays. Age-specific reference intervals for females partitioned at age 45 were required for five analytes. The reference intervals derived by the parametric method resulted in appreciable narrowing of the ranges by applying the latent abnormal values exclusion method in 10 items which were closely associated with prevalent disorders among healthy individuals. Sex- and age-related profiles of reference values, derived from individuals with no abnormal results in major tests, showed peculiar patterns specific to each analyte. Common reference intervals for nationwide use were developed for 40 major tests, based on three multicentre studies by advanced statistical methods. Sex- and age-related profiles of reference values are of great relevance not only for interpreting test results, but for applying clinical decision limits specified in various clinical guidelines. © The Author(s) 2015.

  8. Clustering, Seriation, and Subset Extraction of Confusion Data

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Steinley, Douglas

    2006-01-01

    The study of confusion data is a well established practice in psychology. Although many types of analytical approaches for confusion data are available, among the most common methods are the extraction of 1 or more subsets of stimuli, the partitioning of the complete stimulus set into distinct groups, and the ordering of the stimulus set. Although…

  9. Analytical method for determining rill detachment of purple soil as compared with that of loess soil

    USDA-ARS?s Scientific Manuscript database

    Rills are commonly found on sloping farmlands in both the loess and purple soil regions of China. Rill erosion is an important component of slope water erosion, and primary sediment sources in small catchments in the areas. A comparative study on rill erosion on loess and purple soils is important t...

  10. Do placebo based validation standards mimic real batch products behaviour? Case studies.

    PubMed

    Bouabidi, A; Talbi, M; Bouklouze, A; El Karbane, M; Bourichi, H; El Guezzar, M; Ziemons, E; Hubert, Ph; Rozet, E

    2011-06-01

    Analytical methods validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application. Validation usually involves validation standards or quality control samples that are prepared in placebo or reconstituted matrix made of a mixture of all the ingredients composing the drug product except the active substance or the analyte under investigation. However, one of the main concerns that can be made with this approach is that it may lack an important source of variability that come from the manufacturing process. The question that remains at the end of the validation step is about the transferability of the quantitative performance from validation standards to real authentic drug product samples. In this work, this topic is investigated through three case studies. Three analytical methods were validated using the commonly spiked placebo validation standards at several concentration levels as well as using samples coming from authentic batch samples (tablets and syrups). The results showed that, depending on the type of response function used as calibration curve, there were various degrees of differences in the results accuracy obtained with the two types of samples. Nonetheless the use of spiked placebo validation standards was showed to mimic relatively well the quantitative behaviour of the analytical methods with authentic batch samples. Adding these authentic batch samples into the validation design may help the analyst to select and confirm the most fit for purpose calibration curve and thus increase the accuracy and reliability of the results generated by the method in routine application. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Analytical interference of HBOC-201 (Hemopure, a synthetic hemoglobin-based oxygen carrier) on four common clinical chemistry platforms.

    PubMed

    Korte, Erik A; Pozzi, Nicole; Wardrip, Nina; Ayyoubi, M Tayyeb; Jortani, Saeed A

    2018-07-01

    There are 13 million blood transfusions each year in the US. Limitations in the donor pool, storage capabilities, mass casualties, access in remote locations and reactivity of donors all limit the availability of transfusable blood products to patients. HBOC-201 (Hemopure®) is a second-generation glutaraldehyde-polymer of bovine hemoglobin, which can serve as an "oxygen bridge" to maintain oxygen carrying capacity while transfusion products are unavailable. Hemopure presents the advantages of extended shelf life, ambient storage, and limited reactive potential, but its extracellular location can also cause significant interference in modern laboratory analyzers similar to severe hemolysis. Observed error in 26 commonly measured analytes was determined on 4 different analytical platforms in plasma from a patient therapeutically transfused Hemopure as well as donor blood spiked with Hemopure at a level equivalent to the therapeutic loading dose (10% v/v). Significant negative error ratios >50% of the total allowable error (>0.5tAE) were reported in 23/104 assays (22.1%), positive bias of >0.5tAE in 26/104 assays (25.0%), and acceptable bias between -0.5tAE and 0.5tAE error ratio was reported in 44/104 (42.3%). Analysis failed in the presence of Hemopure in 11/104 (10.6%). Observed error is further subdivided by platform, wavelength, dilution and reaction method. Administration of Hemopure (or other hemoglobin-based oxygen carriers) presents a challenge to laboratorians tasked with analyzing patient specimens. We provide laboratorians with a reference to evaluate patient samples, select optimal analytical platforms for specific analytes, and predict possible bias beyond the 4 analytical platforms included in this study. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era.

    PubMed

    Ferreira, Ana P; Tobyn, Mike

    2015-01-01

    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  13. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  14. An assessment of the Nguyen and Pinder method for slug test analysis. [In situ estimates of ground water contamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, J.J. Jr.; Hyder, Z.

    The Nguyen and Pinder method is one of four techniques commonly used for analysis of response data from slug tests. Limited field research has raised questions about the reliability of the parameter estimates obtained with this method. A theoretical evaluation of this technique reveals that errors were made in the derivation of the analytical solution upon which the technique is based. Simulation and field examples show that the errors result in parameter estimates that can differ from actual values by orders of magnitude. These findings indicate that the Nguyen and Pinder method should no longer be a tool in themore » repertoire of the field hydrogeologist. If data from a slug test performed in a partially penetrating well in a confined aquifer need to be analyzed, recent work has shown that the Hvorslev method is the best alternative among the commonly used techniques.« less

  15. SVPWM Technique with Varying DC-Link Voltage for Common Mode Voltage Reduction in a Matrix Converter and Analytical Estimation of its Output Voltage Distortion

    NASA Astrophysics Data System (ADS)

    Padhee, Varsha

    Common Mode Voltage (CMV) in any power converter has been the major contributor to premature motor failures, bearing deterioration, shaft voltage build up and electromagnetic interference. Intelligent control methods like Space Vector Pulse Width Modulation (SVPWM) techniques provide immense potential and flexibility to reduce CMV, thereby targeting all the afore mentioned problems. Other solutions like passive filters, shielded cables and EMI filters add to the volume and cost metrics of the entire system. Smart SVPWM techniques therefore, come with a very important advantage of being an economical solution. This thesis discusses a modified space vector technique applied to an Indirect Matrix Converter (IMC) which results in the reduction of common mode voltages and other advanced features. The conventional indirect space vector pulse-width modulation (SVPWM) method of controlling matrix converters involves the usage of two adjacent active vectors and one zero vector for both rectifying and inverting stages of the converter. By suitable selection of space vectors, the rectifying stage of the matrix converter can generate different levels of virtual DC-link voltage. This capability can be exploited for operation of the converter in different ranges of modulation indices for varying machine speeds. This results in lower common mode voltage and improves the harmonic spectrum of the output voltage, without increasing the number of switching transitions as compared to conventional modulation. To summarize it can be said that the responsibility of formulating output voltages with a particular magnitude and frequency has been transferred solely to the rectifying stage of the IMC. Estimation of degree of distortion in the three phase output voltage is another facet discussed in this thesis. An understanding of the SVPWM technique and the switching sequence of the space vectors in detail gives the potential to estimate the RMS value of the switched output voltage of any converter. This conceivably aids the sizing and design of output passive filters. An analytical estimation method has been presented to achieve this purpose for am IMC. Knowledge of the fundamental component in output voltage can be utilized to calculate its Total Harmonic Distortion (THD). The effectiveness of the proposed SVPWM algorithms and the analytical estimation technique is substantiated by simulations in MATLAB / Simulink and experiments on a laboratory prototype of the IMC. Proper comparison plots have been provided to contrast the performance of the proposed methods with the conventional SVPWM method. The behavior of output voltage distortion and CMV with variation in operating parameters like modulation index and output frequency has also been analyzed.

  16. In situ ionic liquid dispersive liquid-liquid microextraction and direct microvial insert thermal desorption for gas chromatographic determination of bisphenol compounds.

    PubMed

    Cacho, Juan Ignacio; Campillo, Natalia; Viñas, Pilar; Hernández-Córdoba, Manuel

    2016-01-01

    A new procedure based on direct insert microvial thermal desorption injection allows the direct analysis of ionic liquid extracts by gas chromatography and mass spectrometry (GC-MS). For this purpose, an in situ ionic liquid dispersive liquid-liquid microextraction (in situ IL DLLME) has been developed for the quantification of bisphenol A (BPA), bisphenol Z (BPZ) and bisphenol F (BPF). Different parameters affecting the extraction efficiency of the microextraction technique and the thermal desorption step were studied. The optimized procedure, determining the analytes as acetyl derivatives, provided detection limits of 26, 18 and 19 ng L(-1) for BPA, BPZ and BPF, respectively. The release of the three analytes from plastic containers was monitored using this newly developed analytical method. Analysis of the migration test solutions for 15 different plastic containers in daily use identified the presence of the analytes at concentrations ranging between 0.07 and 37 μg L(-1) in six of the samples studied, BPA being the most commonly found and at higher concentrations than the other analytes.

  17. Methods to assess bioavailability of hydrophobic organic contaminants: Principles, operations, and limitations.

    PubMed

    Cui, Xinyi; Mayer, Philipp; Gan, Jay

    2013-01-01

    Many important environmental contaminants are hydrophobic organic contaminants (HOCs), which include PCBs, PAHs, PBDEs, DDT and other chlorinated insecticides, among others. Owing to their strong hydrophobicity, HOCs have their final destination in soil or sediment, where their ecotoxicological effects are closely regulated by sorption and thus bioavailability. The last two decades have seen a dramatic increase in research efforts in developing and applying partitioning based methods and biomimetic extractions for measuring HOC bioavailability. However, the many variations of both analytical methods and associated measurement endpoints are often a source of confusion for users. In this review, we distinguish the most commonly used analytical approaches based on their measurement objectives, and illustrate their practical operational steps, strengths and limitations using simple flowcharts. This review may serve as guidance for new users on the selection and use of established methods, and a reference for experienced investigators to identify potential topics for further research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Sex-specific reference intervals of hematologic and biochemical analytes in Sprague-Dawley rats using the nonparametric rank percentile method.

    PubMed

    He, Qili; Su, Guoming; Liu, Keliang; Zhang, Fangcheng; Jiang, Yong; Gao, Jun; Liu, Lida; Jiang, Zhongren; Jin, Minwu; Xie, Huiping

    2017-01-01

    Hematologic and biochemical analytes of Sprague-Dawley rats are commonly used to determine effects that were induced by treatment and to evaluate organ dysfunction in toxicological safety assessments, but reference intervals have not been well established for these analytes. Reference intervals as presently defined for these analytes in Sprague-Dawley rats have not used internationally recommended statistical method nor stratified by sex. Thus, we aimed to establish sex-specific reference intervals for hematologic and biochemical parameters in Sprague-Dawley rats according to Clinical and Laboratory Standards Institute C28-A3 and American Society for Veterinary Clinical Pathology guideline. Hematology and biochemistry blood samples were collected from 500 healthy Sprague-Dawley rats (250 males and 250 females) in the control groups. We measured 24 hematologic analytes with the Sysmex XT-2100i analyzer, 9 biochemical analytes with the Olympus AU400 analyzer. We then determined statistically relevant sex partitions and calculated reference intervals, including corresponding 90% confidence intervals, using nonparametric rank percentile method. We observed that most hematologic and biochemical analytes of Sprague-Dawley rats were significantly influenced by sex. Males had higher hemoglobin, hematocrit, red blood cell count, red cell distribution width, mean corpuscular volume, mean corpuscular hemoglobin, white blood cell count, neutrophils, lymphocytes, monocytes, percentage of neutrophils, percentage of monocytes, alanine aminotransferase, aspartate aminotransferase, and triglycerides compared to females. Females had higher mean corpuscular hemoglobin concentration, plateletcrit, platelet count, eosinophils, percentage of lymphocytes, percentage of eosinophils, creatinine, glucose, total cholesterol and urea compared to males. Sex partition was required for most hematologic and biochemical analytes in Sprague-Dawley rats. We established sex-specific reference intervals, including corresponding 90% confidence intervals, for Sprague-Dawley rats. Understanding the significant discrepancies in hematologic and biochemical analytes between male and female Sprague-Dawley rats provides important insight into physiological effects in test rats. Establishment of locally sex-specific reference intervals allows a more precise evaluation of animal quality and experimental results of Sprague-Dawley rats in our toxicology safety assessment.

  19. Can cloud point-based enrichment, preservation, and detection methods help to bridge gaps in aquatic nanometrology?

    PubMed

    Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan

    2016-11-01

    Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.

  20. A highly sensitive method for the simultaneous UHPLC-MS/MS analysis of clonidine, morphine, midazolam and their metabolites in blood plasma using HFIP as the eluent additive.

    PubMed

    Veigure, Rūta; Aro, Rudolf; Metsvaht, Tuuli; Standing, Joseph F; Lutsar, Irja; Herodes, Koit; Kipper, Karin

    2017-05-01

    In intensive care units, the precise administration of sedatives and analgesics is crucial in order to avoid under- or over sedation and for appropriate pain control. Both can be harmful to the patient, causing side effects or pain and suffering. This is especially important in the case of pediatric patients, and dose-response relationships require studies using pharmacokinetic-pharmacodynamic modeling. The aim of this work was to develop and validate a rapid ultra-high performance liquid chromatographic-tandem mass spectrometric method for the analysis of three common sedative and analgesic agents: morphine, clonidine and midazolam, and their metabolites (morphine-3-glucuronide, morphine-6-glucuronide and 1'-hydroxymidazolam) in blood plasma at trace level concentrations. Low concentrations and low sampling volumes may be expected in pediatric patients; we report the lowest limit of quantification for all analytes as 0.05ng/mL using only 100μL of blood plasma. The analytes were separated chromatographically using the C18 column with the weak ion-pairing additive 1,1,1,3,3,3-hexafluoro-2-propanol and methanol. The method was fully validated and a matrix matched calibration range of 0.05-250ng/mL was attained for all analytes In addition, between-day accuracy for all analytes remained within 93-108%, and precision remained within 1.5-9.6% for all analytes at all concentration levels over the calibration range. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  1. Teaching dermatoscopy of pigmented skin tumours to novices: comparison of analytic vs. heuristic approach.

    PubMed

    Tschandl, P; Kittler, H; Schmid, K; Zalaudek, I; Argenziano, G

    2015-06-01

    There are two strategies to approach the dermatoscopic diagnosis of pigmented skin tumours, namely the verbal-based analytic and the more visual-global heuristic method. It is not known if one or the other is more efficient in teaching dermatoscopy. To compare two teaching methods in short-term training of dermatoscopy to medical students. Fifty-seven medical students in the last year of the curriculum were given a 1-h lecture of either the heuristic- or the analytic-based teaching of dermatoscopy. Before and after this session, they were shown the same 50 lesions and asked to diagnose them and rate for chance of malignancy. Test lesions consisted of melanomas, basal cell carcinomas, nevi, seborrhoeic keratoses, benign vascular tumours and dermatofibromas. Performance measures were diagnostic accuracy regarding malignancy as measured by the area under the curves of receiver operating curves (range: 0-1), as well as per cent correct diagnoses (range: 0-100%). Diagnostic accuracy as well as per cent correct diagnoses increased by +0.21 and +32.9% (heuristic teaching) and +0.19 and +35.7% (analytic teaching) respectively (P for all <0.001). Neither for diagnostic accuracy (P = 0.585), nor for per cent correct diagnoses (P = 0.298) was a difference between the two groups. Short-term training of dermatoscopy to medical students allows significant improvement in diagnostic abilities. Choosing a heuristic or analytic method does not have an influence on this effect in short training using common pigmented skin lesions. © 2014 European Academy of Dermatology and Venereology.

  2. A global multicenter study on reference values: 1. Assessment of methods for derivation and comparison of reference intervals.

    PubMed

    Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Qiu, Ling; Erasmus, Rajiv; Borai, Anwar; Evgina, Svetlana; Ashavaid, Tester; Khan, Dilshad; Schreier, Laura; Rolle, Reynan; Shimizu, Yoshihisa; Kimura, Shogo; Kawano, Reo; Armbruster, David; Mori, Kazuo; Yadav, Binod K

    2017-04-01

    The IFCC Committee on Reference Intervals and Decision Limits coordinated a global multicenter study on reference values (RVs) to explore rational and harmonizable procedures for derivation of reference intervals (RIs) and investigate the feasibility of sharing RIs through evaluation of sources of variation of RVs on a global scale. For the common protocol, rather lenient criteria for reference individuals were adopted to facilitate harmonized recruitment with planned use of the latent abnormal values exclusion (LAVE) method. As of July 2015, 12 countries had completed their study with total recruitment of 13,386 healthy adults. 25 analytes were measured chemically and 25 immunologically. A serum panel with assigned values was measured by all laboratories. RIs were derived by parametric and nonparametric methods. The effect of LAVE methods is prominent in analytes which reflect nutritional status, inflammation and muscular exertion, indicating that inappropriate results are frequent in any country. The validity of the parametric method was confirmed by the presence of analyte-specific distribution patterns and successful Gaussian transformation using the modified Box-Cox formula in all countries. After successful alignment of RVs based on the panel test results, nearly half the analytes showed variable degrees of between-country differences. This finding, however, requires confirmation after adjusting for BMI and other sources of variation. The results are reported in the second part of this paper. The collaborative study enabled us to evaluate rational methods for deriving RIs and comparing the RVs based on real-world datasets obtained in a harmonized manner. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Simultaneous analysis of historical, emerging and novel brominated flame retardants in food and feed using a common extraction and purification method.

    PubMed

    Bichon, Emmanuelle; Guiffard, Ingrid; Vénisseau, Anaïs; Lesquin, Elodie; Vaccher, Vincent; Marchand, Philippe; Le Bizec, Bruno

    2018-08-01

    Brominated Flame Retardants (BFRs) are still widely used for industrial purposes. These contaminants may enter the food chain where they mainly occur in food of animal origin. The aim of our work was to provide a unique method able to quantify the widest range of BFRs in feed and food items. After freeze-drying and grinding, a pressurized liquid extraction was carried out. The extract was purified on acidified silica, Florisil ® and carbon columns, the four separated fractions were analyzed by gas and liquid chromatography coupled to high resolution and tandem mass spectrometry. Isotopic dilution was preferentially used when commercial labelled compounds were available. Analytical sensitivity was in accordance with the expectations of Recommendation 2014/118/EU for PBDEs, HBCDDs, TBBPA, TBBPA-bME, EHTBB, BEHTEBP and TBBPA-bME. Additional BFRs were included in this analytical method with the same level of performances (LOQs below 0.01 ng g -1 ww). These are PBBs, pTBX, TBCT, PBBz, PBT, PBEB, HBBz, BTBPE, OBIND and T23BPIC. However, some of the BFRs listed in Recommendation 2014/118/EU are not yet covered by our analytical method, i.e. TBBPA-bOHEE, TBBPA-bAE, TBBPA-bGE, TBBPA-bDiBPrE, TBBPS, TBBPS-bME, TDBPP, EBTEBPI, HBCYD and DBNPG. The uncertainty measurement was fully calculated for 21 of the 31 analytes monitored in the method. Reproducibility uncertainty was below 23% in isotopic dilution. Certified reference materials are now required to better characterize the trueness of this method, which was applied in the French National Control Plans. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Lagrangian based methods for coherent structure detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allshouse, Michael R., E-mail: mallshouse@chaos.utexas.edu; Peacock, Thomas, E-mail: tomp@mit.edu

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other twomore » approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.« less

  5. Analytical Chemistry Laboratory. Progress report for FY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients --more » Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.« less

  6. Modeling systematic errors: polychromatic sources of Beer-Lambert deviations in HPLC/UV and nonchromatographic spectrophotometric assays.

    PubMed

    Galli, C

    2001-07-01

    It is well established that the use of polychromatic radiation in spectrophotometric assays leads to excursions from the Beer-Lambert limit. This Note models the resulting systematic error as a function of assay spectral width, slope of molecular extinction coefficient, and analyte concentration. The theoretical calculations are compared with recent experimental results; a parameter is introduced which can be used to estimate the magnitude of the systematic error in both chromatographic and nonchromatographic spectrophotometric assays. It is important to realize that the polychromatic radiation employed in common laboratory equipment can yield assay errors up to approximately 4%, even at absorption levels generally considered 'safe' (i.e. absorption <1). Thus careful consideration of instrumental spectral width, analyte concentration, and slope of molecular extinction coefficient is required to ensure robust analytical methods.

  7. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    PubMed

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  8. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    PubMed Central

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217

  9. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  10. Toxin Detection by Surface Plasmon Resonance

    PubMed Central

    Hodnik, Vesna; Anderluh, Gregor

    2009-01-01

    Significant efforts have been invested in the past years for the development of analytical methods for fast toxin detection in food and water. Immunochemical methods like ELISA, spectroscopy and chromatography are the most used in toxin detection. Different methods have been linked, e.g. liquid chromatography and mass spectrometry (LC-MS), in order to detect as low concentrations as possible. Surface plasmon resonance (SPR) is one of the new biophysical methods which enables rapid toxin detection. Moreover, this method was already included in portable sensors for on-site determinations. In this paper we describe some of the most common methods for toxin detection, with an emphasis on SPR. PMID:22573957

  11. Determination of mycotoxins in foods: current state of analytical methods and limitations.

    PubMed

    Köppen, Robert; Koch, Matthias; Siegel, David; Merkel, Stefan; Maul, Ronald; Nehls, Irene

    2010-05-01

    Mycotoxins are natural contaminants produced by a range of fungal species. Their common occurrence in food and feed poses a threat to the health of humans and animals. This threat is caused either by the direct contamination of agricultural commodities or by a "carry-over" of mycotoxins and their metabolites into animal tissues, milk, and eggs after feeding of contaminated hay or corn. As a consequence of their diverse chemical structures and varying physical properties, mycotoxins exhibit a wide range of biological effects. Individual mycotoxins can be genotoxic, mutagenic, carcinogenic, teratogenic, and oestrogenic. To protect consumer health and to reduce economic losses, surveillance and control of mycotoxins in food and feed has become a major objective for producers, regulatory authorities and researchers worldwide. However, the variety of chemical structures makes it impossible to use one single technique for mycotoxin analysis. Hence, a vast number of analytical methods has been developed and validated. The heterogeneity of food matrices combined with the demand for a fast, simultaneous and accurate determination of multiple mycotoxins creates enormous challenges for routine analysis. The most crucial issues will be discussed in this review. These are (1) the collection of representative samples, (2) the performance of classical and emerging analytical methods based on chromatographic or immunochemical techniques, (3) the validation of official methods for enforcement, and (4) the limitations and future prospects of the current methods.

  12. Pooling sheep faecal samples for the assessment of anthelmintic drug efficacy using McMaster and Mini-FLOTAC in gastrointestinal strongyle and Nematodirus infection.

    PubMed

    Kenyon, Fiona; Rinaldi, Laura; McBean, Dave; Pepe, Paola; Bosco, Antonio; Melville, Lynsey; Devin, Leigh; Mitchell, Gillian; Ianniello, Davide; Charlier, Johannes; Vercruysse, Jozef; Cringoli, Giuseppe; Levecke, Bruno

    2016-07-30

    In small ruminants, faecal egg counts (FECs) and reduction in FECs (FECR) are the most common methods for the assessment of intensity of gastrointestinal (GI) nematodes infections and anthelmintic drug efficacy, respectively. The main limitation of these methods is the time and cost to conduct FECs on a representative number of individual animals. A cost-saving alternative would be to examine pooled faecal samples, however little is known regarding whether pooling can give representative results. In the present study, we compared the FECR results obtained by both an individual and a pooled examination strategy across different pool sizes and analytical sensitivity of the FEC techniques. A survey was conducted on 5 sheep farms in Scotland, where anthelmintic resistance is known to be widespread. Lambs were treated with fenbendazole (4 groups), levamisole (3 groups), ivermectin (3 groups) or moxidectin (1 group). For each group, individual faecal samples were collected from 20 animals, at baseline (D0) and 14 days after (D14) anthelmintic administration. Faecal samples were analyzed as pools of 3-5, 6-10, and 14-20 individual samples. Both individual and pooled samples were screened for GI strongyle and Nematodirus eggs using two FEC techniques with three different levels of analytical sensitivity, including Mini-FLOTAC (analytical sensitivity of 10 eggs per gram of faeces (EPG)) and McMaster (analytical sensitivity of 15 or 50 EPG).For both Mini-FLOTAC and McMaster (analytical sensitivity of 15 EPG), there was a perfect agreement in classifying the efficacy of the anthelmintic as 'normal', 'doubtful' or 'reduced' regardless of pool size. When using the McMaster method (analytical sensitivity of 50 EPG) anthelmintic efficacy was often falsely classified as 'normal' or assessment was not possible due to zero FECs at D0, and this became more pronounced when the pool size increased. In conclusion, pooling ovine faecal samples holds promise as a cost-saving and efficient strategy for assessing GI nematode FECR. However, for the assessment FECR one will need to consider the baseline FEC, pool size and analytical sensitivity of the method. Copyright © 2016. Published by Elsevier B.V.

  13. Kinetic Titration Series with Biolayer Interferometry

    PubMed Central

    Frenzel, Daniel; Willbold, Dieter

    2014-01-01

    Biolayer interferometry is a method to analyze protein interactions in real-time. In this study, we illustrate the usefulness to quantitatively analyze high affinity protein ligand interactions employing a kinetic titration series for characterizing the interactions between two pairs of interaction patterns, in particular immunoglobulin G and protein G B1 as well as scFv IC16 and amyloid beta (1–42). Kinetic titration series are commonly used in surface plasmon resonance and involve sequential injections of analyte over a desired concentration range on a single ligand coated sensor chip without waiting for complete dissociation between the injections. We show that applying this method to biolayer interferometry is straightforward and i) circumvents problems in data evaluation caused by unavoidable sensor differences, ii) saves resources and iii) increases throughput if screening a multitude of different analyte/ligand combinations. PMID:25229647

  14. Kinetic titration series with biolayer interferometry.

    PubMed

    Frenzel, Daniel; Willbold, Dieter

    2014-01-01

    Biolayer interferometry is a method to analyze protein interactions in real-time. In this study, we illustrate the usefulness to quantitatively analyze high affinity protein ligand interactions employing a kinetic titration series for characterizing the interactions between two pairs of interaction patterns, in particular immunoglobulin G and protein G B1 as well as scFv IC16 and amyloid beta (1-42). Kinetic titration series are commonly used in surface plasmon resonance and involve sequential injections of analyte over a desired concentration range on a single ligand coated sensor chip without waiting for complete dissociation between the injections. We show that applying this method to biolayer interferometry is straightforward and i) circumvents problems in data evaluation caused by unavoidable sensor differences, ii) saves resources and iii) increases throughput if screening a multitude of different analyte/ligand combinations.

  15. Problems and solutions of polyethylene glycol co-injection method in multiresidue pesticide analysis by gas chromatography-mass spectrometry: evaluation of instability phenomenon in type II pyrethroids and its suppression by novel analyte protectants.

    PubMed

    Akutsu, Kazuhiko; Kitagawa, Yoko; Yoshimitsu, Masato; Takatori, Satoshi; Fukui, Naoki; Osakada, Masakazu; Uchida, Kotaro; Azuma, Emiko; Kajimura, Keiji

    2018-05-01

    Polyethylene glycol 300 is commonly used as a base material for "analyte protection" in multiresidue pesticide analysis via gas chromatography-mass spectrometry. However, the disadvantage of the co-injection method using polyethylene glycol 300 is that it causes peak instability in α-cyano pyrethroids (type II pyrethroids) such as fluvalinate. In this study, we confirmed the instability phenomenon in type II pyrethroids and developed novel analyte protectants for acetone/n-hexane mixture solution to suppress the phenomenon. Our findings revealed that among the examined additive compounds, three lipophilic ascorbic acid derivatives, 3-O-ethyl-L-ascorbic acid, 6-O-palmitoyl-L-ascorbic acid, and 6-O-stearoyl-L-ascorbic acid, could effectively stabilize the type II pyrethroids in the presence of polyethylene glycol 300. A mixture of the three ascorbic acid derivatives and polyethylene glycol 300 proved to be an effective analyte protectant for multiresidue pesticide analysis. Further, we designed and evaluated a new combination of analyte protectant compounds without using polyethylene glycol or the troublesome hydrophilic compounds. Consequently, we obtained a set of 10 medium- and long-chain saturated fatty acids as an effective analyte protectant suitable for acetone/n-hexane solution that did not cause peak instability in type II pyrethroids. These analyte protectants will be useful in multiresidue pesticide analysis by gas chromatography-mass spectrometry in terms of ruggedness and reliable quantitativeness. Graphical abstract Comparison of effectiveness of the addition of lipophilic derivatives of ascorbic acid in controlling the instability phenomenon of fluvalinate with polyethylene glycol 300.

  16. Alkaloids Profiling of Fumaria capreolata by Analytical Platforms Based on the Hyphenation of Gas Chromatography and Liquid Chromatography with Quadrupole-Time-of-Flight Mass Spectrometry.

    PubMed

    Contreras, María Del Mar; Bribi, Noureddine; Gómez-Caravaca, Ana María; Gálvez, Julio; Segura-Carretero, Antonio

    2017-01-01

    Two analytical platforms, gas chromatography (GC) coupled to quadrupole-time-of-flight (QTOF) mass spectrometry (MS) and reversed-phase ultrahigh performance liquid chromatography (UHPLC) coupled to diode array (DAD) and QTOF detection, were applied in order to study the alkaloid profile of Fumaria capreolata . The use of these mass analyzers enabled tentatively identifying the alkaloids by matching their accurate mass signals and suggested molecular formulae with those previously reported in libraries and databases. Moreover, the proposed structures were corroborated by studying their fragmentation pattern obtained by both platforms. In this way, 8 and 26 isoquinoline alkaloids were characterized using GC-QTOF-MS and RP-UHPLC-DAD-QTOF-MS, respectively, and they belonged to the following subclasses: protoberberine, protopine, aporphine, benzophenanthridine, spirobenzylisoquinoline, morphinandienone, and benzylisoquinoline. Moreover, the latter analytical method was selected to determine at 280 nm the concentration of protopine (9.6 ± 0.7 mg/g), a potential active compound of the extract. In conclusion, although GC-MS has been commonly used for the analysis of this type of phytochemicals, RP-UHPLC-DAD-QTOF-MS provided essential complementary information. This analytical method can be applied for the quality control of phytopharmaceuticals containing Fumaria extracts currently found in the market.

  17. Alkaloids Profiling of Fumaria capreolata by Analytical Platforms Based on the Hyphenation of Gas Chromatography and Liquid Chromatography with Quadrupole-Time-of-Flight Mass Spectrometry

    PubMed Central

    Bribi, Noureddine; Gómez-Caravaca, Ana María

    2017-01-01

    Two analytical platforms, gas chromatography (GC) coupled to quadrupole-time-of-flight (QTOF) mass spectrometry (MS) and reversed-phase ultrahigh performance liquid chromatography (UHPLC) coupled to diode array (DAD) and QTOF detection, were applied in order to study the alkaloid profile of Fumaria capreolata. The use of these mass analyzers enabled tentatively identifying the alkaloids by matching their accurate mass signals and suggested molecular formulae with those previously reported in libraries and databases. Moreover, the proposed structures were corroborated by studying their fragmentation pattern obtained by both platforms. In this way, 8 and 26 isoquinoline alkaloids were characterized using GC-QTOF-MS and RP-UHPLC-DAD-QTOF-MS, respectively, and they belonged to the following subclasses: protoberberine, protopine, aporphine, benzophenanthridine, spirobenzylisoquinoline, morphinandienone, and benzylisoquinoline. Moreover, the latter analytical method was selected to determine at 280 nm the concentration of protopine (9.6 ± 0.7 mg/g), a potential active compound of the extract. In conclusion, although GC-MS has been commonly used for the analysis of this type of phytochemicals, RP-UHPLC-DAD-QTOF-MS provided essential complementary information. This analytical method can be applied for the quality control of phytopharmaceuticals containing Fumaria extracts currently found in the market. PMID:29348751

  18. Development of sampling and analytical methods for concerted determination of commonly used chloroacetanilide, chlorotriazine, and 2,4-D herbicides in hand-wash, dermal-patch, and air samples.

    PubMed

    Tucker, S P; Reynolds, J M; Wickman, D C; Hines, C J; Perkins, J B

    2001-06-01

    Sampling and analytical methods were developed for commonly used chloroacetanilide, chlorotriazine, and 2,4-D herbicides in hand washes, on dermal patches, and in air. Eight herbicides selected for study were alachlor, atrazine, cyanazine, 2,4-dichlorophenoxyacetic acid (2,4-D), metolachlor, simazine, and two esters of 2,4-D, the 2-butoxyethyl ester (2,4-D, BE) and the 2-ethylhexyl ester (2,4-D, EH). The hand-wash method consisted of shaking the worker's hand in 150 mL of isopropanol in a polyethylene bag for 30 seconds. The dermal-patch method entailed attaching a 10-cm x 10-cm x 0.6-cm polyurethane foam (PUF) patch to the worker for exposure; recovery of the herbicides was achieved by extraction with 40 mL of isopropanol. The air method involved sampling with an OVS-2 tube (which contained an 11-mm quartz fiber filter and two beds of XAD-2 resin) and recovery with 2 mL of 10:90 methanol:methyl t-butyl ether. Analysis of each of the three sample types was performed by gas chromatography with an electron-capture detector. Diazomethane in solution was employed to convert 2,4-D as the free acid to the methyl ester in each of the three methods for ease of gas chromatography. Silicic acid was added to sample solutions to quench excess diazomethane. Limits of detection for all eight herbicides were matrix-dependent and, generally, less than 1 microgram per sample for each matrix. Sampling and analytical methods met NIOSH evaluation criteria for all herbicides in hand-wash samples, for seven herbicides in air samples (all herbicides except cyanazine), and for six herbicides in dermal-patch samples (all herbicides except cyanazine and 2,4-D). Speciation of 2,4-D esters and simultaneous determination of 2,4-D acid were possible without losses of the esters or of other herbicides (acetanilides and triazines) being determined.

  19. Dithizone-modified graphene oxide nano-sheet as a sorbent for pre-concentration and determination of cadmium and lead ions in food.

    PubMed

    Moghadam Zadeh, Hamid Reza; Ahmadvand, Parvaneh; Behbahani, Ali; Amini, Mostafa M; Sayar, Omid

    2015-01-01

    Graphene oxide nano-sheet was modified with dithizone as a novel sorbent for selective pre-concentration and determination of Cd(II) and Pb(II) in food. The sorbent was characterised by various analytical methods and the effective parameters for Cd(II) and Pb(II) adsorption were optimised during this work. The high adsorption capacity and selectivity of this sorbent makes the method capable of fast determinations of the Cd(II) and Pb(II) content in complicated matrices even at μg l(-1) levels using commonly available instrumentation. The precision of this method was < 1.9% from 10 duplicate determinations and its accuracy verified using standard reference materials. Finally, this method was applied to the determination of Cd(II) and Pb(II) ions in common food samples and satisfactory results were obtained.

  20. John Herschel's Graphical Method

    NASA Astrophysics Data System (ADS)

    Hankins, Thomas L.

    2011-01-01

    In 1833 John Herschel published an account of his graphical method for determining the orbits of double stars. He had hoped to be the first to determine such orbits, but Felix Savary in France and Johann Franz Encke in Germany beat him to the punch using analytical methods. Herschel was convinced, however, that his graphical method was much superior to analytical methods, because it used the judgment of the hand and eye to correct the inevitable errors of observation. Line graphs of the kind used by Herschel became common only in the 1830s, so Herschel was introducing a new method. He also found computation fatiguing and devised a "wheeled machine" to help him out. Encke was skeptical of Herschel's methods. He said that he lived for calculation and that the English would be better astronomers if they calculated more. It is difficult to believe that the entire Scientific Revolution of the 17th century took place without graphs and that only a few examples appeared in the 18th century. Herschel promoted the use of graphs, not only in astronomy, but also in the study of meteorology and terrestrial magnetism. Because he was the most prominent scientist in England, Herschel's advocacy greatly advanced graphical methods.

  1. Stability indicating simplified HPLC method for simultaneous analysis of resveratrol and quercetin in nanoparticles and human plasma.

    PubMed

    Kumar, Sandeep; Lather, Viney; Pandita, Deepti

    2016-04-15

    Resveratrol and quercetin are well-known polyphenolic compounds present in common foods, which have demonstrated enormous potential in the treatment of a wide variety of diseases. Owing to their exciting synergistic potential and combination delivery applications, we developed a simple and rapid RP-HPLC method based on isosbestic point detection. The separation was carried out on phenomenex Synergi 4μ Hydro-RP 80A column using methanol: acetonitrile (ACN): 0.1% phosphoric acid (60:10:30) as mobile phase. The method was able to quantify nanograms of analytes simultaneously on a single wavelength (269 nm), making it highly sensitive, rapid as well as economical. Additionally, forced degradation studies of resveratrol and quercetin were established and the method's applicability was evaluated on PLGA nanoparticles and human plasma. The analytes peaks were found to be well resolved in the presence of degradation products and excipients. The simplicity of the developed method potentializes its suitability for routine in vitro and in vivo analysis of resveratrol and quercetin. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  3. Method for the analysis of triadimefon and ethofumesate from dislodgeable foliar residues on turfgrass by solid-phase extraction and in-vial elution.

    PubMed

    Runes, H B; Jenkins, J J; Field, J A

    1999-08-01

    Triadimefon, a fungicide, and ethofumesate, an herbicide, are commonly applied to turfgrass in the Pacific Northwest, resulting in foliar residues. A simple and rapid method was developed to determine triadimefon and ethofumesate concentrations from dislodgeable foliar residues on turfgrass. Turfgrass samples were washed, and wash water containing surfactant (a 0.126% solution) was collected for residue analysis. This analytical method utilizes a 25 mm C(8) Empore disk and in-vial elution to quantitatively determine triadimefon and ethofumesate in 170 mL aqueous samples. The analytes were eluted by placing the disk in a 2 mL autosampler vial with 980 microL of ethyl acetate and 20 microL of 2-chlorolepidine, the internal standard, for analysis by GC/MS. The method quantitation limits are 0.29 microg/L for ethofumesate and 0.59 microg/L for triadimefon. The method detection limits are 0.047 microg/L and 0.29 microg/L for ethofumesate and triadimefon, respectively. Concentrations of triadimefon and ethofumesate from dislodgeable foliar residues from a field study are reported.

  4. Can matrix solid phase dispersion (MSPD) be more simplified? Application of solventless MSPD sample preparation method for GC-MS and GC-FID analysis of plant essential oil components.

    PubMed

    Wianowska, Dorota; Dawidowicz, Andrzej L

    2016-05-01

    This paper proposes and shows the analytical capabilities of a new variant of matrix solid phase dispersion (MSPD) with the solventless blending step in the chromatographic analysis of plant volatiles. The obtained results prove that the use of a solvent is redundant as the sorption ability of the octadecyl brush is sufficient for quantitative retention of volatiles from 9 plants differing in their essential oil composition. The extraction efficiency of the proposed simplified MSPD method is equivalent to the efficiency of the commonly applied variant of MSPD with the organic dispersing liquid and pressurized liquid extraction, which is a much more complex, technically advanced and highly efficient technique of plant extraction. The equivalency of these methods is confirmed by the variance analysis. The proposed solventless MSPD method is precise, accurate, and reproducible. The recovery of essential oil components estimated by the MSPD method exceeds 98%, which is satisfactory for analytical purposes. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Quantification of Acetaminophen and Its Metabolites in Plasma Using UPLC-MS: Doors Open to Therapeutic Drug Monitoring in Special Patient Populations.

    PubMed

    Flint, Robert B; Mian, Paola; van der Nagel, Bart; Slijkhuis, Nuria; Koch, Birgit C P

    2017-04-01

    Acetaminophen (APAP, paracetamol) is the most commonly used drug for pain and fever in both the United States and Europe and is considered safe when used at registered dosages. Nevertheless, differences between specific populations lead to remarkable changes in exposure to potentially toxic metabolites. Furthermore, extended knowledge is required on metabolite formation after intoxication, to optimize antidote treatment. Therefore, the authors aimed to develop and validate a quick and easy analytical method for simultaneous quantification of APAP, APAP-glucuronide, APAP-sulfate, APAP-cysteine, APAP-glutathione, APAP-mercapturate, and protein-derived APAP-cysteine in human plasma by ultraperformance liquid chromatography-electrospray ionization-tandem mass spectrometry. The internal standard was APAP-D4 for all analytes. Chromatographic separation was achieved with a reversed-phase Acquity ultraperformance liquid chromatography HSS T3 column with a runtime of only 4.5 minutes per injected sample. Gradient elution was performed with a mobile phase consisting of ammonium acetate, formic acid in Milli-Q ultrapure water or in methanol at flow rate of 0.4 mL/minute. A plasma volume of only 10 μL was required to achieve both adequate accuracy and precision. Calibration curves of all 6 analytes were linear. All analytes were stable for at least 48 hours in the autosampler; the high quality control of APAP-glutathione was stable for 24 hours. The method was validated according to the U.S. Food and Drug Administration guidelines. This method allows quantification of APAP and 6 metabolites, which serves purposes for research, as well as therapeutic drug monitoring. The advantage of this method is the combination of minimal injection volume, a short runtime, an easy sample preparation method, and the ability to quantify APAP and all 6 metabolites.

  6. CEDS Addresses: Rubric Elements

    ERIC Educational Resources Information Center

    US Department of Education, 2015

    2015-01-01

    Common Education Data Standards (CEDS) Version 4 introduced a common data vocabulary for defining rubrics in a data system. The CEDS elements support digital representations of both holistic and analytic rubrics. This document shares examples of holistic and analytic project rubrics, available CEDS Connections, and a logical model showing the…

  7. Selective and rapid determination of tadalafil and finasteride using solid phase extraction by high performance liquid chromatography and tandem mass spectrometry.

    PubMed

    Pappula, Nagaraju; Kodali, Balaji; Datla, Peda Varma

    2018-04-15

    Highly selective and fast liquid chromatography-tandem mass spectrometric (LC-MS/MS) method was developed and validated for simultaneous determination of tadalafil (TDL) and finasteride (FNS) in human plasma. The method was successfully applied for analysis of TDL and FNS samples in clinical study. The method was validated as per USFDA (United States Food and Drug Administration), EMA (European Medicines Agency), and ANVISA (Agência Nacional de Vigilância Sanitária-Brazil) bio analytical method validation guidelines. Glyburide (GLB) was used as common internal standard (ISTD) for both analytes. The selected multiple reaction monitoring (MRM) transitions for mass spectrometric analysis were m/z 390.2/268.2, m/z 373.3/305.4 and m/z 494.2/369.1 for TDL, FNS and ISTD respectively. The extraction of analytes and ISTD was accomplished by a simple solid phase extraction (SPE) procedure. Rapid analysis time was achieved on Zorbax Eclipse C18 column (50 × 4.6 mm, 5 μm). The calibration ranges for TDL and FNS were 5-800 ng/ml and 0.2-30 ng/ml respectively. The results of precision and accuracy, linearity, recovery and matrix effect of the method are acceptable. The accuracy was in the range of 92.9%-106.4% and method precision was also good; %CV was less than 8.1%. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Rapid Detection of Transition Metals in Welding Fumes Using Paper-Based Analytical Devices

    PubMed Central

    Volckens, John

    2014-01-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments. PMID:24515892

  9. Rapid detection of transition metals in welding fumes using paper-based analytical devices.

    PubMed

    Cate, David M; Nanthasurasak, Pavisara; Riwkulkajorn, Pornpak; L'Orange, Christian; Henry, Charles S; Volckens, John

    2014-05-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments.

  10. Rapid, simultaneous and interference-free determination of three rhodamine dyes illegally added into chilli samples using excitation-emission matrix fluorescence coupled with second-order calibration method.

    PubMed

    Chang, Yue-Yue; Wu, Hai-Long; Fang, Huan; Wang, Tong; Liu, Zhi; Ouyang, Yang-Zi; Ding, Yu-Jie; Yu, Ru-Qin

    2018-06-15

    In this study, a smart and green analytical method based on the second-order calibration algorithm coupled with excitation-emission matrix (EEM) fluorescence was developed for the determination of rhodamine dyes illegally added into chilli samples. The proposed method not only has the advantage of high sensitivity over the traditional fluorescence method but also fully displays the "second-order advantage". Pure signals of analytes were successfully extracted from severely interferential EEMs profiles via using alternating trilinear decomposition (ATLD) algorithm even in the presence of common fluorescence problems such as scattering, peak overlaps and unknown interferences. It is worth noting that the unknown interferents can denote different kinds of backgrounds, not only refer to a constant background. In addition, the method using interpolation method could avoid the information loss of analytes of interest. The use of "mathematical separation" instead of complicated "chemical or physical separation" strategy can be more effective and environmentally friendly. A series of statistical parameters including figures of merit and RSDs of intra- (≤1.9%) and inter-day (≤6.6%) were calculated to validate the accuracy of the proposed method. Furthermore, the authoritative method of HPLC-FLD was adopted to verify the qualitative and quantitative results of the proposed method. Compared with the two methods, it also showed that the ATLD-EEMs method has the advantages of accuracy, rapidness, simplicity and green, which is expected to be developed as an attractive alternative method for simultaneous and interference-free determination of rhodamine dyes illegally added into complex matrices. Copyright © 2018. Published by Elsevier B.V.

  11. Integration of gas chromatography mass spectrometry methods for differentiating ricin preparation methods.

    PubMed

    Wunschel, David S; Melville, Angela M; Ehrhardt, Christopher J; Colburn, Heather A; Victry, Kristin D; Antolick, Kathryn C; Wahl, Jon H; Wahl, Karen L

    2012-05-07

    The investigation of crimes involving chemical or biological agents is infrequent, but presents unique analytical challenges. The protein toxin ricin is encountered more frequently than other agents and is found in the seeds of Ricinus communis, commonly known as the castor plant. Typically, the toxin is extracted from castor seeds utilizing a variety of different recipes that result in varying purity of the toxin. Moreover, these various purification steps can also leave or differentially remove a variety of exogenous and endogenous residual components with the toxin that may indicate the type and number of purification steps involved. We have applied three gas chromatography-mass spectrometry (GC-MS) based analytical methods to measure the variation in seed carbohydrates and castor oil ricinoleic acid, as well as the presence of solvents used for purification. These methods were applied to the same samples prepared using four previously identified toxin preparation methods, starting from four varieties of castor seeds. The individual data sets for seed carbohydrate profiles, ricinoleic acid, or acetone amount each provided information capable of differentiating different types of toxin preparations across seed types. However, the integration of the data sets using multivariate factor analysis provided a clear distinction of all samples based on the preparation method, independent of the seed source. In particular, the abundance of mannose, arabinose, fucose, ricinoleic acid, and acetone were shown to be important differentiating factors. These complementary tools provide a more confident determination of the method of toxin preparation than would be possible using a single analytical method.

  12. On the analytical modeling of the nonlinear vibrations of pretensioned space structures

    NASA Technical Reports Server (NTRS)

    Housner, J. M.; Belvin, W. K.

    1983-01-01

    Pretensioned structures are receiving considerable attention as candidate large space structures. A typical example is a hoop-column antenna. The large number of preloaded members requires efficient analytical methods for concept validation and design. Validation through analyses is especially important since ground testing may be limited due to gravity effects and structural size. The present investigation has the objective to present an examination of the analytical modeling of pretensioned members undergoing nonlinear vibrations. Two approximate nonlinear analysis are developed to model general structural arrangements which include beam-columns and pretensioned cables attached to a common nucleus, such as may occur at a joint of a pretensioned structure. Attention is given to structures undergoing nonlinear steady-state oscillations due to sinusoidal excitation forces. Three analyses, linear, quasi-linear, and nonlinear are conducted and applied to study the response of a relatively simple cable stiffened structure.

  13. The Analytic Information Warehouse (AIW): a Platform for Analytics using Electronic Health Record Data

    PubMed Central

    Post, Andrew R.; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H.

    2013-01-01

    Objective To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. Materials and Methods We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. Results We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30 days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in five years of data from our institution’s clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. Discussion and Conclusion A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. PMID:23402960

  14. Design optimization of piezoresistive cantilevers for force sensing in air and water

    PubMed Central

    Doll, Joseph C.; Park, Sung-Jin; Pruitt, Beth L.

    2009-01-01

    Piezoresistive cantilevers fabricated from doped silicon or metal films are commonly used for force, topography, and chemical sensing at the micro- and macroscales. Proper design is required to optimize the achievable resolution by maximizing sensitivity while simultaneously minimizing the integrated noise over the bandwidth of interest. Existing analytical design methods are insufficient for modeling complex dopant profiles, design constraints, and nonlinear phenomena such as damping in fluid. Here we present an optimization method based on an analytical piezoresistive cantilever model. We use an existing iterative optimizer to minimimize a performance goal, such as minimum detectable force. The design tool is available as open source software. Optimal cantilever design and performance are found to strongly depend on the measurement bandwidth and the constraints applied. We discuss results for silicon piezoresistors fabricated by epitaxy and diffusion, but the method can be applied to any dopant profile or material which can be modeled in a similar fashion or extended to other microelectromechanical systems. PMID:19865512

  15. Probabilistic dual heuristic programming-based adaptive critic

    NASA Astrophysics Data System (ADS)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  16. Quantitative determination of 43 common drugs and drugs of abuse in human serum by HPLC-MS/MS.

    PubMed

    Bassan, David M; Erdmann, Freidoon; Krüll, Ralf

    2011-04-01

    An analytical procedure for the simultaneous determination in human serum of 43 common drugs of abuse and their metabolites belonging to the different chemical and toxicological classes of amphetamines, benzodiazepines, dibenzazepines, cocaine, lysergic acid diethylamide, opioids, phencyclidine, tricyclic antidepressants, and zolpidem, using 33 deuterated standards, is presented. The sample treatment was developed to be a very simple protein precipitation and filtration. All analyses were performed with a high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry in positive ionization mode. All analytes were calibrated up to 550 μg/L. The limit of detection ranged from 0.6 ng/mL (EDDP) to 13.7 ng/mL (flunitrazepam). The method has been validated according to the guidelines of the Gesellschaft für Toxikologische und Forensische Chemie, using three multiple reaction mode (MRM) transitions and retention time for positive compound identification, instead of two MRMs, in anticipation of the new guidelines for January 2011.

  17. Approaches for the analysis of low molecular weight compounds with laser desorption/ionization techniques and mass spectrometry.

    PubMed

    Bergman, Nina; Shevchenko, Denys; Bergquist, Jonas

    2014-01-01

    This review summarizes various approaches for the analysis of low molecular weight (LMW) compounds by different laser desorption/ionization mass spectrometry techniques (LDI-MS). It is common to use an agent to assist the ionization, and small molecules are normally difficult to analyze by, e.g., matrix assisted laser desorption/ionization mass spectrometry (MALDI-MS) using the common matrices available today, because the latter are generally small organic compounds themselves. This often results in severe suppression of analyte peaks, or interference of the matrix and analyte signals in the low mass region. However, intrinsic properties of several LDI techniques such as high sensitivity, low sample consumption, high tolerance towards salts and solid particles, and rapid analysis have stimulated scientists to develop methods to circumvent matrix-related issues in the analysis of LMW molecules. Recent developments within this field as well as historical considerations and future prospects are presented in this review.

  18. Scope and limitations of carbohydrate hydrolysis for de novo glycan sequencing using a hydrogen peroxide/metallopeptide-based glycosidase mimetic.

    PubMed

    Peng, Tianyuan; Wooke, Zachary; Pohl, Nicola L B

    2018-03-22

    Acidic hydrolysis is commonly used as a first step to break down oligo- and polysaccharides into monosaccharide units for structural analysis. While easy to set up and amenable to mass spectrometry detection, acid hydrolysis is not without its drawbacks. For example, ring-destruction side reactions and degradation products, along with difficulties in optimizing conditions from analyte to analyte, greatly limits its broad utility. Herein we report studies on a hydrogen peroxide/CuGGH metallopeptide-based glycosidase mimetic design for a more efficient and controllable carbohydrate hydrolysis. A library of methyl glycosides consisting of ten common monosaccharide substrates, along with oligosaccharide substrates, was screened with the artificial glycosidase for hydrolytic activity in a high-throughput format with a robotic liquid handling system. The artificial glycosidase was found to be active towards most screened linkages, including alpha- and beta-anomers, thus serving as a potential alternative method for traditional acidic hydrolysis approaches of oligosaccharides. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Individual behavioral phenotypes: an integrative meta-theoretical framework. Why "behavioral syndromes" are not analogs of "personality".

    PubMed

    Uher, Jana

    2011-09-01

    Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.

  20. Global standardization measurement of cerebral spinal fluid for Alzheimer's disease: an update from the Alzheimer's Association Global Biomarkers Consortium.

    PubMed

    Carrillo, Maria C; Blennow, Kaj; Soares, Holly; Lewczuk, Piotr; Mattsson, Niklas; Oberoi, Pankaj; Umek, Robert; Vandijck, Manu; Salamone, Salvatore; Bittner, Tobias; Shaw, Leslie M; Stephenson, Diane; Bain, Lisa; Zetterberg, Henrik

    2013-03-01

    Recognizing that international collaboration is critical for the acceleration of biomarker standardization efforts and the efficient development of improved diagnosis and therapy, the Alzheimer's Association created the Global Biomarkers Standardization Consortium (GBSC) in 2010. The consortium brings together representatives of academic centers, industry, and the regulatory community with the common goal of developing internationally accepted common reference standards and reference methods for the assessment of cerebrospinal fluid (CSF) amyloid β42 (Aβ42) and tau biomarkers. Such standards are essential to ensure that analytical measurements are reproducible and consistent across multiple laboratories and across multiple kit manufacturers. Analytical harmonization for CSF Aβ42 and tau will help reduce confusion in the AD community regarding the absolute values associated with the clinical interpretation of CSF biomarker results and enable worldwide comparison of CSF biomarker results across AD clinical studies. Copyright © 2013 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  1. Abundance of common species, not species richness, drives delivery of a real-world ecosystem service.

    PubMed

    Winfree, Rachael; Fox, Jeremy W; Williams, Neal M; Reilly, James R; Cariveau, Daniel P

    2015-07-01

    Biodiversity-ecosystem functioning experiments have established that species richness and composition are both important determinants of ecosystem function in an experimental context. Determining whether this result holds for real-world ecosystem services has remained elusive, however, largely due to the lack of analytical methods appropriate for large-scale, associational data. Here, we use a novel analytical approach, the Price equation, to partition the contribution to ecosystem services made by species richness, composition and abundance in four large-scale data sets on crop pollination by native bees. We found that abundance fluctuations of dominant species drove ecosystem service delivery, whereas richness changes were relatively unimportant because they primarily involved rare species that contributed little to function. Thus, the mechanism behind our results was the skewed species-abundance distribution. Our finding that a few common species, not species richness, drive ecosystem service delivery could have broad generality given the ubiquity of skewed species-abundance distributions in nature. © 2015 John Wiley & Sons Ltd/CNRS.

  2. Uniform GTD solution for the diffraction by metallic tapes on panelled compact-range reflectors

    NASA Technical Reports Server (NTRS)

    Somers, G. A.; Pathak, P. H.

    1992-01-01

    Metallic tape is commonly used to cover the interpanel gaps which occur in paneled compact-range reflectors. It is therefore of interest to study the effect of the scattering by the tape on the field in the target zone of the range. An analytical solution is presented for the target zone fields scattered by 2D metallic tapes. It is formulated by the generalized scattering matrix technique in conjunction with the Wiener-Hopf procedure. An extension to treat 3D tapes can be accomplished using the 2D solution via the equivalent current concept. The analytical solution is compared with a reference moment method solution to confirm the accuracy of the former.

  3. Simple and Sensitive Paper-Based Device Coupling Electrochemical Sample Pretreatment and Colorimetric Detection.

    PubMed

    Silva, Thalita G; de Araujo, William R; Muñoz, Rodrigo A A; Richter, Eduardo M; Santana, Mário H P; Coltro, Wendell K T; Paixão, Thiago R L C

    2016-05-17

    We report the development of a simple, portable, low-cost, high-throughput visual colorimetric paper-based analytical device for the detection of procaine in seized cocaine samples. The interference of most common cutting agents found in cocaine samples was verified, and a novel electrochemical approach was used for sample pretreatment in order to increase the selectivity. Under the optimized experimental conditions, a linear analytical curve was obtained for procaine concentrations ranging from 5 to 60 μmol L(-1), with a detection limit of 0.9 μmol L(-1). The accuracy of the proposed method was evaluated using seized cocaine samples and an addition and recovery protocol.

  4. An analytical optimization model for infrared image enhancement via local context

    NASA Astrophysics Data System (ADS)

    Xu, Yongjian; Liang, Kun; Xiong, Yiru; Wang, Hui

    2017-12-01

    The requirement for high-quality infrared images is constantly increasing in both military and civilian areas, and it is always associated with little distortion and appropriate contrast, while infrared images commonly have some shortcomings such as low contrast. In this paper, we propose a novel infrared image histogram enhancement algorithm based on local context. By constraining the enhanced image to have high local contrast, a regularized analytical optimization model is proposed to enhance infrared images. The local contrast is determined by evaluating whether two intensities are neighbors and calculating their differences. The comparison on 8-bit images shows that the proposed method can enhance the infrared images with more details and lower noise.

  5. Critical Factors in Data Governance for Learning Analytics

    ERIC Educational Resources Information Center

    Elouazizi, Noureddine

    2014-01-01

    This paper identifies some of the main challenges of data governance modelling in the context of learning analytics for higher education institutions, and discusses the critical factors for designing data governance models for learning analytics. It identifies three fundamental common challenges that cut across any learning analytics data…

  6. Laboratory Production of Lemon Liqueur (Limoncello) by Conventional Maceration and a Two-Syringe System to Illustrate Rapid Solid-Liquid Dynamic Extraction

    ERIC Educational Resources Information Center

    Naviglio, Daniele; Montesano, Domenico; Gallo, Monica

    2015-01-01

    Two experimental techniques of solid-liquid extraction are compared relating to the lab-scale production of lemon liqueur, most commonly named "limoncello"; the first is the official method of maceration for the solid-liquid extraction of analytes and is widely used to extract active ingredients from a great variety of natural products;…

  7. SU-C-9A-04: Alternative Analytic Solution to the Paralyzable Detector Model to Calculate Deadtime and Deadtime Loss

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siman, W; Kappadath, S

    2014-06-01

    Purpose: Some common methods to solve for deadtime are (1) dual-source method, which assumes two equal activities; (2) model fitting, which requires multiple acquisitions as source decays; and (3) lossless model, which assumes no deadtime loss at low count rates. We propose a new analytic alternative solution to calculate deadtime for paralyzable gamma camera. Methods: Deadtime T can be calculated analytically from two distinct observed count rates M1 and M2 when the ratio of the true count rates alpha=N2/N1 is known. Alpha can be measured as a ratio of two measured activities using dose calibrators or via radioactive decay. Knowledgemore » of alpha creates a system with 2 equations and 2 unknowns, i.e., T and N1. To verify the validity of the proposed method, projections of a non-uniform phantom (4GBq 99mTc) were acquired in using Siemens SymbiaS multiple times over 48 hours. Each projection has >100kcts. The deadtime for each projection was calculated by fitting the data to a paralyzable model and also by using the proposed 2-acquisition method. The two estimates of deadtime were compared using the Bland-Altmann method. In addition, the dependency of uncertainty in T on uncertainty in alpha was investigated for several imaging conditions. Results: The results strongly suggest that the 2-acquisition method is equivalent to the fitting method. The Bland-Altman analysis yielded mean difference in deadtime estimate of ∼0.076us (95%CI: -0.049us, 0.103us) between the 2-acquisition and model fitting methods. The 95% limits of agreement were calculated to be -0.104 to 0.256us. The uncertainty in deadtime calculated using the proposed method is highly dependent on the uncertainty in the ratio alpha. Conclusion: The 2-acquisition method was found to be equivalent to the parameter fitting method. The proposed method offers a simpler and more practical way to analytically solve for a paralyzable detector deadtime, especially during physics testing.« less

  8. Pesticide data for selected Wyoming streams, 1976-78

    USGS Publications Warehouse

    Butler, David L.

    1987-01-01

    In 1976, the U.S. Geological Survey, in cooperation with the Wyoming Department of Agriculture, started a monitoring program to determine pesticide concentrations in Wyoming streams. This program was incorporated into the water-quality data-collection system already in operation. Samples were collected at 20 sites for analysis of various insecticides, herbicides, polychlorinated biphenyls, and polychlorinated napthalenes.\\The results through 1978 revealed small concentrations of pesticides in water and bottom-material samples were DDE (39 percent of the concentrations equal to or greater than the minimum reported concentrations of the analytical methods), DDD (20 percent), dieldrin (21 percent), and polychlorinated biphenyls (29 percent). The herbicides most commonly found in water samples were 2,4-D (29 percent of the concentrations equal to or greater than the minimum reported concentrations of the analytical method) and picloram (23 percent). Most concentrations were significantly less than concentrations thought to be harmful to freshwater aquatic life based on available toxicity data. However for some pesticides, U.S. Environmental Protection Agency water-quality criteria for freshwater aquatic life are based on bioaccumulation factors that result in criteria concentrations less than the minimum reported concentrations of the analytical methods. It is not known if certain pesticides were present at concentrations less than the minimum reported concentrations that exceeded these criteria.

  9. Method and apparatus for optimized sampling of volatilizable target substances

    DOEpatents

    Lindgren, Eric R.; Phelan, James M.

    2004-10-12

    An apparatus for capturing, from gases such as soil gas, target analytes. Target analytes may include emanations from explosive materials or from residues of explosive materials. The apparatus employs principles of sorption common to solid phase microextraction, and is best used in conjunction with analysis means such as a gas chromatograph. To sorb target analytes, the apparatus functions using various sorptive structures to capture target analyte. Depending upon the embodiment, those structures may include a capillary tube including an interior surface on which sorptive material (similar to that on the surface of a SPME fiber) is supported (along with means for moving gases through the capillary tube so that the gases come into close proximity to the sorptive material). In one disclosed embodiment, at least one such sorptive structure is associated with an enclosure including an opening in communication with the surface of a soil region potentially contaminated with buried explosive material such as unexploded ordnance. Emanations from explosive materials can pass into and accumulate in the enclosure where they are sorbed by the sorptive structures. Also disclosed is the use of heating means such as microwave horns to drive target analytes into the soil gas from solid and liquid phase components of the soil.

  10. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  11. A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics

    NASA Technical Reports Server (NTRS)

    Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan

    2013-01-01

    In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.

  12. Micro- and nanofluidic systems in devices for biological, medical and environmental research

    NASA Astrophysics Data System (ADS)

    Evstrapov, A. A.

    2017-11-01

    The use of micro- and nanofluidic systems in modern analytical instruments allow you to implement a number of unique opportunities and achieve ultra-high measurement sensitivity. The possibility of manipulation of the individual biological objects (cells, bacteria, viruses, proteins, nucleic acids) in a liquid medium caused the development of devices on microchip platform for methods: chromatographic and electrophoretic analyzes; polymerase chain reaction; sequencing of nucleic acids; immunoassay; cytometric studies. Development of micro and nano fabrication technologies, materials science, surface chemistry, analytical chemistry, cell engineering have led to the creation of a unique systems such as “lab-on-a-chip”, “human-on-a-chip” and other. This article discusses common in microfluidics materials and methods of making functional structures. Examples of integration of nanoscale structures in microfluidic devices for the implementation of new features and improve the technical characteristics of devices and systems are shown.

  13. Rapid and sensitive detection of synthetic cannabinoids AMB-FUBINACA and α-PVP using surface enhanced Raman scattering (SERS)

    NASA Astrophysics Data System (ADS)

    Islam, Syed K.; Cheng, Yin Pak; Birke, Ronald L.; Green, Omar; Kubic, Thomas; Lombardi, John R.

    2018-04-01

    The application of surface enhanced Raman scattering (SERS) has been reported as a fast and sensitive analytical method in the trace detection of the two most commonly known synthetic cannabinoids AMB-FUBINACA and alpha-pyrrolidinovalerophenone (α-PVP). FUBINACA and α-PVP are two of the most dangerous synthetic cannabinoids which have been reported to cause numerous deaths in the United States. While instruments such as GC-MS, LC-MS have been traditionally recognized as analytical tools for the detection of these synthetic drugs, SERS has been recently gaining ground in the analysis of these synthetic drugs due to its sensitivity in trace analysis and its effectiveness as a rapid method of detection. This present study shows the limit of detection of a concentration as low as picomolar for AMB-FUBINACA while for α-PVP, the limit of detection is in nanomolar concentration using SERS.

  14. Analytical assessment of woven fabrics under vertical stabbing - The role of protective clothing.

    PubMed

    Hejazi, Sayyed Mahdi; Kadivar, Nastaran; Sajjadi, Ali

    2016-02-01

    Knives are being used more commonly in street fights and muggings. Therefore, this work presents an analytical model for woven fabrics under vertical stabbing loads. The model is based on energy method and the fabric is assumed to be unidirectional comprised of N layers. Thus, the ultimate stab resistance of fabric was determined based on structural parameters of fabric and geometrical characteristics of blade. Moreover, protective clothing is nowadays considered as a strategic branch in technical textile industry. The main idea of the present work is improving the stab resistance of woven textiles by using metal coating method. In the final, a series of vertical stabbing tests were conducted on cotton, polyester and polyamide fabrics. Consequently, it was found that the model predicts with a good accuracy the ultimate stab resistance of the sample fabrics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Application of analytical methods in authentication and adulteration of honey.

    PubMed

    Siddiqui, Amna Jabbar; Musharraf, Syed Ghulam; Choudhary, M Iqbal; Rahman, Atta-Ur-

    2017-02-15

    Honey is synthesized from flower nectar and it is famous for its tremendous therapeutic potential since ancient times. Many factors influence the basic properties of honey including the nectar-providing plant species, bee species, geographic area, and harvesting conditions. Quality and composition of honey is also affected by many other factors, such as overfeeding of bees with sucrose, harvesting prior to maturity, and adulteration with sugar syrups. Due to the complex nature of honey, it is often challenging to authenticate the purity and quality by using common methods such as physicochemical parameters and more specialized procedures need to be developed. This article reviews the literature (between 2000 and 2016) on the use of analytical techniques, mainly NMR spectroscopy, for authentication of honey, its botanical and geographical origin, and adulteration by sugar syrups. NMR is a powerful technique and can be used as a fingerprinting technique to compare various samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Analysis of titanium content in titanium tetrachloride solution

    NASA Astrophysics Data System (ADS)

    Bi, Xiaoguo; Dong, Yingnan; Li, Shanshan; Guan, Duojiao; Wang, Jianyu; Tang, Meiling

    2018-03-01

    Strontium titanate, barium titan and lead titanate are new type of functional ceramic materials with good prospect, and titanium tetrachloride is a commonly in the production such products. Which excellent electrochemical performance of ferroelectric tempreature coefficient effect.In this article, three methods are used to calibrate the samples of titanium tetrachloride solution by back titration method, replacement titration method and gravimetric analysis method. The results show that the back titration method has many good points, for example, relatively simple operation, easy to judgment the titration end point, better accuracy and precision of analytical results, the relative standard deviation not less than 0.2%. So, it is the ideal of conventional analysis methods in the mass production.

  17. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  18. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    PubMed

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the simulated biosignals in the early stages of physiologic deterioration and while the variables are still within normal ranges. Thus, the SBM system was found to identify pathophysiologic conditions in a timeframe that would not have been detected in a usual clinical monitoring scenario. Conclusion. In this study the functionality of a multivariate machine learning predictive methodology that that incorporates commonly monitored clinical information was tested using a computer model of human physiology. SBM and predictive analytics were able to differentiate a state of decompensation while the monitored variables were still within normal clinical ranges. This finding suggests that the SBM could provide for early identification of a clinical deterioration using predictive analytic techniques. predictive analytics, hemodynamic, monitoring.

  19. Enantioresolution of (RS)-baclofen by liquid chromatography: A review.

    PubMed

    Batra, Sonika; Bhushan, Ravi

    2017-01-01

    Baclofen is a commonly used racemic drug and has a simple chemical structure in terms of the presence of only one stereogenic center. Since the desirable pharmacological effect is in only one enantiomer, several possibilities exist for the other enantiomer for evaluation of the disposition of the racemic mixture of the drug. This calls for the development of enantioselective analytical methodology. This review summarizes and evaluates different methods of enantioseparation of (RS)-baclofen using both direct and indirect approaches, application of certain chiral reagents and chiral stationary phases (though very expensive). Methods of separation of diastereomers of (RS)-baclofen prepared with different chiral derivatizing reagents (under microwave irradiation at ease and in less time) on reversed-phase achiral columns or via a ligand exchange approach providing high-sensitivity detection by the relatively less expensive methods of TLC and HPLC are discussed. The methods may be helpful for determination of enantiomers in biological samples and in pharmaceutical formulations for control of enantiomeric purity and can be practiced both in analytical laboratories and industry for routine analysis and R&D activities. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Solvent signal suppression for high-resolution MAS-DNP

    NASA Astrophysics Data System (ADS)

    Lee, Daniel; Chaudhari, Sachin R.; De Paëpe, Gaël

    2017-05-01

    Dynamic nuclear polarization (DNP) has become a powerful tool to substantially increase the sensitivity of high-field magic angle spinning (MAS) solid-state NMR experiments. The addition of dissolved hyperpolarizing agents usually results in the presence of solvent signals that can overlap and obscure those of interest from the analyte. Here, two methods are proposed to suppress DNP solvent signals: a Forced Echo Dephasing experiment (FEDex) and TRAnsfer of Populations in DOuble Resonance Echo Dephasing (TRAPDORED) NMR. These methods reintroduce a heteronuclear dipolar interaction that is specific to the solvent, thereby forcing a dephasing of recoupled solvent spins and leaving acquired NMR spectra free of associated resonance overlap with the analyte. The potency of these methods is demonstrated on sample types common to MAS-DNP experiments, namely a frozen solution (of L-proline) and a powdered solid (progesterone), both containing deuterated glycerol as a DNP solvent. The proposed methods are efficient, simple to implement, compatible with other NMR experiments, and extendable past spectral editing for just DNP solvents. The sensitivity gains from MAS-DNP in conjunction with FEDex or TRAPDORED then permits rapid and uninterrupted sample analysis.

  1. Pre-analytical method for NMR-based grape metabolic fingerprinting and chemometrics.

    PubMed

    Ali, Kashif; Maltese, Federica; Fortes, Ana Margarida; Pais, Maria Salomé; Verpoorte, Robert; Choi, Young Hae

    2011-10-10

    Although metabolomics aims at profiling all the metabolites in organisms, data quality is quite dependent on the pre-analytical methods employed. In order to evaluate current methods, different pre-analytical methods were compared and used for the metabolic profiling of grapevine as a model plant. Five grape cultivars from Portugal in combination with chemometrics were analyzed in this study. A common extraction method with deuterated water and methanol was found effective in the case of amino acids, organic acids, and sugars. For secondary metabolites like phenolics, solid phase extraction with C-18 cartridges showed good results. Principal component analysis, in combination with NMR spectroscopy, was applied and showed clear distinction among the cultivars. Primary metabolites such as choline, sucrose, and leucine were found discriminating for 'Alvarinho', while elevated levels of alanine, valine, and acetate were found in 'Arinto' (white varieties). Among the red cultivars, higher signals for citrate and GABA in 'Touriga Nacional', succinate and fumarate in 'Aragonês', and malate, ascorbate, fructose and glucose in 'Trincadeira', were observed. Based on the phenolic profile, 'Arinto' was found with higher levels of phenolics as compared to 'Alvarinho'. 'Trincadeira' showed lowest phenolics content while higher levels of flavonoids and phenylpropanoids were found in 'Aragonês' and 'Touriga Nacional', respectively. It is shown that the metabolite composition of the extract is highly affected by the extraction procedure and this consideration has to be taken in account for metabolomics studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Development of an analytical method to assess the occupational health risk of therapeutic monoclonal antibodies using LC-HRMS.

    PubMed

    Reinders, Lars M H; Klassen, Martin D; Jaeger, Martin; Teutenberg, Thorsten; Tuerk, Jochen

    2018-04-01

    Monoclonal antibodies are a group of commonly used therapeutics, whose occupational health risk is still discussed controversially. The long-term low-dose exposure side effects are insufficiently evaluated; hence, discussions are often based on a theoretical level or extrapolating side effects from therapeutic dosages. While some research groups recommend applying the precautionary principle for monoclonal antibodies, others consider the exposure risk too low for measures taken towards occupational health and safety. However, both groups agree that airborne monoclonal antibodies have the biggest risk potential. Therefore, we developed a peptide-based analytical method for occupational exposure monitoring of airborne monoclonal antibodies. The method will allow collecting data about the occupational exposure to monoclonal antibodies. Thus, the mean daily intake for personnel in pharmacies and the pharmaceutical industry can be determined for the first time and will help to substantiate the risk assessment by relevant data. The introduced monitoring method includes air sampling, sample preparation and detection by liquid chromatography coupled with high-resolution mass spectrometry of individual monoclonal antibodies as well as sum parameter. For method development and validation, a chimeric (rituximab), humanised (trastuzumab) and a fully humanised (daratumumab) monoclonal antibody are used. A limit of detection between 1 μg per sample for daratumumab and 25 μg per sample for the collective peptide is achieved. Graphical abstract Demonstration of the analytical workflow, from the release of monoclonal antibodies to the detection as single substances as well as sum parameter.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sahoo, Satiprasad; Dhar, Anirban, E-mail: anirban.dhar@gmail.com; Kar, Amlanjyoti

    Environmental management of an area describes a policy for its systematic and sustainable environmental protection. In the present study, regional environmental vulnerability assessment in Hirakud command area of Odisha, India is envisaged based on Grey Analytic Hierarchy Process method (Grey–AHP) using integrated remote sensing (RS) and geographic information system (GIS) techniques. Grey–AHP combines the advantages of classical analytic hierarchy process (AHP) and grey clustering method for accurate estimation of weight coefficients. It is a new method for environmental vulnerability assessment. Environmental vulnerability index (EVI) uses natural, environmental and human impact related factors, e.g., soil, geology, elevation, slope, rainfall, temperature, windmore » speed, normalized difference vegetation index, drainage density, crop intensity, agricultural DRASTIC value, population density and road density. EVI map has been classified into four environmental vulnerability zones (EVZs) namely: ‘low’, ‘moderate’ ‘high’, and ‘extreme’ encompassing 17.87%, 44.44%, 27.81% and 9.88% of the study area, respectively. EVI map indicates that the northern part of the study area is more vulnerable from an environmental point of view. EVI map shows close correlation with elevation. Effectiveness of the zone classification is evaluated by using grey clustering method. General effectiveness is in between “better” and “common classes”. This analysis demonstrates the potential applicability of the methodology. - Highlights: • Environmental vulnerability zone identification based on Grey Analytic Hierarchy Process (AHP) • The effectiveness evaluation by means of a grey clustering method with support from AHP • Use of grey approach eliminates the excessive dependency on the experience of experts.« less

  4. Spectrophotometric Investigations of Macrolide Antibiotics: A Brief Review

    PubMed Central

    Keskar, Mrudul R; Jugade, Ravin M

    2015-01-01

    Macrolides, one of the most commonly used class of antibiotics, are a group of drugs produced by Streptomyces species. They belong to the polyketide class of natural products. Their activity is due to the presence of a large macrolide lactone ring with deoxy sugar moieties. They are protein synthesis inhibitors and broad-spectrum antibiotics, active against both gram-positive and gram-negative bacteria. Different analytical techniques have been reported for the determination of macrolides such as chromatographic methods, flow injection methods, spectrofluorometric methods, spectrophotometric methods, and capillary electrophoresis methods. Among these methods, spectrophotometric methods are sensitive and cost effective for the analysis of various antibiotics in pharmaceutical formulations as well as biological samples. This article reviews different spectrophotometric methods for the determination of macrolide antibiotics. PMID:26609215

  5. Effectiveness of focused source generation methods with consideration of interaural time and level difference.

    PubMed

    Zheng, Jianwen; Lu, Jing; Chen, Kai

    2013-07-01

    Several methods have been proposed for the generation of the focused source, usually a virtual monopole source positioned in between the loudspeaker array and the listener. The problem of pre-echoes of the common analytical methods has been noticed, and the most concise method to cope with this problem is the angular weight method. In this paper, the interaural time and level difference, which are well related to the localization cues of human auditory systems, will be used to further investigate the effectiveness of the focused source generation methods. It is demonstrated that the combination of angular weight method and the numerical pressure matching method has comparatively better performance in a given reconstructed area.

  6. Cross-reactivity by botanicals used in dietary supplements and spices using the multiplex xMAP food allergen detection assay (xMAP FADA).

    PubMed

    Pedersen, Ronnie O; Nowatzke, William L; Cho, Chung Y; Oliver, Kerry G; Garber, Eric A E

    2018-06-18

    Food allergies affect some 15 million Americans. The only treatment for food allergies is a strict avoidance diet. To help ensure the reliability of food labels, analytical methods are employed; the most common being enzyme-linked immunosorbent assays (ELISAs). However, the commonly employed ELISAs are single analyte-specific and cannot distinguish between false positives due to cross-reactive homologous proteins; making the method of questionable utility for regulatory purposes when analyzing for unknown or multiple food allergens. Also, should the need arise to detect additional analytes, extensive research must be undertaken to develop new ELISAs. To address these and other limitations, a multiplex immunoassay, the xMAP® food allergen detection assay (xMAP FADA), was developed using 30 different antibodies against 14 different food allergens plus gluten. Besides incorporating two antibodies for the detection of most analytes, the xMAP FADA also relies on two different extraction protocols; providing multiple confirmatory end-points. Using the xMAP FADA, the cross-reactivities of 45 botanicals used in dietary supplements and spices commercially sold in the USA were assessed. Only a few displayed cross-reactivities with the antibodies in the xMAP FADA at levels exceeding 0.0001%. The utility of the xMAP FADA was exemplified by its ability to detect and distinguish between betel nut, saw palmetto, and acai which are in the same family as coconut. Other botanicals examined included allspice, amchur, anise seed, black pepper, caraway seed, cardamom, cayenne red pepper, sesame seed, poppy seed, white pepper, and wheat grass. The combination of direct antibody detection, multi-antibody profiling, high sensitivity, and a modular design made it possible for the xMAP FADA to distinguish between homologous antigens, provide multiple levels of built-in confirmatory analysis, and optimize the bead set cocktail to address specific needs.

  7. pH-Triggered Molecular Alignment for Reproducible SERS Detection via an AuNP/Nanocellulose Platform

    PubMed Central

    Wei, Haoran; Vikesland, Peter J.

    2015-01-01

    The low affinity of neutral and hydrophobic molecules towards noble metal surfaces hinders their detection by surface-enhanced Raman spectroscopy (SERS). Herein, we present a method to enhance gold nanoparticle (AuNP) surface affinity by lowering the suspension pH below the analyte pKa. We developed an AuNP/bacterial cellulose (BC) nanocomposite platform and applied it to two common pollutants, carbamazepine (CBZ) and atrazine (ATZ) with pKa values of 2.3 and 1.7, respectively. Simple mixing of the analytes with AuNP/BC at pH < pKa resulted in consistent electrostatic alignment of the CBZ and ATZ molecules across the nanocomposite and highly reproducible SERS spectra. Limits of detection of 3 nM and 11 nM for CBZ and ATZ, respectively, were attained. Tests with additional analytes (melamine, 2,4-dichloroaniline, 4-chloroaniline, 3-bromoaniline, and 3-nitroaniline) further illustrate that the AuNP/BC platform provides reproducible analyte detection and quantification while avoiding the uncontrolled aggregation and flocculation of AuNPs that often hinder low pH detection. PMID:26658696

  8. pH-Triggered Molecular Alignment for Reproducible SERS Detection via an AuNP/Nanocellulose Platform

    NASA Astrophysics Data System (ADS)

    Wei, Haoran; Vikesland, Peter J.

    2015-12-01

    The low affinity of neutral and hydrophobic molecules towards noble metal surfaces hinders their detection by surface-enhanced Raman spectroscopy (SERS). Herein, we present a method to enhance gold nanoparticle (AuNP) surface affinity by lowering the suspension pH below the analyte pKa. We developed an AuNP/bacterial cellulose (BC) nanocomposite platform and applied it to two common pollutants, carbamazepine (CBZ) and atrazine (ATZ) with pKa values of 2.3 and 1.7, respectively. Simple mixing of the analytes with AuNP/BC at pH < pKa resulted in consistent electrostatic alignment of the CBZ and ATZ molecules across the nanocomposite and highly reproducible SERS spectra. Limits of detection of 3 nM and 11 nM for CBZ and ATZ, respectively, were attained. Tests with additional analytes (melamine, 2,4-dichloroaniline, 4-chloroaniline, 3-bromoaniline, and 3-nitroaniline) further illustrate that the AuNP/BC platform provides reproducible analyte detection and quantification while avoiding the uncontrolled aggregation and flocculation of AuNPs that often hinder low pH detection.

  9. Analytical interference of 4-hydroxy-3-methoxymethamphetamine with the measurement of plasma free normetanephrine by ultra-high pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Dunand, Marielle; Donzelli, Massimiliano; Rickli, Anna; Hysek, Cédric M; Liechti, Matthias E; Grouzmann, Eric

    2014-08-01

    The diagnosis of pheochromocytoma relies on the measurement of plasma free metanephrines assay whose reliability has been considerably improved by ultra-high pressure liquid chromatography tandem mass spectrometry (UHPLC-MS/MS). Here we report an analytical interference occurring between 4-hydroxy-3-methoxymethamphetamine (HMMA), a metabolite of 3,4-methylenedioxymethamphetamine (MDMA, "Ecstasy"), and normetanephrine (NMN) since they share a common pharmacophore resulting in the same product ion after fragmentation. Synthetic HMMA was spiked into plasma samples containing various concentrations of NMN and the intensity of the interference was determined by UPLC-MS/MS before and after improvement of the analytical method. Using a careful adjustment of chromatographic conditions including the change of the UPLC analytical column, we were able to distinguish both compounds. HMMA interference for NMN determination should be seriously considered since MDMA activates the sympathetic nervous system and if confounded with NMN may lead to false-positive tests when performing a differential diagnostic of pheochromocytoma. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  10. Double power series method for approximating cosmological perturbations

    NASA Astrophysics Data System (ADS)

    Wren, Andrew J.; Malik, Karim A.

    2017-04-01

    We introduce a double power series method for finding approximate analytical solutions for systems of differential equations commonly found in cosmological perturbation theory. The method was set out, in a noncosmological context, by Feshchenko, Shkil' and Nikolenko (FSN) in 1966, and is applicable to cases where perturbations are on subhorizon scales. The FSN method is essentially an extension of the well known Wentzel-Kramers-Brillouin (WKB) method for finding approximate analytical solutions for ordinary differential equations. The FSN method we use is applicable well beyond perturbation theory to solve systems of ordinary differential equations, linear in the derivatives, that also depend on a small parameter, which here we take to be related to the inverse wave-number. We use the FSN method to find new approximate oscillating solutions in linear order cosmological perturbation theory for a flat radiation-matter universe. Together with this model's well-known growing and decaying Mészáros solutions, these oscillating modes provide a complete set of subhorizon approximations for the metric potential, radiation and matter perturbations. Comparison with numerical solutions of the perturbation equations shows that our approximations can be made accurate to within a typical error of 1%, or better. We also set out a heuristic method for error estimation. A Mathematica notebook which implements the double power series method is made available online.

  11. Stopping power and dose calculations with analytical and Monte Carlo methods for protons and prompt gamma range verification

    NASA Astrophysics Data System (ADS)

    Usta, Metin; Tufan, Mustafa Çağatay; Aydın, Güral; Bozkurt, Ahmet

    2018-07-01

    In this study, we have performed the calculations stopping power, depth dose, and range verification for proton beams using dielectric and Bethe-Bloch theories and FLUKA, Geant4 and MCNPX Monte Carlo codes. In the framework, as analytical studies, Drude model was applied for dielectric theory and effective charge approach with Roothaan-Hartree-Fock charge densities was used in Bethe theory. In the simulations different setup parameters were selected to evaluate the performance of three distinct Monte Carlo codes. The lung and breast tissues were investigated are considered to be related to the most common types of cancer throughout the world. The results were compared with each other and the available data in literature. In addition, the obtained results were verified with prompt gamma range data. In both stopping power values and depth-dose distributions, it was found that the Monte Carlo values give better results compared with the analytical ones while the results that agree best with ICRU data in terms of stopping power are those of the effective charge approach between the analytical methods and of the FLUKA code among the MC packages. In the depth dose distributions of the examined tissues, although the Bragg curves for Monte Carlo almost overlap, the analytical ones show significant deviations that become more pronounce with increasing energy. Verifications with the results of prompt gamma photons were attempted for 100-200 MeV protons which are regarded important for proton therapy. The analytical results are within 2%-5% and the Monte Carlo values are within 0%-2% as compared with those of the prompt gammas.

  12. The Analytical Limits of Modeling Short Diffusion Timescales

    NASA Astrophysics Data System (ADS)

    Bradshaw, R. W.; Kent, A. J.

    2016-12-01

    Chemical and isotopic zoning in minerals is widely used to constrain the timescales of magmatic processes such as magma mixing and crystal residence, etc. via diffusion modeling. Forward modeling of diffusion relies on fitting diffusion profiles to measured compositional gradients. However, an individual measurement is essentially an average composition for a segment of the gradient defined by the spatial resolution of the analysis. Thus there is the potential for the analytical spatial resolution to limit the timescales that can be determined for an element of given diffusivity, particularly where the scale of the gradient approaches that of the measurement. Here we use a probabilistic modeling approach to investigate the effect of analytical spatial resolution on estimated timescales from diffusion modeling. Our method investigates how accurately the age of a synthetic diffusion profile can be obtained by modeling an "unknown" profile derived from discrete sampling of the synthetic compositional gradient at a given spatial resolution. We also include the effects of analytical uncertainty and the position of measurements relative to the diffusion gradient. We apply this method to the spatial resolutions of common microanalytical techniques (LA-ICP-MS, SIMS, EMP, NanoSIMS). Our results confirm that for a given diffusivity, higher spatial resolution gives access to shorter timescales, and that each analytical spacing has a minimum timescale, below which it overestimates the timescale. For example, for Ba diffusion in plagioclase at 750 °C timescales are accurate (within 20%) above 10, 100, 2,600, and 71,000 years at 0.3, 1, 5, and 25 mm spatial resolution, respectively. For Sr diffusion in plagioclase at 750 °C, timescales are accurate above 0.02, 0.2, 4, and 120 years at the same spatial resolutions. Our results highlight the importance of selecting appropriate analytical techniques to estimate accurate diffusion-based timescales.

  13. Meta-analysis as Statistical and Analytical Method of Journal’s Content Scientific Evaluation

    PubMed Central

    Masic, Izet; Begic, Edin

    2015-01-01

    Introduction: A meta-analysis is a statistical and analytical method which combines and synthesizes different independent studies and integrates their results into one common result. Goal: Analysis of the journals “Medical Archives”, “Materia Socio Medica” and “Acta Informatica Medica”, which are located in the most eminent indexed databases of the biomedical milieu. Material and methods: The study has retrospective and descriptive character, and included the period of the calendar year 2014. Study included six editions of all three journals (total of 18 journals). Results: In this period was published a total of 291 articles (in the “Medical Archives” 110, “Materia Socio Medica” 97, and in “Acta Informatica Medica” 84). The largest number of articles was original articles. Small numbers have been published as professional, review articles and case reports. Clinical events were most common in the first two journals, while in the journal “Acta Informatica Medica” belonged to the field of medical informatics, as part of pre-clinical medical disciplines. Articles are usually required period of fifty to fifty nine days for review. Articles were received from four continents, mostly from Europe. The authors are most often from the territory of Bosnia and Herzegovina, then Iran, Kosovo and Macedonia. Conclusion: The number of articles published each year is increasing, with greater participation of authors from different continents and abroad. Clinical medical disciplines are the most common, with the broader spectrum of topics and with a growing number of original articles. Greater support of the wider scientific community is needed for further development of all three of the aforementioned journals. PMID:25870484

  14. Nanomaterial-Based Sensing and Biosensing of Phenolic Compounds and Related Antioxidant Capacity in Food.

    PubMed

    Della Pelle, Flavio; Compagnone, Dario

    2018-02-04

    Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field.

  15. Nanomaterial-Based Sensing and Biosensing of Phenolic Compounds and Related Antioxidant Capacity in Food

    PubMed Central

    2018-01-01

    Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field. PMID:29401719

  16. Selective identification and quantification of saccharin by liquid chromatography and fluorescence detection.

    PubMed

    Bruno, Sergio N F; Cardoso, Carlos R; Maciel, Márcia Mosca A; Vokac, Lidmila; da Silva Junior, Ademário I

    2014-09-15

    High-pressure liquid chromatography with ultra-violet detection (HPLC-UV) is one of the most commonly used methods to identify and quantify saccharin in non-alcoholic beverages. However, due to the wide variety of interfering UV spectra in saccharin-containing beverage matrices, the same method cannot be used to measure this analyte accurately. We have developed a new, highly effective method to identify and quantify saccharin using HPLC with fluorescence detection (HPLC-FLD). The excitation wavelength (250 nm) and emission wavelength (440 nm) chosen increased selectivity for all matrices and ensured few changes were required in the mobile phase or other parameters. The presence of saccharin in non-diet beverages - a fraud commonly used to replace more expensive sucrose - was confirmed by comparing coincident peaks as well as the emission spectra of standards and samples. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Regional Input-Output Tables and Trade Flows: an Integrated and Interregional Non-survey Approach

    DOE PAGES

    Boero, Riccardo; Edwards, Brian Keith; Rivera, Michael Kelly

    2017-03-20

    Regional input–output tables and trade flows: an integrated and interregional non-survey approach. Regional Studies. Regional analyses require detailed and accurate information about dynamics happening within and between regional economies. However, regional input–output tables and trade flows are rarely observed and they must be estimated using up-to-date information. Common estimation approaches vary widely but consider tables and flows independently. Here, by using commonly used economic assumptions and available economic information, this paper presents a method that integrates the estimation of regional input–output tables and trade flows across regions. Examples of the method implementation are presented and compared with other approaches, suggestingmore » that the integrated approach provides advantages in terms of estimation accuracy and analytical capabilities.« less

  18. Regional Input-Output Tables and Trade Flows: an Integrated and Interregional Non-survey Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boero, Riccardo; Edwards, Brian Keith; Rivera, Michael Kelly

    Regional input–output tables and trade flows: an integrated and interregional non-survey approach. Regional Studies. Regional analyses require detailed and accurate information about dynamics happening within and between regional economies. However, regional input–output tables and trade flows are rarely observed and they must be estimated using up-to-date information. Common estimation approaches vary widely but consider tables and flows independently. Here, by using commonly used economic assumptions and available economic information, this paper presents a method that integrates the estimation of regional input–output tables and trade flows across regions. Examples of the method implementation are presented and compared with other approaches, suggestingmore » that the integrated approach provides advantages in terms of estimation accuracy and analytical capabilities.« less

  19. Monte Carlo simulation of the radiant field produced by a multiple-lamp quartz heating system

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.

    1991-01-01

    A method is developed for predicting the radiant heat flux distribution produced by a reflected bank of tungsten-filament tubular-quartz radiant heaters. The method is correlated with experimental results from two cases, one consisting of a single lamp and a flat reflector and the other consisting of a single lamp and a parabolic reflector. The simulation methodology, computer implementation, and experimental procedures are discussed. Analytical refinements necessary for comparison with experiment are discussed and applied to a multilamp, common reflector heating system.

  20. A simple validated multi-analyte method for detecting drugs in oral fluid by ultra-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS).

    PubMed

    Zheng, Yufang; Sparve, Erik; Bergström, Mats

    2018-06-01

    A UPLC-MS/MS method was developed to identify and quantitate 37 commonly abused drugs in oral fluid. Drugs of interest included amphetamines, benzodiazepines, cocaine, opiates, opioids, phencyclidine and tetrahydrocannabinol. Sample preparation and extraction are simple, and analysis times short. Validation showed satisfactory performance at relevant concentrations. The possibility of contaminated samples as well as the interpretation in relation to well-knows matrices, such as urine, will demand further study. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Two stage algorithm vs commonly used approaches for the suspect screening of complex environmental samples analyzed via liquid chromatography high resolution time of flight mass spectroscopy: A test study.

    PubMed

    Samanipour, Saer; Baz-Lomba, Jose A; Alygizakis, Nikiforos A; Reid, Malcolm J; Thomaidis, Nikolaos S; Thomas, Kevin V

    2017-06-09

    LC-HR-QTOF-MS recently has become a commonly used approach for the analysis of complex samples. However, identification of small organic molecules in complex samples with the highest level of confidence is a challenging task. Here we report on the implementation of a two stage algorithm for LC-HR-QTOF-MS datasets. We compared the performances of the two stage algorithm, implemented via NIVA_MZ_Analyzer™, with two commonly used approaches (i.e. feature detection and XIC peak picking, implemented via UNIFI by Waters and TASQ by Bruker, respectively) for the suspect analysis of four influent wastewater samples. We first evaluated the cross platform compatibility of LC-HR-QTOF-MS datasets generated via instruments from two different manufacturers (i.e. Waters and Bruker). Our data showed that with an appropriate spectral weighting function the spectra recorded by the two tested instruments are comparable for our analytes. As a consequence, we were able to perform full spectral comparison between the data generated via the two studied instruments. Four extracts of wastewater influent were analyzed for 89 analytes, thus 356 detection cases. The analytes were divided into 158 detection cases of artificial suspect analytes (i.e. verified by target analysis) and 198 true suspects. The two stage algorithm resulted in a zero rate of false positive detection, based on the artificial suspect analytes while producing a rate of false negative detection of 0.12. For the conventional approaches, the rates of false positive detection varied between 0.06 for UNIFI and 0.15 for TASQ. The rates of false negative detection for these methods ranged between 0.07 for TASQ and 0.09 for UNIFI. The effect of background signal complexity on the two stage algorithm was evaluated through the generation of a synthetic signal. We further discuss the boundaries of applicability of the two stage algorithm. The importance of background knowledge and experience in evaluating the reliability of results during the suspect screening was evaluated. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  3. Consensus Classification Using Non-Optimized Classifiers.

    PubMed

    Brownfield, Brett; Lemos, Tony; Kalivas, John H

    2018-04-03

    Classifying samples into categories is a common problem in analytical chemistry and other fields. Classification is usually based on only one method, but numerous classifiers are available with some being complex, such as neural networks, and others are simple, such as k nearest neighbors. Regardless, most classification schemes require optimization of one or more tuning parameters for best classification accuracy, sensitivity, and specificity. A process not requiring exact selection of tuning parameter values would be useful. To improve classification, several ensemble approaches have been used in past work to combine classification results from multiple optimized single classifiers. The collection of classifications for a particular sample are then combined by a fusion process such as majority vote to form the final classification. Presented in this Article is a method to classify a sample by combining multiple classification methods without specifically classifying the sample by each method, that is, the classification methods are not optimized. The approach is demonstrated on three analytical data sets. The first is a beer authentication set with samples measured on five instruments, allowing fusion of multiple instruments by three ways. The second data set is composed of textile samples from three classes based on Raman spectra. This data set is used to demonstrate the ability to classify simultaneously with different data preprocessing strategies, thereby reducing the need to determine the ideal preprocessing method, a common prerequisite for accurate classification. The third data set contains three wine cultivars for three classes measured at 13 unique chemical and physical variables. In all cases, fusion of nonoptimized classifiers improves classification. Also presented are atypical uses of Procrustes analysis and extended inverted signal correction (EISC) for distinguishing sample similarities to respective classes.

  4. The Effect of Multispectral Image Fusion Enhancement on Human Efficiency

    DTIC Science & Technology

    2017-03-20

    human visual system by applying a technique commonly used in visual percep- tion research : ideal observer analysis. Using this approach, we establish...applications, analytic tech- niques, and procedural methods used across studies. This paper uses ideal observer analysis to establish a frame- work that allows...augmented similarly to incorpo- rate research involving more complex stimulus content. Additionally, the ideal observer can be adapted for a number of

  5. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    NASA Astrophysics Data System (ADS)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  6. Circular Functions Based Comprehensive Analysis of Plastic Creep Deformations in the Fiber Reinforced Composites

    NASA Astrophysics Data System (ADS)

    Monfared, Vahid

    2016-12-01

    Analytically based model is presented for behavioral analysis of the plastic deformations in the reinforced materials using the circular (trigonometric) functions. The analytical method is proposed to predict creep behavior of the fibrous composites based on basic and constitutive equations under a tensile axial stress. New insight of the work is to predict some important behaviors of the creeping matrix. In the present model, the prediction of the behaviors is simpler than the available methods. Principal creep strain rate behaviors are very noteworthy for designing the fibrous composites in the creeping composites. Analysis of the mentioned parameter behavior in the reinforced materials is necessary to analyze failure, fracture, and fatigue studies in the creep of the short fiber composites. Shuttles, spaceships, turbine blades and discs, and nozzle guide vanes are commonly subjected to the creep effects. Also, predicting the creep behavior is significant to design the optoelectronic and photonic advanced composites with optical fibers. As a result, the uniform behavior with constant gradient is seen in the principal creep strain rate behavior, and also creep rupture may happen at the fiber end. Finally, good agreements are found through comparing the obtained analytical and FEM results.

  7. Reference materials for cellular therapeutics.

    PubMed

    Bravery, Christopher A; French, Anna

    2014-09-01

    The development of cellular therapeutics (CTP) takes place over many years, and, where successful, the developer will anticipate the product to be in clinical use for decades. Successful demonstration of manufacturing and quality consistency is dependent on the use of complex analytical methods; thus, the risk of process and method drift over time is high. The use of reference materials (RM) is an established scientific principle and as such also a regulatory requirement. The various uses of RM in the context of CTP manufacturing and quality are discussed, along with why they are needed for living cell products and the analytical methods applied to them. Relatively few consensus RM exist that are suitable for even common methods used by CTP developers, such as flow cytometry. Others have also identified this need and made proposals; however, great care will be needed to ensure any consensus RM that result are fit for purpose. Such consensus RM probably will need to be applied to specific standardized methods, and the idea that a single RM can have wide applicability is challenged. Written standards, including standardized methods, together with appropriate measurement RM are probably the most appropriate way to define specific starting cell types. The characteristics of a specific CTP will to some degree deviate from those of the starting cells; consequently, a product RM remains the best solution where feasible. Each CTP developer must consider how and what types of RM should be used to ensure the reliability of their own analytical measurements. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  8. Laser Ablation in situ (U-Th-Sm)/He and U-Pb Double-Dating of Apatite and Zircon: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    McInnes, B.; Danišík, M.; Evans, N.; McDonald, B.; Becker, T.; Vermeesch, P.

    2015-12-01

    We present a new laser-based technique for rapid, quantitative and automated in situ microanalysis of U, Th, Sm, Pb and He for applications in geochronology, thermochronometry and geochemistry (Evans et al., 2015). This novel capability permits a detailed interrogation of the time-temperature history of rocks containing apatite, zircon and other accessory phases by providing both (U-Th-Sm)/He and U-Pb ages (+trace element analysis) on single crystals. In situ laser microanalysis offers several advantages over conventional bulk crystal methods in terms of safety, cost, productivity and spatial resolution. We developed and integrated a suite of analytical instruments including a 193 nm ArF excimer laser system (RESOlution M-50A-LR), a quadrupole ICP-MS (Agilent 7700s), an Alphachron helium mass spectrometry system and swappable flow-through and ultra-high vacuum analytical chambers. The analytical protocols include the following steps: mounting/polishing in PFA Teflon using methods similar to those adopted for fission track etching; laser He extraction and analysis using a 2 s ablation at 5 Hz and 2-3 J/cm2fluence; He pit volume measurement using atomic force microscopy, and U-Th-Sm-Pb (plus optional trace element) analysis using traditional laser ablation methods. The major analytical challenges for apatite include the low U, Th and He contents relative to zircon and the elevated common Pb content. On the other hand, apatite typically has less extreme and less complex zoning of parent isotopes (primarily U and Th). A freeware application has been developed for determining (U-Th-Sm)/He ages from the raw analytical data and Iolite software was used for U-Pb age and trace element determination. In situ double-dating has successfully replicated conventional U-Pb and (U-Th)/He age variations in xenocrystic zircon from the diamondiferous Ellendale lamproite pipe, Western Australia and increased zircon analytical throughput by a factor of 50 over conventional methods.Reference: Evans NJ, McInnes BIA, McDonald B, Becker T, Vermeesch P, Danisik M, Shelley M, Marillo-Sialer E and Patterson D. An in situ technique for (U-Th-Sm)/He and U-Pb double dating. J Analytical Atomic Spectrometry, 30, 1636 - 1645.

  9. Calibration and accuracy analysis of a focused plenoptic camera

    NASA Astrophysics Data System (ADS)

    Zeller, N.; Quint, F.; Stilla, U.

    2014-08-01

    In this article we introduce new methods for the calibration of depth images from focused plenoptic cameras and validate the results. We start with a brief description of the concept of a focused plenoptic camera and how from the recorded raw image a depth map can be estimated. For this camera, an analytical expression of the depth accuracy is derived for the first time. In the main part of the paper, methods to calibrate a focused plenoptic camera are developed and evaluated. The optical imaging process is calibrated by using a method which is already known from the calibration of traditional cameras. For the calibration of the depth map two new model based methods, which make use of the projection concept of the camera are developed. These new methods are compared to a common curve fitting approach, which is based on Taylor-series-approximation. Both model based methods show significant advantages compared to the curve fitting method. They need less reference points for calibration than the curve fitting method and moreover, supply a function which is valid in excess of the range of calibration. In addition the depth map accuracy of the plenoptic camera was experimentally investigated for different focal lengths of the main lens and is compared to the analytical evaluation.

  10. Data Analytics in Procurement Fraud Prevention

    DTIC Science & Technology

    2014-05-30

    Certified Fraud Examiners CAC common access card COR contracting officer’s representative CPAR Contractor Performance Assessment Reporting System DCAA...using analytics to predict patterns occurring in known credit card fraud investigations to prevent future schemes before they happen. The goal of...or iTunes . 4. Distributional Analytics Distributional analytics are used to detect anomalies within data. Through the use of distributional

  11. Business Analytics in Practice and in Education: A Competency-Based Perspective

    ERIC Educational Resources Information Center

    Mamonov, Stanislav; Misra, Ram; Jain, Rashmi

    2015-01-01

    Business analytics is a fast-growing area in practice. The rapid growth of business analytics in practice in the recent years is mirrored by a corresponding fast evolution of new educational programs. While more than 130 graduate and undergraduate degree programs in business analytics have been launched in the past 5 years, no commonly accepted…

  12. Two-condition within-participant statistical mediation analysis: A path-analytic framework.

    PubMed

    Montoya, Amanda K; Hayes, Andrew F

    2017-03-01

    Researchers interested in testing mediation often use designs where participants are measured on a dependent variable Y and a mediator M in both of 2 different circumstances. The dominant approach to assessing mediation in such a design, proposed by Judd, Kenny, and McClelland (2001), relies on a series of hypothesis tests about components of the mediation model and is not based on an estimate of or formal inference about the indirect effect. In this article we recast Judd et al.'s approach in the path-analytic framework that is now commonly used in between-participant mediation analysis. By so doing, it is apparent how to estimate the indirect effect of a within-participant manipulation on some outcome through a mediator as the product of paths of influence. This path-analytic approach eliminates the need for discrete hypothesis tests about components of the model to support a claim of mediation, as Judd et al.'s method requires, because it relies only on an inference about the product of paths-the indirect effect. We generalize methods of inference for the indirect effect widely used in between-participant designs to this within-participant version of mediation analysis, including bootstrap confidence intervals and Monte Carlo confidence intervals. Using this path-analytic approach, we extend the method to models with multiple mediators operating in parallel and serially and discuss the comparison of indirect effects in these more complex models. We offer macros and code for SPSS, SAS, and Mplus that conduct these analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Analytical model and error analysis of arbitrary phasing technique for bunch length measurement

    NASA Astrophysics Data System (ADS)

    Chen, Qushan; Qin, Bin; Chen, Wei; Fan, Kuanjun; Pei, Yuanji

    2018-05-01

    An analytical model of an RF phasing method using arbitrary phase scanning for bunch length measurement is reported. We set up a statistical model instead of a linear chirp approximation to analyze the energy modulation process. It is found that, assuming a short bunch (σφ / 2 π → 0) and small relative energy spread (σγ /γr → 0), the energy spread (Y =σγ 2) at the exit of the traveling wave linac has a parabolic relationship with the cosine value of the injection phase (X = cosφr|z=0), i.e., Y = AX2 + BX + C. Analogous to quadrupole strength scanning for emittance measurement, this phase scanning method can be used to obtain the bunch length by measuring the energy spread at different injection phases. The injection phases can be randomly chosen, which is significantly different from the commonly used zero-phasing method. Further, the systematic error of the reported method, such as the influence of the space charge effect, is analyzed. This technique will be especially useful at low energies when the beam quality is dramatically degraded and is hard to measure using the zero-phasing method.

  14. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.

    PubMed

    Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria

    2017-06-15

    Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  15. Two important limitations relating to the spiking of environmental samples with contaminants of emerging concern: How close to the real analyte concentrations are the reported recovered values?

    PubMed

    Michael, Costas; Bayona, Josep Maria; Lambropoulou, Dimitra; Agüera, Ana; Fatta-Kassinos, Despo

    2017-06-01

    Occurrence and effects of contaminants of emerging concern pose a special challenge to environmental scientists. The investigation of these effects requires reliable, valid, and comparable analytical data. To this effect, two critical aspects are raised herein, concerning the limitations of the produced analytical data. The first relates to the inherent difficulty that exists in the analysis of environmental samples, which is related to the lack of knowledge (information), in many cases, of the form(s) of the contaminant in which is present in the sample. Thus, the produced analytical data can only refer to the amount of the free contaminant ignoring the amount in which it may be present in other forms; e.g., as in chelated and conjugated form. The other important aspect refers to the way with which the spiking procedure is generally performed to determine the recovery of the analytical method. Spiking environmental samples, in particular solid samples, with standard solution followed by immediate extraction, as is the common practice, can lead to an overestimation of the recovery. This is so, because no time is given to the system to establish possible equilibria between the solid matter-inorganic and/or organic-and the contaminant. Therefore, the spiking procedure need to be reconsidered by including a study of the extractable amount of the contaminant versus the time elapsed between spiking and the extraction of the sample. This study can become an element of the validation package of the method.

  16. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data

    PubMed Central

    Feikin, Daniel R.; Scott, J. Anthony G.; Zeger, Scott L.; Murdoch, David R.; O’Brien, Katherine L.; Deloria Knoll, Maria

    2017-01-01

    Abstract Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. PMID:28575372

  17. Statistical process control in nursing research.

    PubMed

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  18. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  19. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  20. Using business analytics to improve outcomes.

    PubMed

    Rivera, Jose; Delaney, Stephen

    2015-02-01

    Orlando Health has brought its hospital and physician practice revenue cycle systems into better balance using four sets of customized analytics: Physician performance analytics gauge the total net revenue for every employed physician. Patient-pay analytics provide financial risk scores for all patients on both the hospital and physician practice sides. Revenue management analytics bridge the gap between the back-end central business office and front-end physician practice managers and administrators. Enterprise management analytics allow the hospitals and physician practices to share important information about common patients.

  1. Occurrence of Organic Wastewater Compounds in the Tinkers Creek Watershed and Two Other Tributaries to the Cuyahoga River, Northeast Ohio

    USGS Publications Warehouse

    Tertuliani, J.S.; Alvarez, D.A.; Furlong, E.T.; Meyer, M.T.; Zaugg, S.D.; Koltun, G.F.

    2008-01-01

    The U.S. Geological Survey - in cooperation with the Ohio Water Development Authority; National Park Service; Cities of Aurora, Bedford, Bedford Heights, Solon, and Twinsburg; and Portage and Summit Counties - and in collaboration with the Ohio Environmental Protection Agency, did a study to determine the occurrence and distribution of organic wastewater compounds (OWCs) in the Tinkers Creek watershed in northeastern Ohio. In the context of this report, OWCs refer to a wide range of compounds such as antibiotics, prescription and nonprescription pharmaceuticals, personal-care products, household and industrial compounds (for example, antimicrobials, fragrances, surfactants, fire retardants, and so forth) and a variety of other chemicals. Canisters containing polar organic integrative sampler (POCIS) and semipermeable membrane device (SPMD) media were deployed instream for a 28-day period in Mayand June 2006 at locations upstream and downstream from seven wastewater-treatment-plant (WWTP) outfalls in the Tinkers Creek watershed, at a site on Tinkers Creek downstream from all WWTP discharges, and at one reference site each in two nearby watersheds (Yellow Creek and Furnace Run) that drain to the Cuyahoga River. Streambed-sediment samples also were collected at each site when the canisters were retrieved. POCIS and SPMDs are referred to as 'passive samplers' because they sample compounds that they are exposed to without use of mechanical or moving parts. OWCs detected in POCIS and SPMD extracts are referred to in this report as 'detections in water' because both POCIS and SPMDs provided time-weighted measures of concentration in the stream over the exposure period. Streambed sediments also reflect exposure to OWCs in the stream over a long period of time and provide another OWC exposure pathway for aquatic organisms. Four separate laboratory methods were used to analyze for 32 antibiotic, 20 pharmaceutical, 57 to 66 wastewater, and 33 hydrophobic compounds. POCIS and streambed-sediment extracts were analyzed by both the pharmaceutical and wastewater methods. POCIS extracts also were analyzed by the antibiotic method, and SPMD extracts were analyzed by the hydrophobic-compound method. Analytes associated with a given laboratory method are referred to in aggregate by the method name (for example, antibiotic-method analytes are referred to as 'antibiotic compounds') even though some analytes associated with the method may not be strictly classified as such. In addition, some compounds were included in the analyte list for more than one laboratory method. For a given sample matrix, individual compounds detected by more than one analytical method are included independently in counts for each method. A total of 12 antibiotic, 20 pharmaceutical, 41 wastewater, and 22 hydrophobic compounds were detected in water at one or more sites. Eight pharmaceutical and 37 wastewater compounds were detected in streambed sediments. The numbers of detections at reference sites tended to be in the low range of detection counts observed in the Tinkers Creek watershed for a given analytical method. Also, the total numbers of compounds detected in water and sediment at the reference sites were less than the total numbers of compounds detected at sites in the Tinkers Creek watershed. With the exception of hydrophobic compounds, it was common at most sites to have more compounds detected in samples collected downstream from WWTP outfalls than in corresponding samples collected upstream from the outfalls. This was particularly true for antibiotic, pharmaceutical, and wastewater compounds in water. In contrast, it was common to have more hydrophobic compounds detected in samples collected upstream from WWTP outfalls than downstream. Caffeine, fluoranthene, N,N-diethyl-meta-toluamide (DEET), phenanthrene, and pyrene were detected in water at all sites in the Tinkers Creek watershed, irrespective of whether the site was upstream or downs

  2. Consistency of FMEA used in the validation of analytical procedures.

    PubMed

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. An Analytical Solution for Yaw Maneuver Optimization on the International Space Station and Other Orbiting Space Vehicles

    NASA Technical Reports Server (NTRS)

    Dobrinskaya, Tatiana

    2015-01-01

    This paper suggests a new method for optimizing yaw maneuvers on the International Space Station (ISS). Yaw rotations are the most common large maneuvers on the ISS often used for docking and undocking operations, as well as for other activities. When maneuver optimization is used, large maneuvers, which were performed on thrusters, could be performed either using control moment gyroscopes (CMG), or with significantly reduced thruster firings. Maneuver optimization helps to save expensive propellant and reduce structural loads - an important factor for the ISS service life. In addition, optimized maneuvers reduce contamination of the critical elements of the vehicle structure, such as solar arrays. This paper presents an analytical solution for optimizing yaw attitude maneuvers. Equations describing pitch and roll motion needed to counteract the major torques during a yaw maneuver are obtained. A yaw rate profile is proposed. Also the paper describes the physical basis of the suggested optimization approach. In the obtained optimized case, the torques are significantly reduced. This torque reduction was compared to the existing optimization method which utilizes the computational solution. It was shown that the attitude profiles and the torque reduction have a good match for these two methods of optimization. The simulations using the ISS flight software showed similar propellant consumption for both methods. The analytical solution proposed in this paper has major benefits with respect to computational approach. In contrast to the current computational solution, which only can be calculated on the ground, the analytical solution does not require extensive computational resources, and can be implemented in the onboard software, thus, making the maneuver execution automatic. The automatic maneuver significantly simplifies the operations and, if necessary, allows to perform a maneuver without communication with the ground. It also reduces the probability of command errors. The suggested analytical solution provides a new method of maneuver optimization which is less complicated, automatic and more universal. A maneuver optimization approach, presented in this paper, can be used not only for the ISS, but for other orbiting space vehicles.

  4. Fluorescence metrology used for analytics of high-quality optical materials

    NASA Astrophysics Data System (ADS)

    Engel, Axel; Haspel, Rainer; Rupertus, Volker

    2004-09-01

    Optical, glass ceramics and crystals are used for various specialized applications in telecommunication, biomedical, optical, and micro lithography technology. In order to qualify and control the material quality during the research and production processes several specialized ultra trace analytisis methods have to be appliedcs Schott Glas is applied. One focus of our the activities is the determination of impurities ranging in the sub ppb-regime, because such kind of impurity level is required e.g. for pure materials used for microlithography for example. Common analytical techniques for these impurity levels areSuch impurities are determined using analytical methods like LA ICP-MS and or Neutron Activation Analysis for example. On the other hand direct and non-destructive optical analysistic becomes is attractive because it visualizes the requirement of the optical applications additionally. Typical eExamples are absorption and laser resistivity measurements of optical material with optical methods like precision spectral photometers and or in-situ transmission measurements by means ofusing lamps and or UV lasers. Analytical methods have the drawback that they are time consuming and rather expensive, whereas the sensitivity for the absorption method will not be sufficient to characterize the future needs (coefficient much below 10-3 cm-1). For a non-destructive qualification for the current and future quality requirements a Jobin Yvon FLUOROLOG 3.22 fluorescence spectrometery is employed to enable fast and precise qualification and analysis. The main advantage of this setup is the combination of highest sensitivity (more than one order of magnitude higher sensitivity than state of the art UV absorption spectroscopy), fast measurement and evaluation cycles (several minutes compared to several hours necessary for chemical analystics). An overview is given for spectral characteristics using specified standards, which are necessary to establish the analytical system. The elementary fluorescence and absorption of rare earth element impurities as well as crystal defects induced luminescence originated by impurities was investigated. Quantitative numbers are given for the relative quantum yield as well as for the excitation cross section for doped glass and calcium fluoride.

  5. Measurement analysis of two radials with a common-origin point and its application.

    PubMed

    Liu, Zhenyao; Yang, Jidong; Zhu, Weiwei; Zhou, Shang; Tan, Xuanping

    2017-08-01

    In spectral analysis, a chemical component is usually identified by its characteristic spectra, especially the peaks. If two components have overlapping spectral peaks, they are generally considered to be indiscriminate in current analytical chemistry textbooks and related literature. However, if the intensities of the overlapping major spectral peaks are additive, and have different rates of change with respect to variations in the concentration of the individual components, a simple method, named the 'common-origin ray', for the simultaneous determination of two components can be established. Several case studies highlighting its applications are presented. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Stability of ricinine, abrine, and alpha-amanitin in finished tap ...

    EPA Pesticide Factsheets

    Journal Article Ricinine and abrine are potential indicators of drinking water contamination by the biotoxins ricin and abrin, respectively. Simultaneous detection of ricinine and abrine, along with α-amanitin, another potential biotoxin water contaminant, is reportable through the use of automated sample preparation via solid phase extraction and detection using liquid chromatography/tandem-mass spectrometry. Performance of the method was characterized over eight analytical batches with quality control samples analyzed over 10 days. For solutions of analytes prepared with appropriate preservatives, the minimum reporting level (MRL) was 0.50 μg/L for ricinine and abrine and 2.0 μg/L for α-amanitin. Among the analytes, the accuracy of the analysis ranged between 93 and 100% at concentrations of 1-2.5 x the MRL, with analytical precision ranging from 4 to 8%. Five drinking waters representing a range of water quality parameters and disinfection practices were fortified with the analytes and analyzed over a 28 day period to determine their storage stability in these waters. Ricinine was observed to be stable for 28 days in all tap waters. The analytical signal decreased within 5 hrs of sample preparation for abrine and μ-amanitin in some waters, but afterwards, remained stable for 28 days. The magnitude of the decrease correlated with common water quality parameters potentially related to sorption of contaminants onto dissolved and colloidal components within

  7. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  8. Multiple Theoretical Lenses as an Analytical Strategy in Researching Group Discussions

    ERIC Educational Resources Information Center

    Berge, Maria; Ingerman, Åke

    2017-01-01

    Background: In science education today, there is an emerging focus on what is happening in situ, making use of an array of analytical traditions. Common practice is to use one specific analytical framing within a research project, but there are projects that make use of multiple analytical framings to further the understanding of the same data,…

  9. Steroid hormones in environmental matrices: extraction method comparison.

    PubMed

    Andaluri, Gangadhar; Suri, Rominder P S; Graham, Kendon

    2017-11-09

    The U.S. Environmental Protection Agency (EPA) has developed methods for the analysis of steroid hormones in water, soil, sediment, and municipal biosolids by HRGC/HRMS (EPA Method 1698). Following the guidelines provided in US-EPA Method 1698, the extraction methods were validated with reagent water and applied to municipal wastewater, surface water, and municipal biosolids using GC/MS/MS for the analysis of nine most commonly detected steroid hormones. This is the first reported comparison of the separatory funnel extraction (SFE), continuous liquid-liquid extraction (CLLE), and Soxhlet extraction methods developed by the U.S. EPA. Furthermore, a solid phase extraction (SPE) method was also developed in-house for the extraction of steroid hormones from aquatic environmental samples. This study provides valuable information regarding the robustness of the different extraction methods. Statistical analysis of the data showed that SPE-based methods provided better recovery efficiencies and lower variability of the steroid hormones followed by SFE. The analytical methods developed in-house for extraction of biosolids showed a wide recovery range; however, the variability was low (≤ 7% RSD). Soxhlet extraction and CLLE are lengthy procedures and have been shown to provide highly variably recovery efficiencies. The results of this study are guidance for better sample preparation strategies in analytical methods for steroid hormone analysis, and SPE adds to the choice in environmental sample analysis.

  10. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    PubMed Central

    Smith, Justin D.

    2013-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874

  11. Maximum residue level validation of triclabendazole marker residues in bovine liver, muscle and milk matrices by ultra high pressure liquid chromatography tandem mass spectrometry.

    PubMed

    Whelan, Michelle; O'Mahony, John; Moloney, Mary; Cooper, Kevin M; Furey, Ambrose; Kennedy, D Glenn; Danaher, Martin

    2013-02-01

    Triclabendazole is the only anthelmintic drug, which is active against immature, mature and adult stages of fluke. The objective of this work was to develop an analytical method to quantify and confirm the presence of triclabendazole residues around the MRL. In this work, a new analytical method was developed, which extended dynamic range to 1-100 and 5-1000 μg kg(-1) for milk and tissue, respectively. This was achieved using a mobile phase containing trifluoroacetic acid (pK(a) of 0.3), which resulted in the formation of the protonated pseudomolecular ions, [M+H](+), of triclabendazole metabolites. Insufficient ionisation of common mobile phase additives due to low pK(a) values (<2) was identified as the cause of poor linearity. The new mobile phase conditions allowed the analysis of triclabendazole residues in liver, muscle and milk encompassing their EU maximum residue levels (MRL) (250, 225 and 10 μg kg(-1) respectively). Triclabendazole residues were extracted using a modified QuEChERS method and analysed by positive electrospray ionisation mass spectrometry with all analytes eluted by 2.23 min. The method was validated at the MRL according to Commission Decision (CD) 2002/657/EC criteria. The decision limit (CCα) of the method was in the range of 250.8-287.2, 2554.9-290.8 and 10.9-12.1 μg kg(-1) for liver, muscle and milk, respectively. The performance of the method was successfully verified for triclabendazole in muscle by participating in a proficiency study, the method was also applied to incurred liver, muscle and milk samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  13. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  14. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  15. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  16. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  17. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  18. Analytical and finite element performance evaluation of embedded piezoelectric sensors in polyethylene

    NASA Astrophysics Data System (ADS)

    Safaei, Mohsen; Anton, Steven R.

    2017-04-01

    A common application of piezoelectric transducers is to obtain operational data from working structures and dynamic components. Collected data can then be used to evaluate dynamic characterization of the system, perform structural health monitoring, or implement various other assessments. In some applications, piezoelectric transducers are bonded inside the host structure to satisfy system requirements; for example, piezoelectric transducers can be embedded inside the biopolymers of total joint replacements to evaluate the functionality of the artificial joint. The interactions between the piezoelectric device (inhomogeneity) and the surrounding polymer matrix determine the mechanical behavior of the matrix and the electromechanical behavior of the sensor. In this work, an analytical approach is employed to evaluate the electromechanical performance of 2-D plane strain piezoelectric elements of both circular and rectangular-shape inhomogeneities. These piezoelectric elements are embedded inside medical grade ultra-high molecular weight (UHMW) polyethylene, a material commonly used for bearing surfaces of joint replacements, such as total knee replacements (TKRs). Using the famous Eshelby inhomogeneity solution, the stress and electric field inside the circular (elliptical) inhomogeneity is obtained by decoupling the solution into purely elastic and dielectric systems of equations. For rectangular (non-elliptical) inhomogeneities, an approximation method based on the boundary integral function is utilized and the same decoupling method is employed. In order to validate the analytical result, a finite element analysis is performed for both the circular and rectangular inhomogeneities and the error for each case is calculated. For elliptical geometry, the error is less than 1% for stress and electric fields inside and outside the piezoelectric inhomogeneity, whereas, the error for non-elliptical geometry is obtained as 11% and 7% for stress and electric field inside the inhomogeneity, respectively.

  19. Development and application of a database of food ingredient fraud and economically motivated adulteration from 1980 to 2010.

    PubMed

    Moore, Jeffrey C; Spink, John; Lipp, Markus

    2012-04-01

    Food ingredient fraud and economically motivated adulteration are emerging risks, but a comprehensive compilation of information about known problematic ingredients and detection methods does not currently exist. The objectives of this research were to collect such information from publicly available articles in scholarly journals and general media, organize into a database, and review and analyze the data to identify trends. The results summarized are a database that will be published in the US Pharmacopeial Convention's Food Chemicals Codex, 8th edition, and includes 1305 records, including 1000 records with analytical methods collected from 677 references. Olive oil, milk, honey, and saffron were the most common targets for adulteration reported in scholarly journals, and potentially harmful issues identified include spices diluted with lead chromate and lead tetraoxide, substitution of Chinese star anise with toxic Japanese star anise, and melamine adulteration of high protein content foods. High-performance liquid chromatography and infrared spectroscopy were the most common analytical detection procedures, and chemometrics data analysis was used in a large number of reports. Future expansion of this database will include additional publically available articles published before 1980 and in other languages, as well as data outside the public domain. The authors recommend in-depth analyses of individual incidents. This report describes the development and application of a database of food ingredient fraud issues from publicly available references. The database provides baseline information and data useful to governments, agencies, and individual companies assessing the risks of specific products produced in specific regions as well as products distributed and sold in other regions. In addition, the report describes current analytical technologies for detecting food fraud and identifies trends and developments. © 2012 US Pharmacupia Journal of Food Science © 2012 Institute of Food Technologistsreg;

  20. Separating method factors and higher order traits of the Big Five: a meta-analytic multitrait-multimethod approach.

    PubMed

    Chang, Luye; Connelly, Brian S; Geeza, Alexis A

    2012-02-01

    Though most personality researchers now recognize that ratings of the Big Five are not orthogonal, the field has been divided about whether these trait intercorrelations are substantive (i.e., driven by higher order factors) or artifactual (i.e., driven by correlated measurement error). We used a meta-analytic multitrait-multirater study to estimate trait correlations after common method variance was controlled. Our results indicated that common method variance substantially inflates trait correlations, and, once controlled, correlations among the Big Five became relatively modest. We then evaluated whether two different theories of higher order factors could account for the pattern of Big Five trait correlations. Our results did not support Rushton and colleagues' (Rushton & Irwing, 2008; Rushton et al., 2009) proposed general factor of personality, but Digman's (1997) α and β metatraits (relabeled by DeYoung, Peterson, and Higgins (2002) as Stability and Plasticity, respectively) produced viable fit. However, our models showed considerable overlap between Stability and Emotional Stability and between Plasticity and Extraversion, raising the question of whether these metatraits are redundant with their dominant Big Five traits. This pattern of findings was robust when we included only studies whose observers were intimately acquainted with targets. Our results underscore the importance of using a multirater approach to studying personality and the need to separate the causes and outcomes of higher order metatraits from those of the Big Five. We discussed the implications of these findings for the array of research fields in which personality is studied.

  1. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    NASA Astrophysics Data System (ADS)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  2. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  3. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  4. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  5. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  6. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  7. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  8. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  9. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture... Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MREs are listed as follows: (1) Official Methods of Analysis of AOAC...

  10. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  11. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  12. Comparative quantification of dietary supplemented neural creatine concentrations with (1)H-MRS peak fitting and basis spectrum methods.

    PubMed

    Turner, Clare E; Russell, Bruce R; Gant, Nicholas

    2015-11-01

    Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Quantitation by Portable Gas Chromatography: Mass Spectrometry of VOCs Associated with Vapor Intrusion

    PubMed Central

    Fair, Justin D.; Bailey, William F.; Felty, Robert A.; Gifford, Amy E.; Shultes, Benjamin; Volles, Leslie H.

    2010-01-01

    Development of a robust reliable technique that permits for the rapid quantitation of volatile organic chemicals is an important first step to remediation associated with vapor intrusion. This paper describes the development of an analytical method that allows for the rapid and precise identification and quantitation of halogenated and nonhalogenated contaminants commonly found within the ppbv level at sites where vapor intrusion is a concern. PMID:20885969

  14. The structure of biodiversity – insights from molecular phylogeography

    PubMed Central

    Hewitt, Godfrey M

    2004-01-01

    DNA techniques, analytical methods and palaeoclimatic studies are greatly advancing our knowledge of the global distribution of genetic diversity, and how it evolved. Such phylogeographic studies are reviewed from Arctic, Temperate and Tropical regions, seeking commonalities of cause in the resulting genetic patterns. The genetic diversity is differently patterned within and among regions and biomes, and is related to their histories of climatic changes. This has major implications for conservation science. PMID:15679920

  15. Meta-analysis as Statistical and Analytical Method of Journal's Content Scientific Evaluation.

    PubMed

    Masic, Izet; Begic, Edin

    2015-02-01

    A meta-analysis is a statistical and analytical method which combines and synthesizes different independent studies and integrates their results into one common result. Analysis of the journals "Medical Archives", "Materia Socio Medica" and "Acta Informatica Medica", which are located in the most eminent indexed databases of the biomedical milieu. The study has retrospective and descriptive character, and included the period of the calendar year 2014. Study included six editions of all three journals (total of 18 journals). In this period was published a total of 291 articles (in the "Medical Archives" 110, "Materia Socio Medica" 97, and in "Acta Informatica Medica" 84). The largest number of articles was original articles. Small numbers have been published as professional, review articles and case reports. Clinical events were most common in the first two journals, while in the journal "Acta Informatica Medica" belonged to the field of medical informatics, as part of pre-clinical medical disciplines. Articles are usually required period of fifty to fifty nine days for review. Articles were received from four continents, mostly from Europe. The authors are most often from the territory of Bosnia and Herzegovina, then Iran, Kosovo and Macedonia. The number of articles published each year is increasing, with greater participation of authors from different continents and abroad. Clinical medical disciplines are the most common, with the broader spectrum of topics and with a growing number of original articles. Greater support of the wider scientific community is needed for further development of all three of the aforementioned journals.

  16. Models for randomly distributed nanoscopic domains on spherical vesicles

    NASA Astrophysics Data System (ADS)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  17. Efficient Numerical Diagonalization of Hermitian 3 × 3 Matrices

    NASA Astrophysics Data System (ADS)

    Kopp, Joachim

    A very common problem in science is the numerical diagonalization of symmetric or hermitian 3 × 3 matrices. Since standard "black box" packages may be too inefficient if the number of matrices is large, we study several alternatives. We consider optimized implementations of the Jacobi, QL, and Cuppen algorithms and compare them with an alytical method relying on Cardano's formula for the eigenvalues and on vector cross products for the eigenvectors. Jacobi is the most accurate, but also the slowest method, while QL and Cuppen are good general purpose algorithms. The analytical algorithm outperforms the others by more than a factor of 2, but becomes inaccurate or may even fail completely if the matrix entries differ greatly in magnitude. This can mostly be circumvented by using a hybrid method, which falls back to QL if conditions are such that the analytical calculation might become too inaccurate. For all algorithms, we give an overview of the underlying mathematical ideas, and present detailed benchmark results. C and Fortran implementations of our code are available for download from .

  18. Bridging meta-analysis and the comparative method: a test of seed size effect on germination after frugivores' gut passage.

    PubMed

    Verdú, Miguel; Traveset, Anna

    2004-02-01

    Most studies using meta-analysis try to establish relationships between traits across taxa from interspecific databases and, thus, the phylogenetic relatedness among these taxa should be taken into account to avoid pseudoreplication derived from common ancestry. This paper illustrates, with a representative example of the relationship between seed size and the effect of frugivore's gut on seed germination, that meta-analytic procedures can also be phylogenetically corrected by means of the comparative method. The conclusions obtained in the meta-analytical and phylogenetical approaches are very different. The meta-analysis revealed that the positive effects that gut passage had on seed germination increased with seed size in the case of gut passage through birds whereas decreased in the case of gut passage through non-flying mammals. However, once the phylogenetic relatedness among plant species was taken into account, the effects of gut passage on seed germination did not depend on seed size and were similar between birds and non-flying mammals. Some methodological considerations are given to improve the bridge between the meta-analysis and the comparative method.

  19. Effect of processing on recovery and variability associated with immunochemical analytical methods for multiple allergens in a single matrix: sugar cookies.

    PubMed

    Khuda, Sefat; Slate, Andrew; Pereira, Marion; Al-Taher, Fadwa; Jackson, Lauren; Diaz-Amigo, Carmen; Bigley, Elmer C; Whitaker, Thomas; Williams, Kristina M

    2012-05-02

    Among the major food allergies, peanut, egg, and milk are the most common. The immunochemical detection of food allergens depends on various factors, such as the food matrix and processing method, which can affect allergen conformation and extractability. This study aimed to (1) develop matrix-specific incurred reference materials for allergen testing, (2) determine whether multiple allergens in the same model food can be simultaneously detected, and (3) establish the effect of processing on reference material stability and allergen detection. Defatted peanut flour, whole egg powder, and spray-dried milk were added to cookie dough at seven incurred levels before baking. Allergens were measured using five commercial enzyme-linked immunosorbent assay (ELISA) kits. All kits showed decreased recovery of all allergens after baking. Analytical coefficients of variation for most kits increased with baking time, but decreased with incurred allergen level. Thus, food processing negatively affects the recovery and variability of peanut, egg, and milk detection in a sugar cookie matrix when using immunochemical methods.

  20. Development of a comprehensive analytical platform for the detection and quantitation of food fraud using a biomarker approach. The oregano adulteration case study.

    PubMed

    Wielogorska, Ewa; Chevallier, Olivier; Black, Connor; Galvin-King, Pamela; Delêtre, Marc; Kelleher, Colin T; Haughey, Simon A; Elliott, Christopher T

    2018-01-15

    Due to increasing number of food fraud incidents, there is an inherent need for the development and implementation of analytical platforms enabling detection and quantitation of adulteration. In this study a set of unique biomarkers of commonly found oregano adulterants became the targets in the development of a LC-MS/MS method which underwent a rigorous in-house validation. The method presented very high selectivity and specificity, excellent linearity (R 2 >0.988) low decision limits and detection capabilities (<2%), acceptable accuracy (intra-assay 92-113%, inter-assay 69-138%) and precision (CV<20%). The method was compared with an established FTIR screening assay and revealed a good correlation of quali- and quantitative results (R 2 >0.81). An assessment of 54 suspected adulterated oregano samples revealed that almost 90% of them contained at least one bulking agent, with a median level of adulteration of 50%. Such innovative methodologies need to be established as routine testing procedures to detect and ultimately deter food fraud. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. 40 CFR 161.180 - Enforcement analytical method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Enforcement analytical method. 161.180... DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements § 161.180 Enforcement analytical method. An analytical method suitable for enforcement purposes must be...

  2. A comparative evaluation of the analytical performances of Capillarys 2 Flex Piercing, Tosoh HLC-723 G8, Premier Hb9210, and Roche Cobas c501 Tina-quant Gen 2 analyzers for HbA1c determination.

    PubMed

    Wu, Xiaobin; Chao, Yan; Wan, Zemin; Wang, Yunxiu; Ma, Yan; Ke, Peifeng; Wu, Xinzhong; Xu, Jianhua; Zhuang, Junhua; Huang, Xianzhang

    2016-10-15

    Haemoglobin A 1c (HbA 1c ) is widely used in the management of diabetes. Therefore, the reliability and comparability among different analytical methods for its detection have become very important. A comparative evaluation of the analytical performances (precision, linearity, accuracy, method comparison, and interferences including bilirubin, triglyceride, cholesterol, labile HbA 1c (LA 1c ), vitamin C, aspirin, fetal haemoglobin (HbF), and haemoglobin E (Hb E)) were performed on Capillarys 2 Flex Piercing (Capillarys 2FP) (Sebia, France), Tosoh HLC-723 G8 (Tosoh G8) (Tosoh, Japan), Premier Hb9210 (Trinity Biotech, Ireland) and Roche Cobas c501 (Roche c501) (Roche Diagnostics, Germany). A good precision was shown at both low and high HbA 1c levels on all four systems, with all individual CVs below 2% (IFCC units) or 1.5% (NGSP units). Linearity analysis for each analyzer had achieved a good correlation coefficient (R 2 > 0.99) over the entire range tested. The analytical bias of the four systems against the IFCC targets was less than ± 6% (NGSP units), indicating a good accuracy. Method comparison showed a great correlation and agreement between methods. Very high levels of triglycerides and cholesterol (≥ 15.28 and ≥ 8.72 mmol/L, respectively) led to falsely low HbA 1c concentrations on Roche c501. Elevated HbF induced false HbA 1c detection on Capillarys 2FP (> 10%), Tosoh G8 (> 30%), Premier Hb9210 (> 15%), and Roche c501 (> 5%). On Tosoh G8, HbE induced an extra peak on chromatogram, and significantly lower results were reported. The four HbA 1c methods commonly used with commercial analyzers showed a good reliability and comparability, although some interference may falsely alter the result.

  3. 40 CFR 158.355 - Enforcement analytical method.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Enforcement analytical method. 158.355... DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.355 Enforcement analytical method. An analytical method suitable for enforcement purposes must be provided for each active ingredient in the...

  4. The determination of mercury in mushrooms by CV-AAS and ICP-AES techniques.

    PubMed

    Jarzynska, Grazyna; Falandysz, Jerzy

    2011-01-01

    This research presents an example of an excellent applied study on analytical problems due to hazardous mercury determination in environmental materials and validity of published results on content of this element in wild growing mushrooms. The total mercury content has been analyzed in a several species of wild-grown mushrooms and some herbal origin certified reference materials, using two analytical methods. One method was commonly known and well validated the cold-vapour atomic absorption spectroscopy (CV-AAS) after a direct sample pyrolysis coupled to the gold wool trap, which was a reference method. A second method was a procedure that involved a final mercury measurement using the inductively-coupled plasma atomic emission spectroscopy (ICP-AES) at λ 194.163 nm, which was used by some authors to report on a high mercury content of a large sets of wild-grown mushrooms. We found that the method using the ICP-AES at λ 194.163 nm gave inaccurate and imprecise results. The results of this study imply that because of unsuitability of total mercury determination using the ICP-AES at λ 194.163 nm, the reports on great concentrations of this metal in a large sets of wild-grown mushrooms, when examined using this method, have to be studied with caution, since data are highly biased.

  5. A Novel Analysis Method for Paired-Sample Microbial Ecology Experiments.

    PubMed

    Olesen, Scott W; Vora, Suhani; Techtmann, Stephen M; Fortney, Julian L; Bastidas-Oyanedel, Juan R; Rodríguez, Jorge; Hazen, Terry C; Alm, Eric J

    2016-01-01

    Many microbial ecology experiments use sequencing data to measure a community's response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samples and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method's validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of "bottle effects".

  6. Management of thyroid cytological material, pre-analytical procedures and bio-banking.

    PubMed

    Bode-Lesniewska, Beata; Cochand-Priollet, Beatrix; Straccia, Patrizia; Fadda, Guido; Bongiovanni, Massimo

    2018-06-09

    Thyroid nodules are common and increasingly detected due to recent advances in imaging techniques. However, clinically relevant thyroid cancer is rare and the mortality from aggressive thyroid cancer remains constant. FNAC (Fine Needle Aspiration Cytology) is a standard method for diagnosing thyroid malignancy and the discrimination of malignant nodules from goiter. As the examined nodules on thyroid FNAC are often small incidental findings, it is important to maintain a low rate of undetermined diagnoses requiring further clinical work up or surgery. The most important factors determining the accuracy of the cytological diagnosis and suitability for biobanking of thyroid FNACs are the quality of the sample and availability of adequate tissue for auxiliary studies. This article analyses technical aspects (pre-analytics) of performing thyroid FNACs, including image guidance and rapid on slide evaluation (ROSE), sample collection methods (conventional slides, liquid based methods (LBC), cell blocks) and storage (bio-banking). The spectrum of the special studies (immunocytochemistry on direct slides or LBC, immunohistochemistry on cell blocks and molecular methods) required for improving the precision of the cytological diagnosis of the thyroid nodules is discussed. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. A Gaussian beam method for ultrasonic non-destructive evaluation modeling

    NASA Astrophysics Data System (ADS)

    Jacquet, O.; Leymarie, N.; Cassereau, D.

    2018-05-01

    The propagation of high-frequency ultrasonic body waves can be efficiently estimated with a semi-analytic Dynamic Ray Tracing approach using paraxial approximation. Although this asymptotic field estimation avoids the computational cost of numerical methods, it may encounter several limitations in reproducing identified highly interferential features. Nevertheless, some can be managed by allowing paraxial quantities to be complex-valued. This gives rise to localized solutions, known as paraxial Gaussian beams. Whereas their propagation and transmission/reflection laws are well-defined, the fact remains that the adopted complexification introduces additional initial conditions. While their choice is usually performed according to strategies specifically tailored to limited applications, a Gabor frame method has been implemented to indiscriminately initialize a reasonable number of paraxial Gaussian beams. Since this method can be applied for an usefully wide range of ultrasonic transducers, the typical case of the time-harmonic piston radiator is investigated. Compared to the commonly used Multi-Gaussian Beam model [1], a better agreement is obtained throughout the radiated field between the results of numerical integration (or analytical on-axis solution) and the resulting Gaussian beam superposition. Sparsity of the proposed solution is also discussed.

  8. Review of Processing and Analytical Methods for Francisella ...

    EPA Pesticide Factsheets

    Journal Article The etiological agent of tularemia, Francisella tularensis, is a resilient organism within the environment and can be acquired many ways (infectious aerosols and dust, contaminated food and water, infected carcasses, and arthropod bites). However, isolating F. tularensis from environmental samples can be challenging due to its nutritionally fastidious and slow-growing nature. In order to determine the current state of the science regarding available processing and analytical methods for detection and recovery of F. tularensis from water and soil matrices, a review of the literature was conducted. During the review, analysis via culture, immunoassays, and genomic identification were the most commonly found methods for F. tularensis detection within environmental samples. Other methods included combined culture and genomic analysis for rapid quantification of viable microorganisms and use of one assay to identify multiple pathogens from a single sample. Gaps in the literature that were identified during this review suggest that further work to integrate culture and genomic identification would advance our ability to detect and to assess the viability of Francisella spp. The optimization of DNA extraction, whole genome amplification with inhibition-resistant polymerases, and multiagent microarray detection would also advance biothreat detection.

  9. Rapid Screening Method for New Psychoactive Substances of Forensic Interest: Electrochemistry and Analytical Determination of Phenethylamines Derivatives (NBOMe) via Cyclic and Differential Pulse Voltammetry.

    PubMed

    Andrade, Ana Flávia B; Mamo, Samuel Kasahun; Gonzalez-Rodriguez, Jose

    2017-02-07

    The NBOMe derivatives are phenethylamines derived from the 2C class of hallucinogens. Only a few human pharmacologic studies have been conducted on these drugs, and several cases of intoxication and deaths have been reported. Presently, NBOMe are not a part of the routine drugs-of-abuse screening procedure for many police forces, and there are no rapid immunoassay screening tests that can detect the presence of those compounds. In this Article, the voltammetric behavior of 25B NBOMe and 25I NBOMe were investigated and their electroanalytical characteristics determined for the first time. A novel, fast, and sensitive screening method for the identification of the two most common NBOMes (25B-NBOMe and 25I-NBOMe) in real samples is reported. The method uses the electrochemical oxidation of these molecules to produce an analytical signal that can be related to the NBOMe concentration with an average lower limit of quantitation of 0.01 mg/mL for both of them. The method is selective enough to identify the two compounds individually, even given the great similarity in their structure.

  10. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  11. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  12. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  13. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  14. 7 CFR 94.103 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  15. Methodological flaws introduce strong bias into molecular analysis of microbial populations.

    PubMed

    Krakat, N; Anjum, R; Demirel, B; Schröder, P

    2017-02-01

    In this study, we report how different cell disruption methods, PCR primers and in silico analyses can seriously bias results from microbial population studies, with consequences for the credibility and reproducibility of the findings. Our results emphasize the pitfalls of commonly used experimental methods that can seriously weaken the interpretation of results. Four different cell lysis methods, three commonly used primer pairs and various computer-based analyses were applied to investigate the microbial diversity of a fermentation sample composed of chicken dung. The fault-prone, but still frequently used, amplified rRNA gene restriction analysis was chosen to identify common weaknesses. In contrast to other studies, we focused on the complete analytical process, from cell disruption to in silico analysis, and identified potential error rates. This identified a wide disagreement of results between applied experimental approaches leading to very different community structures depending on the chosen approach. The interpretation of microbial diversity data remains a challenge. In order to accurately investigate the taxonomic diversity and structure of prokaryotic communities, we suggest a multi-level approach combining DNA-based and DNA-independent techniques. The identified weaknesses of commonly used methods to study microbial diversity can be overcome by a multi-level approach, which produces more reliable data about the fate and behaviour of microbial communities of engineered habitats such as biogas plants, so that the best performance can be ensured. © 2016 The Society for Applied Microbiology.

  16. Differentiation between Staphylococcus aureus and Staphylococcus epidermidis strains using Raman spectroscopy.

    PubMed

    Rebrošová, Katarína; Šiler, Martin; Samek, Ota; Růžička, Filip; Bernatová, Silvie; Ježek, Jan; Zemánek, Pavel; Holá, Veronika

    2017-08-01

    Raman spectroscopy is an analytical method with a broad range of applications across multiple scientific fields. We report on a possibility to differentiate between two important Gram-positive species commonly found in clinical material - Staphylococcus aureus and Staphylococcus epidermidis - using this rapid noninvasive technique. For this, we tested 87 strains, 41 of S. aureus and 46 of S. epidermidis, directly from colonies grown on a Mueller-Hinton agar plate using Raman spectroscopy. The method paves a way for separation of these two species even on high number of samples and therefore, it can be potentially used in clinical diagnostics.

  17. PAT-tools for process control in pharmaceutical film coating applications.

    PubMed

    Knop, Klaus; Kleinebudde, Peter

    2013-12-05

    Recent development of analytical techniques to monitor the coating process of pharmaceutical solid dosage forms such as pellets and tablets are described. The progress from off- or at-line measurements to on- or in-line applications is shown for the spectroscopic methods near infrared (NIR) and Raman spectroscopy as well as for terahertz pulsed imaging (TPI) and image analysis. The common goal of all these methods is to control or at least to monitor the coating process and/or to estimate the coating end point through timely measurements. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Offline solid-phase extraction for preconcentration of pharmaceuticals and personal care products in environmental water and their simultaneous determination using the reversed phase high-performance liquid chromatography method.

    PubMed

    G Archana; Dhodapkar, Rita; Kumar, Anupama

    2016-09-01

    The present study reports a precise and simple offline solid-phase extraction (SPE) coupled with reversed-phase high-performance liquid chromatography (RP-HPLC) method for the simultaneous determination of five representative and commonly present pharmaceuticals and personal care products (PPCPs), a new class of emerging pollutants in the aquatic environment. The target list of analytes including ciprofloxacin, acetaminophen, caffeine benzophenone and irgasan were separated by a simple HPLC method. The column used was a reversed-phase C18 column, and the mobile phase was 1 % acetic acid and methanol (20:80 v/v) under isocratic conditions, at a flow rate of 1 mL min(-1). The analytes were separated and detected within 15 min using the photodiode array detector (PDA). The linearity of the calibration curves were obtained with correlation coefficients 0.98-0.99.The limit of detection (LOD), limit of quantification (LOQ), precision, accuracy and ruggedness demonstrated the reproducibility, specificity and sensitivity of the developed method. Prior to the analysis, the SPE was performed using a C18 cartridge to preconcentrate the targeted analytes from the environmental water samples. The developed method was applied to evaluate and fingerprint PPCPs in sewage collected from a residential engineering college campus, polluted water bodies such as Nag river and Pili river and the influent and effluent samples from a sewage treatment plant (STP) situated at Nagpur city, in the peak summer season. This method is useful for estimation of pollutants present in microquantities in the surface water bodies and treated sewage as compared to nanolevel pollutants detected by mass spectrometry (MS) detectors.

  19. Characterization of titanium dioxide nanoparticles in food products: analytical methods to define nanoparticles.

    PubMed

    Peters, Ruud J B; van Bemmel, Greet; Herrera-Rivera, Zahira; Helsper, Hans P F G; Marvin, Hans J P; Weigel, Stefan; Tromp, Peter C; Oomen, Agnes G; Rietveld, Anton G; Bouwmeester, Hans

    2014-07-09

    Titanium dioxide (TiO2) is a common food additive used to enhance the white color, brightness, and sometimes flavor of a variety of food products. In this study 7 food grade TiO2 materials (E171), 24 food products, and 3 personal care products were investigated for their TiO2 content and the number-based size distribution of TiO2 particles present in these products. Three principally different methods have been used to determine the number-based size distribution of TiO2 particles: electron microscopy, asymmetric flow field-flow fractionation combined with inductively coupled mass spectrometry, and single-particle inductively coupled mass spectrometry. The results show that all E171 materials have similar size distributions with primary particle sizes in the range of 60-300 nm. Depending on the analytical method used, 10-15% of the particles in these materials had sizes below 100 nm. In 24 of the 27 foods and personal care products detectable amounts of titanium were found ranging from 0.02 to 9.0 mg TiO2/g product. The number-based size distributions for TiO2 particles in the food and personal care products showed that 5-10% of the particles in these products had sizes below 100 nm, comparable to that found in the E171 materials. Comparable size distributions were found using the three principally different analytical methods. Although the applied methods are considered state of the art, they showed practical size limits for TiO2 particles in the range of 20-50 nm, which may introduce a significant bias in the size distribution because particles <20 nm are excluded. This shows the inability of current state of the art methods to support the European Union recommendation for the definition of nanomaterials.

  20. Uncertainties in the deprojection of the observed bar properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Yanfei; Shen, Juntai; Li, Zhao-Yu, E-mail: jshen@shao.ac.cn

    2014-08-10

    In observations, it is important to deproject the two fundamental quantities characterizing a bar, i.e., its length (a) and ellipticity (e), to face-on values before any careful analyses. However, systematic estimation on the uncertainties of the commonly used deprojection methods is still lacking. Simulated galaxies are well suited in this study. We project two simulated barred galaxies onto a two-dimensional (2D) plane with different bar orientations and disk inclination angles (i). Bar properties are measured and deprojected with the popular deprojection methods in the literature. Generally speaking, deprojection uncertainties increase with increasing i. All of the deprojection methods behave badlymore » when i is larger than 60°, due to the vertical thickness of the bar. Thus, future statistical studies of barred galaxies should exclude galaxies more inclined than 60°. At moderate inclination angles (i ≤ 60°), 2D deprojection methods (analytical and image stretching), and Fourier-based methods (Fourier decomposition and bar-interbar contrast) perform reasonably well with uncertainties ∼10% in both the bar length and ellipticity, whereas the uncertainties of the one-dimensional (1D) analytical deprojection can be as high as 100% in certain extreme cases. We find that different bar measurement methods show systematic differences in the deprojection uncertainties. We further discuss the deprojection uncertainty factors with the emphasis on the most important one, i.e., the three-dimensional structure of the bar itself. We construct two triaxial toy bar models that can qualitatively reproduce the results of the 1D and 2D analytical deprojections; they confirm that the vertical thickness of the bar is the main source of uncertainties.« less

  1. Algorithms and software for U-Pb geochronology by LA-ICPMS

    NASA Astrophysics Data System (ADS)

    McLean, Noah M.; Bowring, James F.; Gehrels, George

    2016-07-01

    The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.

  2. ASSESSMENT OF ANALYTICAL METHODS USED TO MEASURE CHANGES IN BODY COMPOSITION IN THE ELDERLY AND RECOMMENDATIONS FOR THEIR USE IN PHASE II CLINICAL TRIALS

    PubMed Central

    Lustgarten, M.S.; Fielding, R.A.

    2012-01-01

    It is estimated that in the next 20 years, the amount of people greater than 65 years of age will rise from 40 to 70 million, and will account for 19% of the total population. Age-related decreases in muscle mass and function, known as sarcopenia, have been shown to be related to functional limitation, frailty and an increased risk of morbidity and mortality. Therefore, with an increasing elderly population, interventions that can improve muscle mass content and/or function are essential. However, analytical techniques used for measurement of muscle mass in young subjects may not be valid for use in the elderly. Therefore, the purpose of this review is to examine the applied specificity and accuracy of methods that are commonly used for measurement of muscle mass in aged subjects, and, to propose specific recommendations for the use of body composition measures in phase II clinical trials of function-promoting anabolic therapies. PMID:21528163

  3. Prioritizing pesticide compounds for analytical methods development

    USGS Publications Warehouse

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1 compounds are high priority as new analytes. The objective for analytical methods development is to design an integrated analytical strategy that includes as many of the Tier 1 pesticide compounds as possible in a relatively few, cost-effective methods. More than 60 percent of the Tier 1 compounds are high priority because they are anticipated to be present at concentrations approaching levels that could be of concern to human health or aquatic life in surface water or groundwater. An additional 17 percent of Tier 1 compounds were frequently detected in monitoring studies, but either were not measured at levels potentially relevant to humans or aquatic organisms, or do not have benchmarks available with which to compare concentrations. The remaining 21 percent are pesticide degradates that were included because their parent pesticides were in Tier 1. Tier 1 pesticide compounds for water span all major pesticide use groups and a diverse range of chemical classes, with herbicides and their degradates composing half of compounds. Many of the high priority pesticide compounds also are in several national regulatory programs for water, including those that are regulated in drinking water by the U.S. Environmental Protection Agency under the Safe Drinking Water Act and those that are on the latest Contaminant Candidate List. For sediment, a total of 175 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods available for monitoring and studies. More than 60 percent of these compounds are included in some USGS analytical method; however, some are spread across several research methods that are expensive to perform, and monitoring data are not extensive for many compounds. The remaining Tier 1 compounds for sediment are high priority as new analytes. The objective for analytical methods development for sediment is to enhance an existing analytical method that currently includes nearly half of the pesticide compounds in Tier 1 by adding as many additional Tier 1 compounds as are analytically compatible. About 35 percent of the Tier 1 compounds for sediment are high priority on the basis of measured occurrence. A total of 74 compounds, or 42 percent, are high priority on the basis of predicted likelihood of occurrence according to physical-chemical properties, and either have potential toxicity to aquatic life, high pesticide useage, or both. The remaining 22 percent of Tier 1 pesticide compounds were either degradates of Tier 1 parent compounds or included for other reasons. As with water, the Tier 1 pesticide compounds for sediment are distributed across the major pesticide-use groups; insecticides and their degradates are the largest fraction, making up 45 percent of Tier 1. In contrast to water, organochlorines, at 17 percent, are the largest chemical class for Tier 1 in sediment, which is to be expected because there is continued widespread detection in sediments of persistent organochlorine pesticides and their degradates at concentrations high enough for potential effects on aquatic life. Compared to water, there are fewer available benchmarks with which to compare contaminant concentrations in sediment, but a total of 19 Tier 1 compounds have at least one sediment benchmark or screening value for aquatic organisms. Of the 175 compounds in Tier 1, 77 percent have high aquatic-life toxicity, as defined for this process. This evaluation of pesticides and degradates resulted in two lists of compounds that are priorities for USGS analytical methods development, one for water and one for sediment. These lists will be used as the basis for redesigning and enhancing USGS analytical capabilities for pesticides in order to capture as many high-priority pesticide compounds as possible using an economically feasible approach.

  4. Evaluation and performance of desorption electrospray ionization using a triple quadrupole mass spectrometer for quantitation of pharmaceuticals in plasma.

    PubMed

    Kennedy, Joseph H; Wiseman, Justin M

    2010-02-01

    The present work describes the methodology and investigates the performance of desorption electrospray ionization (DESI) combined with a triple quadrupole mass spectrometer for the quantitation of small drug molecules in human plasma. Amoxepine, atenolol, carbamazepine, clozapine, prazosin, propranolol and verapamil were selected as target analytes while terfenadine was selected as the internal standard common to each of the analytes. Protein precipitation of human plasma using acetonitrile was utilized for all samples. Limits of detection were determined for all analytes in plasma and shown to be in the range 0.2-40 ng/mL. Quantitative analysis of amoxepine, prazosin and verapamil was performed over the range 20-7400 ng/mL and shown to be linear in all cases with R(2) >0.99. In most cases, the precision (relative standard deviation) and accuracy (relative error) of each method were less than or equal to 20%, respectively. The performance of the combined techniques made it possible to analyze each sample in 15 s illustrating DESI tandem mass spectrometry (MS/MS) as powerful tool for the quantitation of analytes in deproteinized human plasma. Copyright 2010 John Wiley & Sons, Ltd.

  5. 7 CFR 94.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.4 Section 94.4 Agriculture... POULTRY AND EGG PRODUCTS Mandatory Analyses of Egg Products § 94.4 Analytical methods. The majority of analytical methods used by the USDA laboratories to perform mandatory analyses for egg products are listed as...

  6. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  7. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  8. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  9. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  10. Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.

    PubMed

    Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra

    2018-02-01

    The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Analysis of intracellular and extracellular microcystin variants in sediments and pore waters by accelerated solvent extraction and high performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Zastepa, Arthur; Pick, Frances R; Blais, Jules M; Saleem, Ammar

    2015-05-04

    The fate and persistence of microcystin cyanotoxins in aquatic ecosystems remains poorly understood in part due to the lack of analytical methods for microcystins in sediments. Existing methods have been limited to the extraction of a few extracellular microcystins of similar chemistry. We developed a single analytical method, consisting of accelerated solvent extraction, hydrophilic-lipophilic balance solid phase extraction, and reversed phase high performance liquid chromatography-tandem mass spectrometry, suitable for the extraction and quantitation of both intracellular and extracellular cyanotoxins in sediments as well as pore waters. Recoveries of nine microcystins, representing the chemical diversity of microcystins, and nodularin (a marine analogue) ranged between 75 and 98% with one, microcystin-RR (MC-RR), at 50%. Chromatographic separation of these analytes was achieved within 7.5 min and the method detection limits were between 1.1 and 2.5 ng g(-1) dry weight (dw). The robustness of the method was demonstrated on sediment cores collected from seven Canadian lakes of diverse geography and trophic states. Individual microcystin variants reached a maximum concentration of 829 ng g(-1) dw on sediment particles and 132 ng mL(-1) in pore waters and could be detected in sediments as deep as 41 cm (>100 years in age). MC-LR, -RR, and -LA were more often detected while MC-YR, -LY, -LF, and -LW were less common. The analytical method enabled us to estimate sediment-pore water distribution coefficients (K(d)), MC-RR had the highest affinity for sediment particles (log K(d)=1.3) while MC-LA had the lowest affinity (log K(d)=-0.4), partitioning mainly into pore waters. Our findings confirm that sediments serve as a reservoir for microcystins but suggest that some variants may diffuse into overlying water thereby constituting a new route of exposure following the dissipation of toxic blooms. The method is well suited to determine the fate and persistence of different microcystins in aquatic systems. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. An analytical review of vasculobiliary injury in laparoscopic and open cholecystectomy

    PubMed Central

    Strasberg, Steven M; Helton, W Scott

    2011-01-01

    Objectives Biliary injuries are frequently accompanied by vascular injuries, which may worsen the bile duct injury and cause liver ischemia. We performed an analytical review with the aim of defining vasculobiliary injury and setting out the important issues in this area. Methods A literature search of relevant terms was peformed using OvidSP. Bibliographies of papers were also searched to obtain older literature. Results Vasculobiliary injury was defined as: an injury to both a bile duct and a hepatic artery and/or portal vein; the bile duct injury may be caused by operative trauma, be ischaemic in origin or both, and may or may not be accompanied by various degrees of hepatic ischaemia. Right hepatic artery (RHA) vasculobiliary injury (VBI) is the most common variant. Injury to the RHA likely extends the biliary injury to a higher level than the gross observed mechanical injury. VBI results in slow hepatic infarction in about 10% of patients. Repair of the artery is rarely possible and the overall benefit unclear. Injuries involving the portal vein or common or proper hepatic arteries are much less common, but have more serious effects including rapid infarction of the liver. Conclusions Routine arteriography is recommended in patients with a biliary injury if early repair is contemplated. Consideration should be given to delaying repair of a biliary injury in patients with occlusion of the RHA. Patients with injuries to the portal vein or proper or common hepatic should be emergently referred to tertiary care centers. PMID:21159098

  13. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  14. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases

    PubMed Central

    2012-01-01

    Background Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. Results This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. Conclusions The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in paint samples. PMID:23050842

  15. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    PubMed

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in paint samples.

  16. Cross-reactivity profiles of legumes and tree nuts using the xMAP® multiplex food allergen detection assay.

    PubMed

    Cho, Chung Y; Oles, Carolyn; Nowatzke, William; Oliver, Kerry; Garber, Eric A E

    2017-10-01

    The homology between proteins in legumes and tree nuts makes it common for individuals with food allergies to be allergic to multiple legumes and tree nuts. This propensity for allergenic and antigenic cross-reactivity means that commonly employed commercial immunodiagnostic assays (e.g., dipsticks) for the detection of food allergens may not always accurately detect, identify, and quantitate legumes and tree nuts unless additional orthogonal analytical methods or secondary measures of analysis are employed. The xMAP ® Multiplex Food Allergen Detection Assay (FADA) was used to determine the cross-reactivity patterns and the utility of multi-antibody antigenic profiling to distinguish between legumes and tree nuts. Pure legumes and tree nuts extracted using buffered detergent displayed a high level of cross-reactivity that decreased upon dilution or by using a buffer (UD buffer) designed to increase the stringency of binding conditions and reduce the occurrence of false positives due to plant-derived lectins. Testing for unexpected food allergens or the screening for multiple food allergens often involves not knowing the identity of the allergen present, its concentration, or the degree of modification during processing. As such, the analytical response measured may represent multiple antigens of varying antigenicity (cross-reactivity). This problem of multiple potential analytes is usually unresolved and the focus becomes the primary analyte, the antigen the antibody was raised against, or quantitative interpretation of the content of the analytical sample problematic. The alternative solution offered here to this problem is the use of an antigenic profile as generated by the xMAP FADA using multiple antibodies (bead sets). By comparing the antigenic profile to standards, the allergen may be identified along with an estimate of the concentration present. Cluster analysis of the xMAP FADA data was also performed and agreed with the known phylogeny of the legumes and tree nuts being analyzed. Graphical abstract The use of cluster analysis to compare the multi-antigen profiles of food allergens.

  17. Analytical validation of a next generation sequencing liquid biopsy assay for high sensitivity broad molecular profiling.

    PubMed

    Plagnol, Vincent; Woodhouse, Samuel; Howarth, Karen; Lensing, Stefanie; Smith, Matt; Epstein, Michael; Madi, Mikidache; Smalley, Sarah; Leroy, Catherine; Hinton, Jonathan; de Kievit, Frank; Musgrave-Brown, Esther; Herd, Colin; Baker-Neblett, Katherine; Brennan, Will; Dimitrov, Peter; Campbell, Nathan; Morris, Clive; Rosenfeld, Nitzan; Clark, James; Gale, Davina; Platt, Jamie; Calaway, John; Jones, Greg; Forshew, Tim

    2018-01-01

    Circulating tumor DNA (ctDNA) analysis is being incorporated into cancer care; notably in profiling patients to guide treatment decisions. Responses to targeted therapies have been observed in patients with actionable mutations detected in plasma DNA at variant allele fractions (VAFs) below 0.5%. Highly sensitive methods are therefore required for optimal clinical use. To enable objective assessment of assay performance, detailed analytical validation is required. We developed the InVisionFirst™ assay, an assay based on enhanced tagged amplicon sequencing (eTAm-Seq™) technology to profile 36 genes commonly mutated in non-small cell lung cancer (NSCLC) and other cancer types for actionable genomic alterations in cell-free DNA. The assay has been developed to detect point mutations, indels, amplifications and gene fusions that commonly occur in NSCLC. For analytical validation, two 10mL blood tubes were collected from NSCLC patients and healthy volunteer donors. In addition, contrived samples were used to represent a wide spectrum of genetic aberrations and VAFs. Samples were analyzed by multiple operators, at different times and using different reagent Lots. Results were compared with digital PCR (dPCR). The InVisionFirst assay demonstrated an excellent limit of detection, with 99.48% sensitivity for SNVs present at VAF range 0.25%-0.33%, 92.46% sensitivity for indels at 0.25% VAF and a high rate of detection at lower frequencies while retaining high specificity (99.9997% per base). The assay also detected ALK and ROS1 gene fusions, and DNA amplifications in ERBB2, FGFR1, MET and EGFR with high sensitivity and specificity. Comparison between the InVisionFirst assay and dPCR in a series of cancer patients showed high concordance. This analytical validation demonstrated that the InVisionFirst assay is highly sensitive, specific and robust, and meets analytical requirements for clinical applications.

  18. Analytical validation of a next generation sequencing liquid biopsy assay for high sensitivity broad molecular profiling

    PubMed Central

    Howarth, Karen; Lensing, Stefanie; Smith, Matt; Epstein, Michael; Madi, Mikidache; Smalley, Sarah; Leroy, Catherine; Hinton, Jonathan; de Kievit, Frank; Musgrave-Brown, Esther; Herd, Colin; Baker-Neblett, Katherine; Brennan, Will; Dimitrov, Peter; Campbell, Nathan; Morris, Clive; Rosenfeld, Nitzan; Clark, James; Gale, Davina; Platt, Jamie; Calaway, John; Jones, Greg

    2018-01-01

    Circulating tumor DNA (ctDNA) analysis is being incorporated into cancer care; notably in profiling patients to guide treatment decisions. Responses to targeted therapies have been observed in patients with actionable mutations detected in plasma DNA at variant allele fractions (VAFs) below 0.5%. Highly sensitive methods are therefore required for optimal clinical use. To enable objective assessment of assay performance, detailed analytical validation is required. We developed the InVisionFirst™ assay, an assay based on enhanced tagged amplicon sequencing (eTAm-Seq™) technology to profile 36 genes commonly mutated in non-small cell lung cancer (NSCLC) and other cancer types for actionable genomic alterations in cell-free DNA. The assay has been developed to detect point mutations, indels, amplifications and gene fusions that commonly occur in NSCLC. For analytical validation, two 10mL blood tubes were collected from NSCLC patients and healthy volunteer donors. In addition, contrived samples were used to represent a wide spectrum of genetic aberrations and VAFs. Samples were analyzed by multiple operators, at different times and using different reagent Lots. Results were compared with digital PCR (dPCR). The InVisionFirst assay demonstrated an excellent limit of detection, with 99.48% sensitivity for SNVs present at VAF range 0.25%-0.33%, 92.46% sensitivity for indels at 0.25% VAF and a high rate of detection at lower frequencies while retaining high specificity (99.9997% per base). The assay also detected ALK and ROS1 gene fusions, and DNA amplifications in ERBB2, FGFR1, MET and EGFR with high sensitivity and specificity. Comparison between the InVisionFirst assay and dPCR in a series of cancer patients showed high concordance. This analytical validation demonstrated that the InVisionFirst assay is highly sensitive, specific and robust, and meets analytical requirements for clinical applications. PMID:29543828

  19. The dynamics of a stabilised Wien bridge oscillator

    NASA Astrophysics Data System (ADS)

    Lerner, L.

    2016-11-01

    We present for the first time analytic solutions for the nonlinear dynamics of a Wien bridge oscillator stabilised by three common methods: an incandescent lamp, signal diodes, and the field effect transistor. The results can be used to optimise oscillator design, and agree well with measurements. The effect of operational amplifier marginal nonlinearity on oscillator performance at high frequencies is clarified. The oscillator circuits and their analysis can be used to demonstrate nonlinear dynamics in the undergraduate laboratory.

  20. Gradient elution moving boundary electrophoresis enables rapid analysis of acids in complex biomass-derived streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.

    Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less

  1. Gradient elution moving boundary electrophoresis enables rapid analysis of acids in complex biomass-derived streams

    DOE PAGES

    Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.; ...

    2016-09-27

    Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less

  2. Versatile fabrication of paper-based microfluidic devices with high chemical resistance using scholar glue and magnetic masks.

    PubMed

    Cardoso, Thiago M G; de Souza, Fabrício R; Garcia, Paulo T; Rabelo, Denilson; Henry, Charles S; Coltro, Wendell K T

    2017-06-29

    Simple methods have been developed for fabricating microfluidic paper-based analytical devices (μPADs) but few of these devices can be used with organic solvents and/or aqueous solutions containing surfactants. This study describes a simple fabrication strategy for μPADs that uses readily available scholar glue to create the hydrophobic flow barriers that are resistant to surfactants and organic solvents. Microfluidic structures were defined by magnetic masks designed with either neodymium magnets or magnetic sheets to define the patter, and structures were created by spraying an aqueous solution of glue on the paper surface. The glue-coated paper was then exposed to UV/Vis light for cross-linking to maximize chemical resistance. Examples of microzone arrays and microfluidic devices are demonstrated. μPADs fabricated with scholar glue retained their barriers when used with surfactants, organic solvents, and strong/weak acids and bases unlike common wax-printed barriers. Paper microzones and microfluidic devices were successfully used for colorimetric assays of clinically relevant analytes commonly detected in urinalysis to demonstrate the low background of the barrier material and generally applicability to sensing. The proposed fabrication method is attractive for both its ability to be used with diverse chemistries and the low cost and simplicity of the materials and process. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Comparison of four extraction/methylation analytical methods to measure fatty acid composition by gas chromatography in meat.

    PubMed

    Juárez, M; Polvillo, O; Contò, M; Ficco, A; Ballico, S; Failla, S

    2008-05-09

    Four different extraction-derivatization methods commonly used for fatty acid analysis in meat (in situ or one-step method, saponification method, classic method and a combination of classic extraction and saponification derivatization) were tested. The in situ method had low recovery and variation. The saponification method showed the best balance between recovery, precision, repeatability and reproducibility. The classic method had high recovery and acceptable variation values, except for the polyunsaturated fatty acids, showing higher variation than the former methods. The combination of extraction and methylation steps had great recovery values, but the precision, repeatability and reproducibility were not acceptable. Therefore the saponification method would be more convenient for polyunsaturated fatty acid analysis, whereas the in situ method would be an alternative for fast analysis. However the classic method would be the method of choice for the determination of the different lipid classes.

  4. Analysis of organic acids and acylglycines for the diagnosis of related inborn errors of metabolism by GC- and HPLC-MS.

    PubMed

    la Marca, Giancarlo; Rizzo, Cristiano

    2011-01-01

    The analysis of organic acids in urine is commonly included in routine procedures for detecting many inborn errors of metabolism. Many analytical methods allow for both qualitative and quantitative determination of organic acids, mainly in urine but also in plasma, serum, whole blood, amniotic fluid, and cerebrospinal fluid. Liquid-liquid extraction and solid-phase extraction using anion exchange or silica columns are commonly employed approaches for sample treatment. Before analysis can be carried out using gas chromatography-mass spectrometry, organic acids must be converted into more thermally stable, volatile, and chemically inert forms, mainly trimethylsilyl ethers, esters, or methyl esters.

  5. Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?

    PubMed

    Ershadi, Saba; Shayanfar, Ali

    2018-03-22

    The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.

  6. Single-analyte to multianalyte fluorescence sensors

    NASA Astrophysics Data System (ADS)

    Lavigne, John J.; Metzger, Axel; Niikura, Kenichi; Cabell, Larry A.; Savoy, Steven M.; Yoo, J. S.; McDevitt, John T.; Neikirk, Dean P.; Shear, Jason B.; Anslyn, Eric V.

    1999-05-01

    The rational design of small molecules for the selective complexation of analytes has reached a level of sophistication such that there exists a high degree of prediction. An effective strategy for transforming these hosts into sensors involves covalently attaching a fluorophore to the receptor which displays some fluorescence modulation when analyte is bound. Competition methods, such as those used with antibodies, are also amenable to these synthetic receptors, yet there are few examples. In our laboratories, the use of common dyes in competition assays with small molecules has proven very effective. For example, an assay for citrate in beverages and an assay for the secondary messenger IP3 in cells have been developed. Another approach we have explored focuses on multi-analyte sensor arrays with attempt to mimic the mammalian sense of taste. Our system utilizes polymer resin beads with the desired sensors covalently attached. These functionalized microspheres are then immobilized into micromachined wells on a silicon chip thereby creating our taste buds. Exposure of the resin to analyte causes a change in the transmittance of the bead. This change can be fluorescent or colorimetric. Optical interrogation of the microspheres, by illuminating from one side of the wafer and collecting the signal on the other, results in an image. These data streams are collected using a CCD camera which creates red, green and blue (RGB) patterns that are distinct and reproducible for their environments. Analysis of this data can identify and quantify the analytes present.

  7. Simultaneous quantification of five major active components in capsules of the traditional Chinese medicine ‘Shu-Jin-Zhi-Tong’ by high performance liquid chromatography

    PubMed Central

    Yang, Xing-Xin; Zhang, Xiao-Xia; Chang, Rui-Miao; Wang, Yan-Wei; Li, Xiao-Ni

    2011-01-01

    A simple and reliable high performance liquid chromatography (HPLC) method has been developed for the simultaneous quantification of five major bioactive components in ‘Shu-Jin-Zhi-Tong’ capsules (SJZTC), for the purposes of quality control of this commonly prescribed traditional Chinese medicine. Under the optimum conditions, excellent separation was achieved, and the assay was fully validated in terms of linearity, precision, repeatability, stability and accuracy. The validated method was applied successfully to the determination of the five compounds in SJZTC samples from different production batches. The HPLC method can be used as a valid analytical method to evaluate the intrinsic quality of SJZTC. PMID:29403711

  8. Comparative Measurement of Microcystins in Diverse Surface ...

    EPA Pesticide Factsheets

    The measurement of microcystins, cyanotoxins associated with cyanobacterial blooms which are increasingly prevalent in inland waters, is complicated by the diversity of congeners which have been observed in the environment. At present, more than 150 microcystin congeners have been identified, and this poses a significant challenge to analytical methods intended to assess human health risks in surface and drinking water systems. The most widely employed analytical method at present is the ADDA-ELISA technique which is potentially sensitive to all microcystins, but it is primarily intended as a semi-quantitative method, and questions have been raised regarding the potential for cross-reactivity and false positives. LC-MS/MS methods targeting specific congeners, such as US EPA Method 544, are intended for use as a secondary confirmation following a positive ELISA response, but these techniques can target only those congeners for which commercial standards are available. Accordingly, they are not suitable for ascertaining the safety of a given water sample, given the potential for omitting unknown microcystin congeners which might be present.An alternative approach involves oxidative transformation of microcystins to a common product, 2-methyl-3-methoxy-4-phenylbutyric acid, or MMPB. Measuring MMPB by LC-MS/MS can potentially provide a metric for the sum of all microcystin congeners present in a sample, subject to the efficiency and overall yield of conversion. The

  9. A review of available analytical technologies for qualitative and quantitative determination of nitramines.

    PubMed

    Lindahl, Sofia; Gundersen, Cathrine Brecke; Lundanes, Elsa

    2014-08-01

    This review aims to summarize the available analytical methods in the open literature for the determination of some aliphatic and cyclic nitramines. Nitramines covered in this review are the ones that can be formed from the use of amines in post-combustion CO2 capture (PCC) plants and end up in the environment. Since the literature is quite scarce regarding the determination of nitramines in aqueous and soil samples, methods for determination of nitramines in other matrices have also been included. Since the nitramines are found in complex matrices and/or in very low concentration, an extraction step is often necessary before their determination. Liquid-liquid extraction (LLE) using dichloromethane and solid phase extraction (SPE) with an activated carbon based material have been the two most common extraction methods. Gas chromatography (GC) or reversed phase liquid chromatography (RPLC) has been used often combined with mass spectrometry (MS) in the final determination step. Presently there is no comprehensive method available that can be used for determination of all nitramines included in this review. The lowest concentration limit of quantification (cLOQ) is in the ng L(-1) range, however, most methods appear to have a cLOQ in the μg L(-1) range, if the cLOQ has been given.

  10. Comparative measurement of microcystins in diverse surface ...

    EPA Pesticide Factsheets

    The measurement of microcystins, cyanotoxins associated with cyanobacterial blooms which are increasingly prevalent in inland waters, is complicated by the diversity of congeners which have been observed in the environment. At present, more than 150 microcystin congeners have been identified, and this poses a significant challenge to analytical methods intended to assess human health risks in surface and drinking water systems. The most widely employed analytical method at present is the ADDA-ELISA technique which is potentially sensitive to all microcystins, but it is primarily intended as a semi-quantitative method, and questions have been raised regarding the potential for cross-reactivity and false positives. LC-MS/MS methods targeting specific congeners, such as US EPA Method 544, are intended for use as a secondary confirmation following a positive ELISA response, but these techniques can target only those congeners for which commercial standards are available. Accordingly, they not suitable for ascertaining the safety of a given water sample, given the potential for omitting unknown microcystin congeners which might be present.An alternative approach involves oxidative transformation of microcystins to a common product, 2-methyl-3-methoxy-4-phenylbutyric acid, or MMPB. Measuring MMPB by LC-MS/MS can potentially provide a metric for the sum of all microcystin congeners present in a sample, subject to the efficiency and overall yield of conversion. The pres

  11. Rapid monitoring of glycerol in fermentation growth media: Facilitating crude glycerol bioprocess development.

    PubMed

    Abad, Sergi; Pérez, Xavier; Planas, Antoni; Turon, Xavier

    2014-04-01

    Recently, the need for crude glycerol valorisation from the biodiesel industry has generated many studies for practical and economic applications. Amongst them, fermentations based on glycerol media for the production of high value metabolites are prominent applications. This has generated a need to develop analytical techniques which allow fast and simple glycerol monitoring during fermentation. The methodology should be fast and inexpensive to be adopted in research, as well as in industrial applications. In this study three different methods were analysed and compared: two common methodologies based on liquid chromatography and enzymatic kits, and the new method based on a DotBlot assay coupled with image analysis. The new methodology is faster and cheaper than the other conventional methods, with comparable performance. Good linearity, precision and accuracy were achieved in the lower range (10 or 15 g/L to depletion), the most common range of glycerol concentrations to monitor fermentations in terms of growth kinetics. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Development of Electrospun Nanomaterials and their Applications in Separation Science

    NASA Astrophysics Data System (ADS)

    Newsome, Toni Elwell

    In separations, efficiency is inversely related to the diameter of the sorbent particles of the stationary phase. Thus, materials research in separation science has primarily been directed towards reducing the diameter of the sorbent particle used in the stationary phase. In this dissertation, innovative methods designed for the fabrication and application of electrospun sorbent nanomaterials for separation science are described. Electrospinning is a facile, cost-effective technique that relies on repulsive electrostatic forces to produce nanofibers from a viscoelastic solution. Here, electrospinning is used to generate polymer, carbon, and silica-based nanofibers which are employed as sorbent nanomaterials in extractions and separations. Electrospun carbon nanofibers have proven to be ideal extractive phases for solid-phase microextraction (SPME) when coupled to gas chromatography (GC) for headspace sampling of volatile analytes. Herein, these carbon nanofibers were employed in the direct extraction of nonvolatile analytes and coupled to liquid chromatography (LC) for the first time. The high surface area of the coatings led to enhanced extraction efficiencies; they offered a 3-33 fold increase in efficiency relative to a commercial SPME phase. Carbon nanofibers proved to be stable when immersed in liquids common to LC demonstrating the enhanced stability of these coatings in SPME coupled to LC relative to conventional SPME fibers. The enhanced chemical and mechanical stability of the carbon SPME coatings considerably expanded the range of compounds applicable to SPME and extended the lifetimes of the fibers. Electrospun nanofibers have also proven to be ideal stationary phases in ultra-thin layer chromatography (UTLC). Nanofibers provide faster separations and enhanced separation efficiencies compared to commercial particle-based stationary phases in a relatively short distance. Here, the electrospun-UTLC technology was extended for the first time to nanofibers composed of silica, the most commonly used surface for TLC. An electrospinning method was optimized to produce silica-based nanofibers with the smallest diameter possible (300-380 nm) while maintaining homogenous nanofiber morphology. Highly efficient separations were performed in 15 mm with observed plate heights as low as 8.6 mum. Silica-based nanofibers proved to be chemically stable with a wide variety of TLC reagents demonstrating the enhanced compatibility of these phases with common TLC methods relative to polymer and carbon nanofiber UTLC plates. The extension of electrospun UTLC to silica-based nanofibers vastly expanded the range of analytes and TLC methods which can be used with this technology. The main disadvantage of conventional TLC development methods is that the mobile phase velocity decreases with increasing separation distance. Here, the chromatographic performance of electrospun polymer stationary phases was further improved by using a forced-flow mobile phase in planar electrochromatography (PEC) in which mobile phase velocity does not diminish with increasing distance. Separations were performed on polymer nanofiber UTLC plates in 1-2 min. Compared to UTLC, PEC offered unique selectivity, decreased analysis times (> 4 times faster), and enhanced efficiency (2-3 times lower plate height). In addition, two-dimensional (2D) separations of a complex analyte mixture using UTLC followed by PEC required only 11 min and exhibited a significant increase in separation number (70-77).

  13. Solving the linear inviscid shallow water equations in one dimension, with variable depth, using a recursion formula

    NASA Astrophysics Data System (ADS)

    Hernandez-Walls, R.; Martín-Atienza, B.; Salinas-Matus, M.; Castillo, J.

    2017-11-01

    When solving the linear inviscid shallow water equations with variable depth in one dimension using finite differences, a tridiagonal system of equations must be solved. Here we present an approach, which is more efficient than the commonly used numerical method, to solve this tridiagonal system of equations using a recursion formula. We illustrate this approach with an example in which we solve for a rectangular channel to find the resonance modes. Our numerical solution agrees very well with the analytical solution. This new method is easy to use and understand by undergraduate students, so it can be implemented in undergraduate courses such as Numerical Methods, Lineal Algebra or Differential Equations.

  14. A survey of analytical methods employed for monitoring of Advanced Oxidation/Reduction Processes for decomposition of selected perfluorinated environmental pollutants.

    PubMed

    Trojanowicz, Marek; Bobrowski, Krzysztof; Szostek, Bogdan; Bojanowska-Czajka, Anna; Szreder, Tomasz; Bartoszewicz, Iwona; Kulisa, Krzysztof

    2018-01-15

    The monitoring of Advanced Oxidation/Reduction Processes (AO/RPs) for the evaluation of the yield and mechanisms of decomposition of perfluorinated compounds (PFCs) is often a more difficult task than their determination in the environmental, biological or food samples with complex matrices. This is mostly due to the formation of hundreds, or even thousands, of both intermediate and final products. The considered AO/RPs, involving free radical reactions, include photolytic and photocatalytic processes, Fenton reactions, sonolysis, ozonation, application of ionizing radiation and several wet oxidation processes. The main attention is paid to the most commonly occurring PFCs in the environment, namely PFOA and PFOS. The most powerful and widely exploited method for this purpose is without a doubt LC/MS/MS, which allows the identification and trace quantitation of all species with detectability and resolution power depending on the particular instrumental configurations. The GC/MS is often employed for the monitoring of volatile fluorocarbons, confirming the formation of radicals in the processes of C‒C and C‒S bonds cleavage. For the direct monitoring of radicals participating in the reactions of PFCs decomposition, the molecular spectrophotometry is employed, especially electron paramagnetic resonance (EPR). The UV/Vis spectrophotometry as a detection method is of special importance in the evaluation of kinetics of radical reactions with the use of pulse radiolysis methods. The most commonly employed for the determination of the yield of mineralization of PFCs is ion-chromatography, but there is also potentiometry with ion-selective electrode and the measurements of general parameters such as Total Organic Carbon and Total Organic Fluoride. The presented review is based on about 100 original papers published in both analytical and environmental journals. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Recommendations for choosing an analysis method that controls Type I error for unbalanced cluster sample designs with Gaussian outcomes.

    PubMed

    Johnson, Jacqueline L; Kreidler, Sarah M; Catellier, Diane J; Murray, David M; Muller, Keith E; Glueck, Deborah H

    2015-11-30

    We used theoretical and simulation-based approaches to study Type I error rates for one-stage and two-stage analytic methods for cluster-randomized designs. The one-stage approach uses the observed data as outcomes and accounts for within-cluster correlation using a general linear mixed model. The two-stage model uses the cluster specific means as the outcomes in a general linear univariate model. We demonstrate analytically that both one-stage and two-stage models achieve exact Type I error rates when cluster sizes are equal. With unbalanced data, an exact size α test does not exist, and Type I error inflation may occur. Via simulation, we compare the Type I error rates for four one-stage and six two-stage hypothesis testing approaches for unbalanced data. With unbalanced data, the two-stage model, weighted by the inverse of the estimated theoretical variance of the cluster means, and with variance constrained to be positive, provided the best Type I error control for studies having at least six clusters per arm. The one-stage model with Kenward-Roger degrees of freedom and unconstrained variance performed well for studies having at least 14 clusters per arm. The popular analytic method of using a one-stage model with denominator degrees of freedom appropriate for balanced data performed poorly for small sample sizes and low intracluster correlation. Because small sample sizes and low intracluster correlation are common features of cluster-randomized trials, the Kenward-Roger method is the preferred one-stage approach. Copyright © 2015 John Wiley & Sons, Ltd.

  16. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...

  17. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...

  18. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...

  19. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...

  20. A multi-targeted liquid chromatography-mass spectrometry screening procedure for the detection in human urine of drugs non-prohibited in sport commonly used by the athletes.

    PubMed

    Mazzarino, Monica; Cesarei, Lorenzo; de la Torre, Xavier; Fiacco, Ilaria; Robach, Paul; Botrè, Francesco

    2016-01-05

    This work presents an analytical method for the simultaneous analysis in human urine of 38 pharmacologically active compounds (19 benzodiazepine-like substances, 7 selective serotonin reuptake inhibitors, 4 azole antifungal drugs, 5 inhibitors of the phosphodiesterases type 4 and 3 inhibitors of the phosphodiesterase type 5) by liquid-chromatography coupled with tandem mass spectrometry. The above substances classes include both the most common "non banned" drugs used by the athletes (based on the information reported on the "doping control form") and those drugs who are suspected to be performance enhancing and/or act as masking agents in particular conditions. The chromatographic separation was performed by a reverse-phase octadecyl column using as mobile phases acetonitrile and ultra-purified water, both with 0.1% formic acid. The detection was carried out using a triple quadrupole mass spectrometric analyser, positive electro-spray as ionization source and selected reaction monitoring as acquisition mode. Sample pre-treatment consisted in an enzymatic hydrolysis followed by a liquid-liquid extraction in neutral field using tert-butyl methyl-ether. The analytical procedure, once developed, was validated in terms of sensitivity (lower limits of detection in the range of 1-50 ng mL(-1)), specificity (no interferences were detected at the retention time of all the analytes under investigation), recovery (≥60% with a satisfactory repeatability, CV % lower than 10), matrix effect (lower than 30%) and reproducibility of retention times (CV% lower than 0.1) and of relative abundances (CV% lower than 15). The performance and the applicability of the method was evaluated by analyzing real samples containing benzodiazepines (alprazolam, diazepam, zolpidem or zoplicone) or inhibitors of the phosphodiesterases type 5 (sildenafil or vardenafil) and samples obtained incubating two of the phosphodiesterases type 4 studied (cilomilast or roflumilast) with pooled human liver microsomes. All the parent compounds, together with their main phase I metabolites, were clearly detected using the analytical procedures here developed. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Capacitive touch sensing : signal and image processing algorithms

    NASA Astrophysics Data System (ADS)

    Baharav, Zachi; Kakarala, Ramakrishna

    2011-03-01

    Capacitive touch sensors have been in use for many years, and recently gained center stage with the ubiquitous use in smart-phones. In this work we will analyze the most common method of projected capacitive sensing, that of absolute capacitive sensing, together with the most common sensing pattern, that of diamond-shaped sensors. After a brief introduction to the problem, and the reasons behind its popularity, we will formulate the problem as a reconstruction from projections. We derive analytic solutions for two simple cases: circular finger on a wire grid, and square finger on a square grid. The solutions give insight into the ambiguities of finding finger location from sensor readings. The main contribution of our paper is the discussion of interpolation algorithms including simple linear interpolation , curve fitting (parabolic and Gaussian), filtering, general look-up-table, and combinations thereof. We conclude with observations on the limits of the present algorithmic methods, and point to possible future research.

  2. Analytical method for promoting process capability of shock absorption steel.

    PubMed

    Sung, Wen-Pei; Shih, Ming-Hsiang; Chen, Kuen-Suan

    2003-01-01

    Mechanical properties and low cycle fatigue are two factors that must be considered in developing new type steel for shock absorption. Process capability and process control are significant factors in achieving the purpose of research and development programs. Often-used evaluation methods failed to measure process yield and process centering; so this paper uses Taguchi loss function as basis to establish an evaluation method and the steps for assessing the quality of mechanical properties and process control of an iron and steel manufacturer. The establishment of this method can serve the research and development and manufacturing industry and lay a foundation in enhancing its process control ability to select better manufacturing processes that are more reliable than decision making by using the other commonly used methods.

  3. Sources and preparation of data for assessing trends in concentrations of pesticides in streams of the United States, 1992–2010

    USGS Publications Warehouse

    Martin, Jeffrey D.; Eberle, Michael; Nakagaki, Naomi

    2011-01-01

    This report updates a previously published water-quality dataset of 44 commonly used pesticides and 8 pesticide degradates suitable for a national assessment of trends in pesticide concentrations in streams of the United States. Water-quality samples collected from January 1992 through September 2010 at stream-water sites of the U.S. Geological Survey (USGS) National Water-Quality Assessment (NAWQA) Program and the National Stream Quality Accounting Network (NASQAN) were compiled, reviewed, selected, and prepared for trend analysis. The principal steps in data review for trend analysis were to (1) identify analytical schedule, (2) verify sample-level coding, (3) exclude inappropriate samples or results, (4) review pesticide detections per sample, (5) review high pesticide concentrations, and (6) review the spatial and temporal extent of NAWQA pesticide data and selection of analytical methods for trend analysis. The principal steps in data preparation for trend analysis were to (1) select stream-water sites for trend analysis, (2) round concentrations to a consistent level of precision for the concentration range, (3) identify routine reporting levels used to report nondetections unaffected by matrix interference, (4) reassign the concentration value for routine nondetections to the maximum value of the long-term method detection level (maxLT-MDL), (5) adjust concentrations to compensate for temporal changes in bias of recovery of the gas chromatography/mass spectrometry (GCMS) analytical method, and (6) identify samples considered inappropriate for trend analysis. Samples analyzed at the USGS National Water Quality Laboratory (NWQL) by the GCMS analytical method were the most extensive in time and space and, consequently, were selected for trend analysis. Stream-water sites with 3 or more water years of data with six or more samples per year were selected for pesticide trend analysis. The selection criteria described in the report produced a dataset of 21,988 pesticide samples at 212 stream-water sites. Only 21,144 pesticide samples, however, are considered appropriate for trend analysis.

  4. High throughput LC-MS/MS method for the simultaneous analysis of multiple vitamin D analytes in serum.

    PubMed

    Jenkinson, Carl; Taylor, Angela E; Hassan-Smith, Zaki K; Adams, John S; Stewart, Paul M; Hewison, Martin; Keevil, Brian G

    2016-03-01

    Recent studies suggest that vitamin D-deficiency is linked to increased risk of common human health problems. To define vitamin D 'status' most routine analytical methods quantify one particular vitamin D metabolite, 25-hydroxyvitamin D3 (25OHD3). However, vitamin D is characterized by complex metabolic pathways, and simultaneous measurement of multiple vitamin D metabolites may provide a more accurate interpretation of vitamin D status. To address this we developed a high-throughput liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to analyse multiple vitamin D analytes, with particular emphasis on the separation of epimer metabolites. A supportive liquid-liquid extraction (SLE) and LC-MS/MS method was developed to quantify 10 vitamin D metabolites as well as separation of an interfering 7α-hydroxy-4-cholesten-3-one (7αC4) isobar (precursor of bile acid), and validated by analysis of human serum samples. In a cohort of 116 healthy subjects, circulating concentrations of 25-hydroxyvitamin D3 (25OHD3), 3-epi-25-hydroxyvitamin D3 (3-epi-25OHD3), 24,25-dihydroxyvitamin D3 (24R,25(OH)2D3), 1,25-dihydroxyvitamin D3 (1α,25(OH)2D3), and 25-hydroxyvitamin D2 (25OHD2) were quantifiable using 220μL of serum, with 25OHD3 and 24R,25(OH)2D3 showing significant seasonal variations. This high-throughput LC-MS/MS method provides a novel strategy for assessing the impact of vitamin D on human health and disease. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. SAM Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation

  6. Screening for illicit and medicinal drugs in whole blood using fully automated SPE and ultra-high-performance liquid chromatography with TOF-MS with data-independent acquisition.

    PubMed

    Pedersen, Anders Just; Dalsgaard, Petur Weihe; Rode, Andrej Jaroslav; Rasmussen, Brian Schou; Müller, Irene Breum; Johansen, Sys Stybe; Linnet, Kristian

    2013-07-01

    A broad forensic screening method for 256 analytes in whole blood based on a fully automated SPE robotic extraction and ultra-high-performance liquid chromatography (UHPLC) with TOF-MS with data-independent acquisition has been developed. The limit of identification was evaluated for all 256 compounds and 95 of these compounds were validated with regard to matrix effects, extraction recovery, and process efficiency. The limit of identification ranged from 0.001 to 0.1 mg/kg, and the process efficiency exceeded 50% for 73 of the 95 analytes. As an example of application, 1335 forensic traffic cases were analyzed with the presented screening method. Of these, 992 cases (74%) were positive for one or more traffic-relevant drugs above the Danish legal limits. Commonly abused drugs such as amphetamine, cocaine, and frequent types of benzodiazepines were the major findings. Nineteen less frequently encountered drugs were detected e.g. buprenorphine, butylone, cathine, fentanyl, lysergic acid diethylamide, m-chlorophenylpiperazine, 3,4-methylenedioxypyrovalerone, mephedrone, 4-methylamphetamine, p-fluoroamphetamine, and p-methoxy-N-methylamphetamine. In conclusion, using UHPLC-TOF-MS screening with data-independent acquisition resulted in the detection of common drugs of abuse as well as new designer drugs and more rarely occurring drugs. Thus, TOF-MS screening of blood samples constitutes a practical way for screening traffic cases, with the exception of δ-9-tetrahydrocannabinol, which should be handled in a separate method. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Simultaneous Quantification of Free and Glucuronidated Cannabinoids in Human Urine by Liquid Chromatography-Tandem Mass Spectrometry

    PubMed Central

    Scheidweiler, Karl B.; Desrosiers, Nathalie A.; Huestis, Marilyn A.

    2012-01-01

    Background Cannabis is the most commonly abused drug of abuse and is commonly quantified during urine drug testing. We conducted a controlled drug administration studies investigating efficacy of urinary cannabinoid glucuronide metabolites for documenting recency of cannabis intake and for determining stability of urinary cannabinoids. Methods A liquid chromatography tandem mass spectrometry method was developed and validated quantifying Δ9-tetrahydrocannabinol (THC), 11-hydroxy-THC (11-OH-THC), 11-nor-9-carboxy-THC (THCCOOH), cannabidiol, cannabinol, THC-glucuronide and THCCOOH-glucuronide in 0.5 ml human urine via supported-liquid extraction. Chromatography was performed on an Ultra Biphenyl column with a gradient of 10 mmol/l ammonium acetate, pH 6.15 and 15% methanol in acetonitrile at 0. 4ml/min. Analytes were monitored by positive and negative mode electrospray ionization and multiple reaction monitoring mass spectrometry. Results Linear ranges were 0.5–50 ng/ml for THC-glucuronide, 1–100 ng/ml for THCCOOH, 11-OH-THC and cannabidiol, 2–100 ng/ml for THC and cannabinol, and 5–500 ng/ml for THCCOOH-glucuronide (R2>0.99). Mean extraction efficiencies were 34–73% with analytical recovery (bias) 80.5–118.0% and total imprecision 3.0–10.2% coefficient of variation. Conclusion This method simultaneously quantifies urinary cannabinoids and phase II glucuronide metabolites, and enables evaluation of urinary cannabinoid glucuronides for documenting recency of cannabis intake and cannabinoid stability. The assay is applicable for routine urine cannabinoid testing. PMID:22771478

  8. THE EFFECT OF MERCURY CONTROLS ON WALLBOARD MANUFACTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandra Meischen

    2004-07-01

    Pending EPA regulations may mandate 70 to 90% mercury removal efficiency from utility flue gas. A mercury control option is the trapping of oxidized mercury in wet flue gas desulfurization systems (FGD). The potential doubling of mercury in the FGD material and its effect on mercury volatility at temperatures common to wallboard manufacture is a concern that could limit the growing byproduct use of FGD material. Prediction of mercury fate is limited by lack of information on the mercury form in the FGD material. The parts per billion mercury concentrations prevent the identification of mercury compounds by common analytical methods.more » A sensitive analytical method, cold vapor atomic fluorescence, coupled with leaching and thermodecomposition methods were evaluated for their potential to identify mercury compounds in FGD material. The results of the study suggest that the mercury form is dominated by the calcium sulfate matrix and is probably associated with the sulfate form in the FGD material. Additionally, to determine the effect of high mercury concentration FGD material on wallboard manufacture, a laboratory FGD unit was built to trap the oxidized mercury generated in a simulated flue gas. Although the laboratory prepared FGD material did not contain the mercury concentrations anticipated, further thermal tests determined that mercury begins to evolve from FGD material at 380 to 390 F, consequently dropping the drying temperature should mitigate mercury evolution if necessary. Mercury evolution is also diminished as the weight of the wallboard sample increased. Consequently, mercury evolution may not be a significant problem in wallboard manufacture.« less

  9. Development of a multi-matrix LC-MS/MS method for urea quantitation and its application in human respiratory disease studies.

    PubMed

    Wang, Jianshuang; Gao, Yang; Dorshorst, Drew W; Cai, Fang; Bremer, Meire; Milanowski, Dennis; Staton, Tracy L; Cape, Stephanie S; Dean, Brian; Ding, Xiao

    2017-01-30

    In human respiratory disease studies, liquid samples such as nasal secretion (NS), lung epithelial lining fluid (ELF), or upper airway mucosal lining fluid (MLF) are frequently collected, but their volumes often remain unknown. The lack of volume information makes it hard to estimate the actual concentration of recovered active pharmaceutical ingredient or biomarkers. Urea has been proposed to serve as a sample volume marker because it can freely diffuse through most body compartments and is less affected by disease states. Here, we report an easy and reliable LC-MS/MS method for cross-matrix measurement of urea in serum, plasma, universal transfer medium (UTM), synthetic absorptive matrix elution buffer 1 (SAMe1) and synthetic absorptive matrix elution buffer 2 (SAMe2) which are commonly sampled in human respiratory disease studies. The method uses two stable-isotope-labeled urea isotopologues, [ 15 N 2 ]-urea and [ 13 C, 15 N 2 ]-urea, as the surrogate analyte and the internal standard, respectively. This approach provides the best measurement consistency across different matrices. The analyte extraction was individually optimized in each matrix. Specifically in UTM, SAMe1 and SAMe2, the unique salting-out assisted liquid-liquid extraction (SALLE) not only dramatically reduces the matrix interferences but also improves the assay recovery. The use of an HILIC column largely increases the analyte retention. The typical run time is 3.6min which allows for high throughput analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  11. Identifying environmental variables explaining genotype-by-environment interaction for body weight of rainbow trout (Onchorynchus mykiss): reaction norm and factor analytic models

    PubMed Central

    2014-01-01

    Background Identifying the relevant environmental variables that cause GxE interaction is often difficult when they cannot be experimentally manipulated. Two statistical approaches can be applied to address this question. When data on candidate environmental variables are available, GxE interaction can be quantified as a function of specific environmental variables using a reaction norm model. Alternatively, a factor analytic model can be used to identify the latent common factor that explains GxE interaction. This factor can be correlated with known environmental variables to identify those that are relevant. Previously, we reported a significant GxE interaction for body weight at harvest in rainbow trout reared on three continents. Here we explore their possible causes. Methods Reaction norm and factor analytic models were used to identify which environmental variables (age at harvest, water temperature, oxygen, and photoperiod) may have caused the observed GxE interaction. Data on body weight at harvest was recorded on 8976 offspring reared in various locations: (1) a breeding environment in the USA (nucleus), (2) a recirculating aquaculture system in the Freshwater Institute in West Virginia, USA, (3) a high-altitude farm in Peru, and (4) a low-water temperature farm in Germany. Akaike and Bayesian information criteria were used to compare models. Results The combination of days to harvest multiplied with daily temperature (Day*Degree) and photoperiod were identified by the reaction norm model as the environmental variables responsible for the GxE interaction. The latent common factor that was identified by the factor analytic model showed the highest correlation with Day*Degree. Day*Degree and photoperiod were the environmental variables that differed most between Peru and other environments. Akaike and Bayesian information criteria indicated that the factor analytical model was more parsimonious than the reaction norm model. Conclusions Day*Degree and photoperiod were identified as environmental variables responsible for the strong GxE interaction for body weight at harvest in rainbow trout across four environments. Both the reaction norm and the factor analytic models can help identify the environmental variables responsible for GxE interaction. A factor analytic model is preferred over a reaction norm model when limited information on differences in environmental variables between farms is available. PMID:24571451

  12. SAM Pathogen Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.

  13. Aberrant Gene Expression in Humans

    PubMed Central

    Yang, Ence; Ji, Guoli; Brinkmeyer-Langford, Candice L.; Cai, James J.

    2015-01-01

    Gene expression as an intermediate molecular phenotype has been a focus of research interest. In particular, studies of expression quantitative trait loci (eQTL) have offered promise for understanding gene regulation through the discovery of genetic variants that explain variation in gene expression levels. Existing eQTL methods are designed for assessing the effects of common variants, but not rare variants. Here, we address the problem by establishing a novel analytical framework for evaluating the effects of rare or private variants on gene expression. Our method starts from the identification of outlier individuals that show markedly different gene expression from the majority of a population, and then reveals the contributions of private SNPs to the aberrant gene expression in these outliers. Using population-scale mRNA sequencing data, we identify outlier individuals using a multivariate approach. We find that outlier individuals are more readily detected with respect to gene sets that include genes involved in cellular regulation and signal transduction, and less likely to be detected with respect to the gene sets with genes involved in metabolic pathways and other fundamental molecular functions. Analysis of polymorphic data suggests that private SNPs of outlier individuals are enriched in the enhancer and promoter regions of corresponding aberrantly-expressed genes, suggesting a specific regulatory role of private SNPs, while the commonly-occurring regulatory genetic variants (i.e., eQTL SNPs) show little evidence of involvement. Additional data suggest that non-genetic factors may also underlie aberrant gene expression. Taken together, our findings advance a novel viewpoint relevant to situations wherein common eQTLs fail to predict gene expression when heritable, rare inter-individual variation exists. The analytical framework we describe, taking into consideration the reality of differential phenotypic robustness, may be valuable for investigating complex traits and conditions. PMID:25617623

  14. An intercomparison study of analytical methods used for quantification of levoglucosan in ambient aerosol filter samples

    NASA Astrophysics Data System (ADS)

    Yttri, K. E.; Schnelle-Kreiss, J.; Maenhaut, W.; Alves, C.; Bossi, R.; Bjerke, A.; Claeys, M.; Dye, C.; Evtyugina, M.; García-Gacio, D.; Gülcin, A.; Hillamo, R.; Hoffer, A.; Hyder, M.; Iinuma, Y.; Jaffrezo, J.-L.; Kasper-Giebl, A.; Kiss, G.; López-Mahia, P. L.; Pio, C.; Piot, C.; Ramirez-Santa-Cruz, C.; Sciare, J.; Teinilä, K.; Vermeylen, R.; Vicente, A.; Zimmermann, R.

    2014-07-01

    The monosaccharide anhydrides (MAs) levoglucosan, galactosan and mannosan are products of incomplete combustion and pyrolysis of cellulose and hemicelluloses, and are found to be major constituents of biomass burning aerosol particles. Hence, ambient aerosol particle concentrations of levoglucosan are commonly used to study the influence of residential wood burning, agricultural waste burning and wild fire emissions on ambient air quality. A European-wide intercomparison on the analysis of the three monosaccharide anhydrides was conducted based on ambient aerosol quartz fiber filter samples collected at a Norwegian urban background site during winter. Thus, the samples' content of MAs is representative for biomass burning particles originating from residential wood burning. The purpose of the intercomparison was to examine the comparability of the great diversity of analytical methods used for analysis of levoglucosan, mannosan and galactosan in ambient aerosol filter samples. Thirteen laboratories participated, of which three applied High-Performance Anion-Exchange Chromatography (HPAEC), four used High-Performance Liquid Chromatography (HPLC) or Ultra-Performance Liquid Chromatography (UPLC), and six resorted to Gas Chromatography (GC). The analytical methods used were of such diversity that they should be considered as thirteen different analytical methods. All of the thirteen laboratories reported levels of levoglucosan, whereas nine reported data for mannosan and/or galactosan. Eight of the thirteen laboratories reported levels for all three isomers. The accuracy for levoglucosan, presented as the mean percentage error (PE) for each participating laboratory, varied from -63 to 23%; however, for 62% of the laboratories the mean PE was within ±10%, and for 85% the mean PE was within ±20%. For mannosan, the corresponding range was -60 to 69%, but as for levoglucosan, the range was substantially smaller for a subselection of the laboratories; i.e., for 33% of the laboratories the mean PE was within ±10%. For galactosan, the mean PE for the participating laboratories ranged from -84 to 593%, and as for mannosan 33% of the laboratories reported a mean PE within ±10%. The variability of the various analytical methods, as defined by their minimum and maximum PE value, was typically better for levoglucosan than for mannosan and galactosan, ranging from 3.2 to 41% for levoglucosan, from 10 to 67% for mannosan, and from 6 to 364% for galactosan. For the levoglucosan to mannosan ratio, which may be used to assess the relative importance of softwood vs. hardwood burning, the variability only ranged from 3.5 to 24%. To our knowledge, this is the first major intercomparison on analytical methods used to quantify monosaccharide anhydrides in ambient aerosol filter samples conducted and reported in the scientific literature. The results show that for levoglucosan the accuracy is only slightly lower than that reported for analysis of SO42- on filter samples, a constituent that has been analyzed by numerous laboratories for several decades, typically by ion chromatography, and which is considered a fairly easy constituent to measure. Hence, the results obtained for levoglucosan with respect to accuracy are encouraging and suggest that levels of levoglucosan, and to a lesser extent mannosan and galactosan, obtained by most of the analytical methods currently used to quantify monosaccharide anhydrides in ambient aerosol filter samples, are comparable. Finally, the various analytical methods used in the current study should be tested for other aerosol matrices and concentrations as well, the most obvious being summertime aerosol samples affected by wild fires and/or agricultural fires.

  15. An intercomparison study of analytical methods used for quantification of levoglucosan in ambient aerosol filter samples

    NASA Astrophysics Data System (ADS)

    Yttri, K. E.; Schnelle-Kreis, J.; Maenhaut, W.; Abbaszade, G.; Alves, C.; Bjerke, A.; Bonnier, N.; Bossi, R.; Claeys, M.; Dye, C.; Evtyugina, M.; García-Gacio, D.; Hillamo, R.; Hoffer, A.; Hyder, M.; Iinuma, Y.; Jaffrezo, J.-L.; Kasper-Giebl, A.; Kiss, G.; López-Mahia, P. L.; Pio, C.; Piot, C.; Ramirez-Santa-Cruz, C.; Sciare, J.; Teinilä, K.; Vermeylen, R.; Vicente, A.; Zimmermann, R.

    2015-01-01

    The monosaccharide anhydrides (MAs) levoglucosan, galactosan and mannosan are products of incomplete combustion and pyrolysis of cellulose and hemicelluloses, and are found to be major constituents of biomass burning (BB) aerosol particles. Hence, ambient aerosol particle concentrations of levoglucosan are commonly used to study the influence of residential wood burning, agricultural waste burning and wildfire emissions on ambient air quality. A European-wide intercomparison on the analysis of the three monosaccharide anhydrides was conducted based on ambient aerosol quartz fiber filter samples collected at a Norwegian urban background site during winter. Thus, the samples' content of MAs is representative for BB particles originating from residential wood burning. The purpose of the intercomparison was to examine the comparability of the great diversity of analytical methods used for analysis of levoglucosan, mannosan and galactosan in ambient aerosol filter samples. Thirteen laboratories participated, of which three applied high-performance anion-exchange chromatography (HPAEC), four used high-performance liquid chromatography (HPLC) or ultra-performance liquid chromatography (UPLC) and six resorted to gas chromatography (GC). The analytical methods used were of such diversity that they should be considered as thirteen different analytical methods. All of the thirteen laboratories reported levels of levoglucosan, whereas nine reported data for mannosan and/or galactosan. Eight of the thirteen laboratories reported levels for all three isomers. The accuracy for levoglucosan, presented as the mean percentage error (PE) for each participating laboratory, varied from -63 to 20%; however, for 62% of the laboratories the mean PE was within ±10%, and for 85% the mean PE was within ±20%. For mannosan, the corresponding range was -60 to 69%, but as for levoglucosan, the range was substantially smaller for a subselection of the laboratories; i.e. for 33% of the laboratories the mean PE was within ±10%. For galactosan, the mean PE for the participating laboratories ranged from -84 to 593%, and as for mannosan 33% of the laboratories reported a mean PE within ±10%. The variability of the various analytical methods, as defined by their minimum and maximum PE value, was typically better for levoglucosan than for mannosan and galactosan, ranging from 3.2 to 41% for levoglucosan, from 10 to 67% for mannosan and from 6 to 364% for galactosan. For the levoglucosan to mannosan ratio, which may be used to assess the relative importance of softwood versus hardwood burning, the variability only ranged from 3.5 to 24 . To our knowledge, this is the first major intercomparison on analytical methods used to quantify monosaccharide anhydrides in ambient aerosol filter samples conducted and reported in the scientific literature. The results show that for levoglucosan the accuracy is only slightly lower than that reported for analysis of SO42- (sulfate) on filter samples, a constituent that has been analysed by numerous laboratories for several decades, typically by ion chromatography and which is considered a fairly easy constituent to measure. Hence, the results obtained for levoglucosan with respect to accuracy are encouraging and suggest that levels of levoglucosan, and to a lesser extent mannosan and galactosan, obtained by most of the analytical methods currently used to quantify monosaccharide anhydrides in ambient aerosol filter samples, are comparable. Finally, the various analytical methods used in the current study should be tested for other aerosol matrices and concentrations as well, the most obvious being summertime aerosol samples affected by wildfires and/or agricultural fires.

  16. Determination of eight artificial sweeteners and common Stevia rebaudiana glycosides in non-alcoholic and alcoholic beverages by reversed-phase liquid chromatography coupled with tandem mass spectrometry.

    PubMed

    Kubica, Paweł; Namieśnik, Jacek; Wasik, Andrzej

    2015-02-01

    The method for the determination of acesulfame-K, saccharine, cyclamate, aspartame, sucralose, alitame, neohesperidin dihydrochalcone, neotame and five common steviol glycosides (rebaudioside A, rebaudioside C, steviol, steviolbioside and stevioside) in soft and alcoholic beverages was developed using high-performance liquid chromatography and tandem mass spectrometry with electrospray ionisation (HPLC-ESI-MS/MS). To the best of our knowledge, this is the first work that presents an HPLC-ESI-MS/MS method which allows for the simultaneous determination of all EU-authorised high-potency sweeteners (thaumatin being the only exception) in one analytical run. The minimalistic sample preparation procedure consisted of only two operations; dilution and centrifugation. Linearity, limits of detection and quantitation, repeatability, and trueness of the method were evaluated. The obtained recoveries at three tested concentration levels varied from 97.0 to 105.7%, with relative standard deviations lower than 4.1%. The proposed method was successfully applied for the determination of sweeteners in 24 samples of different soft and alcoholic drinks.

  17. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  18. SAM Biotoxin Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.

  19. SAM Chemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery

  20. Filter Membrane Effects on Water-Extractable Phosphorus Concentrations from Soil.

    PubMed

    Norby, Jessica; Strawn, Daniel; Brooks, Erin

    2018-03-01

    To accurately assess P concentrations in soil extracts, standard laboratory practices for monitoring P concentrations are needed. Water-extractable P is a common analytical test to determine P availability for leaching from soils, and it is used to determine best management practices. Most P analytical tests require filtration through a filter membrane with 0.45-μm pore size to distinguish between particulate and dissolved P species. However, filter membrane type is rarely specified in method protocols, and many different types of membranes are available. In this study, three common filter membrane materials (polyether sulfone, nylon, and nitrocellulose), all with 0.45-μm pore sizes, were tested for analytical differences in total P concentrations and dissolved reactive P (DRP) concentrations in water extracts from six soils sampled from two regions. Three of the extracts from the six soil samples had different total P concentrations for all three membrane types. The other three soil extracts had significantly different total P results from at least one filter membrane type. Total P concentration differences were as great as 35%. The DRP concentrations in the extracts were dependent on filter type in five of the six soil types. Results from this research show that filter membrane type is an important parameter that affects concentrations of total P and DRP from soil extracts. Thus, membrane type should be specified in soil extraction protocols. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  1. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  2. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  3. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  4. APC: A New Code for Atmospheric Polarization Computations

    NASA Technical Reports Server (NTRS)

    Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.

    2014-01-01

    A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.

  5. Olodaterol and vilanterol detection in sport drug testing.

    PubMed

    Chundela, Zdenek; Große, Joachim

    2015-01-01

    The possibility of the detection of olodaterol and vilanterol, two novel β2 -agonists, in human urine for the purpose of sport drug testing was investigated. Compounds of interest were analyzed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) employing methods commonly used in the World Anti-Doping Agency (WADA) accredited laboratories. For both substances, the respective parent compound was found to be a suitable target analyte for monitoring therapeutic dose administration. Copyright © 2015 John Wiley & Sons, Ltd.

  6. NASA Lewis Research Center's Program on Icing Research

    NASA Technical Reports Server (NTRS)

    Reinmann, J. J.; Shaw, R. J.; Olsen, W. A., Jr.

    1982-01-01

    The helicopter and general aviation, light transport, and commercial transport aircraft share common icing requirements: highly effective, lightweight, low power consuming deicing systems, and detailed knowledge of the aeropenalties due to ice on aircraft surfaces. To meet current and future needs, NASA has a broadbased icing research program which covers both research and engineering applications, and is well coordinated with the FAA, DOD, universities, industry, and some foreign governments. Research activity in ice protection systems, icing instrumentation, experimental methods, analytical modeling, and in-flight research are described.

  7. Net analyte signal standard addition method (NASSAM) as a novel spectrofluorimetric and spectrophotometric technique for simultaneous determination, application to assay of melatonin and pyridoxine

    NASA Astrophysics Data System (ADS)

    Asadpour-Zeynali, Karim; Bastami, Mohammad

    2010-02-01

    In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.

  8. Bacteriophage-based nanoprobes for rapid bacteria separation

    NASA Astrophysics Data System (ADS)

    Chen, Juhong; Duncan, Bradley; Wang, Ziyuan; Wang, Li-Sheng; Rotello, Vincent M.; Nugen, Sam R.

    2015-10-01

    The lack of practical methods for bacterial separation remains a hindrance for the low-cost and successful development of rapid detection methods from complex samples. Antibody-tagged magnetic particles are commonly used to pull analytes from a liquid sample. While this method is well-established, improvements in capture efficiencies would result in an increase of the overall detection assay performance. Bacteriophages represent a low-cost and more consistent biorecognition element as compared to antibodies. We have developed nanoscale bacteriophage-tagged magnetic probes, where T7 bacteriophages were bound to magnetic nanoparticles. The nanoprobe allowed the specific recognition and attachment to E. coli cells. The phage magnetic nanprobes were directly compared to antibody-conjugated magnetic nanoprobes. The capture efficiencies of bacteriophages and antibodies on nanoparticles for the separation of E. coli K12 at varying concentrations were determined. The results indicated a similar bacteria capture efficiency between the two nanoprobes.The lack of practical methods for bacterial separation remains a hindrance for the low-cost and successful development of rapid detection methods from complex samples. Antibody-tagged magnetic particles are commonly used to pull analytes from a liquid sample. While this method is well-established, improvements in capture efficiencies would result in an increase of the overall detection assay performance. Bacteriophages represent a low-cost and more consistent biorecognition element as compared to antibodies. We have developed nanoscale bacteriophage-tagged magnetic probes, where T7 bacteriophages were bound to magnetic nanoparticles. The nanoprobe allowed the specific recognition and attachment to E. coli cells. The phage magnetic nanprobes were directly compared to antibody-conjugated magnetic nanoprobes. The capture efficiencies of bacteriophages and antibodies on nanoparticles for the separation of E. coli K12 at varying concentrations were determined. The results indicated a similar bacteria capture efficiency between the two nanoprobes. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr03779d

  9. Eco-analytical Methodology in Environmental Problems Monitoring

    NASA Astrophysics Data System (ADS)

    Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.

    2017-01-01

    Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.

  10. Deflection Shape Reconstructions of a Rotating Five-blade Helicopter Rotor from TLDV Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fioretti, A.; Castellini, P.; Tomasini, E. P.

    2010-05-28

    Helicopters are aircraft machines which are subjected to high level of vibrations, mainly due to spinning rotors. These are made of two or more blades attached by hinges to a central hub, which can make the dynamic behaviour difficult to study. However, they share some common dynamic properties with the ones expected in bladed discs, thereby the analytical modelling of rotors can be performed using some assumptions as the ones adopted for the bladed discs. This paper presents results of a vibrations study performed on a scaled helicopter rotor model which was rotating at a fix rotational speed and excitedmore » by an air jet. A simplified analytical model of that rotor was also produced to help the identifications of the vibration patterns measured using a single point tracking-SLDV measurement method.« less

  11. Preanalytical requirements of urinalysis

    PubMed Central

    Delanghe, Joris; Speeckaert, Marijn

    2014-01-01

    Urine may be a waste product, but it contains an enormous amount of information. Well-standardized procedures for collection, transport, sample preparation and analysis should become the basis of an effective diagnostic strategy for urinalysis. As reproducibility of urinalysis has been greatly improved due to recent technological progress, preanalytical requirements of urinalysis have gained importance and have become stricter. Since the patients themselves often sample urine specimens, urinalysis is very susceptible to preanalytical issues. Various sampling methods and inappropriate specimen transport can cause important preanalytical errors. The use of preservatives may be helpful for particular analytes. Unfortunately, a universal preservative that allows a complete urinalysis does not (yet) exist. The preanalytical aspects are also of major importance for newer applications (e.g. metabolomics). The present review deals with the current preanalytical problems and requirements for the most common urinary analytes. PMID:24627718

  12. A Sensitive and Effective Proteomic Approach to Identify She-Donkey’s and Goat’s Milk Adulterations by MALDI-TOF MS Fingerprinting

    PubMed Central

    Di Girolamo, Francesco; Masotti, Andrea; Salvatori, Guglielmo; Scapaticci, Margherita; Muraca, Maurizio; Putignani, Lorenza

    2014-01-01

    She-donkey’s milk (DM) and goat’s milk (GM) are commonly used in newborn and infant feeding because they are less allergenic than other milk types. It is, therefore, mandatory to avoid adulteration and contamination by other milk allergens, developing fast and efficient analytical methods to assess the authenticity of these precious nutrients. In this experimental work, a sensitive and robust matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) profiling was designed to assess the genuineness of DM and GM milks. This workflow allows the identification of DM and GM adulteration at levels of 0.5%, thus, representing a sensitive tool for milk adulteration analysis, if compared with other laborious and time-consuming analytical procedures. PMID:25110863

  13. Social Web mining and exploitation for serious applications: Technosocial Predictive Analytics and related technologies for public health, environmental and national security surveillance.

    PubMed

    Kamel Boulos, Maged N; Sanfilippo, Antonio P; Corley, Courtney D; Wheeler, Steve

    2010-10-01

    This paper explores Technosocial Predictive Analytics (TPA) and related methods for Web "data mining" where users' posts and queries are garnered from Social Web ("Web 2.0") tools such as blogs, micro-blogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people's collective health status of whole populations. Several health related tool examples are described and demonstrated as practical means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  14. A novel analysis method for paired-sample microbial ecology experiments

    DOE PAGES

    Olesen, Scott W.; Vora, Suhani; Techtmann, Stephen M.; ...

    2016-05-06

    Many microbial ecology experiments use sequencing data to measure a community s response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samplesmore » and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method s validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Furthermore, our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of bottle effects .« less

  15. A novel analysis method for paired-sample microbial ecology experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olesen, Scott W.; Vora, Suhani; Techtmann, Stephen M.

    Many microbial ecology experiments use sequencing data to measure a community s response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samplesmore » and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method s validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Furthermore, our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of bottle effects .« less

  16. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship-Quasi-Experimental Designs.

    PubMed

    Schweizer, Marin L; Braun, Barbara I; Milstone, Aaron M

    2016-10-01

    Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt, nonrandomized interventions. Quasi-experimental studies can be categorized into 3 major types: interrupted time-series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship, including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. Infect Control Hosp Epidemiol 2016;1-6.

  17. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship – Quasi-Experimental Designs

    PubMed Central

    Schweizer, Marin L.; Braun, Barbara I.; Milstone, Aaron M.

    2016-01-01

    Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt non-randomized interventions. Quasi-experimental studies can be categorized into three major types: interrupted time series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. PMID:27267457

  18. Environmental analysis of higher brominated diphenyl ethers and decabromodiphenyl ethane.

    PubMed

    Kierkegaard, Amelie; Sellström, Ulla; McLachlan, Michael S

    2009-01-16

    Methods for environmental analysis of higher brominated diphenyl ethers (PBDEs), in particular decabromodiphenyl ether (BDE209), and the recently discovered environmental contaminant decabromodiphenyl ethane (deBDethane) are reviewed. The extensive literature on analysis of BDE209 has identified several critical issues, including contamination of the sample, degradation of the analyte during sample preparation and GC analysis, and the selection of appropriate detection methods and surrogate standards. The limited experience with the analysis of deBDethane suggests that there are many commonalities with BDE209. The experience garnered from the analysis of BDE209 over the last 15 years will greatly facilitate progress in the analysis of deBDethane.

  19. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    ERIC Educational Resources Information Center

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  20. Capacity Enablers and Barriers for Learning Analytics: Implications for Policy and Practice

    ERIC Educational Resources Information Center

    Wolf, Mary Ann; Jones, Rachel; Hall, Sara; Wise, Bob

    2014-01-01

    The field of learning analytics is being discussed in many circles as an emerging concept in education. In many districts and states, the core philosophy behind learning analytics is not entirely new; for more than a decade, discussions of data-driven decision making and the use of data to drive instruction have been common. Still, the U.S.…

  1. One-year test-retest reliability of intrinsic connectivity network fMRI in older adults

    PubMed Central

    Guo, Cong C.; Kurth, Florian; Zhou, Juan; Mayer, Emeran A.; Eickhoff, Simon B; Kramer, Joel H.; Seeley, William W.

    2014-01-01

    “Resting-state” or task-free fMRI can assess intrinsic connectivity network (ICN) integrity in health and disease, suggesting a potential for use of these methods as disease-monitoring biomarkers. Numerous analytical options are available, including model-driven ROI-based correlation analysis and model-free, independent component analysis (ICA). High test-retest reliability will be a necessary feature of a successful ICN biomarker, yet available reliability data remains limited. Here, we examined ICN fMRI test-retest reliability in 24 healthy older subjects scanned roughly one year apart. We focused on the salience network, a disease-relevant ICN not previously subjected to reliability analysis. Most ICN analytical methods proved reliable (intraclass coefficients > 0.4) and could be further improved by wavelet analysis. Seed-based ROI correlation analysis showed high map-wise reliability, whereas graph theoretical measures and temporal concatenation group ICA produced the most reliable individual unit-wise outcomes. Including global signal regression in ROI-based correlation analyses reduced reliability. Our study provides a direct comparison between the most commonly used ICN fMRI methods and potential guidelines for measuring intrinsic connectivity in aging control and patient populations over time. PMID:22446491

  2. The cloud radiation impact from optics simulation and airborne observation

    NASA Astrophysics Data System (ADS)

    Melnikova, Irina; Kuznetsov, Anatoly; Gatebe, Charles

    2017-02-01

    The analytical approach of inverse asymptotic formulas of the radiative transfer theory is used for solving inverse problems of cloud optics. The method has advantages because it does not impose strict constraints, but it is tied to the desired solution. Observations are accomplished in extended stratus cloudiness, above a homogeneous ocean surface. Data from NASA`s Cloud Absorption Radiometer (CAR) during two airborne experiments (SAFARI-2000 and ARCTAS-2008) were analyzed. The analytical method of inverse asymptotic formulas was used to retrieve cloud optical parameters (optical thickness, single scattering albedo and asymmetry parameter of the phase function) and ground albedo in all 8 spectral channels independently. The method is free from a priori restrictions and there is no links to parameters, and it has been applied to data set of different origin and geometry of observations. Results obtained from different airborne, satellite and ground radiative experiments appeared consistence and showed common features of values of cloud parameters and its spectral dependence (Vasiluev, Melnikova, 2004; Gatebe et al., 2014). Optical parameters, retrieved here, are used for calculation of radiative divergence, reflected and transmitted irradiance and heating rates in cloudy atmosphere, that agree with previous observational data.

  3. Standardization in laboratory medicine: Adoption of common reference intervals to the Croatian population.

    PubMed

    Flegar-Meštrić, Zlata; Perkov, Sonja; Radeljak, Andrea

    2016-03-26

    Considering the fact that the results of laboratory tests provide useful information about the state of health of patients, determination of reference value is considered an intrinsic part in the development of laboratory medicine. There are still huge differences in the analytical methods used as well as in the associated reference intervals which could consequently significantly affect the proper assessment of patient health. In a constant effort to increase the quality of patients' care, there are numerous international initiatives for standardization and/or harmonization of laboratory diagnostics in order to achieve maximum comparability of laboratory test results and improve patient safety. Through the standardization and harmonization processes of analytical methods the ability to create unique reference intervals is achieved. Such reference intervals could be applied globally in all laboratories using methods traceable to the same reference measuring system and analysing the biological samples from the populations with similar socio-demographic and ethnic characteristics. In this review we outlined the results of the harmonization processes in Croatia in the field of population based reference intervals for clinically relevant blood and serum constituents which are in accordance with ongoing activity for worldwide standardization and harmonization based on traceability in laboratory medicine.

  4. Analysis Of Dynamic Interactions Between Solar Array Simulators And Spacecraft Power Conditioning And Distribution Units

    NASA Astrophysics Data System (ADS)

    Valdivia, V.; Barrado, A.; Lazaro, A.; Rueda, P.; Tonicello, F.; Fernandez, A.; Mourra, O.

    2011-10-01

    Solar array simulators (SASs) are hardware devices, commonly applied instead of actual solar arrays (SAs) during the design process of spacecrafts power conditioning and distribution units (PCDUs), and during spacecrafts assembly integration and tests. However, the dynamic responses between SASs and actual SAs are usually different. This fact plays an important role, since the dynamic response of the SAS may influence significantly the dynamic behaviour of the PCDU under certain conditions, even leading to instability. This paper deals with the dynamic interactions between SASs and PCDUs. Several methods for dynamic characterization of the SASs are discussed, and the response of commercial SASs widely applied in the space industry is compared to that of actual SAs. After that, the interactions are experimentally analyzed by using a boost converter connected to the aforementioned SASs, thus demonstrating their critical importance. The interactions are first tackled analytically by means of small-signal models, and finally a black-box modelling method of SASs is proposed as a useful tool to analyze the interactions by means of simulation. The capabilities of both the analytical method and the black- box model to predict the interactions are demonstrated.

  5. Estimation of the biserial correlation and its sampling variance for use in meta-analysis.

    PubMed

    Jacobs, Perke; Viechtbauer, Wolfgang

    2017-06-01

    Meta-analyses are often used to synthesize the findings of studies examining the correlational relationship between two continuous variables. When only dichotomous measurements are available for one of the two variables, the biserial correlation coefficient can be used to estimate the product-moment correlation between the two underlying continuous variables. Unlike the point-biserial correlation coefficient, biserial correlation coefficients can therefore be integrated with product-moment correlation coefficients in the same meta-analysis. The present article describes the estimation of the biserial correlation coefficient for meta-analytic purposes and reports simulation results comparing different methods for estimating the coefficient's sampling variance. The findings indicate that commonly employed methods yield inconsistent estimates of the sampling variance across a broad range of research situations. In contrast, consistent estimates can be obtained using two methods that appear to be unknown in the meta-analytic literature. A variance-stabilizing transformation for the biserial correlation coefficient is described that allows for the construction of confidence intervals for individual coefficients with close to nominal coverage probabilities in most of the examined conditions. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Quantum networks in divergence-free circuit QED

    NASA Astrophysics Data System (ADS)

    Parra-Rodriguez, A.; Rico, E.; Solano, E.; Egusquiza, I. L.

    2018-04-01

    Superconducting circuits are one of the leading quantum platforms for quantum technologies. With growing system complexity, it is of crucial importance to develop scalable circuit models that contain the minimum information required to predict the behaviour of the physical system. Based on microwave engineering methods, divergent and non-divergent Hamiltonian models in circuit quantum electrodynamics have been proposed to explain the dynamics of superconducting quantum networks coupled to infinite-dimensional systems, such as transmission lines and general impedance environments. Here, we study systematically common linear coupling configurations between networks and infinite-dimensional systems. The main result is that the simple Lagrangian models for these configurations present an intrinsic natural length that provides a natural ultraviolet cutoff. This length is due to the unavoidable dressing of the environment modes by the network. In this manner, the coupling parameters between their components correctly manifest their natural decoupling at high frequencies. Furthermore, we show the requirements to correctly separate infinite-dimensional coupled systems in local bases. We also compare our analytical results with other analytical and approximate methods available in the literature. Finally, we propose several applications of these general methods to analogue quantum simulation of multi-spin-boson models in non-perturbative coupling regimes.

  7. A facile and low-cost micro fabrication material: flash foam.

    PubMed

    He, Yong; Xiao, Xiao; Wu, Yan; Fu, Jian-zhong

    2015-08-28

    Although many microfabrication methods have been reported, the preliminary replication templates used in most microfabrication still depend on the expensive and long-period photolithography. This paper explores an alternative replication templates based on a daily used material, flash foam (FF), and proposes a facile microfabrication method, flash foam stamp lithography (FFSL). When FF is exposed with a desired pattern mask, the negative of the pattern is transferred to its surface and micro structures are formed due to the shrinkage of the exposed area. As FF is commonly used in personal stamps, FFSL is very simple and cost-effective. In this paper, we demonstrated that FF is a good and low-cost template for many micro fabrication methods, such as micro casting and soft lithography. Thus, designing and fabricating micro structures at personal office immediately become possible with FFSL. Furthermore, we demonstrated that multi-scale micro structures can be easily fabricated by double exposure with FFSL. Skin textures is used as another case to demonstrate that FFSL can fabricate structures with different depth in a single exposure. As a result, FF shows a promising future in biology, and analytical chemistry, such as rapid fabrication of point of care diagnostics and microfluidic analytical devices with low cost.

  8. Application of Liquid Chromatography/Ion Trap Mass Spectrometry Technique to Determine Ergot Alkaloids in Grain Products

    PubMed Central

    Szymczyk, Krystyna; Jędrzejczak, Renata; Roszko, Marek

    2015-01-01

    Summary A liquid chromatography/ion trap mass spectrometry-based method to determine six ergot alkaloids and their isomers is presented. The samples were cleaned on neutral alumina-based solid-phase extraction cartridges. The following method parameters were obtained (depending on the analyte and spiking level): method recovery from 63.0 to 104.6%, relative standard deviation below 18%, linear range from 1 to 325 µg/kg, linear correlation coefficient not less than 0.98. The developed analytical procedure was applied to determine the levels of ergot alkaloids in 65 samples of selected rye-based food products (flour – 34 samples, bran – 12 samples, rye – 18 samples, flakes – 1 sample). Measurable levels of alkaloids were found in majority of the analysed samples, particularly in rye flour. Additionally, alkaloids were determined in ergot sclerotia isolated from rye grains. Total content was nearly 0.01% (97.9 mg/kg). However, the alkaloid profile was dominated by ergocristine at 45.6% (44.7 mg/kg), an alkaloid not commonly found in the tested food products. Ergocorninine at 0.2% (0.2 mg/kg) was the least abundant alkaloid. PMID:27904328

  9. Standardization in laboratory medicine: Adoption of common reference intervals to the Croatian population

    PubMed Central

    Flegar-Meštrić, Zlata; Perkov, Sonja; Radeljak, Andrea

    2016-01-01

    Considering the fact that the results of laboratory tests provide useful information about the state of health of patients, determination of reference value is considered an intrinsic part in the development of laboratory medicine. There are still huge differences in the analytical methods used as well as in the associated reference intervals which could consequently significantly affect the proper assessment of patient health. In a constant effort to increase the quality of patients’ care, there are numerous international initiatives for standardization and/or harmonization of laboratory diagnostics in order to achieve maximum comparability of laboratory test results and improve patient safety. Through the standardization and harmonization processes of analytical methods the ability to create unique reference intervals is achieved. Such reference intervals could be applied globally in all laboratories using methods traceable to the same reference measuring system and analysing the biological samples from the populations with similar socio-demographic and ethnic characteristics. In this review we outlined the results of the harmonization processes in Croatia in the field of population based reference intervals for clinically relevant blood and serum constituents which are in accordance with ongoing activity for worldwide standardization and harmonization based on traceability in laboratory medicine. PMID:27019800

  10. Analytical analysis and implementation of a low-speed high-torque permanent magnet vernier in-wheel motor for electric vehicle

    NASA Astrophysics Data System (ADS)

    Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili

    2012-04-01

    In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.

  11. Experimental investigation and numerical simulation of 3He gas diffusion in simple geometries: implications for analytical models of 3He MR lung morphometry.

    PubMed

    Parra-Robles, J; Ajraoui, S; Deppe, M H; Parnell, S R; Wild, J M

    2010-06-01

    Models of lung acinar geometry have been proposed to analytically describe the diffusion of (3)He in the lung (as measured with pulsed gradient spin echo (PGSE) methods) as a possible means of characterizing lung microstructure from measurement of the (3)He ADC. In this work, major limitations in these analytical models are highlighted in simple diffusion weighted experiments with (3)He in cylindrical models of known geometry. The findings are substantiated with numerical simulations based on the same geometry using finite difference representation of the Bloch-Torrey equation. The validity of the existing "cylinder model" is discussed in terms of the physical diffusion regimes experienced and the basic reliance of the cylinder model and other ADC-based approaches on a Gaussian diffusion behaviour is highlighted. The results presented here demonstrate that physical assumptions of the cylinder model are not valid for large diffusion gradient strengths (above approximately 15 mT/m), which are commonly used for (3)He ADC measurements in human lungs. (c) 2010 Elsevier Inc. All rights reserved.

  12. A simple microplate-based method for the determination of α-amylase activity using the glucose assay kit (GOD method).

    PubMed

    Visvanathan, Rizliya; Jayathilake, Chathuni; Liyanage, Ruvini

    2016-11-15

    For the first time, a reliable, simple, rapid and high-throughput analytical method for the detection and quantification of α-amylase inhibitory activity using the glucose assay kit was developed. The new method facilitates rapid screening of a large number of samples, reduces labor, time and reagents and is also suitable for kinetic studies. This method is based on the reaction of maltose with glucose oxidase (GOD) and the development of a red quinone. The test is done in microtitre plates with a total volume of 260μL and an assay time of 40min including the pre-incubation steps. The new method is tested for linearity, sensitivity, precision, reproducibility and applicability. The new method is also compared with the most commonly used 3,5-dinitrosalicylic acid (DNSA) method for determining α-amylase activity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Nanomaterial-Based Electrochemical Immunosensors for Clinically Significant Biomarkers

    PubMed Central

    Ronkainen, Niina J.; Okon, Stanley L.

    2014-01-01

    Nanotechnology has played a crucial role in the development of biosensors over the past decade. The development, testing, optimization, and validation of new biosensors has become a highly interdisciplinary effort involving experts in chemistry, biology, physics, engineering, and medicine. The sensitivity, the specificity and the reproducibility of biosensors have improved tremendously as a result of incorporating nanomaterials in their design. In general, nanomaterials-based electrochemical immunosensors amplify the sensitivity by facilitating greater loading of the larger sensing surface with biorecognition molecules as well as improving the electrochemical properties of the transducer. The most common types of nanomaterials and their properties will be described. In addition, the utilization of nanomaterials in immunosensors for biomarker detection will be discussed since these biosensors have enormous potential for a myriad of clinical uses. Electrochemical immunosensors provide a specific and simple analytical alternative as evidenced by their brief analysis times, inexpensive instrumentation, lower assay cost as well as good portability and amenability to miniaturization. The role nanomaterials play in biosensors, their ability to improve detection capabilities in low concentration analytes yielding clinically useful data and their impact on other biosensor performance properties will be discussed. Finally, the most common types of electroanalytical detection methods will be briefly touched upon. PMID:28788700

  14. A sample preparation method for recovering suppressed analyte ions in MALDI TOF MS.

    PubMed

    Lou, Xianwen; de Waal, Bas F M; Milroy, Lech-Gustav; van Dongen, Joost L J

    2015-05-01

    In matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS), analyte signals can be substantially suppressed by other compounds in the sample. In this technical note, we describe a modified thin-layer sample preparation method that significantly reduces the analyte suppression effect (ASE). In our method, analytes are deposited on top of the surface of matrix preloaded on the MALDI plate. To prevent embedding of analyte into the matrix crystals, the sample solution were prepared without matrix and efforts were taken not to re-dissolve the preloaded matrix. The results with model mixtures of peptides, synthetic polymers and lipids show that detection of analyte ions, which were completely suppressed using the conventional dried-droplet method, could be effectively recovered by using our method. Our findings suggest that the incorporation of analytes in the matrix crystals has an important contributory effect on ASE. By reducing ASE, our method should be useful for the direct MALDI MS analysis of multicomponent mixtures. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Capillary electrophoresis-based immunoassays: principles and quantitative applications.

    PubMed

    Moser, Annette C; Hage, David S

    2008-08-01

    The use of CE as a tool to conduct immunoassays has been an area of increasing interest over the last decade. This approach combines the efficiency, small sample requirements, and relatively high speed of CE with the selectivity of antibodies as binding agents. This review examines the various assay formats and detection modes that have been reported for these assays, along with some representative applications. Most CE immunoassays in the past have employed homogeneous methods in which the sample and reagents are allowed to react in solution. These homogeneous methods have been conducted as both competitive binding immunoassays and as noncompetitive binding immunoassays. Fluorescent labels are most commonly used for detection in these assays, but enzyme labels have also been utilized for such work. Some additional work has been performed in CE immunoassays with heterogeneous methods in which either antibodies or an analog of the analyte is immobilized to a solid support. These heterogeneous methods can be used for the selective isolation of analytes prior to their separation by CE or to remove a given species from a sample/reagent mixture prior to analysis by CE. These CE immunoassays can be used with a variety of detection modes, such as fluorescence, UV/Vis absorbance, chemiluminescence, electrochemical measurements, MS, and surface plasmon resonance.

  16. Proposal of Classification Method of Time Series Data in International Emissions Trading Market Using Agent-based Simulation

    NASA Astrophysics Data System (ADS)

    Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi

    This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.

  17. An overview of methods for comparative effectiveness research.

    PubMed

    Meyer, Anne-Marie; Wheeler, Stephanie B; Weinberger, Morris; Chen, Ronald C; Carpenter, William R

    2014-01-01

    Comparative effectiveness research (CER) is a broad category of outcomes research encompassing many different methods employed by researchers and clinicians from numerous disciplines. The goal of cancer-focused CER is to generate new knowledge to assist cancer stakeholders in making informed decisions that will improve health care and outcomes of both individuals and populations. There are numerous CER methods that may be used to examine specific questions, including randomized controlled trials, observational studies, systematic literature reviews, and decision sciences modeling. Each has its strengths and weaknesses. To both inform and serve as a reference for readers of this issue of Seminars in Radiation Oncology as well as the broader oncology community, we describe CER and several of the more commonly used approaches and analytical methods. © 2013 Published by Elsevier Inc.

  18. Exact exchange-correlation potentials of singlet two-electron systems

    NASA Astrophysics Data System (ADS)

    Ryabinkin, Ilya G.; Ospadov, Egor; Staroverov, Viktor N.

    2017-10-01

    We suggest a non-iterative analytic method for constructing the exchange-correlation potential, v XC ( r ) , of any singlet ground-state two-electron system. The method is based on a convenient formula for v XC ( r ) in terms of quantities determined only by the system's electronic wave function, exact or approximate, and is essentially different from the Kohn-Sham inversion technique. When applied to Gaussian-basis-set wave functions, the method yields finite-basis-set approximations to the corresponding basis-set-limit v XC ( r ) , whereas the Kohn-Sham inversion produces physically inappropriate (oscillatory and divergent) potentials. The effectiveness of the procedure is demonstrated by computing accurate exchange-correlation potentials of several two-electron systems (helium isoelectronic series, H2, H3 + ) using common ab initio methods and Gaussian basis sets.

  19. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  20. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 23 2014-07-01 2014-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  1. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 24 2013-07-01 2013-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  2. 77 FR 41336 - Analytical Methods Used in Periodic Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-13

    ... Methods Used in Periodic Reporting AGENCY: Postal Regulatory Commission. ACTION: Notice of filing. SUMMARY... proceeding to consider changes in analytical methods used in periodic reporting. This notice addresses... informal rulemaking proceeding to consider changes in the analytical methods approved for use in periodic...

  3. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  4. 40 CFR 141.704 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 24 2012-07-01 2012-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  5. Recently published analytical methods for determining alcohol in body materials : alcohol countermeasures literature review

    DOT National Transportation Integrated Search

    1974-10-01

    The author has brought the review of published analytical methods for determining alcohol in body materials up-to- date. The review deals with analytical methods for alcohol in blood and other body fluids and tissues; breath alcohol methods; factors ...

  6. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  7. Evaluation of an in-practice wet-chemistry analyzer using canine and feline serum samples.

    PubMed

    Irvine, Katherine L; Burt, Kay; Papasouliotis, Kostas

    2016-01-01

    A wet-chemistry biochemical analyzer was assessed for in-practice veterinary use. Its small size may mean a cost-effective method for low-throughput in-house biochemical analyses for first-opinion practice. The objectives of our study were to determine imprecision, total observed error, and acceptability of the analyzer for measurement of common canine and feline serum analytes, and to compare clinical sample results to those from a commercial reference analyzer. Imprecision was determined by within- and between-run repeatability for canine and feline pooled samples, and manufacturer-supplied quality control material (QCM). Total observed error (TEobs) was determined for pooled samples and QCM. Performance was assessed for canine and feline pooled samples by sigma metric determination. Agreement and errors between the in-practice and reference analyzers were determined for canine and feline clinical samples by Bland-Altman and Deming regression analyses. Within- and between-run precision was high for most analytes, and TEobs(%) was mostly lower than total allowable error. Performance based on sigma metrics was good (σ > 4) for many analytes and marginal (σ > 3) for most of the remainder. Correlation between the analyzers was very high for most canine analytes and high for most feline analytes. Between-analyzer bias was generally attributed to high constant error. The in-practice analyzer showed good overall performance, with only calcium and phosphate analyses identified as significantly problematic. Agreement for most analytes was insufficient for transposition of reference intervals, and we recommend that in-practice-specific reference intervals be established in the laboratory. © 2015 The Author(s).

  8. Medio-lateral Knee Fluency in Anterior Cruciate Ligament-Injured Athletes During Dynamic Movement Trials

    PubMed Central

    Panos, Joseph A.; Hoffman, Joshua T.; Wordeman, Samuel C.; Hewett, Timothy E.

    2016-01-01

    Background Correction of neuromuscular impairments after anterior cruciate ligament injury is vital to successful return to sport. Frontal plane knee control during landing is a common measure of lower-extremity neuromuscular control and asymmetries in neuromuscular control of the knee can predispose injured athletes to additional injury and associated morbidities. Therefore, this study investigated the effects of anterior cruciate ligament injury on knee biomechanics during landing. Methods Two-dimensional frontal plane video of single leg drop, cross over drop, and drop vertical jump dynamic movement trials was analyzed for twenty injured and reconstructed athletes. The position of the knee joint center was tracked in ImageJ software for 500 milliseconds after landing to calculate medio-lateral knee motion velocities and determine normal fluency, the number of times per second knee velocity changed direction. The inverse of this calculation, analytical fluency, was used to associate larger numerical values with fluent movement. Findings Analytical fluency was decreased in involved limbs for single leg drop trials (P=0.0018). Importantly, analytical fluency for single leg drop differed compared to cross over drop trials for involved (P<0.001), but not uninvolved limbs (P=0.5029). For involved limbs, analytical fluency values exhibited a stepwise trend in relative magnitudes. Interpretation Decreased analytical fluency in involved limbs is consistent with previous studies. Fluency asymmetries observed during single leg drop tasks may be indicative of abhorrent landing strategies in the involved limb. Analytical fluency differences in unilateral tasks for injured limbs may represent neuromuscular impairment as a result of injury. PMID:26895446

  9. [The socio-hygienic monitoring as an integral system for health risk assessment and risk management at the regional level].

    PubMed

    Kuzmin, S V; Gurvich, V B; Dikonskaya, O V; Malykh, O L; Yarushin, S V; Romanov, S V; Kornilkov, A S

    2013-01-01

    The information and analytical framework for the introduction of health risk assessment and risk management methodologies in the Sverdlovsk Region is the system of socio-hygienic monitoring. Techniques of risk management that take into account the choice of most cost-effective and efficient actions for improvement of the sanitary and epidemiologic situation at the level of the region, municipality, or a business entity of the Russian Federation, have been developed and proposed. To assess the efficiency of planning and activities for health risk management common method approaches and economic methods of "cost-effectiveness" and "cost-benefit" analyses provided in method recommendations and introduced in the Russian Federation are applied.

  10. From thermometric to spectrophotometric kinetic-catalytic methods of analysis. A review.

    PubMed

    Cerdà, Víctor; González, Alba; Danchana, Kaewta

    2017-05-15

    Kinetic-catalytic analytical methods have proved to be very easy and highly sensitive strategies for chemical analysis, that rely on simple instrumentation [1,2]. Molecular absorption spectrophotometry is commonly used as the detection technique. However, other detection systems, like electrochemical or thermometric ones, offer some interesting possibilities since they are not affected by the color or turbidity of the samples. In this review some initial experience with thermometric kinetic-catalytic methods is described, up to our current experience exploiting spectrophotometric flow techniques to automate this kind of reactions, including the use of integrated chips. Procedures for determination of inorganic and organic species in organic and inorganic matrices are presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Development of a quantitative LC-MS/MS analytical method coupled with turbulent flow chromatography for digoxin for the in vitro P-gp inhibition assay.

    PubMed

    Smalley, James; Marino, Anthony M; Xin, Baomin; Olah, Timothy; Balimane, Praveen V

    2007-07-01

    Caco-2 cells, the human colon carcinoma cells, are typically used for screening compounds for their permeability characteristics and P-glycoprotein (P-gp) interaction potential during discovery and development. The P-gp inhibition of test compounds is assessed by performing bi-directional permeability studies with digoxin, a well established P-gp substrate probe. Studies performed with digoxin alone as well as digoxin in presence of test compounds as putative inhibitors constitute the P-gp inhibition assay used to assess the potential liability of discovery compounds. Radiolabeled (3)H-digoxin is commonly used in such studies followed by liquid scintillation counting. This manuscript describes the development of a sensitive, accurate, and reproducible LC-MS/MS method for analysis of digoxin and its internal standard digitoxin using an on-line extraction turbulent flow chromatography coupled to tandem mass spectrometric detection that is amendable to high throughput with use of 96-well plates. The standard curve for digoxin was linear between 10 nM and 5000 nM with regression coefficient (R(2)) of 0.99. The applicability and reliability of the analysis method was evaluated by successful demonstration of efflux ratio (permeability B to A over permeability A to B) greater than 10 for digoxin in Caco-2 cells. Additional evaluations were performed on 13 marketed compounds by conducting inhibition studies in Caco-2 cells using classical P-gp inhibitors (ketoconazole, cyclosporin, verapamil, quinidine, saquinavir etc.) and comparing the results to historical data with (3)H-digoxin studies. Similarly, P-gp inhibition studies with LC-MS/MS analytical method for digoxin were also performed for 21 additional test compounds classified as negative, moderate, and potent P-gp inhibitors spanning multiple chemo types and results compared with the historical P-gp inhibition data from the (3)H-digoxin studies. A very good correlation coefficient (R(2)) of 0.89 between the results from the two analytical methods affords an attractive LC-MS/MS analytical option for labs that need to conduct the P-gp inhibition assay without using radiolabeled compounds.

  12. Headspace single drop microextraction versus dispersive liquid-liquid microextraction using magnetic ionic liquid extraction solvents.

    PubMed

    An, Jiwoo; Rahn, Kira L; Anderson, Jared L

    2017-05-15

    A headspace single drop microextraction (HS-SDME) method and a dispersive liquid-liquid microextraction (DLLME) method were developed using two tetrachloromanganate ([MnCl 4 2- ])-based magnetic ionic liquids (MIL) as extraction solvents for the determination of twelve aromatic compounds, including four polyaromatic hydrocarbons, by reversed phase high-performance liquid chromatography (HPLC). The analytical performance of the developed HS-SDME method was compared to the DLLME approach employing the same MILs. In the HS-SDME approach, the magnetic field generated by the magnet was exploited to suspend the MIL solvent from the tip of a rod magnet. The utilization of MILs in HS-SDME resulted in a highly stable microdroplet under elevated temperatures and long extraction times, overcoming a common challenge encountered in traditional SDME approaches of droplet instability. The low UV absorbance of the [MnCl 4 2- ]-based MILs permitted direct analysis of the analyte enriched extraction solvent by HPLC. In HS-SDME, the effects of ionic strength of the sample solution, temperature of the extraction system, extraction time, stir rate, and headspace volume on extraction efficiencies were examined. Coefficients of determination (R 2 ) ranged from 0.994 to 0.999 and limits of detection (LODs) varied from 0.04 to 1.0μgL -1 with relative recoveries from lake water ranging from 70.2% to 109.6%. For the DLLME method, parameters including disperser solvent type and volume, ionic strength of the sample solution, mass of extraction solvent, and extraction time were studied and optimized. Coefficients of determination for the DLLME method varied from 0.997 to 0.999 with LODs ranging from 0.05 to 1.0μgL -1 . Relative recoveries from lake water samples ranged from 68.7% to 104.5%. Overall, the DLLME approach permitted faster extraction times and higher enrichment factors for analytes with low vapor pressure whereas the HS-SDME approach exhibited better extraction efficiencies for analytes with relatively higher vapor pressure. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A solid phase extraction procedure for the determination of Cd(II) and Pb(II) ions in food and water samples by flame atomic absorption spectrometry.

    PubMed

    Daşbaşı, Teslima; Saçmacı, Şerife; Ülgen, Ahmet; Kartal, Şenol

    2015-05-01

    A relatively rapid, accurate and precise solid phase extraction method is presented for the determination of cadmium(II) and lead(II) in various food and water samples. Quantitation is carried out by flame atomic absorption spectrometry (FAAS). The method is based on the retention of the trace metal ions on Dowex Marathon C, a strong acid cation exchange resin. Some important parameters affecting the analytical performance of the method such as pH, flow rate and volume of the sample solution; type, concentration, volume, flow rate of the eluent; and matrix effects on the retention of the metal ions were investigated. Common coexisting ions did not interfere on the separation and determination of the analytes. The detection limits (3 σb) for Cd(II) and Pb(II) were found as 0.13 and 0.18 μg L(-1), respectively, while the limit of quantification values (10 σb) were computed as 0.43 and 0.60 μg L(-1) for the same sequence of the analytes. The precision (as relative standard deviation was lower than 4% at 5 μg L(-1) Cd(II) and 10 μg L(-1) Pb(II) levels, and the preconcentration factor was found to be 250. The accuracy of the proposed procedure was verified by analysing the certified reference materials, SPS-WW2 Batch 108 wastewater level 2 and INCT-TL-1 tea leaves, with the satisfactory results. In addition, for the accuracy of the method the recovery studies (⩾ 95%) were carried out. The method was applied to the determination of the analytes in the various natural waters (lake water, tap water, waste water with boric acid, waste water with H2SO4) and food samples (pomegranate flower, organic pear, radish leaf, lamb meat, etc.), and good results were obtained. While the food samples almost do not contain cadmium, they have included lead at low levels of 0.13-1.12 μg g(-1). Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Measuring Prices in Health Care Markets Using Commercial Claims Data.

    PubMed

    Neprash, Hannah T; Wallace, Jacob; Chernew, Michael E; McWilliams, J Michael

    2015-12-01

    To compare methods of price measurement in health care markets. Truven Health Analytics MarketScan commercial claims. We constructed medical prices indices using three approaches: (1) a "sentinel" service approach based on a single common service in a specific clinical domain, (2) a market basket approach, and (3) a spending decomposition approach. We constructed indices at the Metropolitan Statistical Area level and estimated correlations between and within them. Price indices using a spending decomposition approach were strongly and positively correlated with indices constructed from broad market baskets of common services (r > 0.95). Prices of single common services exhibited weak to moderate correlations with each other and other measures. Market-level price measures that reflect broad sets of services are likely to rank markets similarly. Price indices relying on individual sentinel services may be more appropriate for examining specialty- or service-specific drivers of prices. © Health Research and Educational Trust.

  15. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  16. 40 CFR 141.25 - Analytical methods for radioactivity.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 24 2012-07-01 2012-07-01 false Analytical methods for radioactivity... § 141.25 Analytical methods for radioactivity. (a) Analysis for the following contaminants shall be conducted to determine compliance with § 141.66 (radioactivity) in accordance with the methods in the...

  17. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  18. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  19. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  20. 7 CFR 93.13 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture... PROCESSED FRUITS AND VEGETABLES Peanuts, Tree Nuts, Corn and Other Oilseeds § 93.13 Analytical methods... manuals: (a) Approved Methods of the American Association of Cereal Chemists (AACC), American Association...

  1. Generation of gas-phase ions from charged clusters: an important ionization step causing suppression of matrix and analyte ions in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Lou, Xianwen; van Dongen, Joost L J; Milroy, Lech-Gustav; Meijer, E W

    2016-12-30

    Ionization in matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a very complicated process. It has been reported that quaternary ammonium salts show extremely strong matrix and analyte suppression effects which cannot satisfactorily be explained by charge transfer reactions. Further investigation of the reasons causing these effects can be useful to improve our understanding of the MALDI process. The dried-droplet and modified thin-layer methods were used as sample preparation methods. In the dried-droplet method, analytes were co-crystallized with matrix, whereas in the modified thin-layer method analytes were deposited on the surface of matrix crystals. Model compounds, tetrabutylammonium iodide ([N(Bu) 4 ]I), cesium iodide (CsI), trihexylamine (THA) and polyethylene glycol 600 (PEG 600), were selected as the test analytes given their ability to generate exclusively pre-formed ions, protonated ions and metal ion adducts respectively in MALDI. The strong matrix suppression effect (MSE) observed using the dried-droplet method might disappear using the modified thin-layer method, which suggests that the incorporation of analytes in matrix crystals contributes to the MSE. By depositing analytes on the matrix surface instead of incorporating in the matrix crystals, the competition for evaporation/ionization from charged matrix/analyte clusters could be weakened resulting in reduced MSE. Further supporting evidence for this inference was found by studying the analyte suppression effect using the same two sample deposition methods. By comparing differences between the mass spectra obtained via the two sample preparation methods, we present evidence suggesting that the generation of gas-phase ions from charged matrix/analyte clusters may induce significant suppression of matrix and analyte ions. The results suggest that the generation of gas-phase ions from charged matrix/analyte clusters is an important ionization step in MALDI-MS. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Determination of different recreational drugs in sweat by headspace solid-phase microextraction gas chromatography mass spectrometry (HS-SPME GC/MS): Application to drugged drivers.

    PubMed

    Gentili, Stefano; Mortali, Claudia; Mastrobattista, Luisa; Berretta, Paolo; Zaami, Simona

    2016-09-10

    A procedure based on headspace solid-phase microextraction (HS-SPME) coupled with gas chromatography/mass spectrometry (GC/MS) has been developed for the determination of most commonly used drugs of abuse in sweat of drivers stopped during roadside controls. DrugWipe 5A sweat screening device was used to collect sweat by a specific pad rubbed gently over forehead skin surface. The procedure involved an acid hydrolysis, a HS-SPME extraction for drugs of abuse but Δ(9)-tetrahydrocannabinol, which was directly extracted in alkaline medium HS-SPME conditions, a GC separation of analytes by a capillary column and MS detection by electron impact ionisation. The method was linear from the limit of quantification (LOQ) to 50ng drug per pad (r(2)≥0.99), with an intra- and inter-assay precision and accuracy always less than 15% and an analytical recovery between 95.1% and 102.8%, depending on the considered analyte. Using the validated method, sweat from 60 apparently intoxicated drivers were found positive to one or more drugs of abuse, showing sweat patches testing as a viable economic and simple alternative to conventional (blood and/or urine) and non conventional (oral fluid) testing of drugs of abuse in drugged drivers. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Detection of nucleophosmin 1 mutations by quantitative real-time polymerase chain reaction versus capillary electrophoresis: a comparative study.

    PubMed

    Barakat, Fareed H; Luthra, Rajyalakshmi; Yin, C Cameron; Barkoh, Bedia A; Hai, Seema; Jamil, Waqar; Bhakta, Yaminiben I; Chen, Su; Medeiros, L Jeffrey; Zuo, Zhuang

    2011-08-01

    Nucleophosmin 1 (NPM1) is the most commonly mutated gene in acute myeloid leukemia. Detection of NPM1 mutations is useful for stratifying patients for therapy, predicting prognosis, and assessing for minimal residual disease. Several methods have been developed to rapidly detect NPM1 mutations in genomic DNA and/or messenger RNA specimens. To directly compare a quantitative real-time polymerase chain reaction (qPCR) assay with a widely used capillary electrophoresis assay for detecting NPM1 mutations. We adopted and modified a qPCR assay designed to detect the 6 most common NPM1 mutations and performed the assay in parallel with capillary electrophoresis assay in 207 bone marrow aspirate or peripheral blood samples from patients with a range of hematolymphoid neoplasms. The qPCR assay demonstrated a higher analytical sensitivity than the capillary electrophoresis 1/1000 versus 1/40, respectively. The capillary electrophoresis assay generated 10 equivocal results that needed to be repeated, whereas the qPCR assay generated only 1 equivocal result. After test conditions were optimized, the qPCR and capillary electrophoresis methods produced 100% concordant results, 85 positive and 122 negative. Given the higher analytical sensitivity and specificity of the qPCR assay, that assay is less likely to generate equivocal results than the capillary electrophoresis assay. Moreover, the qPCR assay is quantitative, faster, cheaper, less prone to contamination, and well suited for monitoring minimal residual disease.

  4. Application and potential of capillary electroseparation methods to determine antioxidant phenolic compounds from plant food material.

    PubMed

    Hurtado-Fernández, Elena; Gómez-Romero, María; Carrasco-Pancorbo, Alegría; Fernández-Gutiérrez, Alberto

    2010-12-15

    Antioxidants are one of the most common active ingredients of nutritionally functional foods which can play an important role in the prevention of oxidation and cellular damage inhibiting or delaying the oxidative processes. In recent years there has been an increased interest in the application of antioxidants to medical treatment as information is constantly gathered linking the development of human diseases to oxidative stress. Within antioxidants, phenolic molecules are an important category of compounds, commonly present in a wide variety of plant food materials. Their correct determination is pivotal nowadays and involves their extraction from the sample, analytical separation, identification, quantification and interpretation of the data. The aim of this review is to provide an overview about all the necessary steps of any analytical procedure to achieve the determination of phenolic compounds from plant matrices, paying particular attention to the application and potential of capillary electroseparation methods. Since it is quite complicated to establish a classification of plant food material, and to structure the current review, we will group the different matrices as follows: fruits, vegetables, herbs, spices and medicinal plants, beverages, vegetable oils, cereals, legumes and nuts and other matrices (including cocoa beans and bee products). At the end of the overview, we include two sections to explain the usefulness of the data about phenols provided by capillary electrophoresis and the newest trends. Copyright 2010 Elsevier B.V. All rights reserved.

  5. Three-dimensional photoacoustic tomography based on graphics-processing-unit-accelerated finite element method.

    PubMed

    Peng, Kuan; He, Ling; Zhu, Ziqiang; Tang, Jingtian; Xiao, Jiaying

    2013-12-01

    Compared with commonly used analytical reconstruction methods, the frequency-domain finite element method (FEM) based approach has proven to be an accurate and flexible algorithm for photoacoustic tomography. However, the FEM-based algorithm is computationally demanding, especially for three-dimensional cases. To enhance the algorithm's efficiency, in this work a parallel computational strategy is implemented in the framework of the FEM-based reconstruction algorithm using a graphic-processing-unit parallel frame named the "compute unified device architecture." A series of simulation experiments is carried out to test the accuracy and accelerating effect of the improved method. The results obtained indicate that the parallel calculation does not change the accuracy of the reconstruction algorithm, while its computational cost is significantly reduced by a factor of 38.9 with a GTX 580 graphics card using the improved method.

  6. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  7. Finite-analytic numerical solution of heat transfer in two-dimensional cavity flow

    NASA Technical Reports Server (NTRS)

    Chen, C.-J.; Naseri-Neshat, H.; Ho, K.-S.

    1981-01-01

    Heat transfer in cavity flow is numerically analyzed by a new numerical method called the finite-analytic method. The basic idea of the finite-analytic method is the incorporation of local analytic solutions in the numerical solutions of linear or nonlinear partial differential equations. In the present investigation, the local analytic solutions for temperature, stream function, and vorticity distributions are derived. When the local analytic solution is evaluated at a given nodal point, it gives an algebraic relationship between a nodal value in a subregion and its neighboring nodal points. A system of algebraic equations is solved to provide the numerical solution of the problem. The finite-analytic method is used to solve heat transfer in the cavity flow at high Reynolds number (1000) for Prandtl numbers of 0.1, 1, and 10.

  8. Simple, rapid, and environmentally friendly method for the separation of isoflavones using ultra-high performance supercritical fluid chromatography.

    PubMed

    Wu, Wenjie; Zhang, Yuan; Wu, Hanqiu; Zhou, Weie; Cheng, Yan; Li, Hongna; Zhang, Chuanbin; Li, Lulu; Huang, Ying; Zhang, Feng

    2017-07-01

    Isoflavones are natural substances that exhibit hormone-like pharmacological activities. The separation of isoflavones remains an analytical challenge because of their similar structures. We show that ultra-high performance supercritical fluid chromatography can be an appropriate tool to achieve the fast separation of 12 common dietary isoflavones. Among the five tested columns the Torus DEA column was found to be the most effective column for the separation of these isoflavones. The impact of individual parameters on the retention time and separation factor was evaluated. These parameters were optimized to develop a simple, rapid, and green method for the separation of the 12 target analytes. It only took 12.91 min using gradient elution with methanol as an organic modifier and formic acid as an additive. These isoflavones were determined with limit of quantitation ranging from 0.10 to 0.50 μg/mL, which was sufficient for reliable determination of various matrixes. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Differentiation of Organically and Conventionally Grown Tomatoes by Chemometric Analysis of Combined Data from Proton Nuclear Magnetic Resonance and Mid-infrared Spectroscopy and Stable Isotope Analysis.

    PubMed

    Hohmann, Monika; Monakhova, Yulia; Erich, Sarah; Christoph, Norbert; Wachter, Helmut; Holzgrabe, Ulrike

    2015-11-04

    Because the basic suitability of proton nuclear magnetic resonance spectroscopy ((1)H NMR) to differentiate organic versus conventional tomatoes was recently proven, the approach to optimize (1)H NMR classification models (comprising overall 205 authentic tomato samples) by including additional data of isotope ratio mass spectrometry (IRMS, δ(13)C, δ(15)N, and δ(18)O) and mid-infrared (MIR) spectroscopy was assessed. Both individual and combined analytical methods ((1)H NMR + MIR, (1)H NMR + IRMS, MIR + IRMS, and (1)H NMR + MIR + IRMS) were examined using principal component analysis (PCA), partial least squares discriminant analysis (PLS-DA), linear discriminant analysis (LDA), and common components and specific weight analysis (ComDim). With regard to classification abilities, fused data of (1)H NMR + MIR + IRMS yielded better validation results (ranging between 95.0 and 100.0%) than individual methods ((1)H NMR, 91.3-100%; MIR, 75.6-91.7%), suggesting that the combined examination of analytical profiles enhances authentication of organically produced tomatoes.

  10. Performance analysis of EM-based blind detection for ON-OFF keying modulation over atmospheric optical channels

    NASA Astrophysics Data System (ADS)

    Dabiri, Mohammad Taghi; Sadough, Seyed Mohammad Sajad

    2018-04-01

    In the free-space optical (FSO) links, atmospheric turbulence lead to scintillation in the received signal. Due to its ease of implementation, intensity modulation with direct detection (IM/DD) based on ON-OFF keying (OOK) is a popular signaling scheme in these systems. Over turbulence channel, to detect OOK symbols in a blind way, i.e., without sending pilot symbols, an expectation-maximization (EM)-based detection method was recently proposed in the literature related to free-space optical (FSO) communication. However, the performance of EM-based detection methods severely depends on the length of the observation interval (Ls). To choose the optimum values of Ls at target bit error rates (BER)s of FSO communications which are commonly lower than 10-9, Monte-Carlo simulations would be very cumbersome and require a very long processing time. To facilitate performance evaluation, in this letter we derive the analytic expressions for BER and outage probability. Numerical results validate the accuracy of our derived analytic expressions. Our results may serve to evaluate the optimum value for Ls without resorting to time-consuming Monte-Carlo simulations.

  11. Simultaneous Determination of Food-Related Biogenic Amines and Precursor Amino Acids Using in Situ Derivatization Ultrasound-Assisted Dispersive Liquid-Liquid Microextraction by Ultra-High-Performance Liquid Chromatography Tandem Mass Spectrometry.

    PubMed

    He, Yongrui; Zhao, Xian-En; Wang, Renjun; Wei, Na; Sun, Jing; Dang, Jun; Chen, Guang; Liu, Zhiqiang; Zhu, Shuyun; You, Jinmao

    2016-11-02

    A simple, rapid, sensitive, selective, and environmentally friendly method, based on in situ derivatization ultrasound-assisted dispersive liquid-liquid microextraction (in situ DUADLLME) coupled with ultra-high-performance liquid chromatography tandem mass spectrometry (UHPLC-MS/MS) using multiple reaction monitoring (MRM) mode has been developed for the simultaneous determination of food-related biogenic amines and amino acids. A new mass-spectrometry-sensitive derivatization reagent 4'-carbonyl chloride rosamine (CCR) was designed, synthesized, and first reported. Parameters and conditions of in situ DUADLLME and UHPLC-MS/MS were optimized in detail. Under the optimized conditions, the in situ DUADLLME was completed speedily (within 1 min) with high derivatization efficiencies (≥98.5%). With the cleanup and concentration of microextraction step, good analytical performance was obtained for the analytes. The results showed that this method was accurate and practical for quantification of biogenic amines and amino acids in common food samples (red wine, beer, wine, cheese, sausage, and fish).

  12. The mean and variance of phylogenetic diversity under rarefaction

    PubMed Central

    Matsen, Frederick A.

    2013-01-01

    Summary Phylogenetic diversity (PD) depends on sampling depth, which complicates the comparison of PD between samples of different depth. One approach to dealing with differing sample depth for a given diversity statistic is to rarefy, which means to take a random subset of a given size of the original sample. Exact analytical formulae for the mean and variance of species richness under rarefaction have existed for some time but no such solution exists for PD.We have derived exact formulae for the mean and variance of PD under rarefaction. We confirm that these formulae are correct by comparing exact solution mean and variance to that calculated by repeated random (Monte Carlo) subsampling of a dataset of stem counts of woody shrubs of Toohey Forest, Queensland, Australia. We also demonstrate the application of the method using two examples: identifying hotspots of mammalian diversity in Australasian ecoregions, and characterising the human vaginal microbiome.There is a very high degree of correspondence between the analytical and random subsampling methods for calculating mean and variance of PD under rarefaction, although the Monte Carlo method requires a large number of random draws to converge on the exact solution for the variance.Rarefaction of mammalian PD of ecoregions in Australasia to a common standard of 25 species reveals very different rank orderings of ecoregions, indicating quite different hotspots of diversity than those obtained for unrarefied PD. The application of these methods to the vaginal microbiome shows that a classical score used to quantify bacterial vaginosis is correlated with the shape of the rarefaction curve.The analytical formulae for the mean and variance of PD under rarefaction are both exact and more efficient than repeated subsampling. Rarefaction of PD allows for many applications where comparisons of samples of different depth is required. PMID:23833701

  13. The mean and variance of phylogenetic diversity under rarefaction.

    PubMed

    Nipperess, David A; Matsen, Frederick A

    2013-06-01

    Phylogenetic diversity (PD) depends on sampling depth, which complicates the comparison of PD between samples of different depth. One approach to dealing with differing sample depth for a given diversity statistic is to rarefy, which means to take a random subset of a given size of the original sample. Exact analytical formulae for the mean and variance of species richness under rarefaction have existed for some time but no such solution exists for PD.We have derived exact formulae for the mean and variance of PD under rarefaction. We confirm that these formulae are correct by comparing exact solution mean and variance to that calculated by repeated random (Monte Carlo) subsampling of a dataset of stem counts of woody shrubs of Toohey Forest, Queensland, Australia. We also demonstrate the application of the method using two examples: identifying hotspots of mammalian diversity in Australasian ecoregions, and characterising the human vaginal microbiome.There is a very high degree of correspondence between the analytical and random subsampling methods for calculating mean and variance of PD under rarefaction, although the Monte Carlo method requires a large number of random draws to converge on the exact solution for the variance.Rarefaction of mammalian PD of ecoregions in Australasia to a common standard of 25 species reveals very different rank orderings of ecoregions, indicating quite different hotspots of diversity than those obtained for unrarefied PD. The application of these methods to the vaginal microbiome shows that a classical score used to quantify bacterial vaginosis is correlated with the shape of the rarefaction curve.The analytical formulae for the mean and variance of PD under rarefaction are both exact and more efficient than repeated subsampling. Rarefaction of PD allows for many applications where comparisons of samples of different depth is required.

  14. Electrolyte system strategies for anionic ITP with ESI-MS detection. 3. The ITP spacer technique in moving-boundary systems and configurations with two self-maintained ITP subsystems.

    PubMed

    Gebauer, Petr; Malá, Zdena; Boček, Petr

    2014-03-01

    This contribution is the third part of the project on strategies used in the selection and tuning of electrolyte systems for anionic ITP with ESI-MS detection. The strategy presented here is based on the creation of self-maintained ITP subsystems in moving-boundary systems and describes two new principal approaches offering physical separation of analyte zones from their common ITP stack and/or simultaneous selective stacking of two different analyte groups. Both strategic directions are based on extending the number of components forming the electrolyte system by adding a third suitable anion. The first method is the application of the spacer technique to moving-boundary anionic ITP systems, the second method is a technique utilizing a moving-boundary ITP system in which two ITP subsystems exist and move with mutually different velocities. It is essential for ESI detection that both methods can be based on electrolyte systems containing only several simple chemicals, such as simple volatile organic acids (formic and acetic) and their ammonium salts. The properties of both techniques are defined theoretically and discussed from the viewpoint of their applicability to trace analysis by ITP-ESI-MS. Examples of system design for selected model separations of preservatives and pharmaceuticals illustrate the validity of the theoretical model and application potential of the proposed techniques by both computer simulations and experiments. Both new methods enhance the application range of ITP-MS and may be beneficial particularly for complex multicomponent samples or for analytes with identical molecular mass. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. HPLC-MS method for the quantification of nine anti-HIV drugs from dry plasma spot on glass filter and their long term stability in different conditions.

    PubMed

    D'Avolio, Antonio; Simiele, Marco; Siccardi, Marco; Baietto, Lorena; Sciandra, Mauro; Bonora, Stefano; Di Perri, Giovanni

    2010-09-05

    A bioanalytical method for the determination of most commonly prescribed protease inhibitors (saquinavir, atazanavir, amprenavir, darunavir, lopinavir and ritonavir) and non-nucleoside reverse transcriptase inhibitors (etravirine, efavirenz and nevirapine) was developed, modifying our previous HPLC-MS chromatographic run, validated and a complete short and long term stability evaluation was carried out. One hundred microlitres of plasma were distributed on a collection glass paper filter (Glass-Microfibre from Sartorius), then the filter underwent thermal treatment, both for drying and for HIV inactivation, and stored at room temperature, 4 degrees C and -20 degrees C. The analytes were extracted from the filter disc using tert-butylmethylether with basic pH, after the addition of the internal standards quinoxaline. The extract was dried, reconstituted and the chromatographic separation was performed on a reversed-phase C-18 column (150 mm x 2.0 mm) and the analytes were quantified using a single quadrupole mass spectrometer. The method was validated considering the concentration ranges encountered in clinical trials and the routine clinical practice. The assay was linear over the concentration ranges tested. Accuracies ranged from 92.1% to 111.9% and intra-day and inter-day relative standard deviation for all quality control levels ranged from 0.2 to 12.9 and 3.1 to 14.4, respectively. Analytes in dried plasma spots were stable for longer time when dried/inactivation step was carried out before storage compared to samples not dried/inactivated before the analysis. The dried/inactivation step allows shipment of samples at room temperature without any risks, therefore the developed and validated method enables an easy and cheap sample shipment for therapeutic drug monitoring and pharmacokinetic studies. 2010 Elsevier B.V. All rights reserved.

  16. A fit-for-purpose LC-MS/MS method for the simultaneous quantitation of ATP and 2,3-DPG in human K2EDTA whole blood.

    PubMed

    Kim, Hyeryun; Kosinski, Penelope; Kung, Charles; Dang, Lenny; Chen, Yue; Yang, Hua; Chen, Yuan-Shek; Kramer, Jordyn; Liu, Guowen

    2017-09-01

    Many hemolytic anemias results in major metabolic abnormalities: two common metabolite abnormalities include increased levels of 2,3-diphosphoglycerate (2,3-DPG) and decreased levels of adenosine triphosphate (ATP). To better monitor the concentration changes of these metabolites, the development of a reliable LC-MS/MS method to quantitatively profile the concentrations of 2, 3-DPG and ATP in whole blood is essential to understand the effects of investigational therapeutics. Accurate quantification of both compounds imposes great challenges to bioanalytical scientists due to their polar, ionic and endogenous nature. Here we present an LC-MS/MS method for the reliable quantification of 2,3-DPG and ATP from K 2 EDTA human whole blood (WB) simultaneously. Whole blood samples were spiked with stable isotope labeled internal standards, processed by protein precipitation extraction, and analyzed using zwitterionic ion chromatography-hydrophilic interaction chromatography (ZIC-HILIC) coupled with tandem mass spectrometry. The linear analytical range of the assay was 50-3000μg/mL. The fit-for-purpose method demonstrated excellent accuracy and precision. The overall accuracy was within ±10.5% (%RE) for both analytes and the intra- and inter-assay precision (%CV) were less than 6.7% and 6.2% for both analytes, respectively. ATP and 2,3-DPG were found to be stable in human K 2 EDTA blood for at least 8h at 4°C, 96days when stored at -70°C and after three freeze/thaw cycles. The assay has been successfully applied to K 2 EDTA human whole blood samples to support clinical studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Pyrrolizidine alkaloids in honey: comparison of analytical methods.

    PubMed

    Kempf, M; Wittig, M; Reinhard, A; von der Ohe, K; Blacquière, T; Raezke, K-P; Michel, R; Schreier, P; Beuerle, T

    2011-03-01

    Pyrrolizidine alkaloids (PAs) are a structurally diverse group of toxicologically relevant secondary plant metabolites. Currently, two analytical methods are used to determine PA content in honey. To achieve reasonably high sensitivity and selectivity, mass spectrometry detection is demanded. One method is an HPLC-ESI-MS-MS approach, the other a sum parameter method utilising HRGC-EI-MS operated in the selected ion monitoring mode (SIM). To date, no fully validated or standardised method exists to measure the PA content in honey. To establish an LC-MS method, several hundred standard pollen analysis results of raw honey were analysed. Possible PA plants were identified and typical commercially available marker PA-N-oxides (PANOs). Three distinct honey sets were analysed with both methods. Set A consisted of pure Echium honey (61-80% Echium pollen). Echium is an attractive bee plant. It is quite common in all temperate zones worldwide and is one of the major reasons for PA contamination in honey. Although only echimidine/echimidine-N-oxide were available as reference for the LC-MS target approach, the results for both analytical techniques matched very well (n = 8; PA content ranging from 311 to 520 µg kg(-1)). The second batch (B) consisted of a set of randomly picked raw honeys, mostly originating from Eupatorium spp. (0-15%), another common PA plant, usually characterised by the occurrence of lycopsamine-type PA. Again, the results showed good consistency in terms of PA-positive samples and quantification results (n = 8; ranging from 0 to 625 µg kg(-1) retronecine equivalents). The last set (C) was obtained by consciously placing beehives in areas with a high abundance of Jacobaea vulgaris (ragwort) from the Veluwe region (the Netherlands). J. vulgaris increasingly invades countrysides in Central Europe, especially areas with reduced farming or sites with natural restorations. Honey from two seasons (2007 and 2008) was sampled. While only trace amounts of ragwort pollen were detected (0-6.3%), in some cases extremely high PA values were detected (n = 31; ranging from 0 to 13019 µg kg(-1), average = 1261 or 76 µg kg(-1) for GC-MS and LC-MS, respectively). Here the results showed significantly different quantification results. The GC-MS sum parameter showed in average higher values (on average differing by a factor 17). The main reason for the discrepancy is most likely the incomplete coverage of the J. vulgaris PA pattern. Major J. vulgaris PAs like jacobine-type PAs or erucifoline/acetylerucifoline were not available as reference compounds for the LC-MS target approach. Based on the direct comparison, both methods are considered from various perspectives and the respective individual strengths and weaknesses for each method are presented in detail.

  18. Selected Analytical Methods for Environmental Remediation and Recovery (SAM) - Home

    EPA Pesticide Factsheets

    The SAM Home page provides access to all information provided in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM), and includes a query function allowing users to search methods by analyte, sample type and instrumentation.

  19. Analytical Studies on the Synchronization of a Network of Linearly-Coupled Simple Chaotic Systems

    NASA Astrophysics Data System (ADS)

    Sivaganesh, G.; Arulgnanam, A.; Seethalakshmi, A. N.; Selvaraj, S.

    2018-05-01

    We present explicit generalized analytical solutions for a network of linearly-coupled simple chaotic systems. Analytical solutions are obtained for the normalized state equations of a network of linearly-coupled systems driven by a common chaotic drive system. Two parameter bifurcation diagrams revealing the various hidden synchronization regions, such as complete, phase and phase-lag synchronization are identified using the analytical results. The synchronization dynamics and their stability are studied using phase portraits and the master stability function, respectively. Further, experimental results for linearly-coupled simple chaotic systems are presented to confirm the analytical results. The synchronization dynamics of a network of chaotic systems studied analytically is reported for the first time.

  20. 40 CFR 141.89 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Analytical methods. 141.89 Section 141...) NATIONAL PRIMARY DRINKING WATER REGULATIONS Control of Lead and Copper § 141.89 Analytical methods. (a... shall be conducted with the methods in § 141.23(k)(1). (1) Analyses for alkalinity, calcium...

  1. 40 CFR 141.89 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 24 2013-07-01 2013-07-01 false Analytical methods. 141.89 Section 141...) NATIONAL PRIMARY DRINKING WATER REGULATIONS Control of Lead and Copper § 141.89 Analytical methods. (a... shall be conducted with the methods in § 141.23(k)(1). (1) Analyses for alkalinity, calcium...

  2. 40 CFR 141.89 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 23 2014-07-01 2014-07-01 false Analytical methods. 141.89 Section 141...) NATIONAL PRIMARY DRINKING WATER REGULATIONS Control of Lead and Copper § 141.89 Analytical methods. (a... shall be conducted with the methods in § 141.23(k)(1). (1) Analyses for alkalinity, calcium...

  3. 40 CFR 141.89 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Analytical methods. 141.89 Section 141...) NATIONAL PRIMARY DRINKING WATER REGULATIONS Control of Lead and Copper § 141.89 Analytical methods. (a... shall be conducted with the methods in § 141.23(k)(1). (1) Analyses for alkalinity, calcium...

  4. 75 FR 49930 - Stakeholder Meeting Regarding Re-Evaluation of Currently Approved Total Coliform Analytical Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-16

    ... Currently Approved Total Coliform Analytical Methods AGENCY: Environmental Protection Agency (EPA). ACTION... of currently approved Total Coliform Rule (TCR) analytical methods. At these meetings, stakeholders will be given an opportunity to discuss potential elements of a method re-evaluation study, such as...

  5. 40 CFR 141.89 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 24 2012-07-01 2012-07-01 false Analytical methods. 141.89 Section 141...) NATIONAL PRIMARY DRINKING WATER REGULATIONS Control of Lead and Copper § 141.89 Analytical methods. (a... shall be conducted with the methods in § 141.23(k)(1). (1) Analyses for alkalinity, calcium...

  6. Major to ultra trace element bulk rock analysis of nanoparticulate pressed powder pellets by LA-ICP-MS

    NASA Astrophysics Data System (ADS)

    Peters, Daniel; Pettke, Thomas

    2016-04-01

    An efficient, clean procedure for bulk rock major to trace element analysis by 193 nm Excimer LA-ICP-MS analysis of nanoparticulate pressed powder pellets (PPPs) employing a binder is presented. Sample powders are milled in water suspension in a planetary ball mill, reducing average grain size by about one order of magnitude compared to common dry milling protocols. Microcrystalline cellulose (MCC) is employed as a binder, improving the mechanical strength of the PPP and the ablation behaviour, because MCC absorbs 193 nm laser light well. Use of MCC binder allows for producing cohesive pellets of materials that cannot be pelletized in their pure forms, such as quartz powder. Rigorous blank quantification was performed on synthetic quartz treated like rock samples, demonstrating that procedural blanks are irrelevant except for a few elements at the 10 ng g-1 concentration level. The LA-ICP-MS PPP analytical procedure was optimised and evaluated using six different SRM powders (JP-1, UB-N, BCR-2, GSP-2, OKUM, and MUH-1). Calibration based on external standardization using SRM 610, SRM 612, BCR-2G, and GSD-1G glasses allows for evaluation of possible matrix effects during LA-ICP-MS analysis. The data accuracy of the PPP LA-ICP-MS analytical procedure compares well to that achieved for liquid ICP-MS and LA-ICP-MS glass analysis, except for element concentrations below ˜30 ng g-1, where liquid ICP-MS offers more precise data and in part lower limits of detection. Uncertainties on the external reproducibility of LA-ICP-MS PPP element concentrations are of the order of 0.5 to 2 % (1σ standard deviation) for concentrations exceeding ˜1 μg g-1. For lower element concentrations these uncertainties increase to 5-10% or higher when analyte-depending limits of detection (LOD) are approached, and LODs do not significantly differ from glass analysis. Sample homogeneity is demonstrated by the high analytical precision, except for very few elements where grain size effects can rarely still be resolved analytically. Matrix effects are demonstrated for PPP analysis of diverse rock compositions and basalt glass analysis when externally calibrated based on SRM 610 and SRM 612 glasses; employing basalt glass GSD-1G or BCR-2G for external standardisation basically eliminates these problems. Perhaps the most prominent progress of the LA-ICP-MS PPP analytical procedure presented here is the fact that trace elements not commonly analysed, i.e. new, unconventional geochemical tracers, can be measured straightforwardly, including volatile elements, the flux elements Li and B, the chalcophile elements As, Sb, Tl, Bi, and elements that alloy with metal containers employed in conventional glass production approaches. The method presented here thus overcomes many common problems and limitations in analytical geochemistry and is shown to be an efficient alternative for bulk rock trace elements analysis.

  7. Assessment methods in human body composition.

    PubMed

    Lee, Seon Yeong; Gallagher, Dympna

    2008-09-01

    The present study reviews the most recently developed and commonly used methods for the determination of human body composition in vivo with relevance for nutritional assessment. Body composition measurement methods are continuously being perfected with the most commonly used methods being bioelectrical impedance analysis, dilution techniques, air displacement plethysmography, dual energy X-ray absorptiometry, and MRI or magnetic resonance spectroscopy. Recent developments include three-dimensional photonic scanning and quantitative magnetic resonance. Collectively, these techniques allow for the measurement of fat, fat-free mass, bone mineral content, total body water, extracellular water, total adipose tissue and its subdepots (visceral, subcutaneous, and intermuscular), skeletal muscle, select organs, and ectopic fat depots. There is an ongoing need to perfect methods that provide information beyond mass and structure (static measures) to kinetic measures that yield information on metabolic and biological functions. On the basis of the wide range of measurable properties, analytical methods and known body composition models, clinicians and scientists can quantify a number of body components and with longitudinal assessment, can track changes in health and disease with implications for understanding efficacy of nutritional and clinical interventions, diagnosis, prevention, and treatment in clinical settings. With the greater need to understand precursors of health risk beginning in childhood, a gap exists in appropriate in-vivo measurement methods beginning at birth.

  8. Assessment methods in human body composition

    PubMed Central

    Lee, Seon Yeong; Gallagher, Dympna

    2009-01-01

    Purpose of review The present study reviews the most recently developed and commonly used methods for the determination of human body composition in vivo with relevance for nutritional assessment. Recent findings Body composition measurement methods are continuously being perfected with the most commonly used methods being bioelectrical impedance analysis, dilution techniques, air displacement plethysmography, dual energy X-ray absorptiometry, and MRI or magnetic resonance spectroscopy. Recent developments include three-dimensional photonic scanning and quantitative magnetic resonance. Collectively, these techniques allow for the measurement of fat, fat-free mass, bone mineral content, total body water, extracellular water, total adipose tissue and its subdepots (visceral, subcutaneous, and intermuscular), skeletal muscle, select organs, and ectopic fat depots. Summary There is an ongoing need to perfect methods that provide information beyond mass and structure (static measures) to kinetic measures that yield information on metabolic and biological functions. On the basis of the wide range of measurable properties, analytical methods and known body composition models, clinicians and scientists can quantify a number of body components and with longitudinal assessment, can track changes in health and disease with implications for understanding efficacy of nutritional and clinical interventions, diagnosis, prevention, and treatment in clinical settings. With the greater need to understand precursors of health risk beginning in childhood, a gap exists in appropriate in-vivo measurement methods beginning at birth. PMID:18685451

  9. Highly Sensitive GMO Detection Using Real-Time PCR with a Large Amount of DNA Template: Single-Laboratory Validation.

    PubMed

    Mano, Junichi; Hatano, Shuko; Nagatomi, Yasuaki; Futo, Satoshi; Takabatake, Reona; Kitta, Kazumi

    2018-03-01

    Current genetically modified organism (GMO) detection methods allow for sensitive detection. However, a further increase in sensitivity will enable more efficient testing for large grain samples and reliable testing for processed foods. In this study, we investigated real-time PCR-based GMO detection methods using a large amount of DNA template. We selected target sequences that are commonly introduced into many kinds of GM crops, i.e., 35S promoter and nopaline synthase (NOS) terminator. This makes the newly developed method applicable to a wide range of GMOs, including some unauthorized ones. The estimated LOD of the new method was 0.005% of GM maize events; to the best of our knowledge, this method is the most sensitive among the GM maize detection methods for which the LOD was evaluated in terms of GMO content. A 10-fold increase in the DNA amount as compared with the amount used under common testing conditions gave an approximately 10-fold reduction in the LOD without PCR inhibition. Our method is applicable to various analytical samples, including processed foods. The use of other primers and fluorescence probes would permit highly sensitive detection of various recombinant DNA sequences besides the 35S promoter and NOS terminator.

  10. Methods for investigating biosurfactants and bioemulsifiers: a review.

    PubMed

    Satpute, Surekha K; Banpurkar, Arun G; Dhakephalkar, Prashant K; Banat, Ibrahim M; Chopade, Balu A

    2010-06-01

    Microorganisms produce biosurfactant (BS)/bioemulsifier (BE) with wide structural and functional diversity which consequently results in the adoption of different techniques to investigate these diverse amphiphilic molecules. This review aims to compile information on different microbial screening methods, surface active products extraction procedures, and analytical terminologies used in this field. Different methods for screening microbial culture broth or cell biomass for surface active compounds production are also presented and their possible advantages and disadvantages highlighted. In addition, the most common methods for purification, detection, and structure determination for a wide range of BS and BE are introduced. Simple techniques such as precipitation using acetone, ammonium sulphate, solvent extraction, ultrafiltration, ion exchange, dialysis, ultrafiltration, lyophilization, isoelectric focusing (IEF), and thin layer chromatography (TLC) are described. Other more elaborate techniques including high pressure liquid chromatography (HPLC), infra red (IR), gas chromatography-mass spectroscopy (GC-MS), nuclear magnetic resonance (NMR), and fast atom bombardment mass spectroscopy (FAB-MS), protein digestion and amino acid sequencing are also elucidated. Various experimental strategies including static light scattering and hydrodynamic characterization for micelles have been discussed. A combination of various analytical methods are often essential in this area of research and a numbers of trials and errors to isolate, purify and characterize various surface active agents are required. This review introduces the various methodologies that are indispensable for studying biosurfactants and bioemulsifiers.

  11. Determination of Parabens by Injection-Port Derivatization Coupled With Gas-Chromatography-Mass Spectrometry and Matrix Solid Phase Dispersion

    NASA Astrophysics Data System (ADS)

    Djatmika, Rosalina; Ding, Wang-Hsien; Sulistyarti, Hermin

    2018-01-01

    A rapid determination of four parabens preservatives (methyl paraben, ethyl paraben, propyl paraben, and butyl paraben) in marketed seafood is presented. Analytes were extracted and purified using matrix solid-phase dispersion (MSPD) method, followed by Injection port acylation gas chromatography-mass spectrometry (GC-MS) with acetic anhydride reagent. In this method, acylation of parabens was performed by acetic anhydride at GC injection-port generating reduction of the time-consuming sample-processing steps, and the amount of toxic reagents and solvents. The parameters affecting this method such as injection port temperature, purge-off time and acylation (acetic anhydride) volume were studied. In addition, the MSPD influence factors (including the amount of dispersant and clean-up co-sorbent, as well as the volume of elution solvent) were also investigated. After MSPD method and Injection port acylation applied, good linearity of analytes was achieved. The limits of quantitation (LOQs) were 0.2 to 1.0 ng/g (dry weight). Compared with offline derivatization commonly performed, injection port acylation employs a rapid, simple, low-cost and environmental-friendly derivatization process. The optimized method has been successfully applied for the analysis of parabens in four kind of marketed seafood. Preliminary results showed that the total concentrations of four selected parabens ranged from 16.7 to 44.7 ng/g (dry weight).

  12. Sensitive screening of abused drugs in dried blood samples using ultra-high-performance liquid chromatography-ion booster-quadrupole time-of-flight mass spectrometry.

    PubMed

    Chepyala, Divyabharathi; Tsai, I-Lin; Liao, Hsiao-Wei; Chen, Guan-Yuan; Chao, Hsi-Chun; Kuo, Ching-Hua

    2017-03-31

    An increased rate of drug abuse is a major social problem worldwide. The dried blood spot (DBS) sampling technique offers many advantages over using urine or whole blood sampling techniques. This study developed a simple and efficient ultra-high-performance liquid chromatography-ion booster-quadrupole time-of-flight mass spectrometry (UHPLC-IB-QTOF-MS) method for the analysis of abused drugs and their metabolites using DBS. Fifty-seven compounds covering the most commonly abused drugs, including amphetamines, opioids, cocaine, benzodiazepines, barbiturates, and many other new and emerging abused drugs, were selected as the target analytes of this study. An 80% acetonitrile solvent with a 5-min extraction by Geno grinder was used for sample extraction. A Poroshell column was used to provide efficient separation, and under optimal conditions, the analytical times were 15 and 5min in positive and negative ionization modes, respectively. Ionization parameters of both electrospray ionization source and ion booster (IB) source containing an extra heated zone were optimized to achieve the best ionization efficiency of the investigated abused drugs. In spite of their structural diversity, most of the abused drugs showed an enhanced mass response with the high temperature ionization from an extra heated zone of IB source. Compared to electrospray ionization, the ion booster (IB) greatly improved the detection sensitivity for 86% of the analytes by 1.5-14-fold and allowed the developed method to detect trace amounts of compounds on the DBS cards. The validation results showed that the coefficients of variation of intra-day and inter-day precision in terms of the signal intensity were lower than 19.65%. The extraction recovery of all analytes was between 67.21 and 115.14%. The limits of detection of all analytes were between 0.2 and 35.7ngmL -1 . The stability study indicated that 7% of compounds showed poor stability (below 50%) on the DBS cards after 6 months of storage at room temperature and -80°C. The reported method provides a new direction for abused drug screening using DBS. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Rapid screening of mixed edible oils and gutter oils by matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Ng, Tsz-Tsun; So, Pui-Kin; Zheng, Bo; Yao, Zhong-Ping

    2015-07-16

    Authentication of edible oils is a long-term issue in food safety, and becomes particularly important with the emergence and wide spread of gutter oils in recent years. Due to the very high analytical demand and diversity of gutter oils, a high throughput analytical method and a versatile strategy for authentication of mixed edible oils and gutter oils are highly desirable. In this study, an improved matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) method has been developed for direct analysis of edible oils. This method involved on-target sample loading, automatic data acquisition and simple data processing. MALDI-MS spectra with high quality and high reproducibility have been obtained using this method, and a preliminary spectral database of edible oils has been set up. The authenticity of an edible oil sample can be determined by comparing its MALDI-MS spectrum and principal component analysis (PCA) results with those of its labeled oil in the database. This method is simple and the whole process only takes several minutes for analysis of one oil sample. We demonstrated that the method was sensitive to change in oil compositions and can be used for measuring compositions of mixed oils. The capability of the method for determining mislabeling enables it for rapid screening of gutter oils since fraudulent mislabeling is a common feature of gutter oils. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Regularization of Instantaneous Frequency Attribute Computations

    NASA Astrophysics Data System (ADS)

    Yedlin, M. J.; Margrave, G. F.; Van Vorst, D. G.; Ben Horin, Y.

    2014-12-01

    We compare two different methods of computation of a temporally local frequency:1) A stabilized instantaneous frequency using the theory of the analytic signal.2) A temporally variant centroid (or dominant) frequency estimated from a time-frequency decomposition.The first method derives from Taner et al (1979) as modified by Fomel (2007) and utilizes the derivative of the instantaneous phase of the analytic signal. The second method computes the power centroid (Cohen, 1995) of the time-frequency spectrum, obtained using either the Gabor or Stockwell Transform. Common to both methods is the necessity of division by a diagonal matrix, which requires appropriate regularization.We modify Fomel's (2007) method by explicitly penalizing the roughness of the estimate. Following Farquharson and Oldenburg (2004), we employ both the L curve and GCV methods to obtain the smoothest model that fits the data in the L2 norm.Using synthetic data, quarry blast, earthquakes and the DPRK tests, our results suggest that the optimal method depends on the data. One of the main applications for this work is the discrimination between blast events and earthquakesFomel, Sergey. " Local seismic attributes." , Geophysics, 72.3 (2007): A29-A33.Cohen, Leon. " Time frequency analysis theory and applications." USA: Prentice Hall, (1995).Farquharson, Colin G., and Douglas W. Oldenburg. "A comparison of automatic techniques for estimating the regularization parameter in non-linear inverse problems." Geophysical Journal International 156.3 (2004): 411-425.Taner, M. Turhan, Fulton Koehler, and R. E. Sheriff. " Complex seismic trace analysis." Geophysics, 44.6 (1979): 1041-1063.

  15. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    PubMed

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current methods. Analytical methods are often used to ensure safety, efficacy, and quality of medicinal products. According to government regulations and regulatory guidelines, these methods need to be validated through well-designed studies to minimize the risk of accepting unsuitable methods. This article describes a novel statistical test for analytical method validation, which provides better protection for the risk of accepting unsuitable analytical methods. © PDA, Inc. 2015.

  16. Historical demography of common carp estimated from individuals collected from various parts of the world using the pairwise sequentially markovian coalescent approach.

    PubMed

    Yuan, Zihao; Huang, Wei; Liu, Shikai; Xu, Peng; Dunham, Rex; Liu, Zhanjiang

    2018-04-01

    The inference of historical demography of a species is helpful for understanding species' differentiation and its population dynamics. However, such inference has been previously difficult due to the lack of proper analytical methods and availability of genetic data. A recently developed method called Pairwise Sequentially Markovian Coalescent (PSMC) offers the capability for estimation of the trajectories of historical populations over considerable time periods using genomic sequences. In this study, we applied this approach to infer the historical demography of the common carp using samples collected from Europe, Asia and the Americas. Comparison between Asian and European common carp populations showed that the last glacial period starting 100 ka BP likely caused a significant decline in population size of the wild common carp in Europe, while it did not have much of an impact on its counterparts in Asia. This was probably caused by differences in glacial activities in East Asia and Europe, and suggesting a separation of the European and Asian clades before the last glacial maximum. The North American clade which is an invasive population shared a similar demographic history as those from Europe, consistent with the idea that the North American common carp probably had European ancestral origins. Our analysis represents the first reconstruction of the historical population demography of the common carp, which is important to elucidate the separation of European and Asian common carp clades during the Quaternary glaciation, as well as the dispersal of common carp across the world.

  17. Method and apparatus for optimized sampling of volatilizable target substances

    DOEpatents

    Lindgren, Eric R.; Phelan, James M.

    2002-01-01

    An apparatus for capturing, from gases such as soil gas, target analytes. Target analytes may include emanations from explosive materials or from residues of explosive materials. The apparatus employs principles of sorption common to solid phase microextraction, and is best used in conjunction with analysis means such as a gas chromatograph. To sorb target analytes, the apparatus functions using various sorptive structures to capture target analyte. Depending upon the embodiment, those structures may include 1) a conventional solid-phase microextraction (SPME) fiber, 2) a SPME fiber suspended in a capillary tube (with means provided for moving gases through the capillary tube so that the gases come into close proximity to the suspended fiber), and 3) a capillary tube including an interior surface on which sorptive material (similar to that on the surface of a SPME fiber) is supported (along with means for moving gases through the capillary tube so that the gases come into close proximity to the sorptive material). In one disclosed embodiment, at least one such sorptive structure is associated with an enclosure including an opening in communication with the surface of a soil region potentially contaminated with buried explosive material such as unexploded ordnance. Emanations from explosive materials can pass into and accumulate in the enclosure where they are sorbed by the sorptive structures. Also disclosed is the use of heating means such as microwave horns to drive target analytes into the soil gas from solid and liquid phase components of the soil.

  18. Multiplexed Paper Analytical Device for Quantification of Metals using Distance-Based Detection

    PubMed Central

    Cate, David M.; Noblitt, Scott D.; Volckens, John; Henry, Charles S.

    2015-01-01

    Exposure to metal-containing aerosols has been linked with adverse health outcomes for almost every organ in the human body. Commercially available techniques for quantifying particulate metals are time-intensive, laborious, and expensive; often sample analysis exceeds $100. We report a simple technique, based upon a distance-based detection motif, for quantifying metal concentrations of Ni, Cu, and Fe in airborne particulate matter using microfluidic paper-based analytical devices. Paper substrates are used to create sensors that are self-contained, self-timing, and require only a drop of sample for operation. Unlike other colorimetric approaches in paper microfluidics that rely on optical instrumentation for analysis, with distance-based detection, analyte is quantified visually based on the distance of a colorimetric reaction, similar to reading temperature on a thermometer. To demonstrate the effectiveness of this approach, Ni, Cu, and Fe were measured individually in single-channel devices; detection limits as low as 0.1, 0.1, and 0.05 µg were reported for Ni, Cu, and Fe. Multiplexed analysis of all three metals was achieved with detection limits of 1, 5, and 1 µg for Ni, Cu, and Fe. We also extended the dynamic range for multi-analyte detection by printing concentration gradients of colorimetric reagents using an off the shelf inkjet printer. Analyte selectivity was demonstrated for common interferences. To demonstrate utility of the method, Ni, Cu, and Fe were measured from samples of certified welding fume; levels measured with paper sensors matched known values determined gravimetrically. PMID:26009988

  19. 40 CFR 158.355 - Enforcement analytical method.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Enforcement analytical method. 158.355 Section 158.355 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.355 Enforcement analytical method. An...

  20. 40 CFR 158.355 - Enforcement analytical method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Enforcement analytical method. 158.355 Section 158.355 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.355 Enforcement analytical method. An...

Top