Sample records for analytical procedure based

  1. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Standard Errors of Equating for the Percentile Rank-Based Equipercentile Equating with Log-Linear Presmoothing

    ERIC Educational Resources Information Center

    Wang, Tianyou

    2009-01-01

    Holland and colleagues derived a formula for analytical standard error of equating using the delta-method for the kernel equating method. Extending their derivation, this article derives an analytical standard error of equating procedure for the conventional percentile rank-based equipercentile equating with log-linear smoothing. This procedure is…

  3. A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry

    ERIC Educational Resources Information Center

    Adami, Gianpiero

    2006-01-01

    A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…

  4. Exploring the Efficacy of Behavioral Skills Training to Teach Basic Behavior Analytic Techniques to Oral Care Providers

    ERIC Educational Resources Information Center

    Graudins, Maija M.; Rehfeldt, Ruth Anne; DeMattei, Ronda; Baker, Jonathan C.; Scaglia, Fiorella

    2012-01-01

    Performing oral care procedures with children with autism who exhibit noncompliance can be challenging for oral care professionals. Previous research has elucidated a number of effective behavior analytic procedures for increasing compliance, but some procedures are likely to be too time consuming and expensive for community-based oral care…

  5. Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.

    PubMed

    Grdinić, Vladimir; Vuković, Jadranka

    2004-05-28

    A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.

  6. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    PubMed

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Recent developments in computer vision-based analytical chemistry: A tutorial review.

    PubMed

    Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J

    2015-10-29

    Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Harmonization of strategies for the validation of quantitative analytical procedures. A SFSTP proposal--Part I.

    PubMed

    Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L

    2004-11-15

    This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory documents.

  9. A paper-based analytical device for the determination of hydrogen sulfide in fuel oils based on headspace liquid-phase microextraction and cyclic voltammetry.

    PubMed

    Nechaeva, Daria; Shishov, Andrey; Ermakov, Sergey; Bulatov, Andrey

    2018-06-01

    An easily performed miniaturized, cheap, selective and sensitive procedure for the determination of H 2 S in fuel oil samples based on a headspace liquid-phase microextraction followed by a cyclic voltammetry detection using a paper-based analytical device (PAD) was developed. A modified wax dipping method was applied to fabricate the PAD. The PAD included hydrophobic zones of sample and supporting electrolyte connecting by hydrophilic channel. The zones of sample and supporting electrolyte were connected with nickel working, platinum auxiliary and Ag/AgCl reference electrodes. The analytical procedure included separation of H 2 S from fuel oil sample based on the headspace liquid-phase microextraction in alkaline solution. Then, sulfide ions solution obtained and supporting electrolyte were dropped on the zones followed by analyte detection at + 0.45 V. Under the optimized conditions, H 2 S concentration in the range from 2 to 20 mg kg -1 had a good linear relation with the peak current. The limit of detection (3σ) was 0.6 mg kg -1 . The procedure was successfully applied to the analysis of fuel oil samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Recent advancements in chemical luminescence-based lab-on-chip and microfluidic platforms for bioanalysis.

    PubMed

    Mirasoli, Mara; Guardigli, Massimo; Michelini, Elisa; Roda, Aldo

    2014-01-01

    Miniaturization of analytical procedures through microchips, lab-on-a-chip or micro total analysis systems is one of the most recent trends in chemical and biological analysis. These systems are designed to perform all the steps in an analytical procedure, with the advantages of low sample and reagent consumption, fast analysis, reduced costs, possibility of extra-laboratory application. A range of detection technologies have been employed in miniaturized analytical systems, but most applications relied on fluorescence and electrochemical detection. Chemical luminescence (which includes chemiluminescence, bioluminescence, and electrogenerated chemiluminescence) represents an alternative detection principle that offered comparable (or better) analytical performance and easier implementation in miniaturized analytical devices. Nevertheless, chemical luminescence-based ones represents only a small fraction of the microfluidic devices reported in the literature, and until now no review has been focused on these devices. Here we review the most relevant applications (since 2009) of miniaturized analytical devices based on chemical luminescence detection. After a brief overview of the main chemical luminescence systems and of the recent technological advancements regarding their implementation in miniaturized analytical devices, analytical applications are reviewed according to the nature of the device (microfluidic chips, microchip electrophoresis, lateral flow- and paper-based devices) and the type of application (micro-flow injection assays, enzyme assays, immunoassays, gene probe hybridization assays, cell assays, whole-cell biosensors). Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Quality-assurance results for routine water analyses in U.S. Geological Survey laboratories, water year 1998

    USGS Publications Warehouse

    Ludtke, Amy S.; Woodworth, Mark T.; Marsh, Philip S.

    2000-01-01

    The U.S. Geological Survey operates a quality-assurance program based on the analyses of reference samples for two laboratories: the National Water Quality Laboratory and the Quality of Water Service Unit. Reference samples that contain selected inorganic, nutrient, and low-level constituents are prepared and submitted to the laboratory as disguised routine samples. The program goal is to estimate precision and bias for as many analytical methods offered by the participating laboratories as possible. Blind reference samples typically are submitted at a rate of 2 to 5 percent of the annual environmental-sample load for each constituent. The samples are distributed to the laboratories throughout the year. The reference samples are subject to the identical laboratory handling, processing, and analytical procedures as those applied to environmental samples and, therefore, have been used as an independent source to verify bias and precision of laboratory analytical methods and ambient water-quality measurements. The results are stored permanently in the National Water Information System and the Blind Sample Project's data base. During water year 1998, 95 analytical procedures were evaluated at the National Water Quality Laboratory and 63 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic and low-level constituent data for water year 1998 indicated 77 of 78 analytical procedures at the National Water Quality Laboratory met the criteria for precision. Silver (dissolved, inductively coupled plasma-mass spectrometry) was determined to be imprecise. Five of 78 analytical procedures showed bias throughout the range of reference samples: chromium (dissolved, inductively coupled plasma-atomic emission spectrometry), dissolved solids (dissolved, gravimetric), lithium (dissolved, inductively coupled plasma-atomic emission spectrometry), silver (dissolved, inductively coupled plasma-mass spectrometry), and zinc (dissolved, inductively coupled plasma-mass spectrometry). At the National Water Quality Laboratory during water year 1998, lack of precision was indicated for 2 of 17 nutrient procedures: ammonia as nitrogen (dissolved, colorimetric) and orthophosphate as phosphorus (dissolved, colorimetric). Bias was indicated throughout the reference sample range for ammonia as nitrogen (dissolved, colorimetric, low level) and nitrate plus nitrite as nitrogen (dissolved, colorimetric, low level). All analytical procedures tested at the Quality of Water Service Unit during water year 1998 met the criteria for precision. One of the 63 analytical procedures indicated a bias throughout the range of reference samples: aluminum (whole-water recoverable, inductively coupled plasma-atomic emission spectrometry, trace).

  12. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  13. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  14. General Procedure for the Easy Calculation of pH in an Introductory Course of General or Analytical Chemistry

    ERIC Educational Resources Information Center

    Cepriá, Gemma; Salvatella, Luis

    2014-01-01

    All pH calculations for simple acid-base systems used in introductory courses on general or analytical chemistry can be carried out by using a general procedure requiring the use of predominance diagrams. In particular, the pH is calculated as the sum of an independent term equaling the average pK[subscript a] values of the acids involved in the…

  15. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  16. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  17. A Fuzzy-Based Decision Support Model for Selecting the Best Dialyser Flux in Haemodialysis.

    PubMed

    Oztürk, Necla; Tozan, Hakan

    2015-01-01

    Decision making is an important procedure for every organization. The procedure is particularly challenging for complicated multi-criteria problems. Selection of dialyser flux is one of the decisions routinely made for haemodialysis treatment provided for chronic kidney failure patients. This study provides a decision support model for selecting the best dialyser flux between high-flux and low-flux dialyser alternatives. The preferences of decision makers were collected via a questionnaire. A total of 45 questionnaires filled by dialysis physicians and nephrologists were assessed. A hybrid fuzzy-based decision support software that enables the use of Analytic Hierarchy Process (AHP), Fuzzy Analytic Hierarchy Process (FAHP), Analytic Network Process (ANP), and Fuzzy Analytic Network Process (FANP) was used to evaluate the flux selection model. In conclusion, the results showed that a high-flux dialyser is the best. option for haemodialysis treatment.

  18. A review of selected inorganic surface water quality-monitoring practices: are we really measuring what we think, and if so, are we doing it right?

    USGS Publications Warehouse

    Horowitz, Arthur J.

    2013-01-01

    Successful environmental/water quality-monitoring programs usually require a balance between analytical capabilities, the collection and preservation of representative samples, and available financial/personnel resources. Due to current economic conditions, monitoring programs are under increasing pressure to do more with less. Hence, a review of current sampling and analytical methodologies, and some of the underlying assumptions that form the bases for these programs seems appropriate, to see if they are achieving their intended objectives within acceptable error limits and/or measurement uncertainty, in a cost-effective manner. That evaluation appears to indicate that several common sampling/processing/analytical procedures (e.g., dip (point) samples/measurements, nitrogen determinations, total recoverable analytical procedures) are generating biased or nonrepresentative data, and that some of the underlying assumptions relative to current programs, such as calendar-based sampling and stationarity are no longer defensible. The extensive use of statistical models as well as surrogates (e.g., turbidity) also needs to be re-examined because the hydrologic interrelationships that support their use tend to be dynamic rather than static. As a result, a number of monitoring programs may need redesigning, some sampling and analytical procedures may need to be updated, and model/surrogate interrelationships may require recalibration.

  19. Liquefaction Resistance Based on Shear Wave Velocity

    DOT National Transportation Integrated Search

    1999-01-01

    This report reviews the current simplified procedures for evaluating the liquefaction resistance of granular soil deposits using small-strain shear wave velocity. These procedures were developed from analytical studies, laboratory studies, or very li...

  20. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  1. Quality-assurance results for routine water analysis in US Geological Survey laboratories, water year 1991

    USGS Publications Warehouse

    Maloney, T.J.; Ludtke, A.S.; Krizman, T.L.

    1994-01-01

    The US. Geological Survey operates a quality- assurance program based on the analyses of reference samples for the National Water Quality Laboratory in Arvada, Colorado, and the Quality of Water Service Unit in Ocala, Florida. Reference samples containing selected inorganic, nutrient, and low ionic-strength constituents are prepared and disguised as routine samples. The program goal is to determine precision and bias for as many analytical methods offered by the participating laboratories as possible. The samples typically are submitted at a rate of approximately 5 percent of the annual environmental sample load for each constituent. The samples are distributed to the laboratories throughout the year. Analytical data for these reference samples reflect the quality of environmental sample data produced by the laboratories because the samples are processed in the same manner for all steps from sample login through data release. The results are stored permanently in the National Water Data Storage and Retrieval System. During water year 1991, 86 analytical procedures were evaluated at the National Water Quality Laboratory and 37 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic (major ion and trace metal) constituent data for water year 1991 indicated analytical imprecision in the National Water Quality Laboratory for 5 of 67 analytical procedures: aluminum (whole-water recoverable, atomic emission spectrometric, direct-current plasma); calcium (atomic emission spectrometric, direct); fluoride (ion-exchange chromatographic); iron (whole-water recoverable, atomic absorption spectrometric, direct); and sulfate (ion-exchange chromatographic). The results for 11 of 67 analytical procedures had positive or negative bias during water year 1991. Analytical imprecision was indicated in the determination of two of the five National Water Quality Laboratory nutrient constituents: orthophosphate as phosphorus and phosphorus. A negative or positive bias condition was indicated in three of five nutrient constituents. There was acceptable precision and no indication of bias for the 14 low ionic-strength analytical procedures tested in the National Water Quality Laboratory program and for the 32 inorganic and 5 nutrient analytical procedures tested in the Quality of Water Service Unit during water year 1991.

  2. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    PubMed

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Digital forensics: an analytical crime scene procedure model (ACSPM).

    PubMed

    Bulbul, Halil Ibrahim; Yavuzcan, H Guclu; Ozel, Mesut

    2013-12-10

    In order to ensure that digital evidence is collected, preserved, examined, or transferred in a manner safeguarding the accuracy and reliability of the evidence, law enforcement and digital forensic units must establish and maintain an effective quality assurance system. The very first part of this system is standard operating procedures (SOP's) and/or models, conforming chain of custody requirements, those rely on digital forensics "process-phase-procedure-task-subtask" sequence. An acceptable and thorough Digital Forensics (DF) process depends on the sequential DF phases, and each phase depends on sequential DF procedures, respectively each procedure depends on tasks and subtasks. There are numerous amounts of DF Process Models that define DF phases in the literature, but no DF model that defines the phase-based sequential procedures for crime scene identified. An analytical crime scene procedure model (ACSPM) that we suggest in this paper is supposed to fill in this gap. The proposed analytical procedure model for digital investigations at a crime scene is developed and defined for crime scene practitioners; with main focus on crime scene digital forensic procedures, other than that of whole digital investigation process and phases that ends up in a court. When reviewing the relevant literature and interrogating with the law enforcement agencies, only device based charts specific to a particular device and/or more general perspective approaches to digital evidence management models from crime scene to courts are found. After analyzing the needs of law enforcement organizations and realizing the absence of crime scene digital investigation procedure model for crime scene activities we decided to inspect the relevant literature in an analytical way. The outcome of this inspection is our suggested model explained here, which is supposed to provide guidance for thorough and secure implementation of digital forensic procedures at a crime scene. In digital forensic investigations each case is unique and needs special examination, it is not possible to cover every aspect of crime scene digital forensics, but the proposed procedure model is supposed to be a general guideline for practitioners. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  6. Automatic computer procedure for generating exact and analytical kinetic energy operators based on the polyspherical approach: General formulation and removal of singularities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ndong, Mamadou; Lauvergnat, David; Nauts, André

    2013-11-28

    We present new techniques for an automatic computation of the kinetic energy operator in analytical form. These techniques are based on the use of the polyspherical approach and are extended to take into account Cartesian coordinates as well. An automatic procedure is developed where analytical expressions are obtained by symbolic calculations. This procedure is a full generalization of the one presented in Ndong et al., [J. Chem. Phys. 136, 034107 (2012)]. The correctness of the new implementation is analyzed by comparison with results obtained from the TNUM program. We give several illustrations that could be useful for users of themore » code. In particular, we discuss some cyclic compounds which are important in photochemistry. Among others, we show that choosing a well-adapted parameterization and decomposition into subsystems can allow one to avoid singularities in the kinetic energy operator. We also discuss a relation between polyspherical and Z-matrix coordinates: this comparison could be helpful for building an interface between the new code and a quantum chemistry package.« less

  7. [Basic research on digital logistic management of hospital].

    PubMed

    Cao, Hui

    2010-05-01

    This paper analyzes and explores the possibilities of digital information-based management realized by equipment department, general services department, supply room and other material flow departments in different hospitals in order to optimize the procedures of information-based asset management. There are various analytical methods of medical supplies business models, providing analytical data for correct decisions made by departments and leaders of hospital and the governing authorities.

  8. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  9. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  10. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  11. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  12. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  14. Towards European urinalysis guidelines. Introduction of a project under European Confederation of Laboratory Medicine.

    PubMed

    Kouri, T T; Gant, V A; Fogazzi, G B; Hofmann, W; Hallander, H O; Guder, W G

    2000-07-01

    Improved standardized performance is needed because urinalysis continues to be one of the most frequently requested laboratory tests. Since 1997, the European Confederation of Laboratory Medicine (ECLM) has been supporting an interdisciplinary project aiming to produce European urinalysis guidelines. More than seventy clinical chemists, microbiologists and ward-based clinicians, as well as representatives of manufacturers are taking part. These guidelines aim to improve the quality and consistency of chemical urinalysis, particle counting and bacterial culture by suggesting optimal investigative processes that could be applied in Europe. The approach is based on medical needs for urinalysis. The importance of the pre-analytical stage for total quality is stressed by detailed illustrative advice for specimen collection. Attention is also given to emerging automated technology. For cost containment reasons, both optimum (ideal) procedures and minimum analytical approaches are suggested. Since urinalysis mostly lacks genuine reference methods (primary reference measurement procedures; Level 4), a novel classification of the methods is proposed: comparison measurement procedures (Level 3), quantitative routine procedures (Level 2), and ordinal scale examinations (Level 1). Stepwise strategies are suggested to save costs, applying different rules for general and specific patient populations. New analytical quality specifications have been created. After a consultation period, the final written text will be published in full as a separate document.

  15. Physical and Chemical Properties of the Copper-Alanine System: An Advanced Laboratory Project

    ERIC Educational Resources Information Center

    Farrell, John J.

    1977-01-01

    An integrated physical-analytical-inorganic chemistry laboratory procedure for use with undergraduate biology majors is described. The procedure requires five to six laboratory periods and includes acid-base standardizations, potentiometric determinations, computer usage, spectrophotometric determinations of crystal-field splitting…

  16. Magnetic scavengers as carriers of analytes for flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS).

    PubMed

    Cegłowski, Michał; Kurczewska, Joanna; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz

    2015-09-07

    In this paper, a procedure for the preconcentration and transport of mixtures of acids, bases, and drug components to a mass spectrometer using magnetic scavengers is presented. Flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS) was used as an analytical method for identification of the compounds by thermal desorption from the scavengers. The proposed procedure is fast and cheap, and does not involve time-consuming purification steps. The developed methodology can be applied for trapping harmful substances in minute quantities, to transport them to specialized, remotely located laboratories.

  17. Direct structural parameter identification by modal test results

    NASA Technical Reports Server (NTRS)

    Chen, J.-C.; Kuo, C.-P.; Garba, J. A.

    1983-01-01

    A direct identification procedure is proposed to obtain the mass and stiffness matrices based on the test measured eigenvalues and eigenvectors. The method is based on the theory of matrix perturbation in which the correct mass and stiffness matrices are expanded in terms of analytical values plus a modification matrix. The simplicity of the procedure enables real time operation during the structural testing.

  18. Development of an Analytical Procedure for the Determination of Multiclass Compounds for Forensic Veterinary Toxicology.

    PubMed

    Sell, Bartosz; Sniegocki, Tomasz; Zmudzki, Jan; Posyniak, Andrzej

    2018-04-01

    Reported here is a new analytical multiclass method based on QuEChERS technique, which has proven to be effective in diagnosing fatal poisoning cases in animals. This method has been developed for the determination of analytes in liver samples comprising rodenticides, carbamate and organophosphorus pesticides, coccidiostats and mycotoxins. The procedure entails addition of acetonitrile and sodium acetate to 2 g of homogenized liver sample. The mixture was shaken intensively and centrifuged for phase separation, which was followed by an organic phase transfer into a tube containing sorbents (PSA and C18) and magnesium sulfate, then it was centrifuged, the supernatant was filtered and analyzed by liquid chromatography tandem mass spectrometry. A validation of the procedure was performed. Repeatability variation coefficients <15% have been achieved for most of the analyzed substances. Analytical conditions allowed for a successful separation of variety of poisons with the typical screening detection limit at ≤10 μg/kg levels. The method was used to investigate more than 100 animals poisoning incidents and proved that is useful to be used in animal forensic toxicology cases.

  19. New procedure of quantitative mapping of Ti and Al released from dental implant and Mg, Ca, Fe, Zn, Cu, Mn as physiological elements in oral mucosa by LA-ICP-MS.

    PubMed

    Sajnóg, Adam; Hanć, Anetta; Koczorowski, Ryszard; Barałkiewicz, Danuta

    2017-12-01

    A new procedure for determination of elements derived from titanium implants and physiological elements in soft tissues by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is presented. The analytical procedure was developed which involved preparation of in-house matrix matched solid standards with analyte addition based on certified reference material (CRM) MODAS-4 Cormorant Tissue. Addition of gelatin, serving as a binding agent, essentially improved physical properties of standards. Performance of the analytical method was assayed and validated by calculating parameters like precision, detection limits, trueness and recovery of analyte addition using additional CRM - ERM-BB184 Bovine Muscle. Analyte addition was additionally confirmed by microwave digestion of solid standards and analysis by solution nebulization ICP-MS. The detection limits are in range 1.8μgg -1 to 450μgg -1 for Mn and Ca respectively. The precision values range from 7.3% to 42% for Al and Zn respectively. The estimated recoveries of analyte addition line within scope of 83%-153% for Mn and Cu respectively. Oral mucosa samples taken from patients treated with titanium dental implants were examined using developed analytical method. Standards and tissue samples were cryocut into 30µm thin sections. LA-ICP-MS allowed to obtain two-dimensional maps of distribution of elements in tested samples which revealed high content of Ti and Al derived from implants. Photographs from optical microscope displayed numerous particles with µm size in oral mucosa samples which suggests that they are residues from implantation procedure. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Classifying Correlation Matrices into Relatively Homogeneous Subgroups: A Cluster Analytic Approach

    ERIC Educational Resources Information Center

    Cheung, Mike W.-L.; Chan, Wai

    2005-01-01

    Researchers are becoming interested in combining meta-analytic techniques and structural equation modeling to test theoretical models from a pool of studies. Most existing procedures are based on the assumption that all correlation matrices are homogeneous. Few studies have addressed what the next step should be when studies being analyzed are…

  1. Incorporating Students' Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum

    ERIC Educational Resources Information Center

    Gao, Ruomei

    2015-01-01

    In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…

  2. Flexible pavement overlay design procedures. Volume 1: Evaluation and modification of the design methods

    NASA Astrophysics Data System (ADS)

    Majidzadeh, K.; Ilves, G. J.

    1981-08-01

    A ready reference to design procedures for asphaltic concrete overlay of flexible pavements based on elastic layer theory is provided. The design procedures and the analytical techniques presented were formulated to predict the structural fatigue response of asphaltic concrete overlays for various design conditions, including geometrical and material properties, loading conditions and environmental variables.

  3. Validation of the replica trick for simple models

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  4. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  5. 40 CFR 1065.715 - Natural gas.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROCEDURES Engine Fluids, Test Fuels, Analytical Gases and Other Calibration Standards § 1065.715 Natural gas... specifications in the following table: Table 1 of § 1065.715—Test Fuel Specifications for Natural Gas Property....051 mol/mol. 1 Demonstrate compliance with fuel specifications based on the reference procedures in...

  6. A new method for constructing analytic elements for groundwater flow.

    NASA Astrophysics Data System (ADS)

    Strack, O. D.

    2007-12-01

    The analytic element method is based upon the superposition of analytic functions that are defined throughout the infinite domain, and can be used to meet a variety of boundary conditions. Analytic elements have been use successfully for a number of problems, mainly dealing with the Poisson equation (see, e.g., Theory and Applications of the Analytic Element Method, Reviews of Geophysics, 41,2/1005 2003 by O.D.L. Strack). The majority of these analytic elements consists of functions that exhibit jumps along lines or curves. Such linear analytic elements have been developed also for other partial differential equations, e.g., the modified Helmholz equation and the heat equation, and were constructed by integrating elementary solutions, the point sink and the point doublet, along a line. This approach is limiting for two reasons. First, the existence is required of the elementary solutions, and, second, the integration tends to limit the range of solutions that can be obtained. We present a procedure for generating analytic elements that requires merely the existence of a harmonic function with the desired properties; such functions exist in abundance. The procedure to be presented is used to generalize this harmonic function in such a way that the resulting expression satisfies the applicable differential equation. The approach will be applied, along with numerical examples, for the modified Helmholz equation and for the heat equation, while it is noted that the method is in no way restricted to these equations. The procedure is carried out entirely in terms of complex variables, using Wirtinger calculus.

  7. Recent developments in nickel electrode analysis

    NASA Technical Reports Server (NTRS)

    Whiteley, Richard V.; Daman, M. E.; Kaiser, E. Q.

    1991-01-01

    Three aspects of nickel electrode analysis for Nickel-Hydrogen and Nickel-Cadmium battery cell applications are addressed: (1) the determination of active material; (2) charged state nickel (as NiOOH + CoOOH); and (3) potassium ion content in the electrode. Four deloading procedures are compared for completeness of active material removal, and deloading conditions for efficient active material analyses are established. Two methods for charged state nickel analysis are compared: the current NASA procedure and a new procedure based on the oxidation of sodium oxalate by the charged material. Finally, a method for determining potassium content in an electrode sample by flame photometry is presented along with analytical results illustrating differences in potassium levels from vendor to vendor and the effects of stress testing on potassium content in the electrode. The relevance of these analytical procedures to electrode performance is reviewed.

  8. Systematic investigation of ion suppression and enhancement effects of fourteen stable-isotope-labeled internal standards by their native analogues using atmospheric-pressure chemical ionization and electrospray ionization and the relevance for multi-analyte liquid chromatographic/mass spectrometric procedures.

    PubMed

    Remane, Daniela; Wissenbach, Dirk K; Meyer, Markus R; Maurer, Hans H

    2010-04-15

    In clinical and forensic toxicology, multi-analyte procedures are very useful to quantify drugs and poisons of different classes in one run. For liquid chromatographic/tandem mass spectrometric (LC/MS/MS) multi-analyte procedures, often only a limited number of stable-isotope-labeled internal standards (SIL-ISs) are available. If an SIL-IS is used for quantification of other analytes, it must be excluded that the co-eluting native analyte influences its ionization. Therefore, the effect of ion suppression and enhancement of fourteen SIL-ISs caused by their native analogues has been studied. It could be shown that the native analyte concentration influenced the extent of ion suppression and enhancement effects leading to more suppression with increasing analyte concentration especially when electrospray ionization (ESI) was used. Using atmospheric-pressure chemical ionization (APCI), methanolic solution showed mainly enhancement effects, whereas no ion suppression and enhancement effect, with one exception, occurred when plasma extracts were used under these conditions. Such differences were not observed using ESI. With ESI, eleven SIL-ISs showed relevant suppression effects, but only one analyte showed suppression effects when APCI was used. The presented study showed that ion suppression and enhancement tests using matrix-based samples of different sources are essential for the selection of ISs, particularly if used for several analytes to avoid incorrect quantification. In conclusion, only SIL-ISs should be selected for which no suppression and enhancement effects can be observed. If not enough ISs are free of ionization interferences, a different ionization technique should be considered. 2010 John Wiley & Sons, Ltd.

  9. Career Decision Statuses among Portuguese Secondary School Students: A Cluster Analytical Approach

    ERIC Educational Resources Information Center

    Santos, Paulo Jorge; Ferreira, Joaquim Armando

    2012-01-01

    Career indecision is a complex phenomenon and an increasing number of authors have proposed that undecided individuals do not form a group with homogeneous characteristics. This study examines career decision statuses among a sample of 362 12th-grade Portuguese students. A cluster-analytical procedure, based on a battery of instruments designed to…

  10. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  11. Design optimization of an axial-field eddy-current magnetic coupling based on magneto-thermal analytical model

    NASA Astrophysics Data System (ADS)

    Fontchastagner, Julien; Lubin, Thierry; Mezani, Smaïl; Takorabet, Noureddine

    2018-03-01

    This paper presents a design optimization of an axial-flux eddy-current magnetic coupling. The design procedure is based on a torque formula derived from a 3D analytical model and a population algorithm method. The main objective of this paper is to determine the best design in terms of magnets volume in order to transmit a torque between two movers, while ensuring a low slip speed and a good efficiency. The torque formula is very accurate and computationally efficient, and is valid for any slip speed values. Nevertheless, in order to solve more realistic problems, and then, take into account the thermal effects on the torque value, a thermal model based on convection heat transfer coefficients is also established and used in the design optimization procedure. Results show the effectiveness of the proposed methodology.

  12. Matrix-enhanced secondary ion mass spectrometry: The Alchemist's solution?

    NASA Astrophysics Data System (ADS)

    Delcorte, Arnaud

    2006-07-01

    Because of the requirements of large molecule characterization and high-lateral resolution SIMS imaging, the possibility of improving molecular ion yields by the use of specific sample preparation procedures has recently generated a renewed interest in the static SIMS community. In comparison with polyatomic projectiles, however, signal enhancement by a matrix might appear to some as the alchemist's versus the scientist's solution to the current problems of organic SIMS. In this contribution, I would like to discuss critically the pros and cons of matrix-enhanced SIMS procedures, in the new framework that includes polyatomic ion bombardment. This discussion is based on a short review of the experimental and theoretical developments achieved in the last decade with respect to the three following approaches: (i) blending the analyte with a low-molecular weight organic matrix (MALDI-type preparation procedure); (ii) mixing alkali/noble metal salts with the analyte; (iii) evaporating a noble metal layer on the analyte sample surface (organic molecules, polymers).

  13. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  14. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases

    PubMed Central

    2012-01-01

    Background Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. Results This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. Conclusions The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in paint samples. PMID:23050842

  15. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    PubMed

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in paint samples.

  16. Determination of novel brominated flame retardants and polybrominated diphenyl ethers in serum using gas chromatography-mass spectrometry with two simplified sample preparation procedures.

    PubMed

    Gao, Le; Li, Jian; Wu, Yandan; Yu, Miaohao; Chen, Tian; Shi, Zhixiong; Zhou, Xianqing; Sun, Zhiwei

    2016-11-01

    Two simple and efficient pretreatment procedures have been developed for the simultaneous extraction and cleanup of six novel brominated flame retardants (NBFRs) and eight common polybrominated diphenyl ethers (PBDEs) in human serum. The first sample pretreatment procedure was a quick, easy, cheap, effective, rugged, and safe (QuEChERS)-based approach. An acetone/hexane mixture was employed to isolate the lipid and analytes from the serum with a combination of MgSO 4 and NaCl, followed by a dispersive solid-phase extraction (d-SPE) step using C18 particles as a sorbent. The second sample pretreatment procedure was based on solid-phase extraction. The sample extraction and cleanup were conducted directly on an Oasis HLB SPE column using 5 % aqueous isopropanol, concentrated sulfuric acid, and 10 % aqueous methanol, followed by elution with dichloromethane. The NBFRs and PBDEs were then detected using gas chromatography-negative chemical ionization mass spectrometry (GC-NCI MS). The methods were assessed for repeatability, accuracy, selectivity, limits of detection (LODs), and linearity. The results of spike recovery experiments in fetal bovine serum showed that average recoveries ranged from 77.9 % to 128.8 % with relative standard deviations (RSDs) from 0.73 % to 12.37 % for most of the analytes. The LODs for the analytes in fetal bovine serum ranged from 0.3 to 50.8 pg/mL except for decabromodiphenyl ethane. The proposed method was successfully applied to the determination of the 14 brominated flame retardants in human serum. The two pretreatment procedures described here are simple, accurate, and precise, and are suitable for the routine analysis of human serum. Graphical Abstract Workflow of a QuEChERS-based approach (top) and an SPE-based approach (bottom) for the detection of PBDEs and NBFRs in serum.

  17. Laboratory Analytical Procedures | Bioenergy | NREL

    Science.gov Websites

    analytical procedures (LAPs) to provide validated methods for biofuels and pyrolysis bio-oils research . Biomass Compositional Analysis These lab procedures provide tested and accepted methods for performing

  18. In situ ionic liquid dispersive liquid-liquid microextraction and direct microvial insert thermal desorption for gas chromatographic determination of bisphenol compounds.

    PubMed

    Cacho, Juan Ignacio; Campillo, Natalia; Viñas, Pilar; Hernández-Córdoba, Manuel

    2016-01-01

    A new procedure based on direct insert microvial thermal desorption injection allows the direct analysis of ionic liquid extracts by gas chromatography and mass spectrometry (GC-MS). For this purpose, an in situ ionic liquid dispersive liquid-liquid microextraction (in situ IL DLLME) has been developed for the quantification of bisphenol A (BPA), bisphenol Z (BPZ) and bisphenol F (BPF). Different parameters affecting the extraction efficiency of the microextraction technique and the thermal desorption step were studied. The optimized procedure, determining the analytes as acetyl derivatives, provided detection limits of 26, 18 and 19 ng L(-1) for BPA, BPZ and BPF, respectively. The release of the three analytes from plastic containers was monitored using this newly developed analytical method. Analysis of the migration test solutions for 15 different plastic containers in daily use identified the presence of the analytes at concentrations ranging between 0.07 and 37 μg L(-1) in six of the samples studied, BPA being the most commonly found and at higher concentrations than the other analytes.

  19. Multi-ion detection by one-shot optical sensors using a colour digital photographic camera.

    PubMed

    Lapresta-Fernández, Alejandro; Capitán-Vallvey, Luis Fermín

    2011-10-07

    The feasibility and performance of a procedure to evaluate previously developed one-shot optical sensors as single and selective analyte sensors for potassium, magnesium and hardness are presented. The procedure uses a conventional colour digital photographic camera as the detection system for simultaneous multianalyte detection. A 6.0 megapixel camera was used, and the procedure describes how it is possible to quantify potassium, magnesium and hardness simultaneously from the images captured, using multianalyte one-shot sensors based on ionophore-chromoionophore chemistry, employing the colour information computed from a defined region of interest on the sensing membrane. One of the colour channels in the red, green, blue (RGB) colour space is used to build the analytical parameter, the effective degree of protonation (1-α(eff)), in good agreement with the theoretical model. The linearization of the sigmoidal response function increases the limit of detection (LOD) and analytical range in all cases studied. The increases were from 5.4 × 10(-6) to 2.7 × 10(-7) M for potassium, from 1.4 × 10(-4) to 2.0 × 10(-6) M for magnesium and from 1.7 to 2.0 × 10(-2) mg L(-1) of CaCO(3) for hardness. The method's precision was determined in terms of the relative standard deviation (RSD%) which was from 2.4 to 7.6 for potassium, from 6.8 to 7.8 for magnesium and from 4.3 to 7.8 for hardness. The procedure was applied to the simultaneous determination of potassium, magnesium and hardness using multianalyte one-shot sensors in different types of waters and beverages in order to cover the entire application range, statistically validating the results against atomic absorption spectrometry as the reference procedure. Accordingly, this paper is an attempt to demonstrate the possibility of using a conventional digital camera as an analytical device to measure this type of one-shot sensor based on ionophore-chromoionophore chemistry instead of using conventional lab instrumentation.

  20. New evidences on efficacy of boronic acid-based derivatization method to identify sugars in plant material by gas chromatography-mass spectrometry.

    PubMed

    Faraco, Marianna; Fico, Daniela; Pennetta, Antonio; De Benedetto, Giuseppe E

    2016-10-01

    This work presents an analytical procedure based on gas chromatography-mass spectrometry which allows the determination of aldoses (glucose, mannose, galactose, arabinose, xylose, fucose, rhamnose) and chetoses (fructose) in plant material. One peak for each target carbohydrate was obtained by using an efficient derivatization employing methylboronic acid and acetic anhydride sequentially, whereas the baseline separation of the analytes was accomplished using an ionic liquid capillary column. First, the proposed method was optimized and validated. Successively, it was applied to identify the carbohydrates present in plant material. Finally, the procedure was successfully applied to samples from a XVII century painting, thus highlighting the occurrence of starch glue and fruit tree gum as polysaccharide materials. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Behavior analytic approaches to problem behavior in intellectual disabilities.

    PubMed

    Hagopian, Louis P; Gregory, Meagan K

    2016-03-01

    The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.

  2. Computer-aided diagnostic strategy selection.

    PubMed

    Greenes, R A

    1986-03-01

    Determination of the optimal diagnostic work-up strategy for the patient is becoming a major concern for the practicing physician. Overlap of the indications for various diagnostic procedures, differences in their invasiveness or risk, and high costs have made physicians aware of the need to consider the choice of procedure carefully, as well as its relation to management actions available. In this article, the author discusses research approaches that aim toward development of formal decision analytic methods to allow the physician to determine optimal strategy; clinical algorithms or rules as guides to physician decisions; improved measures for characterizing the performance of diagnostic tests; educational tools for increasing the familiarity of physicians with the concepts underlying these measures and analytic procedures; and computer-based aids for facilitating the employment of these resources in actual clinical practice.

  3. Fast analytical model of MZI micro-opto-mechanical pressure sensor

    NASA Astrophysics Data System (ADS)

    Rochus, V.; Jansen, R.; Goyvaerts, J.; Neutens, P.; O’Callaghan, J.; Rottenberg, X.

    2018-06-01

    This paper presents a fast analytical procedure in order to design a micro-opto-mechanical pressure sensor (MOMPS) taking into account the mechanical nonlinearity and the optical losses. A realistic model of the photonic MZI is proposed, strongly coupled to a nonlinear mechanical model of the membrane. Based on the membrane dimensions, the residual stress, the position of the waveguide, the optical wavelength and the phase variation due to the opto-mechanical coupling, we derive an analytical model which allows us to predict the response of the total system. The effect of the nonlinearity and the losses on the total performance are carefully studied and measurements on fabricated devices are used to validate the model. Finally, a design procedure is proposed in order to realize fast design of this new type of pressure sensor.

  4. Analytical and experimental procedures for determining propagation characteristics of millimeter-wave gallium arsenide microstrip lines

    NASA Technical Reports Server (NTRS)

    Romanofsky, Robert R.

    1989-01-01

    In this report, a thorough analytical procedure is developed for evaluating the frequency-dependent loss characteristics and effective permittivity of microstrip lines. The technique is based on the measured reflection coefficient of microstrip resonator pairs. Experimental data, including quality factor Q, effective relative permittivity, and fringing for 50-omega lines on gallium arsenide (GaAs) from 26.5 to 40.0 GHz are presented. The effects of an imperfect open circuit, coupling losses, and loading of the resonant frequency are considered. A cosine-tapered ridge-guide text fixture is described. It was found to be well suited to the device characterization.

  5. Combination of magnetic dispersive micro solid-phase extraction and supramolecular solvent-based microextraction followed by high-performance liquid chromatography for determination of trace amounts of cholesterol-lowering drugs in complicated matrices.

    PubMed

    Arghavani-Beydokhti, Somayeh; Rajabi, Maryam; Asghari, Alireza

    2017-07-01

    A novel, efficient, rapid, simple, sensitive, selective, and environmentally friendly method termed magnetic dispersive micro solid-phase extraction combined with supramolecular solvent-based microextraction (Mdμ-SPE-SSME) followed by high-performance liquid chromatography (HPLC) with UV detection is introduced for the simultaneous microextraction of cholesterol-lowering drugs in complicated matrices. In the first microextraction procedure, using layered double hydroxide (LDH)-coated Fe 3 O 4 magnetic nanoparticles, an efficient sample cleanup is simply and rapidly provided without the need for time-consuming centrifugation and elution steps. In the first step, desorption of the target analytes is easily performed through dissolution of the LDH-coated magnetic nanoparticles containing the target analytes in an acidic solution. In the next step, an emulsification microextraction method based on a supramolecular solvent is used for excellent preconcentration, ultimately resulting in an appropriate determination of the target analytes in real samples. Under the optimal experimental conditions, the Mdμ-SPE-SSME-HPLC-UV detection procedure provides good linearity in the ranges of 1.0-1500 ng mL -1 , 1.5-2000 ng mL -1 , and 2.0-2000 ng mL -1 with coefficients of determination of 0.995 or less, low limits of detection (0.3, 0.5, and 0.5 ng mL -1 ), and good extraction repeatabilities (relative standard deviations below 7.8%, n = 5) in deionized water for rosuvastatin, atorvastatin, and gemfibrozil, respectively. Finally, the proposed method is successfully applied for the determination of the target analytes in complicated matrices. Graphical Abstract Mdμ-SPE-SSME procedure.

  6. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) Definitions. Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 87.82 Sampling and analytical procedures for measuring smoke exhaust...

  7. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  8. Interactive Management and Updating of Spatial Data Bases

    NASA Technical Reports Server (NTRS)

    French, P.; Taylor, M.

    1982-01-01

    The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.

  9. A unified procedure for meta-analytic evaluation of surrogate end points in randomized clinical trials

    PubMed Central

    Dai, James Y.; Hughes, James P.

    2012-01-01

    The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448

  10. Validation protocol of analytical procedures for quantification of drugs in polymeric systems for parenteral administration: dexamethasone phosphate disodium microparticles.

    PubMed

    Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel

    2013-12-15

    In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Background contamination by coplanar polychlorinated biphenyls (PCBs) in trace level high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS) analytical procedures.

    PubMed

    Ferrario, J; Byrne, C; Dupuy, A E

    1997-06-01

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.

  12. Background contamination by coplanar polychlorinated biphenyls (PCBs) in trace level high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS) analytical procedures

    NASA Technical Reports Server (NTRS)

    Ferrario, J.; Byrne, C.; Dupuy, A. E. Jr

    1997-01-01

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.

  13. Determination of Total Lipids as Fatty Acid Methyl Esters (FAME) by in situ Transesterification: Laboratory Analytical Procedure (LAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Wychen, Stefanie; Ramirez, Kelsey; Laurens, Lieve M. L.

    2016-01-13

    This procedure is based on a whole biomass transesterification of lipids to fatty acid methyl esters to represent an accurate reflection of the potential of microalgal biofuels. Lipids are present in many forms and play various roles within an algal cell, from cell membrane phospholipids to energy stored as triacylglycerols.

  14. Development of a new procedure for the determination of captopril in pharmaceutical formulations employing chemiluminescence and a multicommuted flow analysis approach.

    PubMed

    Lima, Manoel J A; Fernandes, Ridvan N; Tanaka, Auro A; Reis, Boaventura F

    2016-02-01

    This paper describes a new technique for the determination of captopril in pharmaceutical formulations, implemented by employing multicommuted flow analysis. The analytical procedure was based on the reaction between hypochlorite and captopril. The remaining hypochlorite oxidized luminol that generated electromagnetic radiation detected using a homemade luminometer. To the best of our knowledge, this is the first time that this reaction has been exploited for the determination of captopril in pharmaceutical products, offering a clean analytical procedure with minimal reagent usage. The effectiveness of the proposed procedure was confirmed by analyzing a set of pharmaceutical formulations. Application of the paired t-test showed that there was no significant difference between the data sets at a 95% confidence level. The useful features of the new analytical procedure included a linear response for captopril concentrations in the range 20.0-150.0 µmol/L (r = 0.997), a limit of detection (3σ) of 2.0 µmol/L, a sample throughput of 164 determinations per hour, reagent consumption of 9 µg luminol and 42 µg hypochlorite per determination and generation of 0.63 mL of waste. A relative standard deviation of 1% (n = 6) for a standard solution containing 80 µmol/L captopril was also obtained. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Quality-Assurance Data for Routine Water Analyses by the U.S. Geological Survey Laboratory in Troy, New York--July 1999 through June 2001

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2006-01-01

    The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's LabMaster data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality-control samples analyzed from July 1999 through June 2001. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, calcium, chloride and nitrate (ion chromatography and colormetric method) and sulfate. The total aluminum and dissolved organic carbon procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits. The calcium and specific conductance procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The magnesium procedure was biased for the high-concentration and low concentration samples, but was within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 14 of 15 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 17 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except ammonium (81 percent of samples met objectives), chloride (75 percent of samples met objectives), and sodium (86 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with most ratings for each sample in the good to excellent range. The P-sample (low-ionic-strength constituents) analysis had one satisfactory rating for the specific conductance procedure in one study. The T-sample (trace constituents) analysis had one satisfactory rating for the aluminum procedure in one study and one unsatisfactory rating for the sodium procedure in another. The remainder of the samples had good or excellent ratings for each study. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 89 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were ammonium, total aluminum, dissolved organic carbon, and sodium. Results indicate a positive bias for the ammonium procedure in all studies. Data-quality objectives were not met in 50 percent of samples analyzed for total aluminum, 38 percent of samples analyzed for dissolved organic carbon, and 27 percent of samples analyzed for sodium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 91 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, and sulfate. Data-quality objectives were met by 75 percent of the samples analyzed for sodium and 58 percent of the samples analyzed for specific conductance.

  16. Mindfulness-Based Approaches in the Treatment of Disordered Gambling: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Maynard, Brandy R.; Wilson, Alyssa N.; Labuzienski, Elizabeth; Whiting, Seth W.

    2018-01-01

    Background and Aims: To examine the effects of mindfulness-based interventions on gambling behavior and symptoms, urges, and financial outcomes. Method: Systematic review and meta-analytic procedures were employed to search, select, code, and analyze studies conducted between 1980 and 2014, assessing the effects of mindfulness-based interventions…

  17. General Analytical Procedure for Determination of Acidity Parameters of Weak Acids and Bases

    PubMed Central

    Pilarski, Bogusław; Kaliszan, Roman; Wyrzykowski, Dariusz; Młodzianowski, Janusz; Balińska, Agata

    2015-01-01

    The paper presents a new convenient, inexpensive, and reagent-saving general methodology for the determination of pK a values for components of the mixture of diverse chemical classes weak organic acids and bases in water solution, without the need to separate individual analytes. The data obtained from simple pH-metric microtitrations are numerically processed into reliable pK a values for each component of the mixture. Excellent agreement has been obtained between the determined pK a values and the reference literature data for compounds studied. PMID:25692072

  18. General analytical procedure for determination of acidity parameters of weak acids and bases.

    PubMed

    Pilarski, Bogusław; Kaliszan, Roman; Wyrzykowski, Dariusz; Młodzianowski, Janusz; Balińska, Agata

    2015-01-01

    The paper presents a new convenient, inexpensive, and reagent-saving general methodology for the determination of pK a values for components of the mixture of diverse chemical classes weak organic acids and bases in water solution, without the need to separate individual analytes. The data obtained from simple pH-metric microtitrations are numerically processed into reliable pK a values for each component of the mixture. Excellent agreement has been obtained between the determined pK a values and the reference literature data for compounds studied.

  19. Modeling of vortex generated sound in solid propellant rocket motors

    NASA Technical Reports Server (NTRS)

    Flandro, G. A.

    1980-01-01

    There is considerable evidence based on both full scale firings and cold flow simulations that hydrodynamically unstable shear flows in solid propellant rocket motors can lead to acoustic pressure fluctuations of significant amplitude. Although a comprehensive theoretical understanding of this problem does not yet exist, procedures were explored for generating useful analytical models describing the vortex shedding phenomenon and the mechanisms of coupling to the acoustic field in a rocket combustion chamber. Since combustion stability prediction procedures cannot be successful without incorporation of all acoustic gains and losses, it is clear that a vortex driving model comparable in quality to the analytical models currently employed to represent linear combustion instability must be formulated.

  20. Quality-assurance data for routine water analyses by the U.S. Geological Survey laboratory in Troy, New York - July 2003 through June 2005

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2009-01-01

    The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2003 through June 2005. Results for the quality-control samples for 20 analytical procedures were evaluated for bias and precision. Control charts indicate that data for five of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, pH, silicon, and sodium. Seven of the analytical procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits; these procedures were: dissolved organic carbon, chloride, nitrate (ion chromatograph), nitrite, silicon, sodium, and sulfate. The calcium and magnesium procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum and specific conductance procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 17 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 22 analytes. At least 85 percent of the samples met data-quality objectives for all analytes except total monomeric aluminum (82 percent of samples met objectives), total aluminum (77 percent of samples met objectives), chloride (80 percent of samples met objectives), fluoride (76 percent of samples met objectives), and nitrate (ion chromatograph) (79 percent of samples met objectives). The ammonium and total dissolved nitrogen did not meet the data-quality objectives. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with ratings for each sample in the satisfactory, good, and excellent ranges or less than 10 percent error. The P-sample (low-ionic-strength constituents) analysis had one marginal and two unsatisfactory ratings for the chloride procedure. The T-sample (trace constituents)analysis had two unsatisfactory ratings and one high range percent error for the aluminum procedure. The N-sample (nutrient constituents) analysis had one marginal rating for the nitrate procedure. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 84 percent of the samples met data-quality objectives for 11 of the 14 analytes; the exceptions were ammonium, total aluminum, and acid-neutralizing capacity. The ammonium procedure did not meet data quality objectives in all studies. Data-quality objectives were not met in 23 percent of samples analyzed for total aluminum and 45 percent of samples analyzed acid-neutralizing capacity. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 86 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, sodium, and sulfate. Data-quality objectives were not met by samples analyzed for fluoride. 

  1. 40 CFR 140.5 - Analytical procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 140.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) MARINE SANITATION DEVICE STANDARD § 140.5 Analytical procedures. In determining the composition and quality of effluent discharge from marine sanitation devices, the procedures contained in 40 CFR part 136...

  2. Human health risk assessment: models for predicting the effective exposure duration of on-site receptors exposed to contaminated groundwater.

    PubMed

    Baciocchi, Renato; Berardi, Simona; Verginelli, Iason

    2010-09-15

    Clean-up of contaminated sites is usually based on a risk-based approach for the definition of the remediation goals, which relies on the well known ASTM-RBCA standard procedure. In this procedure, migration of contaminants is described through simple analytical models and the source contaminants' concentration is supposed to be constant throughout the entire exposure period, i.e. 25-30 years. The latter assumption may often result over-protective of human health, leading to unrealistically low remediation goals. The aim of this work is to propose an alternative model taking in account the source depletion, while keeping the original simplicity and analytical form of the ASTM-RBCA approach. The results obtained by the application of this model are compared with those provided by the traditional ASTM-RBCA approach, by a model based on the source depletion algorithm of the RBCA ToolKit software and by a numerical model, allowing to assess its feasibility for inclusion in risk analysis procedures. The results discussed in this work are limited to on-site exposure to contaminated water by ingestion, but the approach proposed can be extended to other exposure pathways. Copyright 2010 Elsevier B.V. All rights reserved.

  3. Speciation of Mn(II), Mn(VII) and total manganese in water and food samples by coprecipitation-atomic absorption spectrometry combination.

    PubMed

    Citak, Demirhan; Tuzen, Mustafa; Soylak, Mustafa

    2010-01-15

    A speciation procedure based on the coprecipitation of manganese(II) with zirconium(IV) hydroxide has been developed for the investigation of levels of manganese species. The determination of manganese levels was performed by flame atomic absorption spectrometry (FAAS). Total manganese was determined after the reduction of Mn(VII) to Mn(II) by ascorbic acid. The analytical parameters including pH, amount of zirconium(IV), sample volume, etc., were investigated for the quantitative recoveries of manganese(II). The effects of matrix ions were also examined. The recoveries for manganese(II) were in the range of 95-98%. Preconcentration factor was calculated as 50. The detection limit for the analyte ions based on 3 sigma (n=21) was 0.75 microg L(-1) for Mn(II). The relative standard deviation was found to be lower than 7%. The validation of the presented procedure was performed by analysis of certified reference material having different matrices, NIST SRM 1515 (Apple Leaves) and NIST SRM 1568a (Rice Flour). The procedure was successfully applied to natural waters and food samples.

  4. Availability Simulation of AGT Systems

    DOT National Transportation Integrated Search

    1975-02-01

    The report discusses the analytical and simulation procedures that were used to evaluate the effects of failure in a complex dual mode transportation system based on a worst case study-state condition. The computed results are an availability figure ...

  5. Improvement of analytical dynamic models using modal test data

    NASA Technical Reports Server (NTRS)

    Berman, A.; Wei, F. S.; Rao, K. V.

    1980-01-01

    A method developed to determine maximum changes in analytical mass and stiffness matrices to make them consistent with a set of measured normal modes and natural frequencies is presented. The corrected model will be an improved base for studies of physical changes, boundary condition changes, and for prediction of forced responses. The method features efficient procedures not requiring solutions of the eigenvalue problem, and the ability to have more degrees of freedom than the test data. In addition, modal displacements are obtained for all analytical degrees of freedom, and the frequency dependence of the coordinate transformations is properly treated.

  6. Assessment of passive drag in swimming by numerical simulation and analytical procedure.

    PubMed

    Barbosa, Tiago M; Ramos, Rui; Silva, António J; Marinho, Daniel A

    2018-03-01

    The aim was to compare the passive drag-gliding underwater by a numerical simulation and an analytical procedure. An Olympic swimmer was scanned by computer tomography and modelled gliding at a 0.75-m depth in the streamlined position. Steady-state computer fluid dynamics (CFD) analyses were performed on Fluent. A set of analytical procedures was selected concurrently. Friction drag (D f ), pressure drag (D pr ), total passive drag force (D f +pr ) and drag coefficient (C D ) were computed between 1.3 and 2.5 m · s -1 by both techniques. D f +pr ranged from 45.44 to 144.06 N with CFD, from 46.03 to 167.06 N with the analytical procedure (differences: from 1.28% to 13.77%). C D ranged between 0.698 and 0.622 by CFD, 0.657 and 0.644 by analytical procedures (differences: 0.40-6.30%). Linear regression models showed a very high association for D f +pr plotted in absolute values (R 2  = 0.98) and after log-log transformation (R 2  = 0.99). The C D also obtained a very high adjustment for both absolute (R 2  = 0.97) and log-log plots (R 2  = 0.97). The bias for the D f +pr was 8.37 N and 0.076 N after logarithmic transformation. D f represented between 15.97% and 18.82% of the D f +pr by the CFD, 14.66% and 16.21% by the analytical procedures. Therefore, despite the bias, analytical procedures offer a feasible way of gathering insight on one's hydrodynamics characteristics.

  7. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  8. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  9. Analytic H I-to-H2 Photodissociation Transition Profiles

    NASA Astrophysics Data System (ADS)

    Bialy, Shmuel; Sternberg, Amiel

    2016-05-01

    We present a simple analytic procedure for generating atomic (H I) to molecular ({{{H}}}2) density profiles for optically thick hydrogen gas clouds illuminated by far-ultraviolet radiation fields. Our procedure is based on the analytic theory for the structure of one-dimensional H I/{{{H}}}2 photon-dominated regions, presented by Sternberg et al. Depth-dependent atomic and molecular density fractions may be computed for arbitrary gas density, far-ultraviolet field intensity, and the metallicity-dependent H2 formation rate coefficient, and dust absorption cross section in the Lyman-Werner photodissociation band. We use our procedure to generate a set of {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition profiles for a wide range of conditions, from the weak- to strong-field limits, and from super-solar down to low metallicities. We show that if presented as functions of dust optical depth, the {{H}} {{I}} and {{{H}}}2 density profiles depend primarily on the Sternberg “α G parameter” (dimensionless) that determines the dust optical depth associated with the total photodissociated {{H}} {{I}} column. We derive a universal analytic formula for the {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition points as a function of just α G. Our formula will be useful for interpreting emission-line observations of H I/{{{H}}}2 interfaces, for estimating star formation thresholds, and for sub-grid components in hydrodynamics simulations.

  10. Pre-analytic evaluation of volumetric absorptive microsampling and integration in a mass spectrometry-based metabolomics workflow.

    PubMed

    Volani, Chiara; Caprioli, Giulia; Calderisi, Giovanni; Sigurdsson, Baldur B; Rainer, Johannes; Gentilini, Ivo; Hicks, Andrew A; Pramstaller, Peter P; Weiss, Guenter; Smarason, Sigurdur V; Paglia, Giuseppe

    2017-10-01

    Volumetric absorptive microsampling (VAMS) is a novel approach that allows single-drop (10 μL) blood collection. Integration of VAMS with mass spectrometry (MS)-based untargeted metabolomics is an attractive solution for both human and animal studies. However, to boost the use of VAMS in metabolomics, key pre-analytical questions need to be addressed. Therefore, in this work, we integrated VAMS in a MS-based untargeted metabolomics workflow and investigated pre-analytical strategies such as sample extraction procedures and metabolome stability at different storage conditions. We first evaluated the best extraction procedure for the polar metabolome and found that the highest number and amount of metabolites were recovered upon extraction with acetonitrile/water (70:30). In contrast, basic conditions (pH 9) resulted in divergent metabolite profiles mainly resulting from the extraction of intracellular metabolites originating from red blood cells. In addition, the prolonged storage of blood samples at room temperature caused significant changes in metabolome composition, but once the VAMS devices were stored at - 80 °C, the metabolome remained stable for up to 6 months. The time used for drying the sample did also affect the metabolome. In fact, some metabolites were rapidly degraded or accumulated in the sample during the first 48 h at room temperature, indicating that a longer drying step will significantly change the concentration in the sample. Graphical abstract Volumetric absorptive microsampling (VAMS) is a novel technology that allows single-drop blood collection and, in combination with mass spectrometry (MS)-based untargeted metabolomics, represents an attractive solution for both human and animal studies. In this work, we integrated VAMS in a MS-based untargeted metabolomics workflow and investigated pre-analytical strategies such as sample extraction procedures and metabolome stability at different storage conditions. The latter revealed that prolonged storage of blood samples at room temperature caused significant changes in metabolome composition, but if VAMS devices were stored at - 80 °C, the metabolome remained stable for up to 6 months.

  11. Metrological approach to quantitative analysis of clinical samples by LA-ICP-MS: A critical review of recent studies.

    PubMed

    Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta

    2018-05-15

    Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Green procedure using limonene in the Dean-Stark apparatus for moisture determination in food products.

    PubMed

    Veillet, Sébastien; Tomao, Valérie; Ruiz, Karine; Chemat, Farid

    2010-07-26

    In the past 10 years, trends in analytical chemistry have turned toward the green chemistry which endeavours to develop new techniques that reduce the influence of chemicals on the environment. The challenge of the green analytical chemistry is to develop techniques that meet the request for information output while reducing the environmental impact of the analyses. For this purpose petroleum-based solvents have to be avoided. Therefore, increasing interest was given to new green solvents such as limonene and their potential as alternative solvents in analytical chemistry. In this work limonene was used instead of toluene in the Dean-Stark procedure. Moisture determination on wide range of food matrices was performed either using toluene or limonene. Both solvents gave similar water percentages in food materials, i.e. 89.3+/-0.5 and 89.5+/-0.7 for carrot, 68.0+/-0.7 and 68.6+/-1.9 for garlic, 64.1+/-0.5 and 64.0+/-0.3 for minced meat with toluene and limonene, respectively. Consequently limonene could be used as a good alternative solvent in the Dean-Stark procedure. Copyright 2010 Elsevier B.V. All rights reserved.

  13. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  14. Availability Analysis of Dual Mode Systems

    DOT National Transportation Integrated Search

    1974-04-01

    The analytical procedures presented define a method of evaluating the effects of failures in a complex dual-mode system based on a worst case steady-state analysis. The computed result is an availability figure of merit and not an absolute prediction...

  15. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  16. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  17. Protocol for Detection of Yersinia pestis in Environmental ...

    EPA Pesticide Factsheets

    Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.

  18. On the wing behaviour of the overtones of self-localized modes

    NASA Astrophysics Data System (ADS)

    Dusi, R.; Wagner, M.

    1998-08-01

    In this paper the solutions for self-localized modes in a nonlinear chain are investigated. We present a converging iteration procedure, which is based on analytical information of the wings and which takes into account higher overtones of the solitonic oscillations. The accuracy is controlled in a step by step manner by means of a Gaussian error analysis. Our numerical procedure allows for highly accurate solutions, in all anharmonicity regimes, and beyond the rotating-wave approximation (RWA). It is found that the overtone wings change their analytical behaviour at certain critical values of the energy of the self-localized mode: there is a turnover in the exponent of descent. The results are shown for a Fermi-Pasta-Ulam (FPU) chain with quartic anharmonicity.

  19. Evaluation of Superparamagnetic Silica Nanoparticles for Extraction of Triazines in Magnetic in-Tube Solid Phase Microextraction Coupled to Capillary Liquid Chromatography

    PubMed Central

    González-Fuenzalida, R. A.; Moliner-Martínez, Y.; Prima-Garcia, Helena; Ribera, Antonio; Campins-Falcó, P.; Zaragozá, Ramon J.

    2014-01-01

    The use of magnetic nanomaterials for analytical applications has increased in the recent years. In particular, magnetic nanomaterials have shown great potential as adsorbent phase in several extraction procedures due to the significant advantages over the conventional methods. In the present work, the influence of magnetic forces over the extraction efficiency of triazines using superparamagnetic silica nanoparticles (NPs) in magnetic in tube solid phase microextraction (Magnetic-IT-SPME) coupled to CapLC has been evaluated. Atrazine, terbutylazine and simazine has been selected as target analytes. The superparamagnetic silica nanomaterial (SiO2-Fe3O4) deposited onto the surface of a capillary column gave rise to a magnetic extraction phase for IT-SPME that provided a enhancemment of the extraction efficiency for triazines. This improvement is based on two phenomena, the superparamegnetic behavior of Fe3O4 NPs and the diamagnetic repulsions that take place in a microfluidic device such a capillary column. A systematic study of analytes adsorption and desorption was conducted as function of the magnetic field and the relationship with triazines magnetic susceptibility. The positive influence of magnetism on the extraction procedure was demonstrated. The analytical characteristics of the optimized procedure were established and the method was applied to the determination of the target analytes in water samples with satisfactory results. When coupling Magnetic-IT-SPME with CapLC, improved adsorption efficiencies (60%–63%) were achieved compared with conventional adsorption materials (0.8%–3%). PMID:28344221

  20. Development and applications of two computational procedures for determining the vibration modes of structural systems. [aircraft structures - aerospaceplanes

    NASA Technical Reports Server (NTRS)

    Kvaternik, R. G.

    1975-01-01

    Two computational procedures for analyzing complex structural systems for their natural modes and frequencies of vibration are presented. Both procedures are based on a substructures methodology and both employ the finite-element stiffness method to model the constituent substructures. The first procedure is a direct method based on solving the eigenvalue problem associated with a finite-element representation of the complete structure. The second procedure is a component-mode synthesis scheme in which the vibration modes of the complete structure are synthesized from modes of substructures into which the structure is divided. The analytical basis of the methods contains a combination of features which enhance the generality of the procedures. The computational procedures exhibit a unique utilitarian character with respect to the versatility, computational convenience, and ease of computer implementation. The computational procedures were implemented in two special-purpose computer programs. The results of the application of these programs to several structural configurations are shown and comparisons are made with experiment.

  1. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Quantifying the measurement uncertainty of results from environmental analytical methods.

    PubMed

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  3. Mass Spectrometric Rapid Diagnosis of Infectious Diseases.

    DTIC Science & Technology

    1980-01-01

    the analytical procedures to warrant reporting anew the whole analytical procedure. A. Sample Collection and Storage Procedure Urine samples were...positives or false-negatives. Next we have carried out a longitudinal study on urine samples obtained from groups of volunteer subjects vaccinated with...sterilization and storage procedures. 2. Developed new, simpler sample preparation techniques including one to handle tissue culture media. 3. Improved on the

  4. Multianalyte imaging in one-shot format sensors for natural waters.

    PubMed

    Lapresta-Fernández, A; Huertas, Rafael; Melgosa, Manuel; Capitán-Vallvey, L F

    2009-03-23

    A one-shot multisensor based on ionophore-chromoionophore chemistry for optical monitoring of potassium, magnesium and hardness in water is presented. The analytical procedure uses a black and white non-cooled CCD camera for image acquisition of the one-shot multisensor after reaction, followed by data treatment for quantitation using the grey value pixel average from a defined region of interest from each sensing area to build the analytical parameter 1-alpha. In optimised experimental conditions, the procedure shows a large linear range, up to 6 orders using the linearised model and good detection limits: 9.92 x 10(-5)mM, 1.86 x 10(-3)mM and 1.30 x 10(-2)mgL(-1) of CaCO(3) for potassium, magnesium and hardness, respectively. This analysis system exhibits good precision in terms of relative standard deviation (RSD%) from 2.3 to 3.8 for potassium, from 5.0 to 6.8 for magnesium and from 5.4 to 5.9 for hardness. The trueness of this multisensor procedure was demonstrated comparing it with results obtained by a DAD spectrophotometer used as a reference. Finally, it was satisfactorily applied to the analysis of these analytes in miscellaneous samples, such as water and beverage samples from different origins, validating the results against atomic absorption spectrometry (AAS) as the reference procedure.

  5. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE... Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  6. Ultraviolet, Visible, and Fluorescence Spectroscopy

    NASA Astrophysics Data System (ADS)

    Penner, Michael H.

    Spectroscopy in the ultraviolet-visible (UV-Vis) range is one of the most commonly encountered laboratory techniques in food analysis. Diverse examples, such as the quantification of macrocomponents (total carbohydrate by the phenol-sulfuric acid method), quantification of microcomponents, (thiamin by the thiochrome fluorometric procedure), estimates of rancidity (lipid oxidation status by the thiobarbituric acid test), and surveillance testing (enzyme-linked immunoassays), are presented in this text. In each of these cases, the analytical signal for which the assay is based is either the emission or absorption of radiation in the UV-Vis range. This signal may be inherent in the analyte, such as the absorbance of radiation in the visible range by pigments, or a result of a chemical reaction involving the analyte, such as the colorimetric copper-based Lowry method for the analysis of soluble protein.

  7. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    PubMed

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence factors. This review is a summary of the most important recommendations regarding the importance of pre-analytical factors for coagulation testing and should be a tool to increase awareness about the importance of pre-analytical factors for coagulation testing.

  8. Major to ultra trace element bulk rock analysis of nanoparticulate pressed powder pellets by LA-ICP-MS

    NASA Astrophysics Data System (ADS)

    Peters, Daniel; Pettke, Thomas

    2016-04-01

    An efficient, clean procedure for bulk rock major to trace element analysis by 193 nm Excimer LA-ICP-MS analysis of nanoparticulate pressed powder pellets (PPPs) employing a binder is presented. Sample powders are milled in water suspension in a planetary ball mill, reducing average grain size by about one order of magnitude compared to common dry milling protocols. Microcrystalline cellulose (MCC) is employed as a binder, improving the mechanical strength of the PPP and the ablation behaviour, because MCC absorbs 193 nm laser light well. Use of MCC binder allows for producing cohesive pellets of materials that cannot be pelletized in their pure forms, such as quartz powder. Rigorous blank quantification was performed on synthetic quartz treated like rock samples, demonstrating that procedural blanks are irrelevant except for a few elements at the 10 ng g-1 concentration level. The LA-ICP-MS PPP analytical procedure was optimised and evaluated using six different SRM powders (JP-1, UB-N, BCR-2, GSP-2, OKUM, and MUH-1). Calibration based on external standardization using SRM 610, SRM 612, BCR-2G, and GSD-1G glasses allows for evaluation of possible matrix effects during LA-ICP-MS analysis. The data accuracy of the PPP LA-ICP-MS analytical procedure compares well to that achieved for liquid ICP-MS and LA-ICP-MS glass analysis, except for element concentrations below ˜30 ng g-1, where liquid ICP-MS offers more precise data and in part lower limits of detection. Uncertainties on the external reproducibility of LA-ICP-MS PPP element concentrations are of the order of 0.5 to 2 % (1σ standard deviation) for concentrations exceeding ˜1 μg g-1. For lower element concentrations these uncertainties increase to 5-10% or higher when analyte-depending limits of detection (LOD) are approached, and LODs do not significantly differ from glass analysis. Sample homogeneity is demonstrated by the high analytical precision, except for very few elements where grain size effects can rarely still be resolved analytically. Matrix effects are demonstrated for PPP analysis of diverse rock compositions and basalt glass analysis when externally calibrated based on SRM 610 and SRM 612 glasses; employing basalt glass GSD-1G or BCR-2G for external standardisation basically eliminates these problems. Perhaps the most prominent progress of the LA-ICP-MS PPP analytical procedure presented here is the fact that trace elements not commonly analysed, i.e. new, unconventional geochemical tracers, can be measured straightforwardly, including volatile elements, the flux elements Li and B, the chalcophile elements As, Sb, Tl, Bi, and elements that alloy with metal containers employed in conventional glass production approaches. The method presented here thus overcomes many common problems and limitations in analytical geochemistry and is shown to be an efficient alternative for bulk rock trace elements analysis.

  9. An investigation of several factors involved in a finite difference procedure for analyzing the transonic flow about harmonically oscillating airfoils and wings

    NASA Technical Reports Server (NTRS)

    Ehlers, F. E.; Sebastian, J. D.; Weatherill, W. H.

    1979-01-01

    Analytical and empirical studies of a finite difference method for the solution of the transonic flow about harmonically oscillating wings and airfoils are presented. The procedure is based on separating the velocity potential into steady and unsteady parts and linearizing the resulting unsteady equations for small disturbances. Since sinusoidal motion is assumed, the unsteady equation is independent of time. Three finite difference investigations are discussed including a new operator for mesh points with supersonic flow, the effects on relaxation solution convergence of adding a viscosity term to the original differential equation, and an alternate and relatively simple downstream boundary condition. A method is developed which uses a finite difference procedure over a limited inner region and an approximate analytical procedure for the remaining outer region. Two investigations concerned with three-dimensional flow are presented. The first is the development of an oblique coordinate system for swept and tapered wings. The second derives the additional terms required to make row relaxation solutions converge when mixed flow is present. A finite span flutter analysis procedure is described using the two-dimensional unsteady transonic program with a full three-dimensional steady velocity potential.

  10. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    PubMed

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  11. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China

    PubMed Central

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li’an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-01-01

    Abstract A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box–Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China. PMID:26945390

  12. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  13. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  14. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  15. Glyoxal and methylglyoxal as urinary markers of diabetes. Determination using a dispersive liquid-liquid microextraction procedure combined with gas chromatography-mass spectrometry.

    PubMed

    Pastor-Belda, M; Fernández-García, A J; Campillo, N; Pérez-Cárceles, M D; Motas, M; Hernández-Córdoba, M; Viñas, P

    2017-08-04

    Glyoxal (GO) and methylglyoxal (MGO) are α-oxoaldehydes that can be used as urinary diabetes markers. In this study, their levels were measured using a sample preparation procedure based on salting-out assisted liquid-liquid extraction (SALLE) and dispersive liquid-liquid microextraction (DLLME) combined with gas chromatography-mass spectrometry (GC-MS). The effect of the derivatization reaction with 2,3-diaminonaphthalene, the addition of acetonitrile and sodium chloride to urine, and the DLLME step using the acetonitrile extract as dispersant solvent and carbon tetrachloride as extractant solvent were carefully optimized. Quantification was performed by the internal standard method, using 5-bromo-2-chloroanisole. The intraday and interday precisions were lower than 6%. Limits of detection were 0.12 and 0.06ngmL -1 , and enrichment factors 140 and 130 for GO and MGO, respectively. The concentrations of these α-oxoaldehydes in urine were between 0.9 and 35.8ngg -1 levels (creatinine adjusted). A statistical comparison of the analyte contents of urine samples from non-diabetic and diabetic patients pointed to significant differences (P=0.046, 24 subjects investigated), particularly regarding MGO, which was higher in diabetic patients. The novelty of this study compared with previous procedures lies in the treatment of the urine sample by SALLE based on the addition of acetonitrile and sodium chloride to the urine. The DLLME procedure is performed with a sedimented drop of the extractant solvent, without a surfactant reagent, and using acetonitrile as dispersant solvent. Separation of the analytes was performed using GC-MS detection, being the analytes unequivocal identified. The proposed procedure is the first microextraction method applied to the analysis of urine samples from diabetic and non-diabetic patients that allows a clear differentiation between both groups using a simple analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Analytical method for dissolved-organic carbon fractionation

    USGS Publications Warehouse

    Leenheer, Jerry A.; Huffman, Edward W. D.

    1979-01-01

    A standard procedure for analytical-scale dissolved organic carbon fractionation is presented, whereby dissolved organic carbon in water is first fractionated by a nonionic macroreticular resin into acid, base, and neutral hydrophobic organic solute fractions, and next fractionated by ion-exchange resins into acid, base, and neutral hydrophilic solute fractions. The hydrophobic solutes are defined as those sorbed on a nonionic, acrylic-ester macroreticular resin and are differentiated into acid, base, and nautral fractions by sorption/desorption controlled by pH adjustment. The hydrophilic bases are next sorbed on strong-acid ion-exchange resin, followed by sorption of hydrophilic acids on a strong-base ion-exchange resin. Hydrophilic neutrals are not sorbed and remain dissolved in the deionized water at the end of the fractionation procedure. The complete fractionation can be performed on a 200-milliliter filtered water sample, whose dissolved organic carbon content is 5-25 mg/L and whose specific conductance is less than 2,000 μmhos/cm at 25°C. The applications of dissolved organic carbon fractionation analysis range from field studies of changes of organic solute composition with synthetic fossil fuel production, to fundamental studies of the nature of sorption processes.

  17. Quality-Assurance Data for Routine Water Analyses by the U.S. Geological Survey Laboratory in Troy, New York - July 2001 Through June 2003

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2009-01-01

    The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2001 through June 2003. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for six of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, chloride, magnesium, nitrate (ion chromatography), potassium, and sodium. The calcium procedure was biased throughout the analysis period for the high-concentration sample, but was within control limits. The total monomeric aluminum and fluoride procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum, pH, specific conductance, and sulfate procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 16 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for the dissolved organic carbon or specific conductance procedures. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 90 percent of the samples met data-quality objectives for all procedures except total monomeric aluminum (83 percent of samples met objectives), total aluminum (76 percent of samples met objectives), ammonium (73 percent of samples met objectives), dissolved organic carbon (86 percent of samples met objectives), and nitrate (81 percent of samples met objectives). The data-quality objective was not met for the nitrite procedure. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated satisfactory or above data quality over the time period, with most performance ratings for each sample in the good-to-excellent range. The N-sample (nutrient constituents) analysis had one unsatisfactory rating for the ammonium procedure in one study. The T-sample (trace constituents) analysis had one unsatisfactory rating for the magnesium procedure and one marginal rating for the potassium procedure in one study and one unsatisfactory rating for the sodium procedure in another. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 90 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were acid-neutralizing capacity, ammonium, dissolved organic carbon, and sodium. Data-quality objectives were not met in 37 percent of samples analyzed for acid-neutralizing capacity, 28 percent of samples analyzed for dissolved organic carbon, and 30 percent of samples analyzed for sodium. Results indicate a positive bias for the ammonium procedure in one study and a negative bias in another. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 90 percent of the samples analyzed for calcium, chloride, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 78 percent of

  18. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    PubMed

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  19. Request Pattern, Pre-Analytical and Analytical Conditions of Urinalysis in Primary Care: Lessons from a One-Year Large-Scale Multicenter Study.

    PubMed

    Salinas, Maria; Lopez-Garrigos, Maite; Flores, Emilio; Leiva-Salinas, Carlos

    2018-06-01

    To study the urinalysis request, pre-analytical sample conditions, and analytical procedures. Laboratories were asked to provide the number of primary care urinalyses requested, and to fill out a questionnaire regarding pre-analytical conditions and analytical procedures. 110 laboratories participated in the study. 232.5 urinalyses/1,000 inhabitants were reported. 75.4% used the first morning urine. The sample reached the laboratory in less than 2 hours in 18.8%, between 2 - 4 hours in 78.3%, and between 4 - 6 hours in the remaining 2.9%. 92.5% combined the use of test strip and particle analysis, and only 7.5% used the strip exclusively. All participants except one performed automated particle analysis depending on strip results; in 16.2% the procedure was only manual. Urinalysis was highly requested. There was a lack of compliance with guidelines regarding time between micturition and analysis that usually involved the combination of strip followed by particle analysis.

  20. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  1. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  2. Isolation and purification of all-trans diadinoxanthin and all-trans diatoxanthin from diatom Phaeodactylum tricornutum.

    PubMed

    Kuczynska, Paulina; Jemiola-Rzeminska, Malgorzata

    2017-01-01

    Two diatom-specific carotenoids are engaged in the diadinoxanthin cycle, an important mechanism which protects these organisms against photoinhibition caused by absorption of excessive light energy. A high-performance and economical procedure of isolation and purification of diadinoxanthin and diatoxanthin from the marine diatom Phaeodactylum tricornutum using a four-step procedure has been developed. It is based on the use of commonly available materials and does not require advanced technology. Extraction of pigments, saponification, separation by partition and then open column chromatography, which comprise the complete experimental procedure, can be performed within 2 days. This method allows HPLC grade diadinoxanthin and diatoxanthin of a purity of 99 % or more to be obtained, and the efficiency was estimated to be 63 % for diadinoxanthin and 73 % for diatoxanthin. Carefully selected diatom culture conditions as well as analytical ones ensure highly reproducible performance. A protocol can be used to isolate and purify the diadinoxanthin cycle pigments both on analytical and preparative scale.

  3. Ontological Foundations for Tracking Data Quality through the Internet of Things.

    PubMed

    Ceusters, Werner; Bona, Jonathan

    2016-01-01

    Amongst the positive outcomes expected from the Internet of Things for Health are longitudinal patient records that are more complete and less erroneous by complementing manual data entry with automatic data feeds from sensors. Unfortunately, devices are fallible too. Quality control procedures such as inspection, testing and maintenance can prevent devices from producing errors. The additional approach envisioned here is to establish constant data quality monitoring through analytics procedures on patient data that exploit not only the ontological principles ascribed to patients and their bodily features, but also to observation and measurement processes in which devices and patients participate, including the, perhaps erroneous, representations that are generated. Using existing realism-based ontologies, we propose a set of categories that analytics procedures should be able to reason with and highlight the importance of unique identification of not only patients, caregivers and devices, but of everything involved in those measurements. This approach supports the thesis that the majority of what tends to be viewed as 'metadata' are actually data about first-order entities.

  4. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  5. Quality-Assurance Data for Routine Water Analyses by the U.S. Geological Survey Laboratory in Troy, New York - July 2005 through June 2007

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2009-01-01

    The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2005 through June 2007. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: total aluminum, calcium, magnesium, nitrate (colorimetric method), potassium, silicon, sodium, and sulfate. Eight of the analytical procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits; these procedures were: total aluminum, calcium, dissolved organic carbon, chloride, nitrate (ion chromatograph), potassium, silicon, and sulfate. The magnesium and pH procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The acid-neutralizing capacity, total monomeric aluminum, nitrite, and specific conductance procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicated that the procedures for 16 of 17 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 93 percent of the samples met data-quality objectives for all analytes except acid-neutralizing capacity (85 percent of samples met objectives), total monomeric aluminum (83 percent of samples met objectives), total aluminum (85 percent of samples met objectives), and chloride (85 percent of samples met objectives). The ammonium and total dissolved nitrogen did not meet the data-quality objectives. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project met the Troy Laboratory data-quality objectives for 87 percent of the samples analyzed. The P-sample (low-ionic-strength constituents) analysis had two outliers each in two studies. The T-sample (trace constituents) analysis and the N-sample (nutrient constituents) analysis had one outlier each in two studies. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 85 percent of the samples met data-quality objectives for 11 of the 14 analytes; the exceptions were acid-neutralizing capacity, total aluminum and ammonium. Data-quality objectives were not met in 41 percent of samples analyzed for acid-neutralizing capacity, 50 percent of samples analyzed for total aluminum, and 44 percent of samples analyzed for ammonium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 86 percent of the samples analyzed for calcium, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 76 percent of the samples analyzed for chloride, 80 percent of the samples analyzed for specific conductance, and 77 percent of the samples analyzed for sulfate.

  6. The Analytical Pragmatic Structure of Procedural Due Process: A Framework for Inquiry in Administrative Decision Making.

    ERIC Educational Resources Information Center

    Fisher, James E.; Sealey, Ronald W.

    The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…

  7. Comparison of three multiplex cytokine analysis systems: Luminex, SearchLight and FAST Quant.

    PubMed

    Lash, Gendie E; Scaife, Paula J; Innes, Barbara A; Otun, Harry A; Robson, Steven C; Searle, Roger F; Bulmer, Judith N

    2006-02-20

    Multiplex cytokine analysis technologies have become readily available in the last five years. Two main formats exist: multiplex sandwich ELISA and bead based assays. While these have each been compared to individual ELISAs, there has been no direct comparison between the two formats. We report here the comparison of two multiplex sandwich ELISA procedures (FAST Quant and SearchLight) and a bead based assay (UpState Luminex). All three kits differed from each other for different analytes and there was no clear pattern of one system giving systematically different results than another for any analyte studied. We suggest that each system has merits and several factors including range of analytes available, prospect of development of new analytes, dynamic range of the assay, sensitivity of the assay, cost of equipment, cost of consumables, ease of use and ease of data analysis need to be considered when choosing a system for use. We also suggest that results obtained from different systems cannot be combined.

  8. Nine-analyte detection using an array-based biosensor

    NASA Technical Reports Server (NTRS)

    Taitt, Chris Rowe; Anderson, George P.; Lingerfelt, Brian M.; Feldstein, s. Mark. J.; Ligler, Frances S.

    2002-01-01

    A fluorescence-based multianalyte immunosensor has been developed for simultaneous analysis of multiple samples. While the standard 6 x 6 format of the array sensor has been used to analyze six samples for six different analytes, this same format has the potential to allow a single sample to be tested for 36 different agents. The method described herein demonstrates proof of principle that the number of analytes detectable using a single array can be increased simply by using complementary mixtures of capture and tracer antibodies. Mixtures were optimized to allow detection of closely related analytes without significant cross-reactivity. Following this facile modification of patterning and assay procedures, the following nine targets could be detected in a single 3 x 3 array: Staphylococcal enterotoxin B, ricin, cholera toxin, Bacillus anthracis Sterne, Bacillus globigii, Francisella tularensis LVS, Yersiniapestis F1 antigen, MS2 coliphage, and Salmonella typhimurium. This work maximizes the efficiency and utility of the described array technology, increasing only reagent usage and cost; production and fabrication costs are not affected.

  9. Gas chromatography with mass spectrometry for the determination of phthalates preconcentrated by microextraction based on an ionic liquid.

    PubMed

    Cacho, Juan Ignacio; Campillo, Natalia; Viñas, Pilar; Hernández-Córdoba, Manuel

    2017-03-01

    A new procedure is proposed for the analysis of migration test solutions obtained from plastic bottles used in the packaging of edible oils. Ultrasound-assisted emulsification microextraction with ionic liquids was applied for the preconcentration of six phthalate esters: dimethylphthalate, diethylphthalate, di-n-butylphthalate, n-butylbenzylphthalate, di-2-ethylhexylphthalate, and di-n-octylphthalate. The enriched ionic liquid was directly analyzed by gas chromatography and mass spectrometry using direct insert microvial thermal desorption. The different factors affecting the microextraction efficiency, such as volume of the extracting phase (30 μL of the ionic liquid) and ultrasound application time (25 s), and the thermal desorption step, such as desorption temperature and time, and gas flow rate, were studied. Under the selected conditions, detection limits for the analytes were in the 0.012-0.18 μg/L range, while recovery assays provided values ranging from 80 to 112%. The use of butyl benzoate as internal standard increased the reproducibility of the analytical procedure. When the release of the six phthalate esters from the tested plastic bottles to liquid simulants was monitored using the optimized procedure, analyte concentrations of between 1.0 and 273 μg/L were detected. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Analytical one-dimensional model for laser-induced ultrasound in planar optically absorbing layer.

    PubMed

    Svanström, Erika; Linder, Tomas; Löfqvist, Torbjörn

    2014-03-01

    Ultrasound generated by means of laser-based photoacoustic principles are in common use today and applications can be found both in biomedical diagnostics, non-destructive testing and materials characterisation. For certain measurement applications it could be beneficial to shape the generated ultrasound regarding spectral properties and temporal profile. To address this, we studied the generation and propagation of laser-induced ultrasound in a planar, layered structure. We derived an analytical expression for the induced pressure wave, including different physical and optical properties of each layer. A Laplace transform approach was employed in analytically solving the resulting set of photoacoustic wave equations. The results correspond to simulations and were compared to experimental results. To enable the comparison between recorded voltage from the experiments and the calculated pressure we employed a system identification procedure based on physical properties of the ultrasonic transducer to convert the calculated acoustic pressure to voltages. We found reasonable agreement between experimentally obtained voltages and the voltages determined from the calculated acoustic pressure, for the samples studied. The system identification procedure was found to be unstable, however, possibly from violations of material isotropy assumptions by film adhesives and coatings in the experiment. The presented analytical model can serve as a basis when addressing the inverse problem of shaping an acoustic pulse from absorption of a laser pulse in a planar layered structure of elastic materials. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  12. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  13. Cloud point extraction thermospray flame quartz furnace atomic absorption spectrometry for determination of ultratrace cadmium in water and urine

    NASA Astrophysics Data System (ADS)

    Wu, Peng; Zhang, Yunchang; Lv, Yi; Hou, Xiandeng

    2006-12-01

    A simple, low cost and highly sensitive method based on cloud point extraction (CPE) for separation/preconcentration and thermospray flame quartz furnace atomic absorption spectrometry was proposed for the determination of ultratrace cadmium in water and urine samples. The analytical procedure involved the formation of analyte-entrapped surfactant micelles by mixing the analyte solution with an ammonium pyrrolidinedithiocarbamate (APDC) solution and a Triton X-114 solution. When the temperature of the system was higher than the cloud point of Triton X-114, the complex of cadmium-PDC entered the surfactant-rich phase and thus separation of the analyte from the matrix was achieved. Under optimal chemical and instrumental conditions, the limit of detection was 0.04 μg/L for cadmium with a sample volume of 10 mL. The analytical results of cadmium in water and urine samples agreed well with those by ICP-MS.

  14. Determination of 232Th in urine by ICP-MS for individual monitoring purposes.

    PubMed

    Baglan, N; Cossonnet, C; Ritt, J

    2001-07-01

    Thorium is naturally occurring in various ores used for industrial purposes and has numerous applications. This paper sets out to investigate urine analysis as a suitable monitoring approach for workers potentially exposed to thorium. Due to its biokinetic behavior and its low solubility, urinary concentrations are generally very low, requiring therefore high sensitivity analytical methods. An analytical procedure has been developed for detecting 232Th concentrations of below 1 mBq L(-1) quickly and easily. Due to the long half-life (1.41 x 10(10) y) of 232Th, the potential of a procedure based on urine sample dilution and ICP-MS (inductively coupled plasma-mass spectrometry) measurement was investigated first. Two dilution factors were chosen: 100, which is more suitable for long-term measurement trials, and 20, which increases sensitivity. It has been shown that a 100-fold dilution can be used to measure concentrations of below 1 mBq L(-1), whereas a 20-fold one can be used to reach concentrations of below 0.06 mBq L(-1). Then, on the basis of the limitation of the procedure based on urine dilution, the suitable field of application for the different procedures (100-fold and 20-fold dilution and also a chemical purification followed by an ICP-MS measurement) was determined in relation to monitoring objectives.

  15. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    PubMed

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Task-Analytic Design of Graphic Presentations

    DTIC Science & Technology

    1990-05-18

    important premise of Larkin and Simon’s work is that, when comparing alternative presentations, it is fruitful to characterize graphic-based problem solving...using the same information-processing models used to help understand problem solving using other representations [Newell and Simon, 19721...luring execution of graphic presentation- 4 based problem -solving procedures. Chapter 2 reviews other work related to the problem of designing graphic

  17. CTEPP STANDARD OPERATING PROCEDURE FOR PREPARATION OF SURROGATE RECOVERY STANDARD AND INTERNAL STANDARD SOLUTIONS FOR NEUTRAL TARGET ANALYTES (SOP-5.25)

    EPA Science Inventory

    This standard operating procedure describes the method used for preparing internal standard, surrogate recovery standard and calibration standard solutions for neutral analytes used for gas chromatography/mass spectrometry analysis.

  18. CTEPP STANDARD OPERATING PROCEDURE FOR DETECTION AND QUANTIFICATION OF TARGET ANALYTES BY GAS CHROMATOGRAPHY/MASS SPECTROMETRY (GC/MS) (SOP-5.24)

    EPA Science Inventory

    This standard operating procedure describes the method used for the determination of target analytes in sample extracts and related quality assurance/quality control sample extracts generated in the CTEPP study.

  19. An analytic survey of signing inventory procedures in Virginia.

    DOT National Transportation Integrated Search

    1972-01-01

    An analytic survey was made of the highway signing and sign-maintenance inventory systems in each of the districts of the Virginia Department of Highways. Of particular concern in reviewing the procedures was the format of the inventory forms, the ap...

  20. Ring-oven based preconcentration technique for microanalysis: simultaneous determination of Na, Fe, and Cu in fuel ethanol by laser induced breakdown spectroscopy.

    PubMed

    Cortez, Juliana; Pasquini, Celio

    2013-02-05

    The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.

  1. Magnetic solid-phase extraction using carbon nanotubes as sorbents: a review.

    PubMed

    Herrero-Latorre, C; Barciela-García, J; García-Martín, S; Peña-Crecente, R M; Otárola-Jiménez, J

    2015-09-10

    Magnetic solid-phase extraction (M-SPE) is a procedure based on the use of magnetic sorbents for the separation and preconcentration of different organic and inorganic analytes from large sample volumes. The magnetic sorbent is added to the sample solution and the target analyte is adsorbed onto the surface of the magnetic sorbent particles (M-SPs). Analyte-M-SPs are separated from the sample solution by applying an external magnetic field and, after elution with the appropriate solvent, the recovered analyte is analyzed. This approach has several advantages over traditional solid phase extraction as it avoids time-consuming and tedious on-column SPE procedures and it provides a rapid and simple analyte separation that avoids the need for centrifugation or filtration steps. As a consequence, in the past few years a great deal of research has been focused on M-SPE, including the development of new sorbents and novel automation strategies. In recent years, the use of magnetic carbon nanotubes (M-CNTs) as a sorption substrate in M-SPE has become an active area of research. These materials have exceptional mechanical, electrical, optical and magnetic properties and they also have an extremely large surface area and varied possibilities for functionalization. This review covers the synthesis of M-CNTs and the different approaches for the use of these compounds in M-SPE. The performance, general characteristics and applications of M-SPE based on magnetic carbon nanotubes for organic and inorganic analysis have been evaluated on the basis of more than 110 references. Finally, some important challenges with respect the use of magnetic carbon nanotubes in M-SPE are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Diffusion of Super-Gaussian Profiles

    ERIC Educational Resources Information Center

    Rosenberg, C.-J.; Anderson, D.; Desaix, M.; Johannisson, P.; Lisak, M.

    2007-01-01

    The present analysis describes an analytically simple and systematic approximation procedure for modelling the free diffusive spreading of initially super-Gaussian profiles. The approach is based on a self-similar ansatz for the evolution of the diffusion profile, and the parameter functions involved in the modelling are determined by suitable…

  3. Clinical and diagnostic utility of saliva as a non-invasive diagnostic fluid:
a systematic review

    PubMed Central

    Nunes, Lazaro Alessandro Soares; Mussavira, Sayeeda

    2015-01-01

    This systematic review presents the latest trends in salivary research and its applications in health and disease. Among the large number of analytes present in saliva, many are affected by diverse physiological and pathological conditions. Further, the non-invasive, easy and cost-effective collection methods prompt an interest in evaluating its diagnostic or prognostic utility. Accumulating data over the past two decades indicates towards the possible utility of saliva to monitor overall health, diagnose and treat various oral or systemic disorders and drug monitoring. Advances in saliva based systems biology has also contributed towards identification of several biomarkers, development of diverse salivary diagnostic kits and other sensitive analytical techniques. However, its utilization should be carefully evaluated in relation to standardization of pre-analytical and analytical variables, such as collection and storage methods, analyte circadian variation, sample recovery, prevention of sample contamination and analytical procedures. In spite of all these challenges, there is an escalating evolution of knowledge with the use of this biological matrix. PMID:26110030

  4. Quality-Assurance Data for Routine Water Analyses by the U.S. Geological Survey Laboratory in Troy, New York-July 1997 through June 1999

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2006-01-01

    The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance/quality-control data for the time period addressed in this report were stored in the laboratory's SAS data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality- control samples analyzed from July 1997 through June 1999. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration and (or) low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, ammonium, calcium, chloride, specific conductance, and sulfate. The data from the potassium and sodium analytical procedures are insufficient for evaluation. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 11 of 13 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. Blank analysis results for chloride showed that 22 percent of blanks did not meet data-quality objectives and results for dissolved organic carbon showed that 31 percent of the blanks did not meet data-quality objectives. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 14 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except total aluminum (70 percent of samples met objectives) and potassium (83 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality for most constituents over the time period. The P-sample (low-ionic-strength constituents) analysis had good ratings in two of these studies and a satisfactory rating in the third. The results of the T-sample (trace constituents) analysis indicated high data quality with good ratings in all three studies. The N-sample (nutrient constituents) studies had one each of excellent, good, and satisfactory ratings. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 80 percent of the samples met data-quality objectives for 9 of the 13 analytes; the exceptions were dissolved organic carbon, ammonium, chloride, and specific conductance. Data-quality objectives were not met for dissolved organic carbon in two NWRI studies, but all of the samples were within control limits for the last study. Data-quality objectives were not met in 41 percent of samples analyzed for ammonium, 25 percent of samples analyzed for chloride, and 30 percent of samples analyzed for specific conductance. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 84 percent of the samples analyzed for calcium, chloride, magnesium, pH, and potassium. Data-quality objectives were met by 73 percent of those analyzed for sulfate. The data-quality objective was not met for sodium. The data are insufficient for evaluation of the specific conductance results.

  5. Field Telemetry of Blade-rotor Coupled Torsional Vibration at Matuura Power Station Number 1 Unit

    NASA Technical Reports Server (NTRS)

    Isii, Kuniyoshi; Murakami, Hideaki; Otawara, Yasuhiko; Okabe, Akira

    1991-01-01

    The quasi-modal reduction technique and finite element model (FEM) were used to construct an analytical model for the blade-rotor coupled torsional vibration of a steam turbine generator of the Matuura Power Station. A single rotor test was executed in order to evaluate umbrella vibration characteristics. Based on the single rotor test results and the quasi-modal procedure, the total rotor system was analyzed to predict coupled torsional frequencies. Finally, field measurement of the vibration of the last stage buckets was made, which confirmed that the double synchronous resonance was 124.2 Hz, meaning that the machine can be safely operated. The measured eigen values are very close to the predicted value. The single rotor test and this analytical procedure thus proved to be a valid technique to estimate coupled torsional vibration.

  6. High Rayleigh number convection in rectangular enclosures with differentially heated vertical walls and aspect ratios between zero and unity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kassemi, S.A.

    1988-04-01

    High Rayleigh number convection in a rectangular cavity with insulated horizontal surfaces and differentially heated vertical walls was analyzed for an arbitrary aspect ratio smaller than or equal to unity. Unlike previous analytical studies, a systematic method of solution based on linearization technique and analytical iteration procedure was developed to obtain approximate closed-form solutions for a wide range of aspect ratios. The predicted velocity and temperature fields are shown to be in excellent agreement with available experimental and numerical data.

  7. High Rayleigh number convection in rectangular enclosures with differentially heated vertical walls and aspect ratios between zero and unity

    NASA Technical Reports Server (NTRS)

    Kassemi, Siavash A.

    1988-01-01

    High Rayleigh number convection in a rectangular cavity with insulated horizontal surfaces and differentially heated vertical walls was analyzed for an arbitrary aspect ratio smaller than or equal to unity. Unlike previous analytical studies, a systematic method of solution based on linearization technique and analytical iteration procedure was developed to obtain approximate closed-form solutions for a wide range of aspect ratios. The predicted velocity and temperature fields are shown to be in excellent agreement with available experimental and numerical data.

  8. Liquid-liquid microextraction in a multicommuted flow system for direct spectrophotometric determination of iodine value in biodiesel.

    PubMed

    Pereira, Andréia C; Rocha, Fábio R P

    2014-06-04

    A flow-based procedure was developed for the direct spectrophotometric determination of the iodine value (IV) in biodiesel. The procedure was based on the microextraction/reaction of unsaturated compounds with triiodide ions in an aqueous medium by inserting the reagent solution between the aliquots of biodiesel without any pretreatment. The interaction occurred through the biodiesel film formed on the inner walls of the hydrophobic tube used as the reactor and at the aqueous/biodiesel interfaces. The spectrophotometric detection was based on the discoloration of the I3(-) reagent in the aqueous phase by using a glass tube coupled to a fiber-optic spectrophotometer as the detection cell. Reference solutions were prepared by dilution of biodiesel samples with previously determined IV in hexane. The analytical response was linear for IV from 13 to 135 g I2/100 g with a detection limit of 5 g I2/100 g. A coefficient of variation of 1.7% (n=10) and a sampling rate of 108 determinations per hour were achieved by consuming 224 μL of the sample and 200 μg of I2 per determination. The slopes of analytical curves obtained with three different biodiesel samples were in agreement (variations in slopes lower than 3.1%), thus indicating an absence of any matrix effects. Results for biodiesel samples from different sources agreed with the volumetric official procedure at the 95% confidence level. The proposed procedure is therefore a simple, fast, and reliable alternative for estimating the iodine value of biodiesel. Copyright © 2014. Published by Elsevier B.V.

  9. A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines

    PubMed Central

    Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua

    2018-01-01

    The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905

  10. Engine isolation for structural-borne interior noise reduction in a general aviation aircraft

    NASA Technical Reports Server (NTRS)

    Unruh, J. F.; Scheidt, D. C.

    1981-01-01

    Engine vibration isolation for structural-borne interior noise reduction is investigated. A laboratory based test procedure to simulate engine induced structure-borne noise transmission, the testing of a range of candidate isolators for relative performance data, and the development of an analytical model of the transmission phenomena for isolator design evaluation are addressed. The isolator relative performance test data show that the elastomeric isolators do not appear to operate as single degree of freedom systems with respect to noise isolation. Noise isolation beyond 150 Hz levels off and begins to decrease somewhat above 600 Hz. Coupled analytical and empirical models were used to study the structure-borne noise transmission phenomena. Correlation of predicted results with measured data show that (1) the modeling procedures are reasonably accurate for isolator design evaluation, (2) the frequency dependent properties of the isolators must be included in the model if reasonably accurate noise prediction beyond 150 Hz is desired. The experimental and analytical studies were carried out in the frequency range from 10 Hz to 1000 Hz.

  11. IFCC approved HPLC reference measurement procedure for the alcohol consumption biomarker carbohydrate-deficient transferrin (CDT): Its validation and use.

    PubMed

    Schellenberg, François; Wielders, Jos; Anton, Raymond; Bianchi, Vincenza; Deenmamode, Jean; Weykamp, Cas; Whitfield, John; Jeppsson, Jan-Olof; Helander, Anders

    2017-02-01

    Carbohydrate-deficient transferrin (CDT) is used as a biomarker of sustained high alcohol consumption. The currently available measurement procedures for CDT are based on various analytical techniques (HPLC, capillary electrophoresis, nephelometry), some differing in the definition of the analyte and using different reference intervals and cut-off values. The Working Group on Standardization of CDT (WG-CDT), initiated by the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC), has validated an HPLC candidate reference measurement procedure (cRMP) for CDT (% disialotransferrin to total transferrin based on peak areas), demonstrating that it is suitable as a reference measurement procedure (RMP) for CDT. Presented is a detailed description of the cRMP and its calibration. Practical aspects on how to treat genetic variant and so-called di-tri bridge samples are described. Results of method performance characteristics, as demanded by ISO 15189 and ISO 15193, are given, as well as the reference interval and measurement uncertainty and how to deal with that in routine use. The correlation of the cRMP with commercial CDT procedures and the performance of the cRMP in a network of laboratories are also presented. The performance of the CDT cRMP in combination with previously developed commutable calibrators allows for standardization of the currently available commercial measurement procedures for CDT. The cRMP has recently been approved by the IFCC and will be from now on be known as the IFCC-RMP for CDT, while CDT results standardized according to this RMP should be indicated as CDT IFCC . Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Identification and Quantitative Analysis of Acetaminophen, Acetylsalicylic Acid, and Caffeine in Commercial Analgesic Tablets by LC-MS

    ERIC Educational Resources Information Center

    Fenk, Christopher J.; Hickman, Nicole M.; Fincke, Melissa A.; Motry, Douglas H.; Lavine, Barry

    2010-01-01

    An undergraduate LC-MS experiment is described for the identification and quantitative determination of acetaminophen, acetylsalicylic acid, and caffeine in commercial analgesic tablets. This inquiry-based experimental procedure requires minimal sample preparation and provides good analytical results. Students are provided sufficient background…

  13. Selecting Evaluation Comparison Groups: A Cluster Analytic Approach.

    ERIC Educational Resources Information Center

    Davis, Todd Mclin; McLean, James E.

    A persistent problem in the evaluation of field-based projects is the lack of no-treatment comparison groups. Frequently, potential comparison groups are confounded by socioeconomic, racial, or other factors. Among the possible methods for dealing with this problem are various matching procedures, but they are cumbersome to use with multiple…

  14. Item Content of the Group Personality Projective Test

    ERIC Educational Resources Information Center

    Boudreaux, Ronald F.; Dreger, Ralph M.

    1974-01-01

    Examined the content factors of the GPPT using factor analytic procedures based on item intercorrelations, in contrast to the published version's use of part scores from a prior groupings of items. In terms of what it proposes to measure, it was concluded that the GPPT has very limited utility. (Author/RC)

  15. MEASUREMENT OF VOLATILE ORGANIC COMPOUNDS BY THE US ENVIRONMENTAL PROTECTION AGENCY COMPENDIUM METHOD TO-17 - EVALUATION OF PERFORMANCE CRITERIA

    EPA Science Inventory

    An evaluation of performance criteria for US Environmental Protection Agency Compendium Method TO-17 for monitoring volatile organic compounds (VOCs) in air has been accomplished. The method is a solid adsorbent-based sampling and analytical procedure including performance crit...

  16. Manual Solid-Phase Peptide Synthesis of Metallocene-Peptide Bioconjugates

    ERIC Educational Resources Information Center

    Kirin, Srecko I.; Noor, Fozia; Metzler-Nolte, Nils; Mier, Walter

    2007-01-01

    A simple and relatively inexpensive procedure for preparing a biologically active peptide using solid phase peptide synthesis (SPPS) is described. Fourth-year undergraduate students have gained firsthand experience from the solid-phase synthesis techniques and they have become familiar with modern analytical techniques based on the particular…

  17. Structural assessment of a Space Station solar dynamic heat receiver thermal energy storage canister

    NASA Technical Reports Server (NTRS)

    Tong, M. T.; Kerslake, T. W.; Thompson, R. L.

    1988-01-01

    This paper assesses the structural performance of a Space Station thermal energy storage (TES) canister subject to orbital solar flux variation and engine cold start-up operating conditions. The impact of working fluid temperature and salt-void distribution on the canister structure are assessed. Both analytical and experimental studies were conducted to determine the temperature distribution of the canister. Subsequent finite-element structural analyses of the canister were performed using both analytically and experimentally obtained temperatures. The Arrhenius creep law was incorporated into the procedure, using secondary creep data for the canister material, Haynes-188 alloy. The predicted cyclic creep strain accumulations at the hot spot were used to assess the structural performance of the canister. In addition, the structural performance of the canister based on the analytically-determined temperature was compared with that based on the experimentally-measured temperature data.

  18. Structural assessment of a space station solar dynamic heat receiver thermal energy storage canister

    NASA Technical Reports Server (NTRS)

    Thompson, R. L.; Kerslake, T. W.; Tong, M. T.

    1988-01-01

    The structural performance of a space station thermal energy storage (TES) canister subject to orbital solar flux variation and engine cold start up operating conditions was assessed. The impact of working fluid temperature and salt-void distribution on the canister structure are assessed. Both analytical and experimental studies were conducted to determine the temperature distribution of the canister. Subsequent finite element structural analyses of the canister were performed using both analytically and experimentally obtained temperatures. The Arrhenius creep law was incorporated into the procedure, using secondary creep data for the canister material, Haynes 188 alloy. The predicted cyclic creep strain accumulations at the hot spot were used to assess the structural performance of the canister. In addition, the structural performance of the canister based on the analytically determined temperature was compared with that based on the experimentally measured temperature data.

  19. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  20. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  1. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  2. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  3. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  4. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    PubMed

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  5. Influence of transverse-shear and large-deformation effects on the low-speed impact response of laminated composite plates

    NASA Technical Reports Server (NTRS)

    Ambur, Damodar R.; Starnes, James H., Jr.; Prasad, Chunchu B.

    1993-01-01

    An analytical procedure is presented for determining the transient response of simply supported, rectangular laminated composite plates subjected to impact loads from airgun-propelled or dropped-weight impactors. A first-order shear-deformation theory is included in the analysis to represent properly any local short-wave-length transient bending response. The impact force is modeled as a locally distributed load with a cosine-cosine distribution. A double Fourier series expansion and the Timoshenko small-increment method are used to determine the contact force, out-of-plane deflections, and in-plane strains and stresses at any plate location due to an impact force at any plate location. The results of experimental and analytical studies are compared for quasi-isotropic laminates. The results indicate that using the appropriate local force distribution for the locally loaded area and including transverse-shear-deformation effects in the laminated plate response analysis are important. The applicability of the present analytical procedure based on small deformation theory is investigated by comparing analytical and experimental results for combinations of quasi-isotropic laminate thicknesses and impact energy levels. The results of this study indicate that large-deformation effects influence the response of both 24- and 32-ply laminated plates, and that a geometrically nonlinear analysis is required for predicting the response accurately.

  6. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  8. 40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  9. Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom

    ERIC Educational Resources Information Center

    Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy

    2016-01-01

    The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…

  10. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  11. 40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  12. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...

  13. Negations in syllogistic reasoning: evidence for a heuristic-analytic conflict.

    PubMed

    Stupple, Edward J N; Waterhouse, Eleanor F

    2009-08-01

    An experiment utilizing response time measures was conducted to test dominant processing strategies in syllogistic reasoning with the expanded quantifier set proposed by Roberts (2005). Through adding negations to existing quantifiers it is possible to change problem surface features without altering logical validity. Biases based on surface features such as atmosphere, matching, and the probability heuristics model (PHM; Chater & Oaksford, 1999; Wetherick & Gilhooly, 1995) would not be expected to show variance in response latencies, but participant responses should be highly sensitive to changes in the surface features of the quantifiers. In contrast, according to analytic accounts such as mental models theory and mental logic (e.g., Johnson-Laird & Byrne, 1991; Rips, 1994) participants should exhibit increased response times for negated premises, but not be overly impacted upon by the surface features of the conclusion. Data indicated that the dominant response strategy was based on a matching heuristic, but also provided evidence of a resource-demanding analytic procedure for dealing with double negatives. The authors propose that dual-process theories offer a stronger account of these data whereby participants employ competing heuristic and analytic strategies and fall back on a heuristic response when analytic processing fails.

  14. Analytic and heuristic processes in the detection and resolution of conflict.

    PubMed

    Ferreira, Mário B; Mata, André; Donkin, Christopher; Sherman, Steven J; Ihmels, Max

    2016-10-01

    Previous research with the ratio-bias task found larger response latencies for conflict trials where the heuristic- and analytic-based responses are assumed to be in opposition (e.g., choosing between 1/10 and 9/100 ratios of success) when compared to no-conflict trials where both processes converge on the same response (e.g., choosing between 1/10 and 11/100). This pattern is consistent with parallel dual-process models, which assume that there is effective, rather than lax, monitoring of the output of heuristic processing. It is, however, unclear why conflict resolution sometimes fails. Ratio-biased choices may increase because of a decline in analytical reasoning (leaving heuristic-based responses unopposed) or to a rise in heuristic processing (making it more difficult for analytic processes to override the heuristic preferences). Using the process-dissociation procedure, we found that instructions to respond logically and response speed affected analytic (controlled) processing (C), leaving heuristic processing (H) unchanged, whereas the intuitive preference for large nominators (as assessed by responses to equal ratio trials) affected H but not C. These findings create new challenges to the debate between dual-process and single-process accounts, which are discussed.

  15. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    PubMed

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Rapid, specific determination of iodine and iodide by combined solid-phase extraction/diffuse reflectance spectroscopy

    NASA Technical Reports Server (NTRS)

    Arena, Matteo P.; Porter, Marc D.; Fritz, James S.

    2002-01-01

    A new, rapid methodology for trace analysis using solid-phase extraction is described. The two-step methodology is based on the concentration of an analyte onto a membrane disk and on the determination by diffuse reflectance spectroscopy of the amount of analyte extracted on the disk surface. This method, which is adaptable to a wide range of analytes, has been used for monitoring ppm levels of iodine and iodide in spacecraft water. Iodine is used as a biocide in spacecraft water. For these determinations, a water sample is passed through a membrane disk by means of a 10-mL syringe that is attached to a disk holder assembly. The disk, which is a polystyrene-divinylbenzene composite, is impregnated with poly(vinylpyrrolidone) (PVP), which exhaustively concentrates iodine as a yellow iodine-PVP complex. The amount of concentrated iodine is then determined in only 2 s by using a hand-held diffuse reflectance spectrometer by comparing the result with a calibration curve based on the Kubelka-Munk function. The same general procedure can be used to determine iodide levels after its facile and exhaustive oxidation to iodine by peroxymonosulfate (i.e., Oxone reagent). For samples containing both analytes, a two-step procedure can be used in which the iodide concentration is calculated from the difference in iodine levels before and after treatment of the sample with peroxymonosulfate. With this methodology, iodine and iodide levels in the 0.1-5.0 ppm range can be determined with a total workup time of approximately 60 s with a RSD of approximately 6%.

  17. Transmission eigenchannels for coherent phonon transport

    NASA Astrophysics Data System (ADS)

    Klöckner, J. C.; Cuevas, J. C.; Pauly, F.

    2018-04-01

    We present a procedure to determine transmission eigenchannels for coherent phonon transport in nanoscale devices using the framework of nonequilibrium Green's functions. We illustrate our procedure by analyzing a one-dimensional chain, where all steps can be carried out analytically. More importantly, we show how the procedure can be combined with ab initio calculations to provide a better understanding of phonon heat transport in realistic atomic-scale junctions. In particular, we study the phonon eigenchannels in a gold metallic atomic-size contact and different single-molecule junctions based on molecules such as an alkane chain, a brominated benzene-diamine, where destructive phonon interference effects take place, and a C60 junction.

  18. Analytical learning and term-rewriting systems

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  19. Ion chromatography characterization of polysaccharides in ancient wall paintings.

    PubMed

    Colombin, Maria Perla; Ceccarini, Alessio; Carmignani, Alessia

    2002-08-30

    An analytical procedure for the characterisation of polysaccharides and the identification of plant gums in old polychrome samples is described. The procedure is based on hydrolysis with 2 M trifluoroacetic acid assisted by microwaves (20 min, 120 degrees C, 500 W), clean-up of the hydrolysate by an ion-exchange resin, and analysis by high-performance anion-exchange chromatography with pulsed amperometric detection. Using this method the hydrolysis time was reduced to 20 min and the chromatographic separation of seven monosaccharides (fucose, rhamnose, arabinose, galactose, glucose, mannose, xylose) and two uronic acids (galacturonic and glucuronic) was achieved in 40 min. The whole analytical procedure allows sugar determination in plant gums at picomole levels, with an average recovery of 72% with an RSD of 8% as tested on arabic gum. The analytical procedure was tested with several raw gums, watercolour samples and reference painting specimens prepared according to old recipes at the Opificio delle Pietre Dure of Florence (Italian Ministry of Cultural Heritage, Italy). All the data collected expressed in relative sugar percentage contents were submitted to principal components analysis for gum identification: five groups were spatially separated and this enabled the identification of arabic, tragacanth, karaya, cherry+ghatty, and guar+locust bean gum. Wall painting samples from Macedonian tombs (Greece) of the 4th-3rd Centuries B.C., processed by the suggested method, showed the presence of a complex paint media mainly consisting of tragacanth and fruit tree gums. Moreover, starch had probably been added to plaster as highlighted by the presence of a huge amount of glucose.

  20. Nonlinear modelling of high-speed catenary based on analytical expressions of cable and truss elements

    NASA Astrophysics Data System (ADS)

    Song, Yang; Liu, Zhigang; Wang, Hongrui; Lu, Xiaobing; Zhang, Jing

    2015-10-01

    Due to the intrinsic nonlinear characteristics and complex structure of the high-speed catenary system, a modelling method is proposed based on the analytical expressions of nonlinear cable and truss elements. The calculation procedure for solving the initial equilibrium state is proposed based on the Newton-Raphson iteration method. The deformed configuration of the catenary system as well as the initial length of each wire can be calculated. Its accuracy and validity of computing the initial equilibrium state are verified by comparison with the separate model method, absolute nodal coordinate formulation and other methods in the previous literatures. Then, the proposed model is combined with a lumped pantograph model and a dynamic simulation procedure is proposed. The accuracy is guaranteed by the multiple iterative calculations in each time step. The dynamic performance of the proposed model is validated by comparison with EN 50318, the results of the finite element method software and SIEMENS simulation report, respectively. At last, the influence of the catenary design parameters (such as the reserved sag and pre-tension) on the dynamic performance is preliminarily analysed by using the proposed model.

  1. 40 CFR 600.108-08 - Analytical gases.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...

  2. 40 CFR 600.108-08 - Analytical gases.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...

  3. How Much Can We Learn from a Single Chromatographic Experiment? A Bayesian Perspective.

    PubMed

    Wiczling, Paweł; Kaliszan, Roman

    2016-01-05

    In this work, we proposed and investigated a Bayesian inference procedure to find the desired chromatographic conditions based on known analyte properties (lipophilicity, pKa, and polar surface area) using one preliminary experiment. A previously developed nonlinear mixed effect model was used to specify the prior information about a new analyte with known physicochemical properties. Further, the prior (no preliminary data) and posterior predictive distribution (prior + one experiment) were determined sequentially to search towards the desired separation. The following isocratic high-performance reversed-phase liquid chromatographic conditions were sought: (1) retention time of a single analyte within the range of 4-6 min and (2) baseline separation of two analytes with retention times within the range of 4-10 min. The empirical posterior Bayesian distribution of parameters was estimated using the "slice sampling" Markov Chain Monte Carlo (MCMC) algorithm implemented in Matlab. The simulations with artificial analytes and experimental data of ketoprofen and papaverine were used to test the proposed methodology. The simulation experiment showed that for a single and two randomly selected analytes, there is 97% and 74% probability of obtaining a successful chromatogram using none or one preliminary experiment. The desired separation for ketoprofen and papaverine was established based on a single experiment. It was confirmed that the search for a desired separation rarely requires a large number of chromatographic analyses at least for a simple optimization problem. The proposed Bayesian-based optimization scheme is a powerful method of finding a desired chromatographic separation based on a small number of preliminary experiments.

  4. A new procedure for investigating three-dimensional stress fields in a thin plate with a through-the-thickness crack

    NASA Astrophysics Data System (ADS)

    Yi, Dake; Wang, TzuChiang

    2018-06-01

    In the paper, a new procedure is proposed to investigate three-dimensional fracture problems of a thin elastic plate with a long through-the-thickness crack under remote uniform tensile loading. The new procedure includes a new analytical method and high accurate finite element simulations. In the part of theoretical analysis, three-dimensional Maxwell stress functions are employed in order to derive three-dimensional crack tip fields. Based on the theoretical analysis, an equation which can describe the relationship among the three-dimensional J-integral J( z), the stress intensity factor K( z) and the tri-axial stress constraint level T z ( z) is derived first. In the part of finite element simulations, a fine mesh including 153360 elements is constructed to compute the stress field near the crack front, J( z) and T z ( z). Numerical results show that in the plane very close to the free surface, the K field solution is still valid for in-plane stresses. Comparison with the numerical results shows that the analytical results are valid.

  5. Solvent-free MALDI-MS for the analysis of a membrane protein via the mini ball mill approach: case study of bacteriorhodopsin.

    PubMed

    Trimpin, Sarah; Deinzer, Max L

    2007-01-01

    A mini ball mill (MBM) solvent-free matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) method allows for the analysis of bacteriorhodopsin (BR), an integral membrane protein that previously presented special analytical problems. For well-defined signals in the molecular ion region of the analytes, a desalting procedure of the MBM sample directly on the MALDI target plate was used to reduce adduction by sodium and other cations that are normally attendant with hydrophobic peptides and proteins as a result of the sample preparation procedure. Mass analysis of the intact hydrophobic protein and the few hydrophobic and hydrophilic tryptic peptides available in the digest is demonstrated with this robust new approach. MS and MS/MS spectra of BR tryptic peptides and intact protein were generally superior to the traditional solvent-based method using the desalted "dry" MALDI preparation procedure. The solvent-free method expands the range of peptides that can be effectively analyzed by MALDI-MS to those that are hydrophobic and solubility-limited.

  6. Feature construction can improve diagnostic criteria for high-dimensional metabolic data in newborn screening for medium-chain acyl-CoA dehydrogenase deficiency.

    PubMed

    Ho, Sirikit; Lukacs, Zoltan; Hoffmann, Georg F; Lindner, Martin; Wetter, Thomas

    2007-07-01

    In newborn screening with tandem mass spectrometry, multiple intermediary metabolites are quantified in a single analytical run for the diagnosis of fatty-acid oxidation disorders, organic acidurias, and aminoacidurias. Published diagnostic criteria for these disorders normally incorporate a primary metabolic marker combined with secondary markers, often analyte ratios, for which the markers have been chosen to reflect metabolic pathway deviations. We applied a procedure to extract new markers and diagnostic criteria for newborn screening to the data of newborns with confirmed medium-chain acyl-CoA dehydrogenase deficiency (MCADD) and a control group from the newborn screening program, Heidelberg, Germany. We validated the results with external data of the screening center in Hamburg, Germany. We extracted new markers by performing a systematic search for analyte combinations (features) with high discriminatory performance for MCADD. To select feature thresholds, we applied automated procedures to separate controls and cases on the basis of the feature values. Finally, we built classifiers from these new markers to serve as diagnostic criteria in screening for MCADD. On the basis of chi(2) scores, we identified approximately 800 of >628,000 new analyte combinations with superior discriminatory performance compared with the best published combinations. Classifiers built with the new features achieved diagnostic sensitivities and specificities approaching 100%. Feature construction methods provide ways to disclose information hidden in the set of measured analytes. Other diagnostic tasks based on high-dimensional metabolic data might also profit from this approach.

  7. Solid-phase extraction of heavy metal ions on bucky tubes disc in natural water and herbal plant samples.

    PubMed

    Soylak, Mustafa; Unsal, Yunus Emre

    2011-10-01

    A preconcentration-separation procedure has been established based on solid-phase extraction of Fe(III) and Pb(II) on bucky tubes (BTs) disc. Fe(III) and Pb(II) ions were quantitatively recovered at pH 6. The influences of the analytical parameters like sample volume, flow rates on the recoveries of analytes on BT disc were investigated. The effects of co-existing ions on the recoveries were also studied. The detection limits for iron and lead were found 1.6 and 4.9 μg L⁻¹, respectively. The validation of the presented method was checked by the analysis of TMDA-51.3 fortified water certified reference material. The presented procedure was successfully applied to the separation-preconcentration and determination of iron and lead content of some natural water and herbal plant samples from Kayseri, Turkey.

  8. Comparison of analytical and experimental performance of a wind-tunnel diffuser section

    NASA Technical Reports Server (NTRS)

    Shyne, R. J.; Moore, R. D.; Boldman, D. R.

    1986-01-01

    Wind tunnel diffuser performance is evaluated by comparing experimental data with analytical results predicted by an one-dimensional integration procedure with skin friction coefficient, a two-dimensional interactive boundary layer procedure for analyzing conical diffusers, and a two-dimensional, integral, compressible laminar and turbulent boundary layer code. Pressure, temperature, and velocity data for a 3.25 deg equivalent cone half-angle diffuser (37.3 in., 94.742 cm outlet diameter) was obtained from the one-tenth scale Altitude Wind Tunnel modeling program at the NASA Lewis Research Center. The comparison is performed at Mach numbers of 0.162 (Re = 3.097x19(6)), 0.326 (Re = 6.2737x19(6)), and 0.363 (Re = 7.0129x10(6)). The Reynolds numbers are all based on an inlet diffuser diameter of 32.4 in., 82.296 cm, and reasonable quantitative agreement was obtained between the experimental data and computational codes.

  9. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  10. 42 CFR 493.959 - Immunohematology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...

  11. 42 CFR 493.959 - Immunohematology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...

  12. 42 CFR 493.959 - Immunohematology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...

  13. 42 CFR 493.959 - Immunohematology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...

  14. 77 FR 39895 - New Analytic Methods and Sampling Procedures for the United States National Residue Program for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... Analytic Methods and Sampling Procedures for the United States National Residue Program for Meat, Poultry... implementing several multi-residue methods for analyzing samples of meat, poultry, and egg products for animal.... These modern, high-efficiency methods will conserve resources and provide useful and reliable results...

  15. 40 CFR 63.145 - Process wastewater provisions-test methods and procedures to determine compliance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 9 2011-07-01 2011-07-01 false Process wastewater provisions-test... Operations, and Wastewater § 63.145 Process wastewater provisions—test methods and procedures to determine... analytical method for wastewater which has that compound as a target analyte. (7) Treatment using a series of...

  16. 40 CFR 63.145 - Process wastewater provisions-test methods and procedures to determine compliance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 10 2013-07-01 2013-07-01 false Process wastewater provisions-test... Operations, and Wastewater § 63.145 Process wastewater provisions—test methods and procedures to determine... analytical method for wastewater which has that compound as a target analyte. (7) Treatment using a series of...

  17. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES... to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater... times the standard deviation of replicate instrumental measurements of the analyte in reagent water. (c...

  18. Meta-Analysis of Mathematic Basic-Fact Fluency Interventions: A Component Analysis

    ERIC Educational Resources Information Center

    Codding, Robin S.; Burns, Matthew K.; Lukito, Gracia

    2011-01-01

    Mathematics fluency is a critical component of mathematics learning yet few attempts have been made to synthesize this research base. Seventeen single-case design studies with 55 participants were reviewed using meta-analytic procedures. A component analysis of practice elements was conducted and treatment intensity and feasibility were examined.…

  19. Oral Reading Fluency Growth: A Sample of Methodology and Findings. Research Brief 6

    ERIC Educational Resources Information Center

    Tindal, Gerald; Nese, Joseph F. T.

    2013-01-01

    For the past 20 years, the growth of students' oral reading fluency has been investigated by a number of researchers using curriculum-based measurement. These researchers have used varied methods (student samples, measurement procedures, and analytical techniques) and yet have converged on a relatively consistent finding: General education…

  20. Analytic tests and their relation to jet fuel thermal stability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heneghan, S.P.; Kauffman, R.E.

    1995-05-01

    The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions showmore » that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.« less

  1. Quality Assurance of RNA Expression Profiling in Clinical Laboratories

    PubMed Central

    Tang, Weihua; Hu, Zhiyuan; Muallem, Hind; Gulley, Margaret L.

    2012-01-01

    RNA expression profiles are increasingly used to diagnose and classify disease, based on expression patterns of as many as several thousand RNAs. To ensure quality of expression profiling services in clinical settings, a standard operating procedure incorporates multiple quality indicators and controls, beginning with preanalytic specimen preparation and proceeding thorough analysis, interpretation, and reporting. Before testing, histopathological examination of each cellular specimen, along with optional cell enrichment procedures, ensures adequacy of the input tissue. Other tactics include endogenous controls to evaluate adequacy of RNA and exogenous or spiked controls to evaluate run- and patient-specific performance of the test system, respectively. Unique aspects of quality assurance for array-based tests include controls for the pertinent outcome signatures that often supersede controls for each individual analyte, built-in redundancy for critical analytes or biochemical pathways, and software-supported scrutiny of abundant data by a laboratory physician who interprets the findings in a manner facilitating appropriate medical intervention. Access to high-quality reagents, instruments, and software from commercial sources promotes standardization and adoption in clinical settings, once an assay is vetted in validation studies as being analytically sound and clinically useful. Careful attention to the well-honed principles of laboratory medicine, along with guidance from government and professional groups on strategies to preserve RNA and manage large data sets, promotes clinical-grade assay performance. PMID:22020152

  2. A simple spectrophotometric method for determination of zirconium or hafnium in selected molybdenum-base alloys

    NASA Technical Reports Server (NTRS)

    Dupraw, W. A.

    1972-01-01

    A simple analytical procedure is described for accurately and precisely determining the zirconium or hafnium content of molybdenum-base alloys. The procedure is based on the reaction of the reagent Arsenazo III with zirconium or hafnium in strong hydrochloric acid solution. The colored complexes of zirconium or hafnium are formed in the presence of molybdenum. Titanium or rhenium in the alloy have no adverse effect on the zirconium or hafnium complex at the following levels in the selected aliquot: Mo, 10 mg; Re, 10 mg; Ti, 1 mg. The spectrophotometric measurement of the zirconium or hafnium complex is accomplished without prior separation with a relative standard deviation of 1.3 to 2.7 percent.

  3. Use of fractional factorial design for optimization of digestion procedures followed by multi-element determination of essential and non-essential elements in nuts using ICP-OES technique.

    PubMed

    Momen, Awad A; Zachariadis, George A; Anthemidis, Aristidis N; Stratis, John A

    2007-01-15

    Two digestion procedures have been tested on nut samples for application in the determination of essential (Cr, Cu, Fe, Mg, Mn, Zn) and non-essential (Al, Ba, Cd, Pb) elements by inductively coupled plasma-optical emission spectrometry (ICP-OES). These included wet digestions with HNO(3)/H(2)SO(4) and HNO(3)/H(2)SO(4)/H(2)O(2). The later one is recommended for better analytes recoveries (relative error<11%). Two calibrations (aqueous standard and standard addition) procedures were studied and proved that standard addition was preferable for all analytes. Experimental designs for seven factors (HNO(3), H(2)SO(4) and H(2)O(2) volumes, digestion time, pre-digestion time, temperature of the hot plate and sample weight) were used for optimization of sample digestion procedures. For this purpose Plackett-Burman fractional factorial design, which involve eight experiments was adopted. The factors HNO(3) and H(2)O(2) volume, and the digestion time were found to be the most important parameters. The instrumental conditions were also optimized (using peanut matrix rather than aqueous standard solutions) considering radio-frequency (rf) incident power, nebulizer argon gas flow rate and sample uptake flow rate. The analytical performance, such as limits of detection (LOD<0.74mugg(-1)), precision of the overall procedures (relative standard deviation between 2.0 and 8.2%) and accuracy (relative errors between 0.4 and 11%) were assessed statistically to evaluate the developed analytical procedures. The good agreement between measured and certified values for all analytes (relative error <11%) with respect to IAEA-331 (spinach leaves) and IAEA-359 (cabbage) indicates that the developed analytical method is well suited for further studies on the fate of major elements in nuts and possibly similar matrices.

  4. Net analyte signal-based simultaneous determination of ethanol and water by quartz crystal nanobalance sensor.

    PubMed

    Mirmohseni, A; Abdollahi, H; Rostamizadeh, K

    2007-02-28

    Net analyte signal (NAS)-based method called HLA/GO was applied for the selectively determination of binary mixture of ethanol and water by quartz crystal nanobalance (QCN) sensor. A full factorial design was applied for the formation of calibration and prediction sets in the concentration ranges 5.5-22.2 microg mL(-1) for ethanol and 7.01-28.07 microg mL(-1) for water. An optimal time range was selected by procedure which was based on the calculation of the net analyte signal regression plot in any considered time window for each test sample. A moving window strategy was used for searching the region with maximum linearity of NAS regression plot (minimum error indicator) and minimum of PRESS value. On the base of obtained results, the differences on the adsorption profiles in the time range between 1 and 600 s were used to determine mixtures of both compounds by HLA/GO method. The calculation of the net analytical signal using HLA/GO method allows determination of several figures of merit like selectivity, sensitivity, analytical sensitivity and limit of detection, for each component. To check the ability of the proposed method in the selection of linear regions of adsorption profile, a test for detecting non-linear regions of adsorption profile data in the presence of methanol was also described. The results showed that the method was successfully applied for the determination of ethanol and water.

  5. Source-term development for a contaminant plume for use by multimedia risk assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.

    1999-12-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less

  6. Handheld magnetic sensor for measurement of tension

    NASA Astrophysics Data System (ADS)

    Singal, K.; Rajamani, R.

    2012-04-01

    This letter develops an analytical formulation for measurement of tension in a string using a handheld sensor. By gently pushing the sensor against the string, the tension in the string can be obtained. An experimental sensor prototype is constructed to verify the analytical formulation. The centimeter-sized prototype utilizes three moving pistons and magnetic field based measurements of their positions. Experimental data show that the sensor can accurately measure tension on a bench top rig. The developed sensor could be useful in a variety of orthopedic surgical procedures, including knee replacement, hip replacement, ligament repair, shoulder stabilization, and tendon repair.

  7. Biosensors for the determination of environmental inhibitors of enzymes

    NASA Astrophysics Data System (ADS)

    Evtugyn, Gennadii A.; Budnikov, Herman C.; Nikolskaya, Elena B.

    1999-12-01

    Characteristic features of functioning and practical application of enzyme-based biosensors for the determination of environmental pollutants as enzyme inhibitors are considered with special emphasis on the influence of the methods used for the measurement of the rates of enzymic reactions, of enzyme immobilisation procedure and of the composition of the reaction medium on the analytical characteristics of inhibitor assays. The published data on the development of biosensors for detecting pesticides and heavy metals are surveyed. Special attention is given to the use of cholinesterase-based biosensors in environmental and analytical monitoring. The approaches to the estimation of kinetic parameters of inhibition are reviewed and the factors determining the selectivity and sensitivity of inhibitor assays in environmental objects are analysed. The bibliography includes 195 references.

  8. Using functional neuroimaging combined with a think-aloud protocol to explore clinical reasoning expertise in internal medicine.

    PubMed

    Durning, Steven J; Graner, John; Artino, Anthony R; Pangaro, Louis N; Beckman, Thomas; Holmboe, Eric; Oakes, Terrance; Roy, Michael; Riedy, Gerard; Capaldi, Vincent; Walter, Robert; van der Vleuten, Cees; Schuwirth, Lambert

    2012-09-01

    Clinical reasoning is essential to medical practice, but because it entails internal mental processes, it is difficult to assess. Functional magnetic resonance imaging (fMRI) and think-aloud protocols may improve understanding of clinical reasoning as these methods can more directly assess these processes. The objective of our study was to use a combination of fMRI and think-aloud procedures to examine fMRI correlates of a leading theoretical model in clinical reasoning based on experimental findings to date: analytic (i.e., actively comparing and contrasting diagnostic entities) and nonanalytic (i.e., pattern recognition) reasoning. We hypothesized that there would be functional neuroimaging differences between analytic and nonanalytic reasoning theory. 17 board-certified experts in internal medicine answered and reflected on validated U.S. Medical Licensing Exam and American Board of Internal Medicine multiple-choice questions (easy and difficult) during an fMRI scan. This procedure was followed by completion of a formal think-aloud procedure. fMRI findings provide some support for the presence of analytic and nonanalytic reasoning systems. Statistically significant activation of prefrontal cortex distinguished answering incorrectly versus correctly (p < 0.01), whereas activation of precuneus and midtemporal gyrus distinguished not guessing from guessing (p < 0.01). We found limited fMRI evidence to support analytic and nonanalytic reasoning theory, as our results indicate functional differences with correct vs. incorrect answers and guessing vs. not guessing. However, our findings did not suggest one consistent fMRI activation pattern of internal medicine expertise. This model of employing fMRI correlates offers opportunities to enhance our understanding of theory, as well as improve our teaching and assessment of clinical reasoning, a key outcome of medical education.

  9. Space proton transport in one dimension

    NASA Technical Reports Server (NTRS)

    Lamkin, S. L.; Khandelwal, G. S.; Shinn, J. L.; Wilson, J. W.

    1994-01-01

    An approximate evaluation procedure is derived for a second-order theory of coupled nucleon transport in one dimension. An analytical solution with a simplified interaction model is used to determine quadrature parameters to minimize truncation error. Effects of the improved method on transport solutions with the BRYNTRN data base are evaluated. Comparisons with Monte Carlo benchmarks are given. Using different shield materials, the computational procedure is used to study the physics of space protons. A transition effect occurs in tissue near the shield interface and is most important in shields of high atomic number.

  10. SRC-I demonstration plant analytical laboratory methods manual. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klusaritz, M.L.; Tewari, K.C.; Tiedge, W.F.

    1983-03-01

    This manual is a compilation of analytical procedures required for operation of a Solvent-Refined Coal (SRC-I) demonstration or commercial plant. Each method reproduced in full includes a detailed procedure, a list of equipment and reagents, safety precautions, and, where possible, a precision statement. Procedures for the laboratory's environmental and industrial hygiene modules are not included. Required American Society for Testing and Materials (ASTM) methods are cited, and ICRC's suggested modifications to these methods for handling coal-derived products are provided.

  11. Background Contamination by Coplanar Polychlorinated Biphenyls (PCBS) in Trace Level High Resolution Gas Chromatography/High Resolution Mass Spectrometry (HRGC/HRMS) Analytical Procedures

    EPA Science Inventory

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for t...

  12. High-throughput biomonitoring of dioxins and polychlorinated biphenyls at the sub-picogram level in human serum.

    PubMed

    Focant, Jean-François; Eppe, Gauthier; Massart, Anne-Cécile; Scholl, Georges; Pirard, Catherine; De Pauw, Edwin

    2006-10-13

    We report on the use of a state-of-the-art method for the measurement of selected polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls in human serum specimens. The sample preparation procedure is based on manual small size solid-phase extraction (SPE) followed by automated clean-up and fractionation using multi-sorbent liquid chromatography columns. SPE cartridges and all clean-up columns are disposable. Samples are processed in batches of 20 units, including one blank control (BC) sample and one quality control (QC) sample. The analytical measurement is performed using gas chromatography coupled to isotope dilution high-resolution mass spectrometry. The sample throughput corresponds to one series of 20 samples per day, from sample reception to data quality cross-check and reporting, once the procedure has been started and series of samples keep being produced. Four analysts are required to ensure proper performances of the procedure. The entire procedure has been validated under International Organization for Standardization (ISO) 17025 criteria and further tested over more than 1500 unknown samples during various epidemiological studies. The method is further discussed in terms of reproducibility, efficiency and long-term stability regarding the 35 target analytes. Data related to quality control and limit of quantification (LOQ) calculations are also presented and discussed.

  13. Linearized semiclassical initial value time correlation functions with maximum entropy analytic continuation.

    PubMed

    Liu, Jian; Miller, William H

    2008-09-28

    The maximum entropy analytic continuation (MEAC) method is used to extend the range of accuracy of the linearized semiclassical initial value representation (LSC-IVR)/classical Wigner approximation for real time correlation functions. LSC-IVR provides a very effective "prior" for the MEAC procedure since it is very good for short times, exact for all time and temperature for harmonic potentials (even for correlation functions of nonlinear operators), and becomes exact in the classical high temperature limit. This combined MEAC+LSC/IVR approach is applied here to two highly nonlinear dynamical systems, a pure quartic potential in one dimensional and liquid para-hydrogen at two thermal state points (25 and 14 K under nearly zero external pressure). The former example shows the MEAC procedure to be a very significant enhancement of the LSC-IVR for correlation functions of both linear and nonlinear operators, and especially at low temperature where semiclassical approximations are least accurate. For liquid para-hydrogen, the LSC-IVR is seen already to be excellent at T=25 K, but the MEAC procedure produces a significant correction at the lower temperature (T=14 K). Comparisons are also made as to how the MEAC procedure is able to provide corrections for other trajectory-based dynamical approximations when used as priors.

  14. A deterministic model of electron transport for electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Bünger, J.; Richter, S.; Torrilhon, M.

    2018-01-01

    Within the last decades significant improvements in the spatial resolution of electron probe microanalysis (EPMA) were obtained by instrumental enhancements. In contrast, the quantification procedures essentially remained unchanged. As the classical procedures assume either homogeneity or a multi-layered structure of the material, they limit the spatial resolution of EPMA. The possibilities of improving the spatial resolution through more sophisticated quantification procedures are therefore almost untouched. We investigate a new analytical model (M 1-model) for the quantification procedure based on fast and accurate modelling of electron-X-ray-matter interactions in complex materials using a deterministic approach to solve the electron transport equations. We outline the derivation of the model from the Boltzmann equation for electron transport using the method of moments with a minimum entropy closure and present first numerical results for three different test cases (homogeneous, thin film and interface). Taking Monte Carlo as a reference, the results for the three test cases show that the M 1-model is able to reproduce the electron dynamics in EPMA applications very well. Compared to classical analytical models like XPP and PAP, the M 1-model is more accurate and far more flexible, which indicates the potential of deterministic models of electron transport to further increase the spatial resolution of EPMA.

  15. Gas chromatographic-mass spectrometric characterisation of plant gums in samples from painted works of art.

    PubMed

    Bonaduce, Ilaria; Brecoulaki, Hariclia; Colombini, Maria Perla; Lluveras, Anna; Restivo, Vincenzo; Ribechini, Erika

    2007-12-21

    This paper presents an analytical GC-MS procedure to study the chemical composition of plant gums, determining aldoses and uronic acids in one step. The procedure is based on the silylation of aldoses and uronic acids, released from plant gums by microwave assisted hydrolysis, and previously converted into the corresponding diethyl-dithioacetals and diethyl-dithioacetal lactones. Using this method only one peak for each compound is obtained, thus providing simple and highly reproducible chromatograms. The analytical procedure was optimised using reference samples of raw plant gums (arabic, karaya, ghatti, guar, locust bean and tragacanth, cherry, plum and peach gums), commercial watercolours and paint layers prepared according to ancient recipes at the Opificio delle Pietre Dure of Florence (Italy). To identify gum media in samples of unknown composition, a decisional schema for the gum identification and the principal component analysis of the relative sugar percentage contents were employed. The procedure was used to study samples collected from wall paintings from Macedonian tombs (4th-3rd centuries bc) and from the Mycenaean "Palace of Nestor" (13th century bc) in Pylos, Greece. The presence of carbohydrates was ascertained and plant gum binders (fruit and a mixture of tragacanth and fruit tree gums) were identified in some of the samples.

  16. Ratio of sequential chromatograms for quantitative analysis and peak deconvolution: Application to standard addition method and process monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.

    1990-08-01

    This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less

  17. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  18. Electrostatic potential map modelling with COSY Infinity

    NASA Astrophysics Data System (ADS)

    Maloney, J. A.; Baartman, R.; Planche, T.; Saminathan, S.

    2016-06-01

    COSY Infinity (Makino and Berz, 2005) is a differential-algebra based simulation code which allows accurate calculation of transfer maps to arbitrary order. COSY's existing internal procedures were modified to allow electrostatic elements to be specified using an array of field potential data from the midplane. Additionally, a new procedure was created allowing electrostatic elements and their fringe fields to be specified by an analytic function. This allows greater flexibility in accurately modelling electrostatic elements and their fringe fields. Applied examples of these new procedures are presented including the modelling of a shunted electrostatic multipole designed with OPERA, a spherical electrostatic bender, and the effects of different shaped apertures in an electrostatic beam line.

  19. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies.

    PubMed

    Chen, Hui; Luthra, Rajyalakshmi; Goswami, Rashmi S; Singh, Rajesh R; Roy-Chowdhuri, Sinchita

    2015-08-28

    Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.

  20. Design and construction of an Offner spectrometer based on geometrical analysis of ring fields.

    PubMed

    Kim, Seo Hyun; Kong, Hong Jin; Lee, Jong Ung; Lee, Jun Ho; Lee, Jai Hoon

    2014-08-01

    A method to obtain an aberration-corrected Offner spectrometer without ray obstruction is proposed. A new, more efficient spectrometer optics design is suggested in order to increase its spectral resolution. The derivation of a new ring equation to eliminate ray obstruction is based on geometrical analysis of the ring fields for various numerical apertures. The analytical design applying this equation was demonstrated using the optical design software Code V in order to manufacture a spectrometer working in wavelengths of 900-1700 nm. The simulation results show that the new concept offers an analytical initial design taking the least time of calculation. The simulated spectrometer exhibited a modulation transfer function over 80% at Nyquist frequency, root-mean-square spot diameters under 8.6 μm, and a spectral resolution of 3.2 nm. The final design and its realization of a high resolution Offner spectrometer was demonstrated based on the simulation result. The equation and analytical design procedure shown here can be applied to most Offner systems regardless of the wavelength range.

  1. Procedures For Microbial-Ecology Laboratory

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1993-01-01

    Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.

  2. Meta-Analysis of Single-Case Design Research: Introduction to the Special Issue

    ERIC Educational Resources Information Center

    Burns, Matthew K.

    2012-01-01

    Single-case design (SCD) research focuses on finding powerful effects, but the influence of this methodology on the evidence-based practice (EBP) movement is questionable. Meta-analytic procedures may help facilitate the role of SCD research in the EBP movement, but meta-analyses of SCDs are controversial. The current article provides an…

  3. An Analytical Chemistry Experiment in Simultaneous Spectrophotometric Determination of Fe(III) and Cu(II) with Hexacyanoruthenate(II) Reagent.

    ERIC Educational Resources Information Center

    Mehra, M. C.; Rioux, J.

    1982-01-01

    Experimental procedures, typical observations, and results for the simultaneous analysis of Fe(III) and Cu(II) in a solution are discussed. The method is based on selective interaction between the two ions and potassium hexacyanoruthenate(II) in acid solution involving no preliminary sample preparations. (Author/JN)

  4. Evaluation of a proposed standardized analytical method for the determination of humic and fulvic acids in commercial products

    USDA-ARS?s Scientific Manuscript database

    A constraint to growth of the commercial humic products industry has been the lack of a widely accepted procedure for determining humic acid and fulvic acid concentrations of the products, which has raised regulatory issues. On behalf of the U.S.-based Humic Products Trade Association, we developed ...

  5. Profiling Differences in Achievement and Social Goals of Students at Different Levels of Expertise

    ERIC Educational Resources Information Center

    O'Malley, Patricia Tenowich; Sonnenschein, Susan

    2010-01-01

    The purpose of this study was to integrate domain-learning theory and goal theory to investigate the learning processes, achievement goals, social goals, and achievement of 141 college students. Cluster-analytic procedures were used to categorize participants at different levels of expertise based on their responses on knowledge, interest, and…

  6. MULTIRESIDUE DETERMINATION OF ACIDIC PESTICIDES IN WATER BY HPLC/DAD WITH CONFIRMATION BY GC/MS USING CONVERSION TO THE METHYL ESTER WITH TRIMETHYLSILYDIAZOMETHANE

    EPA Science Inventory

    A multiresidue pesticide methodology has been studied and results for acidics are reported here with base/neutral to follow. This work studies a literature procedure as a possible general approach to many pesticides and potentially other analytes that are considered to be liquid...

  7. Comparison of three sampling and analytical methods for the determination of airborne hexavalent chromium.

    PubMed

    Boiano, J M; Wallace, M E; Sieber, W K; Groff, J H; Wang, J; Ashley, K

    2000-08-01

    A field study was conducted with the goal of comparing the performance of three recently developed or modified sampling and analytical methods for the determination of airborne hexavalent chromium (Cr(VI)). The study was carried out in a hard chrome electroplating facility and in a jet engine manufacturing facility where airborne Cr(VI) was expected to be present. The analytical methods evaluated included two laboratory-based procedures (OSHA Method ID-215 and NIOSH Method 7605) and a field-portable method (NIOSH Method 7703). These three methods employ an identical sampling methodology: collection of Cr(VI)-containing aerosol on a polyvinyl chloride (PVC) filter housed in a sampling cassette, which is connected to a personal sampling pump calibrated at an appropriate flow rate. The basis of the analytical methods for all three methods involves extraction of the PVC filter in alkaline buffer solution, chemical isolation of the Cr(VI) ion, complexation of the Cr(VI) ion with 1,5-diphenylcarbazide, and spectrometric measurement of the violet chromium diphenylcarbazone complex at 540 nm. However, there are notable specific differences within the sample preparation procedures used in three methods. To assess the comparability of the three measurement protocols, a total of 20 side-by-side air samples were collected, equally divided between a chromic acid electroplating operation and a spray paint operation where water soluble forms of Cr(VI) were used. A range of Cr(VI) concentrations from 0.6 to 960 microg m(-3), with Cr(VI) mass loadings ranging from 0.4 to 32 microg, was measured at the two operations. The equivalence of the means of the log-transformed Cr(VI) concentrations obtained from the different analytical methods was compared. Based on analysis of variance (ANOVA) results, no statistically significant differences were observed between mean values measured using each of the three methods. Small but statistically significant differences were observed between results obtained from performance evaluation samples for the NIOSH field method and the OSHA laboratory method.

  8. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  9. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  10. SAM Companion Documents

    EPA Pesticide Factsheets

    SAM Companion Documents and Sample Collection Procedures provide information intended to complement the analytical methods listed in Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  11. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey

    NASA Astrophysics Data System (ADS)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.

    2016-07-01

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  12. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  13. Determining a carbohydrate profile for Hansenula polymorpha

    NASA Technical Reports Server (NTRS)

    Petersen, G. R.

    1985-01-01

    The determination of the levels of carbohydrates in the yeast Hansenula polymorpha required the development of new analytical procedures. Existing fractionation and analytical methods were adapted to deal with the problems involved with the lysis of whole cells. Using these new procedures, the complete carbohydrate profiles of H. polymorpha and selected mutant strains were determined and shown to correlate favourably with previously published results.

  14. Performance-based, cost- and time-effective pcb analytical methodology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alvarado, J. S.

    1998-06-11

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less

  15. Determination of lead in bone tissues by axially viewed inductively coupled plasma multichannel-based emission spectrometry.

    PubMed

    Grotti, Marco; Abelmoschi, Maria Luisa; Dalla Riva, Simona; Soggia, Francesco; Frache, Roberto

    2005-04-01

    A new procedure for determining low levels of lead in bone tissues has been developed. After wet acid digestion in a pressurized microwave-heated system, the solution was analyzed by inductively coupled plasma multichannel-based emission spectrometry. Internal standardization using the Co 228.615 nm reference line was chosen as the optimal method to compensate for the matrix effects from the presence of calcium and nitric acid at high concentration levels. The detection limit of the procedure was 0.11 microg Pb g(-1) dry mass. Instrumental precision at the analytical concentration of approximately 10 microg l(-1) ranged from 6.1 to 9.4%. Precision of the sample preparation step was 5.4%. The concentration of lead in SRM 1486 (1.32+/-0.04 microg g(-1)) found using the new procedure was in excellent agreement with the certified level (1.335+/-0.014 microg g(-1)). Finally, the method was applied to determine the lead in various fish bone tissues, and the analytical results were found to be in good agreement with those obtained through differential pulse anodic stripping voltammetry. The method is therefore suitable for the reliable determination of lead at concentration levels of below 1 microg g(-1) in bone samples. Moreover, the multi-element capability of the technique allows us to simultaneously determine other major or trace elements in order to investigate inter-element correlation and to compute enrichment factors, making the proposed procedure particularly useful for investigating lead occurrence and pathways in fish bone tissues in order to find suitable biomarkers for the Antarctic marine environment.

  16. Selecting Statistical Quality Control Procedures for Limiting the Impact of Increases in Analytical Random Error on Patient Safety.

    PubMed

    Yago, Martín

    2017-05-01

    QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.

  17. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  18. Flow-batch analysis of clenbuterol based on analyte extraction on molecularly imprinted polymers coupled to an in-system chromogenic reaction. Application to human urine and milk substitute samples.

    PubMed

    González, Natalia; Grünhut, Marcos; Šrámková, Ivana; Lista, Adriana G; Horstkotte, Burkhard; Solich, Petr; Sklenářová, Hana; Acebal, Carolina C

    2018-02-01

    A fully automated spectrophotometric method based on flow-batch analysis has been developed for the determination of clenbuterol including an on-line solid phase extraction using a molecularly imprinted polymer (MIP) as the sorbent. The molecularly imprinted solid phase extraction (MISPE) procedure allowed analyte extraction from complex matrices at low concentration levels and with high selectivity towards the analyte. The MISPE procedure was performed using a commercial MIP cartridge that was introduced into a guard column holder and integrated in the analyzer system. Optimized parameters included the volume of the sample, the type and volume of the conditioning and washing solutions, and the type and volume of the eluent. Quantification of clenbuterol was carried out by spectrophotometry after in-system post-elution analyte derivatization based on azo-coupling using N- (1-Naphthyl) ethylenediamine as the coupling agent to yield a red-colored compound with maximum absorbance at 500nm. Both the chromogenic reaction and spectrophotometric detection were performed in a lab-made flow-batch mixing chamber that replaced the cuvette holder of the spectrophotometer. The calibration curve was linear in the 0.075-0.500mgL -1 range with a correlation coefficient of 0.998. The precision of the proposed method was evaluated in terms of the relative standard deviation obtaining 1.1% and 3.0% for intra-day precision and inter-day precision, respectively. The detection limit was 0.021mgL -1 and the sample throughput for the entire process was 3.4h -1 . The proposed method was applied for the determination of CLB in human urine and milk substitute samples obtaining recoveries values within a range of 94.0-100.0%. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  20. Determination of steroid hormones and related compounds in filtered and unfiltered water by solid-phase extraction, derivatization, and gas chromatography with tandem mass spectrometry

    USGS Publications Warehouse

    Foreman, William T.; Gray, James L.; ReVello, Rhiannon C.; Lindley, Chris E.; Losche, Scott A.; Barber, Larry B.

    2012-01-01

    A new analytical method has been developed and implemented at the U.S. Geological Survey National Water Quality Laboratory that determines a suite of 20 steroid hormones and related compounds in filtered water (using laboratory schedule 2434) and in unfiltered water (using laboratory schedule 4434). This report documents the procedures and initial performance data for the method and provides guidance on application of the method and considerations of data quality in relation to data interpretation. The analytical method determines 6 natural and 3 synthetic estrogen compounds, 6 natural androgens, 1 natural and 1 synthetic progestin compound, and 2 sterols: cholesterol and 3--coprostanol. These two sterols have limited biological activity but typically are abundant in wastewater effluents and serve as useful tracers. Bisphenol A, an industrial chemical used primarily to produce polycarbonate plastic and epoxy resins and that has been shown to have estrogenic activity, also is determined by the method. A technique referred to as isotope-dilution quantification is used to improve quantitative accuracy by accounting for sample-specific procedural losses in the determined analyte concentration. Briefly, deuterium- or carbon-13-labeled isotope-dilution standards (IDSs), all of which are direct or chemically similar isotopic analogs of the method analytes, are added to all environmental and quality-control and quality-assurance samples before extraction. Method analytes and IDS compounds are isolated from filtered or unfiltered water by solid-phase extraction onto an octadecylsilyl disk, overlain with a graded glass-fiber filter to facilitate extraction of unfiltered sample matrices. The disks are eluted with methanol, and the extract is evaporated to dryness, reconstituted in solvent, passed through a Florisil solid-phase extraction column to remove polar organic interferences, and again evaporated to dryness in a reaction vial. The method compounds are reacted with activated -methyl--trimethylsilyl trifluoroacetamide at 65 degrees Celsius for 1 hour to form trimethylsilyl or trimethylsilyl-enol ether derivatives that are more amenable to gas chromatographic separation than the underivatized compounds. Analysis is carried out by gas chromatography with tandem mass spectrometry using calibration standards that are derivatized concurrently with the sample extracts. Analyte concentrations are quantified relative to specific IDS compounds in the sample, which directly compensate for procedural losses (incomplete recovery) in the determined and reported analyte concentrations. Thus, reported analyte concentrations (or analyte recoveries for spiked samples) are corrected based on recovery of the corresponding IDS compound during the quantification process. Recovery for each IDS compound is reported for each sample and represents an absolute recovery in a manner comparable to surrogate recoveries for other organic methods used by the National Water Quality Laboratory. Thus, IDS recoveries provide a useful tool for evaluating sample-specific analytical performance from an absolute mass recovery standpoint. IDS absolute recovery will differ and typically be lower than the corresponding analyte’s method recovery in spiked samples. However, additional correction of reported analyte concentrations is unnecessary and inappropriate because the analyte concentration (or recovery) already is compensated for by the isotope-dilution quantification procedure. Method analytes were spiked at 10 and 100 nanograms per liter (ng/L) for most analytes (10 times greater spike levels were used for bisphenol A and 100 times greater spike levels were used for 3--coprostanol and cholesterol) into the following validation-sample matrices: reagent water, wastewater-affected surface water, a secondary-treated wastewater effluent, and a primary (no biological treatment) wastewater effluent. Overall method recovery for all analytes in these matrices averaged 100 percent, with overall relative standard deviation of 28 percent. Mean recoveries of the 20 individual analytes for spiked reagent-water samples prepared along with field samples and analyzed in 2009–2010 ranged from 84–104 percent, with relative standard deviations of 6–36 percent. Concentrations for two analytes, equilin and progesterone, are reported as estimated because these analytes had excessive bias or variability, or both. Additional database coding is applied to other reported analyte data as needed, based on sample-specific IDS recovery performance. Detection levels were derived statistically by fortifying reagent water at six different levels (0.1 to 4 ng/L) and range from about 0.4 to 4 ng/L for 16 analytes. Interim reporting levels applied to analytes in this report range from 0.8 to 8 ng/L. Bisphenol A and the sterols (cholesterol and 3-beta-coprostanol) were consistently detected in laboratory and field blanks. The minimum reporting levels were set at 100 ng/L for bisphenol A and at 200 ng/L for the two sterols to prevent any bias associated with the presence of these compounds in the blanks. A minimum reporting level of 2 ng/L was set for 11-ketotestosterone to minimize false positive risk from an interfering siloxane compound emanating as chromatographic-column bleed, from vial septum material, or from other sources at no more than 1 ng/L.

  1. Genetics-based methods for detection of Salmonella spp. in foods.

    PubMed

    Mozola, Mark A

    2006-01-01

    Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.

  2. Evaluation of Transverse Thermal Stresses in Composite Plates Based on First-Order Shear Deformation Theory

    NASA Technical Reports Server (NTRS)

    Rolfes, R.; Noor, A. K.; Sparr, H.

    1998-01-01

    A postprocessing procedure is presented for the evaluation of the transverse thermal stresses in laminated plates. The analytical formulation is based on the first-order shear deformation theory and the plate is discretized by using a single-field displacement finite element model. The procedure is based on neglecting the derivatives of the in-plane forces and the twisting moments, as well as the mixed derivatives of the bending moments, with respect to the in-plane coordinates. The calculated transverse shear stiffnesses reflect the actual stacking sequence of the composite plate. The distributions of the transverse stresses through-the-thickness are evaluated by using only the transverse shear forces and the thermal effects resulting from the finite element analysis. The procedure is implemented into a postprocessing routine which can be easily incorporated into existing commercial finite element codes. Numerical results are presented for four- and ten-layer cross-ply laminates subjected to mechanical and thermal loads.

  3. An analytical method for free vibration analysis of functionally graded beams with edge cracks

    NASA Astrophysics Data System (ADS)

    Wei, Dong; Liu, Yinghua; Xiang, Zhihai

    2012-03-01

    In this paper, an analytical method is proposed for solving the free vibration of cracked functionally graded material (FGM) beams with axial loading, rotary inertia and shear deformation. The governing differential equations of motion for an FGM beam are established and the corresponding solutions are found first. The discontinuity of rotation caused by the cracks is simulated by means of the rotational spring model. Based on the transfer matrix method, then the recurrence formula is developed to get the eigenvalue equations of free vibration of FGM beams. The main advantage of the proposed method is that the eigenvalue equation for vibrating beams with an arbitrary number of cracks can be conveniently determined from a third-order determinant. Due to the decrease in the determinant order as compared with previous methods, the developed method is simpler and more convenient to analytically solve the free vibration problem of cracked FGM beams. Moreover, free vibration analyses of the Euler-Bernoulli and Timoshenko beams with any number of cracks can be conducted using the unified procedure based on the developed method. These advantages of the proposed procedure would be more remarkable as the increase of the number of cracks. A comprehensive analysis is conducted to investigate the influences of the location and total number of cracks, material properties, axial load, inertia and end supports on the natural frequencies and vibration mode shapes of FGM beams. The present work may be useful for the design and control of damaged structures.

  4. An alternative analytical method based on ultrasound micro bath hydrolysis and GC-MS analysis for the characterization of organic biomarkers in archaeological ceramics.

    PubMed

    Blanco-Zubiaguirre, Laura; Olivares, Maitane; Castro, Kepa; Iñañez, Javier G; Madariaga, Juan Manuel

    2016-11-01

    The analysis of organic biomarkers in ancient and valuable archaeological remains provides a worthwhile source of information regarding their management. This work was focused on the development of an analytical procedure to characterize organic residues that have remained in archaeological ceramic samples. A novel analytical approach based on an alkaline hydrolysis by means of an ultrasound micro bath followed by liquid extraction was proposed to isolate saturated and unsaturated fatty acids, degradation products such as dihydroxy acids or dienoic fatty acids, isoprenoid fatty acids, and many other biomarkers from archaeological remains. This main goal has been achieved after the optimization of the main parameters affecting the hydrolysis step, the extraction procedure, and the derivatization step prior to the gas chromatography-mass spectrometry analysis. In this work, archaeological ceramic remains suspected to have been used by Basque Whalers to store whale oil in the period from the sixteenth to the seventeenth century were studied. Nevertheless, the proposed method is useful to determine the organic remains preserved in many other archaeological ceramic remains. Moreover, this methodology can be used to determine organic remains in any porous ceramic, archaeological or not. The preliminary results of the analysis of ceramic vessels led to the determination of some interesting unsaturated compounds such as 11-eicosenoic acid, an important biomarker of marine commodities, and several saturated fatty acids, which could be indicative of having used the vessels to store whale oil. Graphical abstract ᅟ.

  5. Dextroamphetamine: a pharmacologic countermeasure for space motion sickness and orthostatic dysfunction

    NASA Technical Reports Server (NTRS)

    Snow, L. Dale

    1996-01-01

    Dextroamphetamine has potential as a pharmacologic agent for the alleviation of two common health effects associated with microgravity. As an adjuvant to Space Motion Sickness (SMS) medication, dextroamphetamine can enhance treatment efficacy by reducing undesirable Central Nervous System (CNS) side effects of SMS medications. Secondly, dextroamphetamine may be useful for the prevention of symptoms of post-mission orthostatic intolerance caused by cardiovascular deconditioning during spaceflight. There is interest in developing an intranasal delivery form of dextroamphetanmine for use as a countermeasure in microgravity conditions. Development of this dosage form will require an analytical detection method with sensitivity in the low ng range (1 to 100 ng/mL). During the 1995 Summer Faculty Fellowship Program, two analytical methods were developed and evaluated for their suitability as quantitative procedures for dextroamphetamine in studies of product stability, bioavailability assessment, and pharmacokinetic evaluation. In developing some of the analytical methods, beta-phenylethylamine, a primary amine structurally similar to dextroamphetamine, was used. The first analytical procedure to be evaluated involved hexane extraction and subsequent fluorescamine labeling of beta-phenylethylamine. The second analytical procedure to be evaluated involved quantitation of dextroamphetamine by an Enzyme-Linked ImmunoSorbent Assay (ELISA).

  6. Guided-inquiry laboratory experiments to improve students' analytical thinking skills

    NASA Astrophysics Data System (ADS)

    Wahyuni, Tutik S.; Analita, Rizki N.

    2017-12-01

    This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.

  7. An efficient and numerically stable procedure for generating sextic force fields in normal mode coordinates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sibaev, M.; Crittenden, D. L., E-mail: deborah.crittenden@canterbury.ac.nz

    In this paper, we outline a general, scalable, and black-box approach for calculating high-order strongly coupled force fields in rectilinear normal mode coordinates, based upon constructing low order expansions in curvilinear coordinates with naturally limited mode-mode coupling, and then transforming between coordinate sets analytically. The optimal balance between accuracy and efficiency is achieved by transforming from 3 mode representation quartic force fields in curvilinear normal mode coordinates to 4 mode representation sextic force fields in rectilinear normal modes. Using this reduced mode-representation strategy introduces an error of only 1 cm{sup −1} in fundamental frequencies, on average, across a sizable testmore » set of molecules. We demonstrate that if it is feasible to generate an initial semi-quartic force field in curvilinear normal mode coordinates from ab initio data, then the subsequent coordinate transformation procedure will be relatively fast with modest memory demands. This procedure facilitates solving the nuclear vibrational problem, as all required integrals can be evaluated analytically. Our coordinate transformation code is implemented within the extensible PyPES library program package, at http://sourceforge.net/projects/pypes-lib-ext/.« less

  8. A finite-element method for large-amplitude, two-dimensional panel flutter at hypersonic speeds

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Gray, Carl E.

    1989-01-01

    The nonlinear flutter behavior of a two-dimensional panel in hypersonic flow is investigated analytically. An FEM formulation based unsteady third-order piston theory (Ashley and Zartarian, 1956; McIntosh, 1970) and taking nonlinear structural and aerodynamic phenomena into account is derived; the solution procedure is outlined; and typical results are presented in extensive tables and graphs. A 12-element finite-element solution obtained using an alternative method for linearizing the assumed limit-cycle time function is shown to give predictions in good agreement with classical analytical results for large-amplitude vibration in a vacuum and large-amplitude panel flutter, using linear aerodynamics.

  9. Fluorometric method for the determination of gas-phase hydrogen peroxide

    NASA Technical Reports Server (NTRS)

    Kok, Gregory L.; Lazrus, Allan L.

    1986-01-01

    The fluorometric gas-phase hydrogen peroxide procedure is based on the technique used by Lazrus et. al. for the determination of H2O2 in the liquid phase. The analytical method utilizes the reaction of H2O2 with horseradish peroxidase and p-hydroxphenylacetic acid (POPHA) to form the fluorescent dimer of POPHA. The analytical reaction responds stoichiometrically to both H2O2 and some organic hydroperoxides. To discriminate H2O2 from organic hydroperoxides, catalase is used to preferentially destroy H2O2. Using a dual-channel flow system the H2O2 concentration is determined by difference.

  10. Numerical design of streamlined tunnel walls for a two-dimensional transonic test

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Anderson, E. C.

    1978-01-01

    An analytical procedure is discussed for designing wall shapes for streamlined, nonporous, two-dimensional, transonic wind tunnels. It is based upon currently available 2-D inviscid transonic and boundary layer analysis computer programs. Predicted wall shapes are compared with experimental data obtained from the NASA Langley 6 by 19 inch Transonic Tunnel where the slotted walls were replaced by flexible nonporous walls. Comparisons are presented for the empty tunnel operating at a Mach number of 0.9 and for a supercritical test of an NACA 0012 airfoil at zero lift. Satisfactory agreement is obtained between the analytically and experimentally determined wall shapes.

  11. Analytical solutions for systems of partial differential-algebraic equations.

    PubMed

    Benhammouda, Brahim; Vazquez-Leal, Hector

    2014-01-01

    This work presents the application of the power series method (PSM) to find solutions of partial differential-algebraic equations (PDAEs). Two systems of index-one and index-three are solved to show that PSM can provide analytical solutions of PDAEs in convergent series form. What is more, we present the post-treatment of the power series solutions with the Laplace-Padé (LP) resummation method as a useful strategy to find exact solutions. The main advantage of the proposed methodology is that the procedure is based on a few straightforward steps and it does not generate secular terms or depends of a perturbation parameter.

  12. Contamination in food from packaging material.

    PubMed

    Lau, O W; Wong, S K

    2000-06-16

    Packaging has become an indispensible element in the food manufacturing process, and different types of additives, such as antioxidants, stabilizers, lubricants, anti-static and anti-blocking agents, have also been developed to improve the performance of polymeric packaging materials. Recently the packaging has been found to represent a source of contamination itself through the migration of substances from the packaging into food. Various analytical methods have been developed to analyze the migrants in the foodstuff, and migration evaluation procedures based on theoretical prediction of migration from plastic food contact material were also introduced recently. In this paper, the regulatory control, analytical methodology, factors affecting the migration and migration evaluation are reviewed.

  13. Delay-tunable gap-soliton-based slow-light system

    NASA Astrophysics Data System (ADS)

    Mok, Joe T.; de Sterke, C. Martijn; Eggleton, Benjamin J.

    2006-12-01

    We numerically and analytically evaluate the delay of solitons propagating slowly, and without broadening, in an apodized Bragg grating. Simulations indicate that a 100 mm Bragg grating with Δn = 10-3 can delay sub-nanosecond pulses by nearly 20 pulse widths without any change in the output pulse width. Delay tunability is achieved by simultaneously adjusting the launch power and detuning. A simple analytic model is developed to describe the monotonic dependence of delay on Δn and compared with simulations. As the intensity may be greatly enhanced due to a reduced velocity, a procedure for improving the delay while avoiding material damage is outlined.

  14. Parametric study of minimum reactor mass in energy-storage dc-to-dc converters

    NASA Technical Reports Server (NTRS)

    Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.

    1981-01-01

    Closed-form analytical solutions for the design equations of a minimum-mass reactor for a two-winding voltage-or-current step-up converter are derived. A quantitative relationship between the three parameters - minimum total reactor mass, maximum output power, and switching frequency - is extracted from these analytical solutions. The validity of the closed-form solution is verified by a numerical minimization procedure. A computer-aided design procedure using commercially available toroidal cores and magnet wires is also used to examine how the results from practical designs follow the predictions of the analytical solutions.

  15. X-Graphs: Language and Algorithms for Heterogeneous Graph Streams

    DTIC Science & Technology

    2017-09-01

    INTRODUCTION 1 3 METHODS , ASUMPTIONS, AND PROCEDURES 2 Software Abstractions for Graph Analytic Applications 2 High performance Platforms for Graph Processing...data is stored in a distributed file system. 3 METHODS , ASUMPTIONS, AND PROCEDURES Software Abstractions for Graph Analytic Applications To...implementations of novel methods for networks analysis: several methods for detection of overlapping communities, personalized PageRank, node embeddings into a d

  16. Lab-on-chip systems for integrated bioanalyses

    PubMed Central

    Madaboosi, Narayanan; Soares, Ruben R.G.; Fernandes, João Tiago S.; Novo, Pedro; Moulas, Geraud; Chu, Virginia

    2016-01-01

    Biomolecular detection systems based on microfluidics are often called lab-on-chip systems. To fully benefit from the miniaturization resulting from microfluidics, one aims to develop ‘from sample-to-answer’ analytical systems, in which the input is a raw or minimally processed biological, food/feed or environmental sample and the output is a quantitative or qualitative assessment of one or more analytes of interest. In general, such systems will require the integration of several steps or operations to perform their function. This review will discuss these stages of operation, including fluidic handling, which assures that the desired fluid arrives at a specific location at the right time and under the appropriate flow conditions; molecular recognition, which allows the capture of specific analytes at precise locations on the chip; transduction of the molecular recognition event into a measurable signal; sample preparation upstream from analyte capture; and signal amplification procedures to increase sensitivity. Seamless integration of the different stages is required to achieve a point-of-care/point-of-use lab-on-chip device that allows analyte detection at the relevant sensitivity ranges, with a competitive analysis time and cost. PMID:27365042

  17. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  18. Simple and fast analysis of tetrabromobisphenol A, hexabromocyclododecane isomers, and polybrominated diphenyl ethers in serum using solid-phase extraction or QuEChERS extraction followed by tandem mass spectrometry coupled to HPLC and GC.

    PubMed

    Li, Jian; Chen, Tian; Wang, Yuwei; Shi, Zhixiong; Zhou, Xianqing; Sun, Zhiwei; Wang, Dejun; Wu, Yongning

    2017-02-01

    Two simplified sample preparation procedures for simultaneous extraction and clean-up of tetrabromobisphenol A, α-, β-, and γ-hexabromocyclododecane and polybrominated diphenyl ethers in human serum were developed and validated. The first procedure was based on solid-phase extraction. Sample extraction, purification, and lipid removal were carried out directly on an Oasis HLB cartridge. The second procedure was a quick, easy, cheap, effective, rugged, and safe-based approach using octadecyl-modified silica particles as a sorbent. After sample extraction and cleanup, tetrabromobisphenol A/hexabromocyclododecane was separated from polybrominated diphenyl ethers by using a Si-based cartridge. Tetrabromobisphenol A and hexabromocyclododecane were then detected by high-performance liquid chromatography coupled to tandem mass spectrometry, while polybrominated diphenyl ethers were detected by gas chromatography coupled to tandem mass spectrometry. The results of the spike recovery test using fetal bovine serum showed that the average recoveries of the analytes ranged from 87.3 to 115.3% with relative standard deviations equal to or lower than 13.4 %. Limits of detection of the analytes were in the range of 0.4-19 pg/mL except for decabromodiphenyl ether. The developed method was successfully applied to routine analysis of human serum samples from occupational workers and the general population. Extremely high serum polybrominated diphenyl ethers levels up to 3.32 × 10 4 ng/g lipid weight were found in occupational workers. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Rapid screening procedure based on headspace solid-phase microextraction and gas chromatography-mass spectrometry for the detection of many recreational drugs in hair.

    PubMed

    Gentili, Stefano; Cornetta, Maria; Macchia, Teodora

    2004-03-05

    An increasing number of synthetic drugs are appearing on the illicit market and on the scene of drug use by youngsters. Official figures are underestimated. In addition, immunochemical tests are blind to many of these drugs and appropriate analytical procedures for routine clinical and epidemiological purposes are lacking. Therefore, the perceived increasing abuse of recreational drugs has not been proved yet. In a previous paper, we proposed a procedure for the preliminary screening of several recreational substances in hair and other biological matrices. Unfortunately, this procedure cannot apply to cocaine. Consequently, we performed a new headspace solid-phase microextraction and gas chromatography-mass spectrometry (HS-SPME-GC-MS) procedure for the simultaneous detection of cocaine, amphetamine (A), methamphetamine (MA), methylen-dioxyamphetamine (MDA), methylen-dioxymethamphetamine (MDMA), methylen-dioxyethamphetamine (MDE), N-methyl-1-(1,3-benzodioxol-5-yl)-2-butanamine (MBDB), ketamine, and methadone in human hair. Hair was washed with water and acetone in an ultrasonic bath. A short acid extraction with 1M hydrochloric acid was needed; the fiber was exposed to a 5 min absorption at 90 degrees C and thermal desorption was performed at 250 degrees C for 3 min. The procedure was simple, rapid, required small quantities of sample and no derivatization. Good linearity was obtained over the 0.1-20.0 ng/mg range for the target compounds. Sensitivity was good enough: limits of detection (LOD) were 0.7 ng/mg of hair for the majority of substances. The intra-day precision ranged between 7 and 20%. This paper deals with the analytical performance of this procedure and its preliminary application to hair samples obtained on a voluntary basis from 183 young people (138 males and 45 females) in the Rome area.

  20. Development and Validation of a Fast Procedure to Analyze Amoxicillin in River Waters by Direct-Injection LC-MS/MS

    ERIC Educational Resources Information Center

    Homem, Vera; Alves, Arminda; Santos, Lu´cia

    2014-01-01

    A laboratory application with a strong component in analytical chemistry was designed for undergraduate students, in order to introduce a current problem in the environmental science field, the water contamination by antibiotics. Therefore, a simple and rapid method based on direct injection and high performance liquid chromatography-tandem mass…

  1. Determination of Hypochlorite in Bleaching Products with Flower Extracts to Demonstrate the Principles of Flow Injection Analysis

    ERIC Educational Resources Information Center

    Ramos, Luiz Antonio; Prieto, Katia Roberta; Carvalheiro, Eder Tadeu Gomes; Carvalheiro, Carla Cristina Schmitt

    2005-01-01

    The use of crude flower extracts to the principle of analytical chemistry automation, with the flow injection analysis (FIA) procedure developed to determine hypochlorite in household bleaching products was performed. The FIA comprises a group of techniques based on injection of a liquid sample into a moving, nonsegmented carrier stream of a…

  2. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  3. Two and three dimensional grid generation by an algebraic homotopy procedure

    NASA Technical Reports Server (NTRS)

    Moitra, Anutosh

    1990-01-01

    An algebraic method for generating two- and three-dimensional grid systems for aerospace vehicles is presented. The method is based on algebraic procedures derived from homotopic relations for blending between inner and outer boundaries of any given configuration. Stable properties of homotopic maps have been exploited to provide near-orthogonality and specified constant spacing at the inner boundary. The method has been successfully applied to analytically generated blended wing-body configurations as well as discretely defined geometries such as the High-Speed Civil Transport Aircraft. Grid examples representative of the capabilities of the method are presented.

  4. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  5. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  6. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  7. 40 CFR 1066.101 - Overview.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROCEDURES Equipment, Fuel, and Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and analytical gases. This section addresses emission sampling and analytical equipment, test fuels, and analytical gases. (b) The provisions of 40 CFR part 1065...

  8. 40 CFR 1066.101 - Overview.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROCEDURES Equipment, Fuel, and Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and analytical gases. This section addresses emission sampling and analytical equipment, test fuels, and analytical gases. (b) The provisions of 40 CFR part 1065...

  9. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  10. Interleaved Training and Training-Based Transmission Design for Hybrid Massive Antenna Downlink

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Jing, Yindi; Huang, Yongming; Yang, Luxi

    2018-06-01

    In this paper, we study the beam-based training design jointly with the transmission design for hybrid massive antenna single-user (SU) and multiple-user (MU) systems where outage probability is adopted as the performance measure. For SU systems, we propose an interleaved training design to concatenate the feedback and training procedures, thus making the training length adaptive to the channel realization. Exact analytical expressions are derived for the average training length and the outage probability of the proposed interleaved training. For MU systems, we propose a joint design for the beam-based interleaved training, beam assignment, and MU data transmissions. Two solutions for the beam assignment are provided with different complexity-performance tradeoff. Analytical results and simulations show that for both SU and MU systems, the proposed joint training and transmission designs achieve the same outage performance as the traditional full-training scheme but with significant saving in the training overhead.

  11. Prediction of turning stability using receptance coupling

    NASA Astrophysics Data System (ADS)

    Jasiewicz, Marcin; Powałka, Bartosz

    2018-01-01

    This paper presents an issue of machining stability prediction of dynamic "lathe - workpiece" system evaluated using receptance coupling method. Dynamic properties of the lathe components (the spindle and the tailstock) are assumed to be constant and can be determined experimentally based on the results of the impact test. Hence, the variable of the system "machine tool - holder - workpiece" is the machined part, which can be easily modelled analytically. The method of receptance coupling enables a synthesis of experimental (spindle, tailstock) and analytical (machined part) models, so impact testing of the entire system becomes unnecessary. The paper presents methodology of analytical and experimental models synthesis, evaluation of the stability lobes and experimental validation procedure involving both the determination of the dynamic properties of the system and cutting tests. In the summary the experimental verification results would be presented and discussed.

  12. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  13. Dynamics and Control of Flexible Space Vehicles

    NASA Technical Reports Server (NTRS)

    Likins, P. W.

    1970-01-01

    The purpose of this report is twofold: (1) to survey the established analytic procedures for the simulation of controlled flexible space vehicles, and (2) to develop in detail methods that employ a combination of discrete and distributed ("modal") coordinates, i.e., the hybrid-coordinate methods. Analytic procedures are described in three categories: (1) discrete-coordinate methods, (2) hybrid-coordinate methods, and (3) vehicle normal-coordinate methods. Each of these approaches is described and analyzed for its advantages and disadvantages, and each is found to have an area of applicability. The hybrid-coordinate method combines the efficiency of the vehicle normal-coordinate method with the versatility of the discrete-coordinate method, and appears to have the widest range of practical application. The results in this report have practical utility in two areas: (1) complex digital computer simulation of flexible space vehicles of arbitrary configuration subject to realistic control laws, and (2) preliminary control system design based on transfer functions for linearized models of dynamics and control laws.

  14. Coupling of Multiple Coulomb Scattering with Energy Loss and Straggling in HZETRN

    NASA Technical Reports Server (NTRS)

    Mertens, Christopher J.; Wilson, John W.; Walker, Steven A.; Tweed, John

    2007-01-01

    The new version of the HZETRN deterministic transport code based on Green's function methods, and the incorporation of ground-based laboratory boundary conditions, has lead to the development of analytical and numerical procedures to include off-axis dispersion of primary ion beams due to small-angle multiple Coulomb scattering. In this paper we present the theoretical formulation and computational procedures to compute ion beam broadening and a methodology towards achieving a self-consistent approach to coupling multiple scattering interactions with ionization energy loss and straggling. Our initial benchmark case is a 60 MeV proton beam on muscle tissue, for which we can compare various attributes of beam broadening with Monte Carlo simulations reported in the open literature.

  15. Quality assessment of patients’ self-monitoring of blood glucose in community pharmacies

    PubMed Central

    Kjome, Reidun L. S.; Granas, Anne G.; Nerhus, Kari; Sandberg, Sverre

    2009-01-01

    Objective To evaluate diabetes patients’ self-monitoring of blood glucose using a community pharmacy-based quality assurance procedure, to investigate whether the procedure improved the quality of the patient performance of self monitoring of blood glucose, and to examine the opinions of the patients taking part in the study. Methods The results of patient blood glucose measurements were compared to the results obtained with HemoCue Glucose 201+ by pharmacy employees in 16 Norwegian community pharmacies. Patient performance was monitored using an eight item checklist. Patients whose blood glucose measurements differed from pharmacy measurements by more than 20% were instructed in the correct use of their glucometer. The patients then re-measured their blood glucose. If the results were still outside the set limits, the control procedure was repeated with a new lot of glucometer strips, and then with a new glucometer. The patients returned for a follow-up visit after three months. Results During the first visit, 5% of the 338 patients had measurements that deviated from pharmacy blood glucose values by more than 20% and user errors were observed for 50% of the patients. At the second visit, there was no significant change in the analytical quality of patient measurements, but the percentage of patients who made user errors had decreased to 29% (p < 0.001). Eighty-five percent of the patients reported that they used their blood glucose results to adjust medication, exercise or meals. Fifty-one percent of the patients reported a greater trust in their measurements after the second visit. Eighty percent of patients wished to have their measurements assessed yearly. Of these patients, 83% preferred to have the assessment done at the community pharmacy. Conclusion A community pharmacy-based quality assessment procedure of patients’ self monitoring of blood glucose significantly reduced the number of user errors. The analytical quality of the patients’ measurements was good and did not improve further during the study. The high analytical quality might be explained by a selection bias of participating patients. Patients also reported increased confidence in their blood glucose measurements after their measurements had been assessed at the pharmacy. PMID:25152795

  16. Analytical method for the effects of the asteroid belt on planetary orbits

    NASA Technical Reports Server (NTRS)

    Mayo, A. P.

    1979-01-01

    Analytic expressions are derived for the perturbation of planetary orbits due to a thick constant-density asteroid belt. The derivations include extensions and adaptations of Plakhov's (1968) analytic expressions for the perturbations in five of the orbital elements for closed orbits around Saturn's rings. The equations of Plakhov are modified to include the effect of ring thickness, and additional equations are derived for the perturbations in the sixth orbital element, the mean anomaly. The gravitational potential and orbital perturbations are derived for the asteroid belt with and without thickness, and for a hoop approximation to the belt. The procedures are also applicable to Saturn's rings and the newly discovered rings of Uranus. The effects of the asteroid belt thickness on the gravitational potential coefficients and the orbital motions are demonstrated. Comparisons between the Mars orbital perturbations obtained by using the analytic expressions and those obtained by numerical integration are discussed. The effects of the asteroid belt on earth-based ranging to Mars are also demonstrated.

  17. A sampling procedure to guide the collection of narrow-band, high-resolution spatially and spectrally representative reflectance data. [satellite imagery of earth resources

    NASA Technical Reports Server (NTRS)

    Brand, R. R.; Barker, J. L.

    1983-01-01

    A multistage sampling procedure using image processing, geographical information systems, and analytical photogrammetry is presented which can be used to guide the collection of representative, high-resolution spectra and discrete reflectance targets for future satellite sensors. The procedure is general and can be adapted to characterize areas as small as minor watersheds and as large as multistate regions. Beginning with a user-determined study area, successive reductions in size and spectral variation are performed using image analysis techniques on data from the Multispectral Scanner, orbital and simulated Thematic Mapper, low altitude photography synchronized with the simulator, and associated digital data. An integrated image-based geographical information system supports processing requirements.

  18. A design procedure for a tension-wire stiffened truss-column

    NASA Technical Reports Server (NTRS)

    Greene, W. H.

    1980-01-01

    A deployable, tension wire stiffened, truss column configuration was considered for space structure applications. An analytical procedure, developed for design of the truss column and exercised in numerical studies, was based on equivalent beam stiffness coefficients in the classical analysis for an initially imperfect beam column. Failure constraints were formulated to be used in a combined weight/strength and nonlinear mathematical programming automated design procedure to determine the minimum mass column for a particular combination of design load and length. Numerical studies gave the mass characteristics of the truss column for broad ranges of load and length. Comparisons of the truss column with a baseline tubular column used a special structural efficiency parameter for this class of columns.

  19. 7 CFR 90.2 - General terms defined.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...

  20. Damage states in laminated composite three-point bend specimens: An experimental-analytical correlation study

    NASA Technical Reports Server (NTRS)

    Starbuck, J. Michael; Guerdal, Zafer; Pindera, Marek-Jerzy; Poe, Clarence C.

    1990-01-01

    Damage states in laminated composites were studied by considering the model problem of a laminated beam subjected to three-point bending. A combination of experimental and theoretical research techniques was used to correlate the experimental results with the analytical stress distributions. The analytical solution procedure was based on the stress formulation approach of the mathematical theory of elasticity. The solution procedure is capable of calculating the ply-level stresses and beam displacements for any laminated beam of finite length using the generalized plane deformation or plane stress state assumption. Prior to conducting the experimental phase, the results from preliminary analyses were examined. Significant effects in the ply-level stress distributions were seen depending on the fiber orientation, aspect ratio, and whether or not a grouped or interspersed stacking sequence was used. The experimental investigation was conducted to determine the different damage modes in laminated three-point bend specimens. The test matrix consisted of three-point bend specimens of 0 deg unidirectional, cross-ply, and quasi-isotropic stacking sequences. The dependence of the damage initiation loads and ultimate failure loads were studied, and their relation to damage susceptibility and damage tolerance of the mean configuration was discussed. Damage modes were identified by visual inspection of the damaged specimens using an optical microscope. The four fundamental damage mechanisms identified were delaminations, matrix cracking, fiber breakage, and crushing. The correlation study between the experimental results and the analytical results were performed for the midspan deflection, indentation, damage modes, and damage susceptibility.

  1. Trends in Analytical Scale Separations.

    ERIC Educational Resources Information Center

    Jorgenson, James W.

    1984-01-01

    Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)

  2. Discordance between net analyte signal theory and practical multivariate calibration.

    PubMed

    Brown, Christopher D

    2004-08-01

    Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.

  3. F-14 modeling study

    NASA Technical Reports Server (NTRS)

    Levison, William H.

    1988-01-01

    This study explored application of a closed loop pilot/simulator model to the analysis of some simulator fidelity issues. The model was applied to two data bases: (1) a NASA ground based simulation of an air-to-air tracking task in which nonvisual cueing devices were explored, and (2) a ground based and inflight study performed by the Calspan Corporation to explore the effects of simulator delay on attitude tracking performance. The model predicted the major performance trends obtained in both studies. A combined analytical and experimental procedure for exploring simulator fidelity issues is outlined.

  4. Creep and creep rupture of laminated graphite/epoxy composites. Ph.D. Thesis. Final Report, 1 Oct. 1979 - 30 Sep. 1980

    NASA Technical Reports Server (NTRS)

    Dillard, D. A.; Morris, D. H.; Brinson, H. F.

    1981-01-01

    An incremental numerical procedure based on lamination theory is developed to predict creep and creep rupture of general laminates. Existing unidirectional creep compliance and delayed failure data is used to develop analytical models for lamina response. The compliance model is based on a procedure proposed by Findley which incorporates the power law for creep into a nonlinear constitutive relationship. The matrix octahedral shear stress is assumed to control the stress interaction effect. A modified superposition principle is used to account for the varying stress level effect on the creep strain. The lamina failure model is based on a modification of the Tsai-Hill theory which includes the time dependent creep rupture strength. A linear cumulative damage law is used to monitor the remaining lifetime in each ply.

  5. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.

  6. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  7. Analytical study of comet nucleus samples

    NASA Technical Reports Server (NTRS)

    Albee, A. L.

    1989-01-01

    Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.

  8. Fluorescent aptasensor for detection of four tetracycline veterinary drugs in milk based on catalytic hairpin assembly reaction and displacement of G-quadruplex.

    PubMed

    Zhou, Chen; Zou, Haimin; Sun, Chengjun; Ren, Dongxia; Xiong, Wei; Li, Yongxin

    2018-05-01

    Based on a novel signal amplification strategy by catalytic hairpin assembly and displacement of G-quadruplex DNA, an enzyme-free, non-label fluorescent aptasensing approach was established for sensitive detection of four tetracycline veterinary drugs in milk. The network consisted of a pair of partially complementary DNA hairpins (HP1 and HP2). The DNA aptamer of four tetracycline veterinary drugs was located at the sticky end of the HP1. The ring region of HP1 rich in G and C could form a stable G-quadruplex structure, which could emit specific fluorescence signal after binding with the fluorescent dye and N-methylmesoporphyrin IX (NMM). When presented in the system, the target analytes would be repeatedly used to trigger a recycling procedure between the hairpins, generating numerous HP1-HP2 duplex complexes and displacing G-quadruplex DNA. Thus, the sensitive detection of target analytes was achieved in a wide linear range (0-1000 μg/L) with the detection limit of 4.6 μg/L. Moreover, this proposed method showed high discrimination efficiency towards target analytes against other common mismatched veterinary drugs, and could be successfully applied to the analysis of milk samples. Graphical abstract Schematic of target analyte detection based on catalytic hairpin assembly reaction and displacement of G-quadruplex.

  9. Highly sensitive detection of cancer cells with an electrochemical cytosensor based on boronic acid functional polythiophene.

    PubMed

    Dervisevic, Muamer; Senel, Mehmet; Sagir, Tugba; Isik, Sevim

    2017-04-15

    The detection of cancer cells through important molecular recognition target such as sialic acid is significant for the clinical diagnosis and treatment. There are many electrochemical cytosensors developed for cancer cells detection but most of them have complicated fabrication processes which results in poor reproducibility and reliability. In this study, a simple, low-cost, and highly sensitive electrochemical cytosensor was designed based on boronic acid-functionalized polythiophene. In cytosensors fabrication simple single-step procedure was used which includes coating pencil graphite electrode (PGE) by means of electro-polymerization of 3-Thienyl boronic acid and Thiophen. Electrochemical impedance spectroscopy and cyclic voltammetry were used as an analytical methods to optimize and measure analytical performances of PGE/P(TBA 0.5 Th 0.5 ) based electrode. Cytosensor showed extremely good analytical performances in detection of cancer cells with linear rage of 1×10 1 to 1×10 6 cellsmL -1 exhibiting low detection limit of 10 cellsmL -1 and incubation time of 10min. Next to excellent analytical performances, it showed high selectivity towards AGS cancer cells when compared to HEK 293 normal cells and bone marrow mesenchymal stem cells (BM-hMSCs). This method is promising for future applications in early stage cancer diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Design and evaluation of Continuous Descent Approach as a fuel-saving procedure

    NASA Astrophysics Data System (ADS)

    Jin, Li

    Continuous Descent Approach (CDA), which is among the key concepts of the Next Generation Air Transportation System (NextGen), is a fuel economical procedure, but requires increased separation to accommodate spacing uncertainties among arriving aircraft. Such negative impact is often overlooked when benefits are estimated. Although a considerable number of researches have been devoted to the estimation of potential fuel saving of CDA, few have attempted to explain the fuel saving observed in field tests from an analytical point of view. This research gives insights into the reasons why CDA saves fuel, and a number of design guidelines for CDA procedures are derived. The analytical relationship between speed, altitude, and time-cumulative fuel consumption is derived based on Base of Aircraft Data (BADA) Total Energy Model. Theoretical analysis implies that speed profile has an impact as substantial as, if not more than, vertical profile on the fuel consumption in the terminal area. In addition, CDA is not intrinsically a fuel-saving procedure: whether CDA saves fuel or not is contingent upon whether the speed profile is properly designed or not. Based on this model, the potential fuel savings due to CDA at San Francisco International Airport were estimated, and the accuracy of this estimation is analyzed. Possible uncertainties in this fuel estimation primarily resulted from the modeled CDA procedure and the inaccuracy of BADA. This thesis also investigates the fuel savings due to CDAs under high traffic conditions, counting not only the savings benefiting from optimal vertical profiles but also the extra fuel burn resulting from the increased separations. The simulated CDAs traffic is based on radar track data, and deconflicted by a scheduling algorithm that targets minimized delays. The delays are absorbed by speed change and path stretching, accounting for the air traffic controls that are entailed by CDAs. The fuel burn statistics calculated based on the BADA Total Energy Model reveals that the CDAs save on average 171.87 kg per arrival, but the number is discounted by delay absorption. The savings diminish as the arrival demand increases, and could be even negative due to large delays. The throughput analysis demonstrated that the impact of CDA on airport capacity is insignificant and tolerable. The Atlanta International Airport was used as the testbed for sensitivity analysis, and the New York Metroplex was used as the test bed for throughput analysis.

  11. Comparison of analytical and predictive methods for water, protein, fat, sugar, and gross energy in marine mammal milk.

    PubMed

    Oftedal, O T; Eisert, R; Barrell, G K

    2014-01-01

    Mammalian milks may differ greatly in composition from cow milk, and these differences may affect the performance of analytical methods. High-fat, high-protein milks with a preponderance of oligosaccharides, such as those produced by many marine mammals, present a particular challenge. We compared the performance of several methods against reference procedures using Weddell seal (Leptonychotes weddellii) milk of highly varied composition (by reference methods: 27-63% water, 24-62% fat, 8-12% crude protein, 0.5-1.8% sugar). A microdrying step preparatory to carbon-hydrogen-nitrogen (CHN) gas analysis slightly underestimated water content and had a higher repeatability relative standard deviation (RSDr) than did reference oven drying at 100°C. Compared with a reference macro-Kjeldahl protein procedure, the CHN (or Dumas) combustion method had a somewhat higher RSDr (1.56 vs. 0.60%) but correlation between methods was high (0.992), means were not different (CHN: 17.2±0.46% dry matter basis; Kjeldahl 17.3±0.49% dry matter basis), there were no significant proportional or constant errors, and predictive performance was high. A carbon stoichiometric procedure based on CHN analysis failed to adequately predict fat (reference: Röse-Gottlieb method) or total sugar (reference: phenol-sulfuric acid method). Gross energy content, calculated from energetic factors and results from reference methods for fat, protein, and total sugar, accurately predicted gross energy as measured by bomb calorimetry. We conclude that the CHN (Dumas) combustion method and calculation of gross energy are acceptable analytical approaches for marine mammal milk, but fat and sugar require separate analysis by appropriate analytic methods and cannot be adequately estimated by carbon stoichiometry. Some other alternative methods-low-temperature drying for water determination; Bradford, Lowry, and biuret methods for protein; the Folch and the Bligh and Dyer methods for fat; and enzymatic and reducing sugar methods for total sugar-appear likely to produce substantial error in marine mammal milks. It is important that alternative analytical methods be properly validated against a reference method before being used, especially for mammalian milks that differ greatly from cow milk in analyte characteristics and concentrations. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. A Numerical-Analytical Approach Based on Canonical Transformations for Computing Optimal Low-Thrust Transfers

    NASA Astrophysics Data System (ADS)

    da Silva Fernandes, S.; das Chagas Carvalho, F.; Bateli Romão, J. V.

    2018-04-01

    A numerical-analytical procedure based on infinitesimal canonical transformations is developed for computing optimal time-fixed low-thrust limited power transfers (no rendezvous) between coplanar orbits with small eccentricities in an inverse-square force field. The optimization problem is formulated as a Mayer problem with a set of non-singular orbital elements as state variables. Second order terms in eccentricity are considered in the development of the maximum Hamiltonian describing the optimal trajectories. The two-point boundary value problem of going from an initial orbit to a final orbit is solved by means of a two-stage Newton-Raphson algorithm which uses an infinitesimal canonical transformation. Numerical results are presented for some transfers between circular orbits with moderate radius ratio, including a preliminary analysis of Earth-Mars and Earth-Venus missions.

  13. Multishaker modal testing

    NASA Technical Reports Server (NTRS)

    Craig, R. R., Jr.

    1983-01-01

    Procedures for improving the modal modeling of structures using test data and to determine appropriate analytical models based on substructure experimental data were explored. Two related research topics were considered in modal modeling: using several independently acquired columns of frequency response data, and modal modeling using simultaneous multi-point excitation. In component mode synthesis modeling, the emphasis is on determining the best way to employ complex modes and residuals.

  14. Validity of plant fiber length measurement : a review of fiber length measurement based on kenaf as a model

    Treesearch

    James S. Han; Theodore Mianowski; Yi-yu Lin

    1999-01-01

    The efficacy of fiber length measurement techniques such as digitizing, the Kajaani procedure, and NIH Image are compared in order to determine the optimal tool. Kenaf bast fibers, aspen, and red pine fibers were collected from different anatomical parts, and the fiber lengths were compared using various analytical tools. A statistical analysis on the validity of the...

  15. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.

  16. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. A low cost and high throughput magnetic bead-based immuno-agglutination assay in confined droplets.

    PubMed

    Teste, Bruno; Ali-Cherif, Anaïs; Viovy, Jean Louis; Malaquin, Laurent

    2013-06-21

    Although passive immuno-agglutination assays consist of one step and simple procedures, they are usually not adapted for high throughput analyses and they require expensive and bulky equipment for quantitation steps. Here we demonstrate a low cost, multimodal and high throughput immuno-agglutination assay that relies on a combination of magnetic beads (MBs), droplets microfluidics and magnetic tweezers. Antibody coated MBs were used as a capture support in the homogeneous phase. Following the immune interaction, water in oil droplets containing MBs and analytes were generated and transported in Teflon tubing. When passing in between magnetic tweezers, the MBs contained in the droplets were magnetically confined in order to enhance the agglutination rate and kinetics. When releasing the magnetic field, the internal recirculation flows in the droplet induce shear forces that favor MBs redispersion. In the presence of the analyte, the system preserves specific interactions and MBs stay in the aggregated state while in the case of a non-specific analyte, redispersion of particles occurs. The analyte quantitation procedure relies on the MBs redispersion rate within the droplet. The influence of different parameters such as magnetic field intensity, flow rate and MBs concentration on the agglutination performances have been investigated and optimized. Although the immuno-agglutination assay described in this work may not compete with enzyme linked immunosorbent assay (ELISA) in terms of sensitivity, it offers major advantages regarding the reagents consumption (analysis is performed in sub microliter droplet) and the platform cost that yields to very cheap analyses. Moreover the fully automated analysis procedure provides reproducible analyses with throughput well above those of existing technologies. We demonstrated the detection of biotinylated phosphatase alkaline in 100 nL sample volumes with an analysis rate of 300 assays per hour and a limit of detection of 100 pM.

  18. Evaluation of management measures of software development. Volume 1: Analysis summary

    NASA Technical Reports Server (NTRS)

    Page, J.; Card, D.; Mcgarry, F.

    1982-01-01

    The conceptual model, the data classification scheme, and the analytic procedures are explained. The analytic results are summarized and specific software measures for collection and monitoring are recommended.

  19. Clean Water Act Analytical Methods

    EPA Pesticide Factsheets

    EPA publishes laboratory analytical methods (test procedures) that are used by industries and municipalities to analyze the chemical, physical and biological components of wastewater and other environmental samples required by the Clean Water Act.

  20. Laboratory Workhorse: The Analytical Balance.

    ERIC Educational Resources Information Center

    Clark, Douglas W.

    1979-01-01

    This report explains the importance of various analytical balances in the water or wastewater laboratory. Stressed is the proper procedure for utilizing the equipment as well as the mechanics involved in its operation. (CS)

  1. FDA Bacteriological Analytical Manual, Chapter 10, 2003: Listeria monocytogenes

    EPA Pesticide Factsheets

    FDA Bacteriological Analytical Manual, Chapter 10 describes procedures for analysis of food samples and may be adapted for assessment of solid, particulate, aerosol, liquid and water samples containing Listeria monocytogenes.

  2. Portable microwave assisted extraction: An original concept for green analytical chemistry.

    PubMed

    Perino, Sandrine; Petitcolas, Emmanuel; de la Guardia, Miguel; Chemat, Farid

    2013-11-08

    This paper describes a portable microwave assisted extraction apparatus (PMAE) for extraction of bioactive compounds especially essential oils and aromas directly in a crop or in a forest. The developed procedure, based on the concept of green analytical chemistry, is appropriate to obtain direct in-field information about the level of essential oils in natural samples and to illustrate green chemical lesson and research. The efficiency of this experiment was validated for the extraction of essential oil of rosemary directly in a crop and allows obtaining a quantitative information on the content of essential oil, which was similar to that obtained by conventional methods in the laboratory. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Hybrid-dual-fourier tomographic algorithm for a fast three-dimensionial optical image reconstruction in turbid media

    NASA Technical Reports Server (NTRS)

    Alfano, Robert R. (Inventor); Cai, Wei (Inventor)

    2007-01-01

    A reconstruction technique for reducing computation burden in the 3D image processes, wherein the reconstruction procedure comprises an inverse and a forward model. The inverse model uses a hybrid dual Fourier algorithm that combines a 2D Fourier inversion with a 1D matrix inversion to thereby provide high-speed inverse computations. The inverse algorithm uses a hybrid transfer to provide fast Fourier inversion for data of multiple sources and multiple detectors. The forward model is based on an analytical cumulant solution of a radiative transfer equation. The accurate analytical form of the solution to the radiative transfer equation provides an efficient formalism for fast computation of the forward model.

  4. Evaluating bis(2-ethylhexyl) methanediphosphonic acid (H 2DEH[MDP]) based polymer ligand film (PLF) for plutonium and uranium extraction

    DOE PAGES

    Rim, Jung H.; Armenta, Claudine E.; Gonzales, Edward R.; ...

    2015-09-12

    This paper describes a new analyte extraction medium called polymer ligand film (PLF) that was developed to rapidly extract radionuclides. PLF is a polymer medium with ligands incorporated in its matrix that selectively and quickly extracts analytes. The main focus of the new technique is to shorten and simplify the procedure for chemically isolating radionuclides for determination through alpha spectroscopy. The PLF system was effective for plutonium and uranium extraction. The PLF was capable of co-extracting or selectively extracting plutonium over uranium depending on the PLF composition. As a result, the PLF and electrodeposited samples had similar alpha spectra resolutions.

  5. Wind flow characteristics in the wakes of large wind turbines. Volume 1: Analytical model development

    NASA Technical Reports Server (NTRS)

    Eberle, W. R.

    1981-01-01

    A computer program to calculate the wake downwind of a wind turbine was developed. Turbine wake characteristics are useful for determining optimum arrays for wind turbine farms. The analytical model is based on the characteristics of a turbulent coflowing jet with modification for the effects of atmospheric turbulence. The program calculates overall wake characteristics, wind profiles, and power recovery for a wind turbine directly in the wake of another turbine, as functions of distance downwind of the turbine. The calculation procedure is described in detail, and sample results are presented to illustrate the general behavior of the wake and the effects of principal input parameters.

  6. SSME single-crystal turbine blade dynamics

    NASA Technical Reports Server (NTRS)

    Moss, Larry A.

    1988-01-01

    A study was performrd to determine the dynamic characteristics of the Space Shuttle Main Engine high pressure fuel turbopump (HPFTP) blades made of single crystal (SC) material. The first and second stage drive turbine blades of HPFTP were examined. The nonrotating natural frequencies were determined experimentally and analytically. The experimental results of the SC second stage blade were used to verify the analytical procedures. The study examined the SC first stage blade natural frequencies with respect to crystal orientation at typical operating conditions. The SC blade dynamic response was predicted to be less than the directionally solidified base. Crystal axis orientation optimization indicated that the third mode interference will exist in any SC orientation.

  7. Viscoelastic behavior and lifetime (durability) predictions. [for laminated fiber reinforced plastics

    NASA Technical Reports Server (NTRS)

    Brinson, R. F.

    1985-01-01

    A method for lifetime or durability predictions for laminated fiber reinforced plastics is given. The procedure is similar to but not the same as the well known time-temperature-superposition principle for polymers. The method is better described as an analytical adaptation of time-stress-super-position methods. The analytical constitutive modeling is based upon a nonlinear viscoelastic constitutive model developed by Schapery. Time dependent failure models are discussed and are related to the constitutive models. Finally, results of an incremental lamination analysis using the constitutive and failure model are compared to experimental results. Favorable results between theory and predictions are presented using data from creep tests of about two months duration.

  8. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Gradient retention prediction of acid-base analytes in reversed phase liquid chromatography: a simplified approach for acetonitrile-water mobile phases.

    PubMed

    Andrés, Axel; Rosés, Martí; Bosch, Elisabeth

    2014-11-28

    In previous work, a two-parameter model to predict chromatographic retention of ionizable analytes in gradient mode was proposed. However, the procedure required some previous experimental work to get a suitable description of the pKa change with the mobile phase composition. In the present study this previous experimental work has been simplified. The analyte pKa values have been calculated through equations whose coefficients vary depending on their functional group. Forced by this new approach, other simplifications regarding the retention of the totally neutral and totally ionized species also had to be performed. After the simplifications were applied, new prediction values were obtained and compared with the previously acquired experimental data. The simplified model gave pretty good predictions while saving a significant amount of time and resources. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. The analytical solution for drug delivery system with nonhomogeneous moving boundary condition

    NASA Astrophysics Data System (ADS)

    Saudi, Muhamad Hakimi; Mahali, Shalela Mohd; Harun, Fatimah Noor

    2017-08-01

    This paper discusses the development and the analytical solution of a mathematical model based on drug release system from a swelling delivery device. The mathematical model is represented by a one-dimensional advection-diffusion equation with nonhomogeneous moving boundary condition. The solution procedures consist of three major steps. Firstly, the application of steady state solution method, which is used to transform the nonhomogeneous moving boundary condition to homogeneous boundary condition. Secondly, the application of the Landau transformation technique that gives a significant impact in removing the advection term in the system of equation and transforming the moving boundary condition to a fixed boundary condition. Thirdly, the used of separation of variables method to find the analytical solution for the resulted initial boundary value problem. The results show that the swelling rate of delivery device and drug release rate is influenced by value of growth factor r.

  11. A new multi-step technique with differential transform method for analytical solution of some nonlinear variable delay differential equations.

    PubMed

    Benhammouda, Brahim; Vazquez-Leal, Hector

    2016-01-01

    This work presents an analytical solution of some nonlinear delay differential equations (DDEs) with variable delays. Such DDEs are difficult to treat numerically and cannot be solved by existing general purpose codes. A new method of steps combined with the differential transform method (DTM) is proposed as a powerful tool to solve these DDEs. This method reduces the DDEs to ordinary differential equations that are then solved by the DTM. Furthermore, we show that the solutions can be improved by Laplace-Padé resummation method. Two examples are presented to show the efficiency of the proposed technique. The main advantage of this technique is that it possesses a simple procedure based on a few straight forward steps and can be combined with any analytical method, other than the DTM, like the homotopy perturbation method.

  12. Study of a vibrating plate: comparison between experimental (ESPI) and analytical results

    NASA Astrophysics Data System (ADS)

    Romero, G.; Alvarez, L.; Alanís, E.; Nallim, L.; Grossi, R.

    2003-07-01

    Real-time electronic speckle pattern interferometry (ESPI) was used for tuning and visualization of natural frequencies of a trapezoidal plate. The plate was excited to resonant vibration by a sinusoidal acoustical source, which provided a continuous range of audio frequencies. Fringe patterns produced during the time-average recording of the vibrating plate—corresponding to several resonant frequencies—were registered. From these interferograms, calculations of vibrational amplitudes by means of zero-order Bessel functions were performed in some particular cases. The system was also studied analytically. The analytical approach developed is based on the Rayleigh-Ritz method and on the use of non-orthogonal right triangular co-ordinates. The deflection of the plate is approximated by a set of beam characteristic orthogonal polynomials generated by using the Gram-Schmidt procedure. A high degree of correlation between computational analysis and experimental results was observed.

  13. Variable-centered and person-centered approaches to studying Mexican-origin mother-daughter cultural orientation dissonance.

    PubMed

    Bámaca-Colbert, Mayra Y; Gayles, Jochebed G

    2010-11-01

    The overall aim of the current study was to identify the methodological approach and corresponding analytic procedure that best elucidated the associations among Mexican-origin mother-daughter cultural orientation dissonance, family functioning, and adolescent adjustment. To do so, we employed, and compared, two methodological approaches (i.e., variable-centered and person-centered) via four analytic procedures (i.e., difference score, interactive, matched/mismatched grouping, and latent profiles). The sample consisted of 319 girls in the 7th or 10th grade and their mother or mother figure from a large Southwestern, metropolitan area in the US. Family factors were found to be important predictors of adolescent adjustment in all models. Although some findings were similar across all models, overall, findings suggested that the latent profile procedure best elucidated the associations among the variables examined in this study. In addition, associations were present across early and middle adolescents, with a few findings being only present for one group. Implications for using these analytic procedures in studying cultural and family processes are discussed.

  14. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    PubMed

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  15. Analytical and experimental investigation of a 1/8-scale dynamic model of the shuttle orbiter. Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    Mason, P. W.; Harris, H. G.; Zalesak, J.; Bernstein, M.

    1974-01-01

    The methods and procedures used in the analysis and testing of the scale model are reported together with the correlation of the analytical and experimental results. The model, the NASTRAN finite element analysis, and results are discussed. Tests and analytical investigations are also reported.

  16. CTEPP STANDARD OPERATING PROCEDURE FOR PREPARATION OF SURROGATE RECOVERY STANDARD AND INTERNAL STANDARD SOLUTIONS FOR POLAR TARGET ANALYTES (SOP-5.26)

    EPA Science Inventory

    This SOP describes the method used for preparing surrogate recovery standard and internal standard solutions for the analysis of polar target analytes. It also describes the method for preparing calibration standard solutions for polar analytes used for gas chromatography/mass sp...

  17. A Computational Procedure for Identifying Bilinear Representations of Nonlinear Systems Using Volterra Kernels

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G.; Silva, Walter A.

    2008-01-01

    A computational procedure for identifying the state-space matrices corresponding to discrete bilinear representations of nonlinear systems is presented. A key feature of the method is the use of first- and second-order Volterra kernels (first- and second-order pulse responses) to characterize the system. The present method is based on an extension of a continuous-time bilinear system identification procedure given in a 1971 paper by Bruni, di Pillo, and Koch. The analytical and computational considerations that underlie the original procedure and its extension to the title problem are presented and described, pertinent numerical considerations associated with the process are discussed, and results obtained from the application of the method to a variety of nonlinear problems from the literature are presented. The results of these exploratory numerical studies are decidedly promising and provide sufficient credibility for further examination of the applicability of the method.

  18. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  19. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  20. Analysis of D-penicillamine by gas chromatography utilizing nitrogen--phosphorus detection.

    PubMed

    Rushing, L G; Hansen, E B; Thompson, H C

    1985-01-11

    A method is presented for the analysis of the "orphan" drug D-penicillamine (D-Pa), which is used for the treatment of the inherited rare copper metabolism dysfunction known as Wilson's disease, by assaying a derivative of the compound by gas chromatography employing a rubidium sensitized nitrogen--phosphorus detector. Analytical procedures are described for the analyses of residues of D-Pa X HCl salt in animal feed and for the analyses of the salt or free base from aqueous solutions by utilizing a single-step double derivatization with diazomethane--acetone. Stability data for D-Pa X HCl in animal feed and for the free base in water are presented. An ancillary fluorescence derivatization procedure for the analysis of D-Pa in water is also reported.

  1. Cardiac catheterization laboratory inpatient forecast tool: a prospective evaluation

    PubMed Central

    Flanagan, Eleni; Siddiqui, Sauleh; Appelbaum, Jeff; Kasper, Edward K; Levin, Scott

    2016-01-01

    Objective To develop and prospectively evaluate a web-based tool that forecasts the daily bed need for admissions from the cardiac catheterization laboratory using routinely available clinical data within electronic medical records (EMRs). Methods The forecast model was derived using a 13-month retrospective cohort of 6384 catheterization patients. Predictor variables such as demographics, scheduled procedures, and clinical indicators mined from free-text notes were input to a multivariable logistic regression model that predicted the probability of inpatient admission. The model was embedded into a web-based application connected to the local EMR system and used to support bed management decisions. After implementation, the tool was prospectively evaluated for accuracy on a 13-month test cohort of 7029 catheterization patients. Results The forecast model predicted admission with an area under the receiver operating characteristic curve of 0.722. Daily aggregate forecasts were accurate to within one bed for 70.3% of days and within three beds for 97.5% of days during the prospective evaluation period. The web-based application housing the forecast model was used by cardiology providers in practice to estimate daily admissions from the catheterization laboratory. Discussion The forecast model identified older age, male gender, invasive procedures, coronary artery bypass grafts, and a history of congestive heart failure as qualities indicating a patient was at increased risk for admission. Diagnostic procedures and less acute clinical indicators decreased patients’ risk of admission. Despite the site-specific limitations of the model, these findings were supported by the literature. Conclusion Data-driven predictive analytics may be used to accurately forecast daily demand for inpatient beds for cardiac catheterization patients. Connecting these analytics to EMR data sources has the potential to provide advanced operational decision support. PMID:26342217

  2. New correction procedures for the fast field program which extend its range

    NASA Technical Reports Server (NTRS)

    West, M.; Sack, R. A.

    1990-01-01

    A fast field program (FFP) algorithm was developed based on the method of Lee et al., for the prediction of sound pressure level from low frequency, high intensity sources. In order to permit accurate predictions at distances greater than 2 km, new correction procedures have had to be included in the algorithm. Certain functions, whose Hankel transforms can be determined analytically, are subtracted from the depth dependent Green's function. The distance response is then obtained as the sum of these transforms and the Fast Fourier Transformation (FFT) of the residual k dependent function. One procedure, which permits the elimination of most complex exponentials, has allowed significant changes in the structure of the FFP algorithm, which has resulted in a substantial reduction in computation time.

  3. Modified procedure to determine acid-insoluble lignin in wood and pulp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Effland, M.J.

    1977-10-01

    If wood is treated with strong acid, carbohydrates are hydrolyzed and solubilized. The insoluble residue is by definition lignin and can be measured gravimetrically. The standard method of analysis requires samples of 1 or 2 g of wood or pulp. In research at this laboratory these amounts of sample are often not available for analytical determinations. Thus we developed a modification of the standard procedure suitable for much smaller sample amounts. The modification is based on the procedure of Saeman. Wood samples require extraction prior to lignin analysis to remove acid-insoluble extractives that will be measured as lignin. Usually thismore » involves only a standard extraction with ethanol--benzene. However, woods high in tannin must also be subjected to extraction with alcohol. Pulps seldom require extraction.« less

  4. Current projects in Pre-analytics: where to go?

    PubMed

    Sapino, Anna; Annaratone, Laura; Marchiò, Caterina

    2015-01-01

    The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.

  5. Physical Property Analysis and Report for Sediments at 100-BC-5 Operable Unit, Boreholes C7505, C7506, C7507, and C7665

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindberg, Michael J.

    2010-09-28

    Between October 14, 2009 and February 22, 2010 sediment samples were received from 100-BC Decision Unit for geochemical studies. This is an analytical data report for sediments received from CHPRC at the 100 BC 5 OU. The analyses for this project were performed at the 325 building located in the 300 Area of the Hanford Site. The analyses were performed according to Pacific Northwest National Laboratory (PNNL) approved procedures and/or nationally recognized test procedures. The data sets include the sample identification numbers, analytical results, estimated quantification limits (EQL), and quality control data. The preparatory and analytical quality control requirements, calibrationmore » requirements, acceptance criteria, and failure actions are defined in the on-line QA plan 'Conducting Analytical Work in Support of Regulatory Programs' (CAW). This QA plan implements the Hanford Analytical Services Quality Assurance Requirements Documents (HASQARD) for PNNL.« less

  6. Estimating Aquifer Properties Using Sinusoidal Pumping Tests

    NASA Astrophysics Data System (ADS)

    Rasmussen, T. C.; Haborak, K. G.; Young, M. H.

    2001-12-01

    We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.

  7. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    PubMed

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  8. Comparison of procedures for correction of matrix interferences in the analysis of soils by ICP-OES with CCD detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadler, D.A.; Sun, F.; Littlejohn, D.

    1995-12-31

    ICP-OES is a useful technique for multi-element analysis of soils. However, as a number of elements are present in relatively high concentrations, matrix interferences can occur and examples have been widely reported. The availability of CCD detectors has increased the opportunities for rapid multi-element, multi-wave-length determination of elemental concentrations in soils and other environmental samples. As the composition of soils from industrial sites can vary considerably, especially when taken from different pit horizons, procedures are required to assess the extent of interferences and correct the effects, on a simultaneous multi-element basis. In single element analysis, plasma operating conditions can sometimesmore » be varied to minimize or even remove multiplicative interferences. In simultaneous multi-element analysis, the scope for this approach may be limited, depending on the spectrochemical characteristics of the emitting analyte species. Matrix matching, by addition of major sample components to the analyte calibrant solutions, can be used to minimize inaccuracies. However, there are also limitations to this procedure, when the sample composition varies significantly. Multiplicative interference effects can also be assessed by a {open_quotes}single standard addition{close_quotes} of each analyte to the sample solution and the information obtained may be used to correct the analyte concentrations determined directly. Each of these approaches has been evaluated to ascertain the best procedure for multi-element analysis of industrial soils by ICP-OES with CCD detection at multiple wavelengths. Standard reference materials and field samples have been analyzed to illustrate the efficacy of each procedure.« less

  9. Implementation and application of moving average as continuous analytical quality control instrument demonstrated for 24 routine chemistry assays.

    PubMed

    Rossum, Huub H van; Kemperman, Hans

    2017-07-26

    General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.

  10. A comparison of various modes of liquid-liquid based microextraction techniques: determination of picric acid.

    PubMed

    Burdel, Martin; Šandrejová, Jana; Balogh, Ioseph S; Vishnikin, Andriy; Andruch, Vasil

    2013-03-01

    Three modes of liquid-liquid based microextraction techniques--namely auxiliary solvent-assisted dispersive liquid-liquid microextraction, auxiliary solvent-assisted dispersive liquid-liquid microextraction with low-solvent consumption, and ultrasound-assisted emulsification microextraction--were compared. Picric acid was used as the model analyte. The determination is based on the reaction of picric acid with Astra Phloxine reagent to produce an ion associate easily extractable by various organic solvents, followed by spectrophotometric detection at 558 nm. Each of the compared procedures has both advantages and disadvantages. The main benefit of ultrasound-assisted emulsification microextraction is that no hazardous chlorinated extraction solvents and no dispersive solvent are necessary. Therefore, this procedure was selected for validation. Under optimized experimental conditions (pH 3, 7 × 10(-5) mol/L of Astra Phloxine, and 100 μL of toluene), the calibration plot was linear in the range of 0.02-0.14 mg/L and the LOD was 7 μg/L of picric acid. The developed procedure was applied to the analysis of spiked water samples. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. 10 CFR 26.137 - Quality assurance and quality control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...

  12. 10 CFR 26.137 - Quality assurance and quality control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...

  13. 10 CFR 26.137 - Quality assurance and quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...

  14. 10 CFR 26.137 - Quality assurance and quality control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...

  15. 10 CFR 26.137 - Quality assurance and quality control.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...

  16. Identification of natural indigo in historical textiles by GC-MS.

    PubMed

    Degani, Laura; Riedo, Chiara; Chiantore, Oscar

    2015-02-01

    The possibility of successfully applying a common GC-MS procedure for identification in one step of all types of dyes from plants of unknown origin and from historical objects is particularly attractive due to the high separation efficiency of the capillary columns, the MS detection sensitivity and the reproducibility of results. In this work, GC-MS analysis, previously and successfully used for the characterization of anthraquinones, flavonoids and tannins from plant extracts and historical samples, has been tested on indigoid dyestuffs. An analytical procedure based on the silylating agent N,O-bis-(trimethylsilyl)trifluoroacetamide (BSTFA) with 1% trimethylchlorosilane (TMCS) was applied to pure molecules of indigotin and indirubin and to plant extracts of Indigofera tinctoria L. and Isatis tinctoria L. Preliminary tests have been done to establish the chromatographic conditions and the derivatization amounts most suitable for the simultaneous detection of indigoid molecules and of the other natural compounds, such as fatty acids, carboxylic acids and sugars, contained within the plant extracts. In order to assess the capacity and the sensitivity of the analytical procedure in typical archaeometric applications, wool samples dyed in the laboratory with indigo were analysed by mimicking the sample amounts typically available with historical objects. The electron ionization (EI) spectra of the main silylated derivatives of indigoid molecules obtained in this way constitute the necessary data set for the characterization of natural extracts and historical works of art. Subsequently, the procedure has been applied to historical samples for the detection of indigo and of other dyestuffs eventually contained in samples. Additional information, useful for restoration and preservation of works of art, could be also obtained on the nature of stains and smudges present on the sampled textile material. The GC-MS method turns out to be an efficient and fast analytical tool also for the identification of natural indigo in plants and textile artefacts, providing results complementary to those from high-performance liquid chromatography (HPLC).

  17. An experimental and theoretical study of reaction mechanisms between nitriles and hydroxylamine.

    PubMed

    Vörös, Attila; Mucsi, Zoltán; Baán, Zoltán; Timári, Géza; Hermecz, István; Mizsey, Péter; Finta, Zoltán

    2014-10-28

    The industrially relevant reaction between nitriles and hydroxylamine yielding amidoximes was studied in different molecular solvents and in ionic liquids. In industry, this procedure is carried out on the ton scale in alcohol solutions and the above transformation produces a significant amount of unexpected amide by-product, depending on the nature of the nitrile, which can cause further analytical and purification issues. Although there were earlier attempts to propose mechanisms for this transformation, the real reaction pathway is still under discussion. A new detailed reaction mechanistic explanation, based on theoretical and experimental proof, is given to augment the former mechanisms, which allowed us to find a more efficient, side-product free procedure. Interpreting the theoretical results obtained, it was shown that the application of specific imidazolium, phosphonium and quaternary ammonium based ionic liquids could decrease simultaneously the reaction time while eliminating the amide side-product, leading to the targeted product selectively. This robust and economic procedure now affords a fast, selective amide free synthesis of amidoximes.

  18. Simultaneous determination of PPCPs, EDCs, and artificial sweeteners in environmental water samples using a single-step SPE coupled with HPLC-MS/MS and isotope dilution.

    PubMed

    Tran, Ngoc Han; Hu, Jiangyong; Ong, Say Leong

    2013-09-15

    A high-throughput method for the simultaneous determination of 24 pharmaceuticals and personal care products (PPCPs), endocrine disrupting chemicals (EDCs) and artificial sweeteners (ASs) was developed. The method was based on a single-step solid phase extraction (SPE) coupled with high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) and isotope dilution. In this study, a single-step SPE procedure was optimized for simultaneous extraction of all target analytes. Good recoveries (≥ 70%) were observed for all target analytes when extraction was performed using Chromabond(®) HR-X (500 mg, 6 mL) cartridges under acidic condition (pH 2). HPLC-MS/MS parameters were optimized for the simultaneous analysis of 24 PPCPs, EDCs and ASs in a single injection. Quantification was performed by using 13 isotopically labeled internal standards (ILIS), which allows correcting efficiently the loss of the analytes during SPE procedure, matrix effects during HPLC-MS/MS and fluctuation in MS/MS signal intensity due to instrument. Method quantification limit (MQL) for most of the target analytes was below 10 ng/L in all water samples. The method was successfully applied for the simultaneous determination of PPCPs, EDCs and ASs in raw wastewater, surface water and groundwater samples collected in a local catchment area in Singapore. In conclusion, the developed method provided a valuable tool for investigating the occurrence, behavior, transport, and the fate of PPCPs, EDCs and ASs in the aquatic environment. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Simplified multiple scattering model for radiative transfer in turbid water

    NASA Technical Reports Server (NTRS)

    Ghovanlou, A. H.; Gupta, G. N.

    1978-01-01

    Quantitative analytical procedures for relating selected water quality parameters to the characteristics of the backscattered signals, measured by remote sensors, require the solution of the radiative transport equation in turbid media. Presented is an approximate closed form solution of this equation and based on this solution, the remote sensing of sediments is discussed. The results are compared with other standard closed form solutions such as quasi-single scattering approximations.

  20. Monte Carlo calculation of dynamical properties of the two-dimensional Hubbard model

    NASA Technical Reports Server (NTRS)

    White, S. R.; Scalapino, D. J.; Sugar, R. L.; Bickers, N. E.

    1989-01-01

    A new method is introduced for analytically continuing imaginary-time data from quantum Monte Carlo calculations to the real-frequency axis. The method is based on a least-squares-fitting procedure with constraints of positivity and smoothness on the real-frequency quantities. Results are shown for the single-particle spectral-weight function and density of states for the half-filled, two-dimensional Hubbard model.

  1. Intrinsic Remediation Engineering Evaluation/Cost Analysis for Car Care Center at Bolling Air Force Base, Washington, DC

    DTIC Science & Technology

    1997-01-01

    supplemented using established literature values for similar aquifer materials . The groundwater sampling activities and analytical results from both...subsurface materials recovered. Observed soil classification types compared very favorably to the soil classifications determined by the CPT tests. 0 2.1.5...other similar substances were handled in a manner consistent with accepted safety procedures and standard operating practices. Well completion materials

  2. Unsteady Flow Simulation: A Numerical Challenge

    DTIC Science & Technology

    2003-03-01

    drive to convergence the numerical unsteady term. The time marching procedure is based on the approximate implicit Newton method for systems of non...computed through analytical derivatives of S. The linear system stemming from equation (3) is solved at each integration step by the same iterative method...significant reduction of memory usage, thanks to the reduced dimensions of the linear system matrix during the implicit marching of the solution. The

  3. Sensitivity of fish density estimates to standard analytical procedures applied to Great Lakes hydroacoustic data

    USGS Publications Warehouse

    Kocovsky, Patrick M.; Rudstam, Lars G.; Yule, Daniel L.; Warner, David M.; Schaner, Ted; Pientka, Bernie; Deller, John W.; Waterfield, Holly A.; Witzel, Larry D.; Sullivan, Patrick J.

    2013-01-01

    Standardized methods of data collection and analysis ensure quality and facilitate comparisons among systems. We evaluated the importance of three recommendations from the Standard Operating Procedure for hydroacoustics in the Laurentian Great Lakes (GLSOP) on density estimates of target species: noise subtraction; setting volume backscattering strength (Sv) thresholds from user-defined minimum target strength (TS) of interest (TS-based Sv threshold); and calculations of an index for multiple targets (Nv index) to identify and remove biased TS values. Eliminating noise had the predictable effect of decreasing density estimates in most lakes. Using the TS-based Sv threshold decreased fish densities in the middle and lower layers in the deepest lakes with abundant invertebrates (e.g., Mysis diluviana). Correcting for biased in situ TS increased measured density up to 86% in the shallower lakes, which had the highest fish densities. The current recommendations by the GLSOP significantly influence acoustic density estimates, but the degree of importance is lake dependent. Applying GLSOP recommendations, whether in the Laurentian Great Lakes or elsewhere, will improve our ability to compare results among lakes. We recommend further development of standards, including minimum TS and analytical cell size, for reducing the effect of biased in situ TS on density estimates.

  4. Analytical approaches to image orientation and stereo digitization applied in the Budnlab software. (Polish Title: Rozwiazania analityczne zwiazane z obsluga procesu orientacji zdjec oraz wykonywaniem opracowan wektorowych w programie Bundlab)

    NASA Astrophysics Data System (ADS)

    Kolecki, J.

    2015-12-01

    The Bundlab software has been developed mainly for academic and research application. This work can be treated as a kind of a report describing the current state of the development of this computer program, focusing especially on the analytical solutions. Firstly, the overall characteristics of the software are provided. Then the description of the image orientation procedure starting from the relative orientation is addressed. The applied solution is based on the coplanarity equation parametrized with the essential matrix. The problem is reformulated in order to solve it using methods of algebraic geometry. The solution is followed by the optimization involving the least square criterion. The formation of the image block from the oriented models as well as the absolute orientation procedure were implemented using the Horn approach as a base algorithm. The second part of the paper is devoted to the tools and methods applied in the stereo digitization module. The solutions that support the user and improve the accuracy are given. Within the paper a few exemplary applications and products are mentioned. The work finishes with the concepts of development and improvements of existing functions.

  5. Evaluation of new natural deep eutectic solvents for the extraction of isoflavones from soy products.

    PubMed

    Bajkacz, Sylwia; Adamek, Jakub

    2017-06-01

    Natural deep eutectic solvents (NADESs) are considered to be new, safe solvents in green chemistry that can be widely used in many chemical processes such as extraction or synthesis. In this study, a simple extraction method based on NADES was used for the isolation of isoflavones (daidzin, genistin, genistein, daidzein) from soy products. Seventeen different NADES systems each including two or three components were tested. Multivariate data analysis revealed that NADES based on a 30% solution of choline chloride: citric acid (molar ratio of 1:1) are the most effective systems for the extraction of isoflavones from soy products. After extraction, the analytes were detected and quantified using ultra-high performance liquid chromatography with ultraviolet detection (UHPLC-UV). The proposed NADES extraction procedure achieved enrichment factors up to 598 for isoflavones and the recoveries of the analytes were in the range 64.7-99.2%. The developed NADES extraction procedure and UHPLC-UV determination method was successfully applied for the analysis of isoflavones in soy-containing food samples. The obtained results indicated that new natural deep eutectic solvents could be an alternative to traditional solvents for the extraction of isoflavones and can be used as sustainable and safe extraction media for another applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Development of a methodology for strategic environmental assessment: application to the assessment of golf course installation policy in Taiwan.

    PubMed

    Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min

    2009-01-01

    Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.

  7. New procedure for multielemental speciation analysis of five toxic species: As(III), As(V), Cr(VI), Sb(III) and Sb(V) in drinking water samples by advanced hyphenated technique HPLC/ICP-DRC-MS.

    PubMed

    Marcinkowska, Monika; Komorowicz, Izabela; Barałkiewicz, Danuta

    2016-05-12

    Analytical procedure dedicated for multielemental determination of toxic species: As(III), As(V), Cr(VI), Sb(III) and Sb(V) in drinking water samples using high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry (HPLC/ICP-DRC-MS) technique was developed. Optimization of the detection and separation conditions was conducted. Dynamic reaction cell (DRC) with oxygen as a reaction gas was involved in the experiments. Obtained analytical signals for species separation were symmetrical, as studied by anion-exchange chromatography. Applied mobile phase consisted of 3 mM of EDTANa2 and 36 mM of ammonium nitrate. Full separation of species in the form of the following forms: H3AsO3, H2AsO4(-), SbO2(-), Sb(OH)6(-), CrO4(2-) was achieved in 15 min with use of gradient elution program. Detailed validation of analytical procedure proved the reliability of analytical measurements. The procedure was characterized by high precision in the range from 1.7% to 2.4%. Detection limits (LD) were 0.067 μg L(-1), 0.068 μg L(-1), 0.098 μg L(-1), 0.083 μg L(-1) and 0.038 μg L(-1) for As(III), As(V), Cr(VI), Sb(III) and Sb(V), respectively. Obtained recoveries confirmed the lack of interferences' influence on analytical signals as their values were in the range of 91%-110%. The applicability of the proposed procedure was tested on drinking water samples characterized by mineralization up to 650 mg L(-1). Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Design and analysis of composite structures with stress concentrations

    NASA Technical Reports Server (NTRS)

    Garbo, S. P.

    1983-01-01

    An overview of an analytic procedure which can be used to provide comprehensive stress and strength analysis of composite structures with stress concentrations is given. The methodology provides designer/analysts with a user-oriented procedure which, within acceptable engineering accuracy, accounts for the effects of a wide range of application design variables. The procedure permits the strength of arbitrary laminate constructions under general bearing/bypass load conditions to be predicted with only unnotched unidirectional strength and stiffness input data required. Included is a brief discussion of the relevancy of this analysis to the design of primary aircraft structure; an overview of the analytic procedure with theory/test correlations; and an example of the use and interaction of this strength analysis relative to the design of high-load transfer bolted composite joints.

  9. Development of an analytical procedure to study linear alkylbenzenesulphonate (LAS) degradation in sewage sludge-amended soils.

    PubMed

    Comellas, L; Portillo, J L; Vaquero, M T

    1993-12-24

    A procedure for determining linear alkylbenzenesulphonates (LASs) in sewage sludge and amended soils has been developed. Extraction by sample treatment with 0.5 M potassium hydroxide in methanol and reflux was compared with a previously described extraction procedure in Soxhlet with methanol and solid sodium hydroxide in the sample. Repeatability results were similar with savings in extraction time, solvents and evaporation time. A clean-up method involving a C18 cartridge has been developed. Analytes were quantified by a reversed-phase HPLC method with UV and fluorescence detectors. Recoveries obtained were higher than 84%. The standing procedure was applied to high doses of sewage sludge-amended soils (15%) with increasing quantities of added LASs. Degradation data for a 116-day period are presented.

  10. Violent Video Game Effects on Aggression, Empathy, and Prosocial Behavior in Eastern and Western Countries: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Anderson, Craig A.; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L.; Bushman, Brad J.; Sakamoto, Akira; Rothstein, Hannah R.; Saleem, Muniba

    2010-01-01

    Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past…

  11. 40 CFR 63.786 - Test methods and procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... level of sample dilution must be factored in. (2) Repeatability. First, at the 0.1-5 percent analyte... percent analyte range the results would be suspect if duplicates vary by more than 5 percent relative and...) Reproducibility. First, at the 0.1-5 percent analyte range the results would be suspect if lab to lab variation...

  12. 40 CFR 63.786 - Test methods and procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... level of sample dilution must be factored in. (2) Repeatability. First, at the 0.1-5 percent analyte... percent analyte range the results would be suspect if duplicates vary by more than 5 percent relative and...) Reproducibility. First, at the 0.1-5 percent analyte range the results would be suspect if lab to lab variation...

  13. How to conduct External Quality Assessment Schemes for the pre-analytical phase?

    PubMed

    Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre

    2014-01-01

    In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.

  14. Vortex-assisted emulsification semimicroextraction for the analytical control of restricted ingredients in cosmetic products: determination of bronopol by liquid chromatography.

    PubMed

    Miralles, Pablo; Bellver, Raquel; Chisvert, Alberto; Salvador, Amparo

    2016-03-01

    Vortex-assisted emulsification semimicroextraction is proposed as a one-step solution-extraction procedure for sample preparation in cosmetic products. The procedure allows rapid preparation based on dispersion of the sample in a mixture of 1 mL of n-hexane and 0.5 mL of ethanol, followed by the addition of 0.5 mL of water and centrifugation to obtain two separated phases. This procedure provides good sample clean-up with minimum dilution and is very useful for the determination of ingredients with restricted concentrations, such as bronopol. The procedure was applied to the determination of bronopol by liquid chromatography with UV detection. The best chromatographic separation was obtained by using a C18 column set at 40 °C and performing a stepwise elution with a mixture of ethanol/aqueous 1 % acetic acid solution as mobile phase pumped at 0.5 mL min(-1). The detection wavelength was set at 250 nm and the total run time required was 12 min. The method was successfully applied to 18 commercial cosmetic samples including creams, shampoos, and bath gels. Good recoveries and repeatability were obtained, with a limit of detection of 0.9 μg mL(-1), which makes the method suitable for the analytical control of cosmetic products. Moreover, it could be considered environmentally friendly, because water, ethanol, and only a low volume of n-hexane are used as solvents.

  15. Cytological preparations for molecular analysis: A review of technical procedures, advantages and limitations for referring samples for testing.

    PubMed

    da Cunha Santos, G; Saieg, M A; Troncone, G; Zeppa, P

    2018-04-01

    Minimally invasive procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) must yield not only good quality and quantity of material for morphological assessment, but also an adequate sample for analysis of molecular markers to guide patients to appropriate targeted therapies. In this context, cytopathologists worldwide should be familiar with minimum requirements for refereeing cytological samples for testing. The present manuscript is a review with comprehensive description of the content of the workshop entitled Cytological preparations for molecular analysis: pre-analytical issues for EBUS TBNA, presented at the 40th European Congress of Cytopathology in Liverpool, UK. The present review emphasises the advantages and limitations of different types of cytology substrates used for molecular analysis such as archival smears, liquid-based preparations, archival cytospin preparations and FTA (Flinders Technology Associates) cards, as well as their technical requirements/features. These various types of cytological specimens can be successfully used for an extensive array of molecular studies, but the quality and quantity of extracted nucleic acids rely directly on adequate pre-analytical assessment of those samples. In this setting, cytopathologists must not only be familiar with the different types of specimens and associated technical procedures, but also correctly handle the material provided by minimally invasive procedures, ensuring that there is sufficient amount of material for a precise diagnosis and correct management of the patient through personalised care. © 2018 John Wiley & Sons Ltd.

  16. Novel approaches to analysis by flow injection gradient titration.

    PubMed

    Wójtowicz, Marzena; Kozak, Joanna; Kościelniak, Paweł

    2007-09-26

    Two novel procedures for flow injection gradient titration with the use of a single stock standard solution are proposed. In the multi-point single-line (MP-SL) method the calibration graph is constructed on the basis of a set of standard solutions, which are generated in a standard reservoir and subsequently injected into the titrant. According to the single-point multi-line (SP-ML) procedure the standard solution and a sample are injected into the titrant stream from four loops of different capacities, hence four calibration graphs are able to be constructed and the analytical result is calculated on the basis of a generalized slope of these graphs. Both approaches have been tested on the example of spectrophotometric acid-base titration of hydrochloric and acetic acids with using bromothymol blue and phenolphthalein as indicators, respectively, and sodium hydroxide as a titrant. Under optimized experimental conditions the analytical results of precision less than 1.8 and 2.5% (RSD) and of accuracy less than 3.0 and 5.4% (relative error (RE)) were obtained for MP-SL and SP-ML procedures, respectively, in ranges of 0.0031-0.0631 mol L(-1) for samples of hydrochloric acid and of 0.1680-1.7600 mol L(-1) for samples of acetic acid. The feasibility of both methods was illustrated by applying them to the total acidity determination in vinegar samples with precision lower than 0.5 and 2.9% (RSD) for MP-SL and SP-ML procedures, respectively.

  17. The pitfalls of hair analysis for toxicants in clinical practice: three case reports.

    PubMed Central

    Frisch, Melissa; Schwartz, Brian S

    2002-01-01

    Hair analysis is used to assess exposure to heavy metals in patients presenting with nonspecific symptoms and is a commonly used procedure in patients referred to our clinic. We are frequently called on to evaluate patients who have health-related concerns as a result of hair analysis. Three patients first presented to outside physicians with nonspecific, multisystemic symptoms. A panel of analytes was measured in hair, and one or more values were interpreted as elevated. As a result of the hair analysis and other unconventional diagnostic tests, the patients presented to us believing they suffered from metal toxicity. In this paper we review the clinical efficacy of this procedure within the context of a patient population with somatic disorders and no clear risk factors for metal intoxication. We also review limitations of hair analysis in this setting; these limitations include patient factors such as low pretest probability of disease and test factors such as the lack of validation of analytic techniques, the inability to discern between exogenous contaminants and endogenous toxicants in hair, the variability of analytic procedures, low interlaboratory reliability, and the increased likelihood of false positive test results in the measurement of panels of analytes. PMID:11940463

  18. Evolution of microbiological analytical methods for dairy industry needs

    PubMed Central

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675

  19. Collaborative trial validation study of two methods, one based on high performance liquid chromatography-tandem mass spectrometry and on gas chromatography-mass spectrometry for the determination of acrylamide in bakery and potato products.

    PubMed

    Wenzl, Thomas; Karasek, Lubomir; Rosen, Johan; Hellenaes, Karl-Erik; Crews, Colin; Castle, Laurence; Anklam, Elke

    2006-11-03

    A European inter-laboratory study was conducted to validate two analytical procedures for the determination of acrylamide in bakery ware (crispbreads, biscuits) and potato products (chips), within a concentration range from about 20 microg/kg to about 9000 microgg/kg. The methods are based on gas chromatography-mass spectrometry (GC-MS) of the derivatised analyte and on high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) of native acrylamide. Isotope dilution with isotopically labelled acrylamide was an integral part of both methods. The study was evaluated according to internationally accepted guidelines. The performance of the HPLC-MS/MS method was found to be superior to that of the GC-MS method and to be fit-for-the-purpose.

  20. Evolution of microbiological analytical methods for dairy industry needs.

    PubMed

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.

  1. Development of Amperometric Biosensors Based on Nanostructured Tyrosinase-Conducting Polymer Composite Electrodes

    PubMed Central

    Lupu, Stelian; Lete, Cecilia; Balaure, Paul Cătălin; Caval, Dan Ion; Mihailciuc, Constantin; Lakard, Boris; Hihn, Jean-Yves; del Campo, Francisco Javier

    2013-01-01

    Bio-composite coatings consisting of poly(3,4-ethylenedioxythiophene) (PEDOT) and tyrosinase (Ty) were successfully electrodeposited on conventional size gold (Au) disk electrodes and microelectrode arrays using sinusoidal voltages. Electrochemical polymerization of the corresponding monomer was carried out in the presence of various Ty amounts in aqueous buffered solutions. The bio-composite coatings prepared using sinusoidal voltages and potentiostatic electrodeposition methods were compared in terms of morphology, electrochemical properties, and biocatalytic activity towards various analytes. The amperometric biosensors were tested in dopamine (DA) and catechol (CT) electroanalysis in aqueous buffered solutions. The analytical performance of the developed biosensors was investigated in terms of linear response range, detection limit, sensitivity, and repeatability. A semi-quantitative multi-analyte procedure for simultaneous determination of DA and CT was developed. The amperometric biosensor prepared using sinusoidal voltages showed much better analytical performance. The Au disk biosensor obtained by 50 mV alternating voltage amplitude displayed a linear response for DA concentrations ranging from 10 to 300 μM, with a detection limit of 4.18 μM. PMID:23698270

  2. Determination of Caffeine in Beverages by Capillary Zone Electrophoresis: An Experiment for the Undergraduate Analytical Laboratory

    NASA Astrophysics Data System (ADS)

    Conte, Eric D.; Barry, Eugene F.; Rubinstein, Harry

    1996-12-01

    Certain individuals may be sensitive to specific compounds in comsumer products. It is important to quantify these analytes in food products in order to monitor their intake. Caffeine is one such compound. Determination of caffeine in beverages by spectrophotometric procedures requires an extraction procedure, which can prove time-consuming. Although the corresponding determination by HPLC allows for a direct injection, capillary zone electrophoresis provides several advantages such as extremely low solvent consumption, smaller sample volume requirements, and improved sensitivity.

  3. VOFTools - A software package of calculation tools for volume of fluid methods using general convex grids

    NASA Astrophysics Data System (ADS)

    López, J.; Hernández, J.; Gómez, P.; Faura, F.

    2018-02-01

    The VOFTools library includes efficient analytical and geometrical routines for (1) area/volume computation, (2) truncation operations that typically arise in VOF (volume of fluid) methods, (3) area/volume conservation enforcement (VCE) in PLIC (piecewise linear interface calculation) reconstruction and(4) computation of the distance from a given point to the reconstructed interface. The computation of a polyhedron volume uses an efficient formula based on a quadrilateral decomposition and a 2D projection of each polyhedron face. The analytical VCE method is based on coupling an interpolation procedure to bracket the solution with an improved final calculation step based on the above volume computation formula. Although the library was originally created to help develop highly accurate advection and reconstruction schemes in the context of VOF methods, it may have more general applications. To assess the performance of the supplied routines, different tests, which are provided in FORTRAN and C, were implemented for several 2D and 3D geometries.

  4. Estimation of the diagnostic threshold accounting for decision costs and sampling uncertainty.

    PubMed

    Skaltsa, Konstantina; Jover, Lluís; Carrasco, Josep Lluís

    2010-10-01

    Medical diagnostic tests are used to classify subjects as non-diseased or diseased. The classification rule usually consists of classifying subjects using the values of a continuous marker that is dichotomised by means of a threshold. Here, the optimum threshold estimate is found by minimising a cost function that accounts for both decision costs and sampling uncertainty. The cost function is optimised either analytically in a normal distribution setting or empirically in a free-distribution setting when the underlying probability distributions of diseased and non-diseased subjects are unknown. Inference of the threshold estimates is based on approximate analytically standard errors and bootstrap-based approaches. The performance of the proposed methodology is assessed by means of a simulation study, and the sample size required for a given confidence interval precision and sample size ratio is also calculated. Finally, a case example based on previously published data concerning the diagnosis of Alzheimer's patients is provided in order to illustrate the procedure.

  5. Raman spectroscopy for the analytical quality control of low-dose break-scored tablets.

    PubMed

    Gómez, Diego A; Coello, Jordi; Maspoch, Santiago

    2016-05-30

    Quality control of solid dosage forms involves the analysis of end products according to well-defined criteria, including the assessment of the uniformity of dosage units (UDU). However, in the case of break-scored tablets, given that tablet splitting is widespread as a means to adjust doses, the uniform distribution of the active pharmaceutical ingredient (API) in all the possible fractions of the tablet must be assessed. A general procedure to accomplish with both issues, using Raman spectroscopy, is presented. It is based on the acquisition of a collection of spectra in different regions of the tablet, that later can be selected to determine the amount of API in the potential fractions that can result after splitting. The procedure has been applied to two commercial products, Sintrom 1 and Sintrom 4, with API (acenocoumarol) mass proportion of 2% and 0.7% respectively. Partial Least Squares (PLS) calibration models were constructed for the quantification of acenocoumarol in whole tablets using HPLC as a reference analytical method. Once validated, the calibration models were used to determine the API content in the different potential fragments of the scored Sintrom 4 tablets. Fragment mass measurements were also performed to estimate the range of masses of the halves and quarters that could result after tablet splitting. The results show that Raman spectroscopy can be an alternative analytical procedure to assess the uniformity of content, both in whole tablets as in its potential fragments, and that Sintrom 4 tablets can be perfectly split in halves, but some cautions have to be taken when considering the fragmentation in quarters. A practical alternative to the use of UDU test for the assessment of tablet fragments is proposed. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. General method of solving the Schroedinger equation of atoms and molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakatsuji, Hiroshi

    2005-12-15

    We propose a general method of solving the Schroedinger equation of atoms and molecules. We first construct the wave function having the exact structure, using the ICI (iterative configuration or complement interaction) method and then optimize the variables involved by the variational principle. Based on the scaled Schroedinger equation and related principles, we can avoid the singularity problem of atoms and molecules and formulate a general method of calculating the exact wave functions in an analytical expansion form. We choose initial function {psi}{sub 0} and scaling g function, and then the ICI method automatically generates the wave function that hasmore » the exact structure by using the Hamiltonian of the system. The Hamiltonian contains all the information of the system. The free ICI method provides a flexible and variationally favorable procedure of constructing the exact wave function. We explain the computational procedure of the analytical ICI method routinely performed in our laboratory. Simple examples are given using hydrogen atom for the nuclear singularity case, the Hooke's atom for the electron singularity case, and the helium atom for both cases.« less

  7. Stressful life events during adolescence and risk for externalizing and internalizing psychopathology: a meta-analysis.

    PubMed

    March-Llanes, Jaume; Marqués-Feixa, Laia; Mezquita, Laura; Fañanás, Lourdes; Moya-Higueras, Jorge

    2017-12-01

    The main objective of the present research was to analyze the relations between stressful life events and the externalizing and internalizing spectra of psychopathology using meta-analytical procedures. After removing the duplicates, a total of 373 papers were found in a literature search using several bibliographic databases, such as the PsycINFO, Medline, Scopus, and Web of Science. Twenty-seven studies were selected for the meta-analytical analysis after applying different inclusion and exclusion criteria in different phases. The statistical procedure was performed using a random/mixed-effects model based on the correlations found in the studies. Significant positive correlations were found in cross-sectional and longitudinal studies. A transactional effect was then found in the present study. Stressful life events could be a cause, but also a consequence, of psychopathological spectra. The level of controllability of the life events did not affect the results. Special attention should be given to the usage of stressful life events in gene-environment interaction and correlation studies, and also for clinical purposes.

  8. Determination of different recreational drugs in sweat by headspace solid-phase microextraction gas chromatography mass spectrometry (HS-SPME GC/MS): Application to drugged drivers.

    PubMed

    Gentili, Stefano; Mortali, Claudia; Mastrobattista, Luisa; Berretta, Paolo; Zaami, Simona

    2016-09-10

    A procedure based on headspace solid-phase microextraction (HS-SPME) coupled with gas chromatography/mass spectrometry (GC/MS) has been developed for the determination of most commonly used drugs of abuse in sweat of drivers stopped during roadside controls. DrugWipe 5A sweat screening device was used to collect sweat by a specific pad rubbed gently over forehead skin surface. The procedure involved an acid hydrolysis, a HS-SPME extraction for drugs of abuse but Δ(9)-tetrahydrocannabinol, which was directly extracted in alkaline medium HS-SPME conditions, a GC separation of analytes by a capillary column and MS detection by electron impact ionisation. The method was linear from the limit of quantification (LOQ) to 50ng drug per pad (r(2)≥0.99), with an intra- and inter-assay precision and accuracy always less than 15% and an analytical recovery between 95.1% and 102.8%, depending on the considered analyte. Using the validated method, sweat from 60 apparently intoxicated drivers were found positive to one or more drugs of abuse, showing sweat patches testing as a viable economic and simple alternative to conventional (blood and/or urine) and non conventional (oral fluid) testing of drugs of abuse in drugged drivers. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Planned Variation in Preanalytical Conditions to Evaluate Biospecimen Stability in the National Children’s Study (NCS)

    PubMed Central

    Mechanic, Leah; Mendez, Armando; Merrill, Lori; Rogers, John; Layton, Marnie; Todd, Deborah; Varanasi, Arti; O’Brien, Barbara; Meyer, William A.; Zhang, Ming; Schleicher, Rosemary L.; Moye, Jack

    2014-01-01

    BACKGROUND Preanalytical conditions encountered during collection, processing, and storage of biospecimens may influence laboratory results. The National Children’s Study (NCS) is a planned prospective cohort study of 100,000 families to examine the influence of a wide variety of exposures on child health. In developing biospecimen collection, processing, and storage procedures for the NCS, we identified several analytes of different biochemical categories for which it was unclear to what extent deviations from NCS procedures could influence measurement results. METHODS A pilot study was performed to examine effects of preanalytic sample handling conditions (delays in centrifugation, freezing delays, delays in separation from cells, additive delay, and tube type) on concentrations of eight different analytes. 2,825 measurements were made to assess 15 unique combinations of analyte and handling conditions in blood collected from 151 women of childbearing age (≥20 individuals per handling condition). RESULTS The majority of analytes were stable under the conditions evaluated. However, levels of plasma interleukin-6 and serum insulin were decreased in response to sample centrifugation delays of up to 5.5 hours post collection (P<0.0001). In addition, delays in freezing centrifuged plasma samples (comparing 24, 48 and 72 hours to immediate freezing) resulted in increased levels of adrenocorticotropic hormone (P=0.0014). CONCLUSIONS Determining stability of proposed analytes in response to preanalytical conditions and handling helps to ensure high-quality specimens for study now and in the future. The results inform development of procedures, plans for measurement of analytes, and interpretation of laboratory results. PMID:23924524

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoenig, M.; Elsen, Y.V.; Cauter, R.V.

    The progressive degradation of the pyrolytic graphite surface of atomizers provides variable and misleading results of molybdenum peak-height measurements. The changes in the peak shapes produce no analytical problems during the lifetime of the atomizer (approx.300 firings) when integrated absorbance (A.s signals) is considered and the possible base-line drifts are controlled. This was demonstrated on plant samples mineralized by simple digestion with a mixture of HNO/sub 3/ and H/sub 2/O/sub 2/. The value of this method was assessed by comparison with a standard dry oxidation method and by molybdenum determination in National Bureau of Standards reference plant samples. The relativemore » standard deviations (n = 5) of the full analytical procedure do not exceed 7%. 13 references, 3 figures, 3 tables.« less

  11. Analysis of high-aspect-ratio jet-flap wings of arbitrary geometry

    NASA Technical Reports Server (NTRS)

    Lissaman, P. B. S.

    1973-01-01

    An analytical technique to compute the performance of an arbitrary jet-flapped wing is developed. The solution technique is based on the method of Maskell and Spence in which the well-known lifting-line approach is coupled with an auxiliary equation providing the extra function needed in jet-flap theory. The present method is generalized to handle straight, uncambered wings of arbitrary planform, twist, and blowing (including unsymmetrical cases). An analytical procedure is developed for continuous variations in the above geometric data with special functions to exactly treat discontinuities in any of the geometric and blowing data. A rational theory for the effect of finite wing thickness is introduced as well as simplified concepts of effective aspect ratio for rapid estimation of performance.

  12. A methodology for the assessment of manned flight simulator fidelity

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; Malsbury, Terry N.

    1989-01-01

    A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.

  13. Building analytical three-field cosmological models

    NASA Astrophysics Data System (ADS)

    Santos, J. R. L.; Moraes, P. H. R. S.; Ferreira, D. A.; Neta, D. C. Vilar

    2018-02-01

    A difficult task to deal with is the analytical treatment of models composed of three real scalar fields, as their equations of motion are in general coupled and hard to integrate. In order to overcome this problem we introduce a methodology to construct three-field models based on the so-called "extension method". The fundamental idea of the procedure is to combine three one-field systems in a non-trivial way, to construct an effective three scalar field model. An interesting scenario where the method can be implemented is with inflationary models, where the Einstein-Hilbert Lagrangian is coupled with the scalar field Lagrangian. We exemplify how a new model constructed from our method can lead to non-trivial behaviors for cosmological parameters.

  14. Analytical sensor redundancy assessment

    NASA Technical Reports Server (NTRS)

    Mulcare, D. B.; Downing, L. E.; Smith, M. K.

    1988-01-01

    The rationale and mechanization of sensor fault tolerance based on analytical redundancy principles are described. The concept involves the substitution of software procedures, such as an observer algorithm, to supplant additional hardware components. The observer synthesizes values of sensor states in lieu of their direct measurement. Such information can then be used, for example, to determine which of two disagreeing sensors is more correct, thus enhancing sensor fault survivability. Here a stability augmentation system is used as an example application, with required modifications being made to a quadruplex digital flight control system. The impact on software structure and the resultant revalidation effort are illustrated as well. Also, the use of an observer algorithm for wind gust filtering of the angle-of-attack sensor signal is presented.

  15. 42 CFR 493.801 - Condition: Enrollment and testing of samples.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...

  16. 42 CFR 493.801 - Condition: Enrollment and testing of samples.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...

  17. 42 CFR 493.801 - Condition: Enrollment and testing of samples.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...

  18. 42 CFR 493.801 - Condition: Enrollment and testing of samples.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...

  19. 42 CFR 493.801 - Condition: Enrollment and testing of samples.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...

  20. 75 FR 5722 - Procedures for Transportation Workplace Drug and Alcohol Testing Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-04

    ... drugs in a DOT drug test. You must not test ``DOT specimens'' for any other drugs. (a) Marijuana... test analyte concentration analyte concentration Marijuana metabolites 50 ng/mL THCA \\1\\ 15 ng/mL...

  1. Application of Partial Least Square (PLS) Analysis on Fluorescence Data of 8-Anilinonaphthalene-1-Sulfonic Acid, a Polarity Dye, for Monitoring Water Adulteration in Ethanol Fuel.

    PubMed

    Kumar, Keshav; Mishra, Ashok Kumar

    2015-07-01

    Fluorescence characteristic of 8-anilinonaphthalene-1-sulfonic acid (ANS) in ethanol-water mixture in combination with partial least square (PLS) analysis was used to propose a simple and sensitive analytical procedure for monitoring the adulteration of ethanol by water. The proposed analytical procedure was found to be capable of detecting even small adulteration level of ethanol by water. The robustness of the procedure is evident from the statistical parameters such as square of correlation coefficient (R(2)), root mean square of calibration (RMSEC) and root mean square of prediction (RMSEP) that were found to be well with in the acceptable limits.

  2. Environmental and human monitoring of Americium-241 utilizing extraction chromatography and alpha-spectrometry.

    PubMed

    Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J

    1997-03-01

    Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.

  3. Sample Collection Procedures and Strategies

    EPA Pesticide Factsheets

    Individuals responsible for collecting environmental and building material samples following a contamination incident, can use these procedures to plan for and/or collect samples for analysis using the analytical methods listed in EPA's SAM

  4. Analysis of trace contamination of phthalate esters in ultrapure water using a modified solid-phase extraction procedure and automated thermal desorption-gas chromatography/mass spectrometry.

    PubMed

    Liu, Hsu-Chuan; Den, Walter; Chan, Shu-Fei; Kin, Kuan Tzu

    2008-04-25

    The present study was aimed to develop a procedure modified from the conventional solid-phase extraction (SPE) method for the analysis of trace concentration of phthalate esters in industrial ultrapure water (UPW). The proposed procedure allows UPW sample to be drawn through a sampling tube containing hydrophobic sorbent (Tenax TA) to concentrate the aqueous phthalate esters. The solid trap was then demoisturized by two-stage gas drying before subjecting to thermal desorption and analysis by gas chromatography-mass spectrometry. This process removes the solvent extraction procedure necessary for the conventional SPE method, and permits automation of the analytical procedure for high-volume analyses. Several important parameters, including desorption temperature and duration, packing quantity and demoisturizing procedure, were optimized in this study based on the analytical sensitivity for a standard mixture containing five different phthalate esters. The method detection limits for the five phthalate esters were between 36 ng l(-1) and 95 ng l(-1) and recovery rates between 15% and 101%. Dioctyl phthalate (DOP) was not recovered adequately because the compound was both poorly adsorbed and desorbed on and off Tenax TA sorbents. Furthermore, analyses of material leaching from poly(vinyl chloride) (PVC) tubes as well as the actual water samples showed that di-n-butyl phthalate (DBP) and di(2-ethylhexyl) phthalate (DEHP) were the common contaminants detected from PVC contaminated UPW and the actual UPW, as well as in tap water. The reduction of DEHP in the production processes of actual UPW was clearly observed, however a DEHP concentration of 0.20 microg l(-1) at the point of use was still being quantified, suggesting that the contamination of phthalate esters could present a barrier to the future cleanliness requirement of UPW. The work demonstrated that the proposed modified SPE procedure provided an effective method for rapid analysis and contamination identification in UPW production lines.

  5. Systematically reviewing and synthesizing evidence from conversation analytic and related discursive research to inform healthcare communication practice and policy: an illustrated guide

    PubMed Central

    2013-01-01

    Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181

  6. Nanocoating cellulose paper based microextraction combined with nanospray mass spectrometry for rapid and facile quantitation of ribonucleosides in human urine.

    PubMed

    Wan, Lingzhong; Zhu, Haijing; Guan, Yafeng; Huang, Guangming

    2017-07-01

    A rapid and facile analytical method for quantification of ribonucleosides in human urine was developed by the combination of nanocoating cellulose paper based microextraction and nanoelectrospray ionization-tandem mass spectrometry (nESI-MS/MS). Cellulose paper used for microextraction was modified by nano-precision deposition of uniform ultrathin zirconia gel film using a sol-gel process. Due to the large surface area of the cellulose paper and the strong affinity between zirconia and the cis-diol compounds, the target analytes were selectively extracted from the complex matrix. Thus, the detection sensitivity was greatly improved. Typically, the nanocoating cellulose paper was immersed into the diluted urine for selective extraction of target analytes, then the extracted analytes were subjected to nESI-MS/MS detection. The whole analytical procedure could be completed within 10min. The method was evaluated by the determination of ribonucleosides (adenosine, cytidine, uridine, guanosine) in urine sample. The signal intensities of the ribonuclesides extracted by the nanocoating cellulose paper were greatly enhanced by 136-459-folds compared with the one of the unmodified cellulose paper based microextraction. The limits of detection (LODs) and the limits of quantification (LOQs) of the four ribonucleosides were in the range of 0.0136-1.258μgL -1 and 0.0454-4.194μgL -1 , respectively. The recoveries of the target nucleosides from spiked human urine were in the range of 75.64-103.49% with the relative standard deviations (RSDs) less than 9.36%. The results demonstrate the potential of the proposed method for rapid and facile determination of endogenous ribonucleosides in urine sample. Copyright © 2017. Published by Elsevier B.V.

  7. A spin column-free approach to sodium hydroxide-based glycan permethylation.

    PubMed

    Hu, Yueming; Borges, Chad R

    2017-07-24

    Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues-yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based "glycan node" analysis results. When applied to blood plasma samples from stage III-IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens.

  8. A spin column-free approach to sodium hydroxide-based glycan permethylation†

    PubMed Central

    Hu, Yueming; Borges, Chad R.

    2018-01-01

    Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues—yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based “glycan node” analysis results. When applied to blood plasma samples from stage III–IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens. PMID:28635997

  9. Multistate Evaluation of an Ultrafiltration-Based Procedure for Simultaneous Recovery of Enteric Microbes in 100-Liter Tap Water Samples▿

    PubMed Central

    Hill, Vincent R.; Kahler, Amy M.; Jothikumar, Narayanan; Johnson, Trisha B.; Hahn, Donghyun; Cromeans, Theresa L.

    2007-01-01

    Ultrafiltration (UF) is increasingly being recognized as a potentially effective procedure for concentrating and recovering microbes from large volumes of water and treated wastewater. Because of their very small pore sizes, UF membranes are capable of simultaneously concentrating viruses, bacteria, and parasites based on size exclusion. In this study, a UF-based water sampling procedure was used to simultaneously recover representatives of these three microbial classes seeded into 100-liter samples of tap water collected from eight cities covering six hydrologic areas of the United States. The UF-based procedure included hollow-fiber UF as the primary step for concentrating microbes and then used membrane filtration for bacterial culture assays, immunomagnetic separation for parasite recovery and quantification, and centrifugal UF for secondary concentration of viruses. Water samples were tested for nine water quality parameters to investigate whether water quality data correlated with measured recovery efficiencies and molecular detection levels. Average total method recovery efficiencies were 71, 97, 120, 110, and 91% for φX174 bacteriophage, MS2 bacteriophage, Enterococcus faecalis, Clostridium perfringens spores, and Cryptosporidium parvum oocysts, respectively. Real-time PCR and reverse transcription-PCR (RT-PCR) for seeded microbes and controls indicated that tap water quality could affect the analytical performance of molecular amplification assays, although no specific water quality parameter was found to correlate with reduced PCR or RT-PCR performance. PMID:17483281

  10. Loading-unloading response of circular GLARE fiber-metal laminates under lateral indentation

    NASA Astrophysics Data System (ADS)

    Tsamasphyros, George J.; Bikakis, George S.

    2015-01-01

    GLARE is a Fiber-Metal laminated material used in aerospace structures which are frequently subjected to various impact damages. Hence, the response of GLARE plates subjected to lateral indentation is very important. In this paper, analytical expressions are derived and a non-linear finite element modeling procedure is proposed in order to predict the static load-indentation curves of circular GLARE plates during loading and unloading by a hemispherical indentor. We have recently published analytical formulas and a finite element procedure for the static indentation of circular GLARE plates which are now used during the loading stage. Here, considering that aluminum layers are in a state of membrane yield and employing energy balance during unloading, the unloading path is determined. Using this unloading path, an algebraic equation is derived for calculating the permanent dent depth of the GLARE plate after the indentor's withdrawal. Furthermore, our finite element procedure is modified in order to simulate the unloading stage as well. The derived formulas and the proposed finite element modeling procedure are applied successfully to GLARE 2-2/1-0.3 and to GLARE 3-3/2-0.4 circular plates. The analytical results are compared with corresponding FEM results and a good agreement is found. The analytically calculated permanent dent depth is within 6 % for the GLARE 2 plate, and within 7 % for the GLARE 3 plate, of the corresponding numerically calculated result. No other solution of this problem is known to the authors.

  11. Influence of analytical bias and imprecision on the number of false positive results using Guideline-Driven Medical Decision Limits.

    PubMed

    Hyltoft Petersen, Per; Klee, George G

    2014-03-20

    Diagnostic decisions based on decision limits according to medical guidelines are different from the majority of clinical decisions due to the strict dichotomization of patients into diseased and non-diseased. Consequently, the influence of analytical performance is more critical than for other diagnostic decisions where much other information is included. The aim of this opinion paper is to investigate consequences of analytical quality and other circumstances for the outcome of "Guideline-Driven Medical Decision Limits". Effects of analytical bias and imprecision should be investigated separately and analytical quality specifications should be estimated accordingly. Use of sharp decision limits doesn't consider biological variation and effects of this variation are closely connected with the effects of analytical performance. Such relationships are investigated for the guidelines for HbA1c in diagnosis of diabetes and in risk of coronary heart disease based on serum cholesterol. The effects of a second sampling in diagnosis give dramatic reduction in the effects of analytical quality showing minimal influence of imprecision up to 3 to 5% for two independent samplings, whereas the reduction in bias is more moderate and a 2% increase in concentration doubles the percentage of false positive diagnoses, both for HbA1c and cholesterol. An alternative approach comes from the current application of guidelines for follow-up laboratory tests according to clinical procedure orders, e.g. frequency of parathyroid hormone requests as a function of serum calcium concentrations. Here, the specifications for bias can be evaluated from the functional increase in requests for increasing serum calcium concentrations. In consequence of the difficulties with biological variation and the practical utilization of concentration dependence of frequency of follow-up laboratory tests already in use, a kind of probability function for diagnosis as function of the key-analyte is proposed. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Reprint of "Influence of analytical bias and imprecision on the number of false positive results using Guideline-Driven Medical Decision Limits".

    PubMed

    Hyltoft Petersen, Per; Klee, George G

    2014-05-15

    Diagnostic decisions based on decision limits according to medical guidelines are different from the majority of clinical decisions due to the strict dichotomization of patients into diseased and non-diseased. Consequently, the influence of analytical performance is more critical than for other diagnostic decisions where much other information is included. The aim of this opinion paper is to investigate consequences of analytical quality and other circumstances for the outcome of "Guideline-Driven Medical Decision Limits". Effects of analytical bias and imprecision should be investigated separately and analytical quality specifications should be estimated accordingly. Use of sharp decision limits doesn't consider biological variation and effects of this variation are closely connected with the effects of analytical performance. Such relationships are investigated for the guidelines for HbA1c in diagnosis of diabetes and in risk of coronary heart disease based on serum cholesterol. The effects of a second sampling in diagnosis give dramatic reduction in the effects of analytical quality showing minimal influence of imprecision up to 3 to 5% for two independent samplings, whereas the reduction in bias is more moderate and a 2% increase in concentration doubles the percentage of false positive diagnoses, both for HbA1c and cholesterol. An alternative approach comes from the current application of guidelines for follow-up laboratory tests according to clinical procedure orders, e.g. frequency of parathyroid hormone requests as a function of serum calcium concentrations. Here, the specifications for bias can be evaluated from the functional increase in requests for increasing serum calcium concentrations. In consequence of the difficulties with biological variation and the practical utilization of concentration dependence of frequency of follow-up laboratory tests already in use, a kind of probability function for diagnosis as function of the key-analyte is proposed. Copyright © 2014. Published by Elsevier B.V.

  13. 40 CFR 265.92 - Sampling and analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...

  14. 40 CFR 265.92 - Sampling and analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...

  15. 40 CFR 265.92 - Sampling and analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...

  16. 14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82...

  17. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  18. MPL-A program for computations with iterated integrals on moduli spaces of curves of genus zero

    NASA Astrophysics Data System (ADS)

    Bogner, Christian

    2016-06-01

    We introduce the Maple program MPL for computations with multiple polylogarithms. The program is based on homotopy invariant iterated integrals on moduli spaces M0,n of curves of genus 0 with n ordered marked points. It includes the symbol map and procedures for the analytic computation of period integrals on M0,n. It supports the automated computation of a certain class of Feynman integrals.

  19. Subsurface Stress Fields in FCC Single Crystal Anisotropic Contacts

    NASA Technical Reports Server (NTRS)

    Arakere, Nagaraj K.; Knudsen, Erik; Swanson, Gregory R.; Duke, Gregory; Ham-Battista, Gilda

    2004-01-01

    Single crystal superalloy turbine blades used in high pressure turbomachinery are subject to conditions of high temperature, triaxial steady and alternating stresses, fretting stresses in the blade attachment and damper contact locations, and exposure to high-pressure hydrogen. The blades are also subjected to extreme variations in temperature during start-up and shutdown transients. The most prevalent high cycle fatigue (HCF) failure modes observed in these blades during operation include crystallographic crack initiation/propagation on octahedral planes, and non-crystallographic initiation with crystallographic growth. Numerous cases of crack initiation and crack propagation at the blade leading edge tip, blade attachment regions, and damper contact locations have been documented. Understanding crack initiation/propagation under mixed-mode loading conditions is critical for establishing a systematic procedure for evaluating HCF life of single crystal turbine blades. This paper presents analytical and numerical techniques for evaluating two and three dimensional subsurface stress fields in anisotropic contacts. The subsurface stress results are required for evaluating contact fatigue life at damper contacts and dovetail attachment regions in single crystal nickel-base superalloy turbine blades. An analytical procedure is presented for evaluating the subsurface stresses in the elastic half-space, based on the adaptation of a stress function method outlined by Lekhnitskii. Numerical results are presented for cylindrical and spherical anisotropic contacts, using finite element analysis (FEA). Effects of crystal orientation on stress response and fatigue life are examined. Obtaining accurate subsurface stress results for anisotropic single crystal contact problems require extremely refined three-dimensional (3-D) finite element grids, especially in the edge of contact region. Obtaining resolved shear stresses (RSS) on the principal slip planes also involves considerable post-processing work. For these reasons it is very advantageous to develop analytical solution schemes for subsurface stresses, whenever possible.

  20. Inverse Thermal Analysis of Alloy 690 Laser and Hybrid Laser-GMA Welds Using Solidification-Boundary Constraints

    NASA Astrophysics Data System (ADS)

    Lambrakos, S. G.

    2017-08-01

    An inverse thermal analysis of Alloy 690 laser and hybrid laser-GMA welds is presented that uses numerical-analytical basis functions and boundary constraints based on measured solidification cross sections. In particular, the inverse analysis procedure uses three-dimensional constraint conditions such that two-dimensional projections of calculated solidification boundaries are constrained to map within experimentally measured solidification cross sections. Temperature histories calculated by this analysis are input data for computational procedures that predict solid-state phase transformations and mechanical response. These temperature histories can be used for inverse thermal analysis of welds corresponding to other welding processes whose process conditions are within similar regimes.

  1. SAMPLING AND ANALYSIS OF MERCURY IN CRUDE OIL

    EPA Science Inventory

    Sampling and analytical procedures used to determine total mercury content in crude oils were examined. Three analytical methods were compared with respect to accuracy, precision and detection limit. The combustion method and a commercial extraction method were found adequate to...

  2. 40 CFR 1066.101 - Overview.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and... specifications for fuels, engine fluids, and analytical gases; these specifications apply for testing under this...

  3. Evaluation of Second-Level Inference in fMRI Analysis

    PubMed Central

    Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs

    2016-01-01

    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578

  4. 3D-MICE: integration of cross-sectional and longitudinal imputation for multi-analyte longitudinal clinical data.

    PubMed

    Luo, Yuan; Szolovits, Peter; Dighe, Anand S; Baron, Jason M

    2018-06-01

    A key challenge in clinical data mining is that most clinical datasets contain missing data. Since many commonly used machine learning algorithms require complete datasets (no missing data), clinical analytic approaches often entail an imputation procedure to "fill in" missing data. However, although most clinical datasets contain a temporal component, most commonly used imputation methods do not adequately accommodate longitudinal time-based data. We sought to develop a new imputation algorithm, 3-dimensional multiple imputation with chained equations (3D-MICE), that can perform accurate imputation of missing clinical time series data. We extracted clinical laboratory test results for 13 commonly measured analytes (clinical laboratory tests). We imputed missing test results for the 13 analytes using 3 imputation methods: multiple imputation with chained equations (MICE), Gaussian process (GP), and 3D-MICE. 3D-MICE utilizes both MICE and GP imputation to integrate cross-sectional and longitudinal information. To evaluate imputation method performance, we randomly masked selected test results and imputed these masked results alongside results missing from our original data. We compared predicted results to measured results for masked data points. 3D-MICE performed significantly better than MICE and GP-based imputation in a composite of all 13 analytes, predicting missing results with a normalized root-mean-square error of 0.342, compared to 0.373 for MICE alone and 0.358 for GP alone. 3D-MICE offers a novel and practical approach to imputing clinical laboratory time series data. 3D-MICE may provide an additional tool for use as a foundation in clinical predictive analytics and intelligent clinical decision support.

  5. Applied behavior analysis: behavior management of children with autism spectrum disorders in dental environments.

    PubMed

    Hernandez, Purnima; Ikkanda, Zachary

    2011-03-01

    There are a limited number of studies addressing behavior management techniques and procedural modifications that dentists can use to treat people with an autism spectrum disorder (ASD). The authors conducted a search of the dental and behavioral analytic literature to identify management techniques that address problem behaviors exhibited by children with ASDs in dental and other health-related environments. Applied behavior analysis (ABA) is a science in which procedures are based on the principles of behavior through systematic experimentation. Clinicians have used ABA procedures successfully to modify socially significant behaviors of people with ASD. Basic behavior management techniques currently used in dentistry may not encourage people with cognitive and behavioral disabilities, such as ASD, to tolerate simple in-office dental procedures consistently. Instead, dental care providers often are required to use advanced behavior management techniques to complete simple in-office procedures such as prophylaxis, sealant placement and obtaining radiographs. ABA procedures can be integrated in the dental environment to manage problem behaviors often exhibited by children with an ASD. The authors found no evidence-based procedural modifications that address the behavioral characteristics and problematic behaviors of children with an ASD in a dental environment. Further research in this area should be conducted. Knowledge and in-depth understanding of behavioral principles is essential when a dentist is concerned with modifying behaviors. Using ABA procedures can help dentists manage problem behaviors effectively and systematically when performing routine dental treatment. Being knowledgeable about each patient's behavioral characteristics and the parents' level of involvement is important in the successful integration of the procedures and reduction of in-office time.

  6. Determination of nanomolar chromate in drinking water with solid phase extraction and a portable spectrophotometer.

    PubMed

    Ma, Jian; Yang, Bo; Byrne, Robert H

    2012-06-15

    Determination of chromate at low concentration levels in drinking water is an important analytical objective for both human health and environmental science. Here we report the use of solid phase extraction (SPE) in combination with a custom-made portable light-emitting diode (LED) spectrophotometer to achieve detection of chromate in the field at nanomolar levels. The measurement chemistry is based on a highly selective reaction between 1,5-diphenylcarbazide (DPC) and chromate under acidic conditions. The Cr-DPC complex formed in the reaction can be extracted on a commercial C18 SPE cartridge. Concentrated Cr-DPC is subsequently eluted with methanol and detected by spectrophotometry. Optimization of analytical conditions involved investigation of reagent compositions and concentrations, eluent type, flow rate (sample loading), sample volume, and stability of the SPE cartridge. Under optimized conditions, detection limits are on the order of 3 nM. Only 50 mL of sample is required for an analysis, and total analysis time is around 10 min. The targeted analytical range of 0-500 nM can be easily extended by changing the sample volume. Compared to previous SPE-based spectrophotometric methods, this analytical procedure offers the benefits of improved sensitivity, reduced sample consumption, shorter analysis time, greater operational convenience, and lower cost. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Effect of particle inertia on turbulence in a suspension.

    PubMed

    L'vov, Victor S; Ooms, Gijs; Pomyalov, Anna

    2003-04-01

    We propose a one-fluid analytical model for a turbulently flowing dilute suspension, based on a modified Navier-Stokes equation with a k-dependent effective density of suspension rho(eff)(k) and an additional damping term proportional, variant gamma(p)(k), representing the fluid-particle friction (described by Stokes law). The statistical description of turbulence within the model is simplified by a modification of the usual closure procedure based on the Richardson-Kolmogorov picture of turbulence with a differential approximation for the energy transfer term. The resulting ordinary differential equation for the energy budget is solved analytically for various important limiting cases and numerically in the general case. In the inertial interval of scales, we describe analytically two competing effects: the energy suppression due to the fluid-particle friction and the energy enhancement during the cascade process due to decrease of the effective density of the small-scale motions. An additional suppression or enhancement of the energy density may occur in the viscous subrange, caused by the variation of the extent of the inertial interval due to the combined effect of the fluid-particle friction and the decrease of the kinematic viscosity of the suspensions. The analytical description of the complicated interplay of these effects supported by numerical calculations is presented. Our findings allow one to rationalize the qualitative picture of the isotropic homogeneous turbulence of dilute suspensions as observed in direct numerical simulations.

  8. Microchip integrating magnetic nanoparticles for allergy diagnosis.

    PubMed

    Teste, Bruno; Malloggi, Florent; Siaugue, Jean-Michel; Varenne, Anne; Kanoufi, Frederic; Descroix, Stéphanie

    2011-12-21

    We report on the development of a simple and easy to use microchip dedicated to allergy diagnosis. This microchip combines both the advantages of homogeneous immunoassays i.e. species diffusion and heterogeneous immunoassays i.e. easy separation and preconcentration steps. In vitro allergy diagnosis is based on specific Immunoglobulin E (IgE) quantitation, in that way we have developed and integrated magnetic core-shell nanoparticles (MCSNPs) as an IgE capture nanoplatform in a microdevice taking benefit from both their magnetic and colloidal properties. Integrating such immunosupport allows to perform the target analyte (IgE) capture in the colloidal phase thus increasing the analyte capture kinetics since both immunological partners are diffusing during the immune reaction. This colloidal approach improves 1000 times the analyte capture kinetics compared to conventional methods. Moreover, based on the MCSNPs' magnetic properties and on the magnetic chamber we have previously developed the MCSNPs and therefore the target can be confined and preconcentrated within the microdevice prior to the detection step. The MCSNPs preconcentration factor achieved was about 35,000 and allows to reach high sensitivity thus avoiding catalytic amplification during the detection step. The developed microchip offers many advantages: the analytical procedure was fully integrated on-chip, analyses were performed in short assay time (20 min), the sample and reagents consumption was reduced to few microlitres (5 μL) while a low limit of detection can be achieved (about 1 ng mL(-1)).

  9. Quantifying construction and demolition waste: an analytical review.

    PubMed

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Modal element method for potential flow in non-uniform ducts: Combining closed form analysis with CFD

    NASA Technical Reports Server (NTRS)

    Baumeister, Kenneth J.; Baumeister, Joseph F.

    1994-01-01

    An analytical procedure is presented, called the modal element method, that combines numerical grid based algorithms with eigenfunction expansions developed by separation of variables. A modal element method is presented for solving potential flow in a channel with two-dimensional cylindrical like obstacles. The infinite computational region is divided into three subdomains; the bounded finite element domain, which is characterized by the cylindrical obstacle and the surrounding unbounded uniform channel entrance and exit domains. The velocity potential is represented approximately in the grid based domain by a finite element solution and is represented analytically by an eigenfunction expansion in the uniform semi-infinite entrance and exit domains. The calculated flow fields are in excellent agreement with exact analytical solutions. By eliminating the grid surrounding the obstacle, the modal element method reduces the numerical grid size, employs a more precise far field boundary condition, as well as giving theoretical insight to the interaction of the obstacle with the mean flow. Although the analysis focuses on a specific geometry, the formulation is general and can be applied to a variety of problems as seen by a comparison to companion theories in aeroacoustics and electromagnetics.

  11. Novel cellulose-based halochromic test strips for naked-eye detection of alkaline vapors and analytes.

    PubMed

    Abou-Yousef, Hussein; Khattab, Tawfik A; Youssef, Yehia A; Al-Balakocy, Naser; Kamel, Samir

    2017-08-01

    A simple, portable and highly sensitive naked-eye test strip is successfully prepared for optical detection of gaseous and aqueous alkaline analytes. Novel pH-sensory tricyanofuran-hydrazone (TCFH) disperse colorant containing a hydrazone recognition functional moiety is successfully synthesized via azo-coupling reaction between active methyl-containing tricyanofuran (TCF) heterocycle and diazonium salt of 4-aminobenzaldehyde followed by Knoevenagel condensation with malononitrile. UV-vis absorption spectra display solvatochromism and reversible color changes of the TCFH solution in dimethyl sulfoxide in response to pH variations. We investigate the preparation of hydrophobic cellulose/polyethylene terephthalate composites characterized by their high affinity for disperse dyes. Composite films made from CA, Cell/CA, PET/CA, and Cell/PET-CA are produced via solvent-casting procedure using 10-30% modified cellulose or modified polyethylene terephthalate. The mechanical properties and morphologies of these composite films are investigated. The prepared pH-sensory hydrazone-based disperse dye is then applied to dye the produced cellulose-based composite films employing the high temperature pressure dyeing procedure. The produced halochromic PET-CA-TCFH test strip provide an instant visible signal from orange to purple upon exposure to alkaline conditions as proved by the coloration measurements. The sensor strip exhibits high sensitivity and quick detection toward ammonia in both of aqueous and vapor phases by naked-eye observations at room temperature and atmospheric pressure. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Quality-assurance data for routine water quality analyses by the U. S. Geological Survey laboratory in Troy, New York; July 1993 through June 1995

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2001-01-01

    A laboratory for analysis of low-ionic strength water has been developed at the U.S. Geological Survey (USGS) office in Troy, N.Y., to analyze samples collected by USGS projects in the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures developed to ensure proper sample collection, processing, and analysis. The quality-assurance/quality-control data are stored in the laboratory's SAS data-management system, which provides efficient review, compilation, and plotting of quality-assurance/quality-control data. This report presents and discusses samples analyzed from July 1993 through June 1995. Quality-control results for 18 analytical procedures were evaluated for bias and precision. Control charts show that data from seven of the analytical procedures were biased throughout the analysis period for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, dissolved inorganic carbon, dissolved organic carbon (soil expulsions), chloride, magnesium, nitrate (colorimetric method), and pH. Three of the analytical procedures were occasionally biased but were within control limits; they were: calcium (high for high-concentration samples for May 1995), dissolved organic carbon (high for highconcentration samples from January through September 1994), and fluoride (high in samples for April and June 1994). No quality-control sample has been developed for the organic monomeric aluminum procedure. Results from the filter-blank and analytical-blank analyses indicate that all analytical procedures in which blanks were run were within control limits, although values for a few blanks were outside the control limits. Blanks were not analyzed for acid-neutralizing capacity, dissolved inorganic carbon, fluoride, nitrate (colorimetric method), or pH. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in 14 of the 18 procedures. Data-quality objectives were met by more than 90 percent of the samples analyzed in all procedures except total monomeric aluminum (85 percent of samples met objectives), total aluminum (70 percent of samples met objectives), and dissolved organic carbon (85 percent of samples met objectives). Triplicate samples were not analyzed for ammonium, fluoride, dissolved inorganic carbon, or nitrate (colorimetric method). Results of the USGS interlaboratory Standard Reference Sample Program indicated high data quality with a median result of 3.6 of a possible 4.0. Environment Canada's LRTAP interlaboratory study results indicated that more than 85 percent of the samples met data-quality objectives in 6 of the 12 analyses; exceptions were calcium, dissolved organic carbon, chloride, pH, potassium, and sodium. Data-quality objectives were not met for calcium samples in one LRTAP study, but 94 percent of samples analyzed were within control limits for the remaining studies. Data-quality objectives were not met by 35 percent of samples analyzed for dissolved organic carbon, but 94 percent of sample values were within 20 percent of the most probable value. Data-quality objectives were not met for 30 percent of samples analyzed for chloride, but 90 percent of sample values were within 20 percent of the most probable value. Measurements of samples with a pH above 6.0 were biased high in 54 percent of the samples, although 85 percent of the samples met data-quality objectives for pH measurements below 6.0. Data-quality objectives for potassium and sodium were not met in one study (only 33 percent of the samples analyzed met the objectives), although 85 percent of the sample values were within control limits for the other studies. Measured sodium values were above the upper control limit in all studies. Results from blind reference-sample analyses indicated that data

  13. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  14. Low level vapor verification of monomethyl hydrazine

    NASA Technical Reports Server (NTRS)

    Mehta, Narinder

    1990-01-01

    The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.

  15. Laboratory and quality assurance protocols for the analysis of herbicides in ground water from the Management Systems Evaluation Area, Princeton, Minnesota

    USGS Publications Warehouse

    Larson, S.J.; Capel, P.D.; VanderLoop, A.G.

    1996-01-01

    Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.

  16. Analytical procedures for the determination of fuel combustion products, anti-corrosive compounds, and de-icing compounds in airport runoff water samples.

    PubMed

    Sulej, Anna Maria; Polkowska, Żaneta; Astel, Aleksander; Namieśnik, Jacek

    2013-12-15

    The purpose of this study is to propose and evaluate new procedures for determination of fuel combustion products, anti-corrosive and de-icing compounds in runoff water samples collected from the airports located in different regions and characterized by different levels of the activity expressed by the number of flights and the number of passengers (per year). The most difficult step in the analytical procedure used for the determination of PAHs, benzotriazoles and glycols is sample preparation stage, due to diverse matrix composition, the possibility of interference associated with the presence of components with similar physicochemical properties. In this study, five different versions of sample preparation using extraction techniques, such as: LLE and SPE, were tested. In all examined runoff water samples collected from the airports, the presence of PAH compounds and glycols was observed. In majority of the samples, BT compounds were determined. Runoff water samples collected from the areas of Polish and British international airports as well as local airports had similar qualitative composition, but quantitative composition of the analytes was very diverse. New and validated analytical methodologies ensure that the necessary information for assessing the negative impact of airport activities on the environment can be obtained. © 2013 Elsevier B.V. All rights reserved.

  17. Development and optimization of SPE-HPLC-UV/ELSD for simultaneous determination of nine bioactive components in Shenqi Fuzheng Injection based on Quality by Design principles.

    PubMed

    Wang, Lu; Qu, Haibin

    2016-03-01

    A method combining solid phase extraction, high performance liquid chromatography, and ultraviolet/evaporative light scattering detection (SPE-HPLC-UV/ELSD) was developed according to Quality by Design (QbD) principles and used to assay nine bioactive compounds within a botanical drug, Shenqi Fuzheng Injection. Risk assessment and a Plackett-Burman design were utilized to evaluate the impact of 11 factors on the resolutions and signal-to-noise of chromatographic peaks. Multiple regression and Pareto ranking analysis indicated that the sorbent mass, sample volume, flow rate, column temperature, evaporator temperature, and gas flow rate were statistically significant (p < 0.05) in this procedure. Furthermore, a Box-Behnken design combined with response surface analysis was employed to study the relationships between the quality of SPE-HPLC-UV/ELSD analysis and four significant factors, i.e., flow rate, column temperature, evaporator temperature, and gas flow rate. An analytical design space of SPE-HPLC-UV/ELSD was then constructed by calculated Monte Carlo probability. In the presented approach, the operating parameters of sample preparation, chromatographic separation, and compound detection were investigated simultaneously. Eight terms of method validation, i.e., system-suitability tests, method robustness/ruggedness, sensitivity, precision, repeatability, linearity, accuracy, and stability, were accomplished at a selected working point. These results revealed that the QbD principles were suitable in the development of analytical procedures for samples in complex matrices. Meanwhile, the analytical quality and method robustness were validated by the analytical design space. The presented strategy provides a tutorial on the development of a robust QbD-compliant quantitative method for samples in complex matrices.

  18. Further investigation of a finite difference procedure for analyzing the transonic flow about harmonically oscillating airfoils and wings

    NASA Technical Reports Server (NTRS)

    Weatherill, W. H.; Ehlers, F. E.; Yip, E.; Sebastian, J. D.

    1980-01-01

    Analytical and empirical studies of a finite difference method for the solution of the transonic flow about harmonically oscillating wings and airfoils are presented. The procedure is based on separating the velocity potential into steady and unsteady parts and linearizing the resulting unsteady equations for small disturbances. The steady velocity potential is obtained first from the well-known nonlinear equation for steady transonic flow. The unsteady velocity potential is then obtained from a linear differential equation in complex form with spatially varying coefficients. Since sinusoidal motion is assumed, the unsteady equation is independent of time. An out-of-core direct solution procedure was developed and applied to two-dimensional sections. Results are presented for a section of vanishing thickness in subsonic flow and an NACA 64A006 airfoil in supersonic flow. Good correlation is obtained in the first case at values of Mach number and reduced frequency of direct interest in flutter analyses. Reasonable results are obtained in the second case. Comparisons of two-dimensional finite difference solutions with exact analytic solutions indicate that the accuracy of the difference solution is dependent on the boundary conditions used on the outer boundaries. Homogeneous boundary conditions on the mesh edges that yield complex eigenvalues give the most accurate finite difference solutions. The plane outgoing wave boundary conditions meet these requirements.

  19. Fast methodology for the reliable determination of nonylphenol in water samples by minimal labeling isotope dilution mass spectrometry.

    PubMed

    Fabregat-Cabello, Neus; Castillo, Ángel; Sancho, Juan V; González, Florenci V; Roig-Navarro, Antoni Francesc

    2013-08-02

    In this work we have developed and validated an accurate and fast methodology for the determination of 4-nonylphenol (technical mixture) in complex matrix water samples by UHPLC-ESI-MS/MS. The procedure is based on isotope dilution mass spectrometry (IDMS) in combination with isotope pattern deconvolution (IPD), which provides the concentration of the analyte directly from the spiked sample without requiring any methodological calibration graph. To avoid any possible isotopic effect during the analytical procedure the in-house synthesized (13)C1-4-(3,6-dimethyl-3-heptyl)phenol was used as labeled compound. This proposed surrogate was able to compensate the matrix effect even from wastewater samples. A SPE pre-concentration step together with exhaustive efforts to avoid contamination were included to reach the signal-to-noise ratio necessary to detect the endogenous concentrations present in environmental samples. Calculations were performed acquiring only three transitions, achieving limits of detection lower than 100ng/g for all water matrix assayed. Recoveries within 83-108% and coefficients of variation ranging from 1.5% to 9% were obtained. On the contrary a considerable overestimation was obtained with the most usual classical calibration procedure using 4-n-nonylphenol as internal standard, demonstrating the suitability of the minimal labeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Determination of target fat-soluble micronutrients in rainbow trout's muscle and liver tissues by liquid chromatography with diode array-tandem mass spectrometry detection.

    PubMed

    Pérez Fernández, Virginia; Ventura, Salvatore; Tomai, Pierpaolo; Curini, Roberta; Gentili, Alessandra

    2017-03-01

    This paper describes an analytical approach, based on LC-diode array detector-MS/MS (LC-DAD-MS/MS), for characterizing the fat-soluble micronutrient fraction in rainbow trout (Oncorhynchus mykiss). Two different procedures were applied to isolate the analytes from liver and muscle tissue: overnight cold saponification to hydrolyze bound forms and to simplify the analysis; matrix solid-phase dispersion to avoid artifacts and to maintain unaltered the naturally occurring forms. Analytes were separated on a C 30 analytical column by using a nonaqueous reversed mobile phase compatible with the atmospheric pressure chemical ionization. Compared to other works, the most relevant advantage of the here illustrated method is the large amount of information obtained with few analytical steps: nine fat-soluble vitamins (3,4-dehydroretinol, retinol, cholecalciferol, ergocalciferol, α-tocopherol, γ-tocopherol, δ-tocopherol, phylloquinone, and menaquinone-4) and eight carotenoids (all-trans-lutein, all-trans-astaxanthin, all-trans-zeaxanthin, all-trans-β-cryptoxanthin, all-trans-canthaxanthin, all-trans-ζ-carotene, all-trans-β-carotene, and all-trans-γ-carotene) were quantified after the method validation, while other untargeted carotenoids were tentatively identified by exploiting the identification power of the LC-DAD-MS/MS hyphenation. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. The interpretation of hair analysis for drugs and drug metabolites.

    PubMed

    Cuypers, Eva; Flanagan, Robert J

    2018-02-01

    Head hair analysis for drugs and drug metabolites has been used widely with the aim of detecting exposure in the weeks or months prior to sample collection. However, inappropriate interpretation of results has likely led to serious miscarriages of justice, especially in child custody cases. The aim of this review is to assess critically what can, and perhaps more importantly, what cannot be claimed as regards the interpretation of hair test results in a given set of circumstances in order to inform future testing. We searched the PubMed database for papers published 2010-2016 using the terms "hair" and "drug" and "decontamination", the terms "hair" and "drug" and "contamination", the terms "hair" and "drug-facilitated crime", the terms "hair" and "ethyl glucuronide", and the terms "hair", "drug testing" and "analysis". Study of the reference lists of the 46 relevant papers identified 25 further relevant citations, giving a total of 71 citations. Hair samples: Drugs, drug metabolites and/or decomposition products may arise not only from deliberate drug administration, but also via deposition from a contaminated atmosphere if drug(s) have been smoked or otherwise vaporized in a confined area, transfer from contaminated surfaces via food/fingers, etc., and transfer from sweat and other secretions after a single large exposure, which could include anesthesia. Excretion in sweat of endogenous analytes such as γ-hydroxybutyric acid is a potential confounder if its use is to be investigated. Cosmetic procedures such as bleaching or heat treatment of hair may remove analytes prior to sample collection. Hair color and texture, the area of the head the sample is taken from, the growth rate of individual hairs, and how the sample has been stored, may also affect the interpretation of results. Toxicological analysis: Immunoassay results alone do not provide reliable evidence on which to base judicial decisions. Gas or liquid chromatography with mass spectrometric detection (GC- or LC-MS), if used with due caution, can give accurate analyte identification and high sensitivity, but many problems remain. Firstly, it is not possible to prepare assay calibrators or quality control material except by soaking "blank" hair in solutions of appropriate analytes, drying, and then subjecting the dried material to an analysis. The fact that solvents can be used to add analytes to hair points to the fact that analytes can arrive not only on, but also in hair from exogenous sources. A range of solvent-washing procedures have been advocated to "decontaminate" hair by removing adsorbed analytes, but these carry the risk of transporting adsorbed analytes into the medulla of the hair therefore confounding the whole procedure. This is especially true if segmental analysis is being undertaken in order to provide a "time course" of drug exposure. Proposed clinical applications of hair analysis: There have been a number of reports where drugs seemingly administered during the perpetration of a crime have been detected in head hair. However, detailed evaluation of these reports is difficult without full understanding of the possible effects of any "decontamination" procedures used and of other variables such as hair color or cosmetic hair treatment. Similarly, in child custody cases and where the aim is to demonstrate abstinence from drug or alcohol use, the issues of possible exogenous sources of analyte, and of the large variations in analyte concentrations reported in known users, continue to confound the interpretation of results in individual cases. Interpretation of results of head hair analysis must take into account all the available circumstantial and other evidence especially as regards the methodology employed and the possibility of surface contamination of the hair prior to collection.

  2. Analytical display design for flight tasks conducted under instrument meteorological conditions. [human factors engineering of pilot performance for display device design in instrument landing systems

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1976-01-01

    Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.

  3. Advances in the analysis of biological samples using ionic liquids.

    PubMed

    Clark, Kevin D; Trujillo-Rodríguez, María J; Anderson, Jared L

    2018-02-12

    Ionic liquids are a class of solvents and materials that hold great promise in bioanalytical chemistry. Task-specific ionic liquids have recently been designed for the selective extraction, separation, and detection of proteins, peptides, nucleic acids, and other physiologically relevant analytes from complex biological samples. To facilitate rapid bioanalysis, ionic liquids have been integrated in miniaturized and automated procedures. Bioanalytical separations have also benefited from the modification of nonspecific magnetic materials with ionic liquids or the implementation of ionic liquids with inherent magnetic properties. Furthermore, the direct detection of the extracted molecules in the analytical instrument has been demonstrated with structurally tuned ionic liquids and magnetic ionic liquids, providing a significant advantage in the analysis of low-abundance analytes. This article gives an overview of these advances that involve the application of ionic liquids and derivatives in bioanalysis. Graphical abstract Ionic liquids, magnetic ionic liquids, and ionic liquid-based sorbents are increasing the speed, selectivity, and sensitivity in the analysis of biological samples.

  4. PyVCI: A flexible open-source code for calculating accurate molecular infrared spectra

    NASA Astrophysics Data System (ADS)

    Sibaev, Marat; Crittenden, Deborah L.

    2016-06-01

    The PyVCI program package is a general purpose open-source code for simulating accurate molecular spectra, based upon force field expansions of the potential energy surface in normal mode coordinates. It includes harmonic normal coordinate analysis and vibrational configuration interaction (VCI) algorithms, implemented primarily in Python for accessibility but with time-consuming routines written in C. Coriolis coupling terms may be optionally included in the vibrational Hamiltonian. Non-negligible VCI matrix elements are stored in sparse matrix format to alleviate the diagonalization problem. CPU and memory requirements may be further controlled by algorithmic choices and/or numerical screening procedures, and recommended values are established by benchmarking using a test set of 44 molecules for which accurate analytical potential energy surfaces are available. Force fields in normal mode coordinates are obtained from the PyPES library of high quality analytical potential energy surfaces (to 6th order) or by numerical differentiation of analytic second derivatives generated using the GAMESS quantum chemical program package (to 4th order).

  5. Magnetically-driven medical robots: An analytical magnetic model for endoscopic capsules design

    NASA Astrophysics Data System (ADS)

    Li, Jing; Barjuei, Erfan Shojaei; Ciuti, Gastone; Hao, Yang; Zhang, Peisen; Menciassi, Arianna; Huang, Qiang; Dario, Paolo

    2018-04-01

    Magnetic-based approaches are highly promising to provide innovative solutions for the design of medical devices for diagnostic and therapeutic procedures, such as in the endoluminal districts. Due to the intrinsic magnetic properties (no current needed) and the high strength-to-size ratio compared with electromagnetic solutions, permanent magnets are usually embedded in medical devices. In this paper, a set of analytical formulas have been derived to model the magnetic forces and torques which are exerted by an arbitrary external magnetic field on a permanent magnetic source embedded in a medical robot. In particular, the authors modelled cylindrical permanent magnets as general solution often used and embedded in magnetically-driven medical devices. The analytical model can be applied to axially and diametrically magnetized, solid and annular cylindrical permanent magnets in the absence of the severe calculation complexity. Using a cylindrical permanent magnet as a selected solution, the model has been applied to a robotic endoscopic capsule as a pilot study in the design of magnetically-driven robots.

  6. Solvent microextraction-flame atomic absorption spectrometry (SME-FAAS) for determination of ultratrace amounts of cadmium in meat and fish samples.

    PubMed

    Goudarzi, Nasser

    2009-02-11

    A simple, low cost and highly sensitive method based on solvent microextraction (SME) for separation/preconcentration and flame atomic absorption spectrometry (FAAS) was proposed for the determination of ultratrace amounts of cadmium in meat and fish samples. The analytical procedure involved the formation of a hydrophobic complex by mixing the analyte solution with an ammonium pyrrolidinedithiocarbamate (APDC) solution. In suitable conditions, the complex of cadmium-APDC entered the micro organic phase, and thus, separation of the analyte from the matrix was achieved. Under optimal chemical and instrumental conditions, a detection limit (3 sigma) of 0.8 ng L(-1) and an enrichment factor of 93 were achieved. The relative standard deviation for the method was found to be 2.2% for Cd. The interference effects of some anions and cations were also investigated. The developed method has been applied to the determination of trace Cd in meat and fish samples.

  7. Surface enhanced Raman scattering imaging of developed thin-layer chromatography plates.

    PubMed

    Freye, Chris E; Crane, Nichole A; Kirchner, Teresa B; Sepaniak, Michael J

    2013-04-16

    A method for hyphenating surface enhanced Raman scattering (SERS) and thin-layer chromatography (TLC) is presented that employs silver-polymer nanocomposites as an interface. Through the process of conformal blotting, analytes are transferred from TLC plates to nanocomposite films before being imaged via SERS. A procedure leading to maximum blotting efficiency was established by investigating various parameters such as time, pressure, and type and amount of blotting solvent. Additionally, limits of detection were established for test analytes malachite green isothiocyanate, 4-aminothiophenol, and Rhodamine 6G (Rh6G) ranging from 10(-7) to 10(-6) M. Band broadening due to blotting was minimal (∼10%) as examined by comparing the spatial extent of TLC-spotted Rh6G via fluorescence and then the SERS-based spot size on the nanocomposite after the blotting process. Finally, a separation of the test analytes was carried out on a TLC plate followed by blotting and the acquisition of distance × wavenumber × intensity three-dimensional TLC-SERS plots.

  8. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2015-01-01

    Within the mosaic display of international anti-doping efforts, analytical strategies based on up-to-date instrumentation as well as most recent information about physiology, pharmacology, metabolism, etc., of prohibited substances and methods of doping are indispensable. The continuous emergence of new chemical entities and the identification of arguably beneficial effects of established or even obsolete drugs on endurance, strength, and regeneration, necessitate frequent and adequate adaptations of sports drug testing procedures. These largely rely on exploiting new technologies, extending the substance coverage of existing test protocols, and generating new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA). In reference of the content of the 2014 Prohibited List, literature concerning human sports drug testing that was published between October 2013 and September 2014 is summarized and reviewed in this annual banned-substance review, with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Applications of computer algebra to distributed parameter systems

    NASA Technical Reports Server (NTRS)

    Storch, Joel A.

    1993-01-01

    In the analysis of vibrations of continuous elastic systems, one often encounters complicated transcendental equations with roots directly related to the system's natural frequencies. Typically, these equations contain system parameters whose values must be specified before a numerical solution can be obtained. The present paper presents a method whereby the fundamental frequency can be obtained in analytical form to any desired degree of accuracy. The method is based upon truncation of rapidly converging series involving inverse powers of the system natural frequencies. A straightforward method to developing these series and summing them in closed form is presented. It is demonstrated how Computer Algebra can be exploited to perform the intricate analytical procedures which otherwise would render the technique difficult to apply in practice. We illustrate the method by developing two analytical approximations to the fundamental frequency of a vibrating cantilever carrying a rigid tip body. The results are compared to the numerical solution of the exact (transcendental) frequency equation over a range of system parameters.

  10. Fused Deposition Modeling 3D Printing for (Bio)analytical Device Fabrication: Procedures, Materials, and Applications

    PubMed Central

    2017-01-01

    In this work, the use of fused deposition modeling (FDM) in a (bio)analytical/lab-on-a-chip research laboratory is described. First, the specifications of this 3D printing method that are important for the fabrication of (micro)devices were characterized for a benchtop FDM 3D printer. These include resolution, surface roughness, leakage, transparency, material deformation, and the possibilities for integration of other materials. Next, the autofluorescence, solvent compatibility, and biocompatibility of 12 representative FDM materials were tested and evaluated. Finally, we demonstrate the feasibility of FDM in a number of important applications. In particular, we consider the fabrication of fluidic channels, masters for polymer replication, and tools for the production of paper microfluidic devices. This work thus provides a guideline for (i) the use of FDM technology by addressing its possibilities and current limitations, (ii) material selection for FDM, based on solvent compatibility and biocompatibility, and (iii) application of FDM technology to (bio)analytical research by demonstrating a broad range of illustrative examples. PMID:28628294

  11. Development of coring procedures applied to Si, CdTe, and CIGS solar panels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moutinho, H. R.; Johnston, S.; To, B.

    Most of the research on the performance and degradation of photovoltaic modules is based on macroscale measurements of device parameters such as efficiency, fill factor, open-circuit voltage, and short-circuit current. Our goal is to develop the capabilities to allow us to study the degradation of these parameters in the micro- and nanometer scale and to relate our results to performance parameters. To achieve this objective, the first step is to be able to access small samples from specific areas of the solar panels without changing the properties of the material. In this paper, we describe two coring procedures that wemore » developed and applied to Si, CIGS, and CdTe solar panels. In the first procedure, we cored full samples, whereas in the second we performed a partial coring that keeps the tempered glass intact. The cored samples were analyzed by different analytical techniques before and after coring, at the same locations, and no damage during the coring procedure was observed.« less

  12. Development of coring procedures applied to Si, CdTe, and CIGS solar panels

    DOE PAGES

    Moutinho, H. R.; Johnston, S.; To, B.; ...

    2018-01-04

    Most of the research on the performance and degradation of photovoltaic modules is based on macroscale measurements of device parameters such as efficiency, fill factor, open-circuit voltage, and short-circuit current. Our goal is to develop the capabilities to allow us to study the degradation of these parameters in the micro- and nanometer scale and to relate our results to performance parameters. To achieve this objective, the first step is to be able to access small samples from specific areas of the solar panels without changing the properties of the material. In this paper, we describe two coring procedures that wemore » developed and applied to Si, CIGS, and CdTe solar panels. In the first procedure, we cored full samples, whereas in the second we performed a partial coring that keeps the tempered glass intact. The cored samples were analyzed by different analytical techniques before and after coring, at the same locations, and no damage during the coring procedure was observed.« less

  13. Experimental performance and acoustic investigation of modern, counterrotating blade concepts

    NASA Technical Reports Server (NTRS)

    Hoff, G. E.

    1990-01-01

    The aerodynamic, acoustic, and aeromechanical performance of counterrotating blade concepts were evaluated both theoretically and experimentally. Analytical methods development and design are addressed. Utilizing the analytical methods which evolved during the conduct of this work, aerodynamic and aeroacoustic predictions were developed, which were compared to NASA and GE wind tunnel test results. The detailed mechanical design and fabrication of five different composite shell/titanium spar counterrotating blade set configurations are presented. Design philosophy, analyses methods, and material geometry are addressed, as well as the influence of aerodynamics, aeromechanics, and aeroacoustics on the design procedures. Blade fabrication and quality control procedures are detailed; bench testing procedures and results of blade integrity verification are presented; and instrumentation associated with the bench testing also is identified. Additional hardware to support specialized testing is described, as are operating blade instrumentation and the associated stress limits. The five counterrotating blade concepts were scaled to a tip diameter of 2 feet, so they could be incorporated into MPS (model propulsion simulators). Aerodynamic and aeroacoustic performance testing was conducted in the NASA Lewis 8 x 6 supersonic and 9 x 15 V/STOL (vertical or short takeoff and landing) wind tunnels and in the GE freejet anechoic test chamber (Cell 41) to generate an experimental data base for these counterrotating blade designs. Test facility and MPS vehicle matrices are provided, and test procedures are presented. Effects on performance of rotor-to-rotor spacing, angle-of-attack, pylon proximity, blade number, reduced-diameter aft blades, and mismatched rotor speeds are addressed. Counterrotating blade and specialized aeromechanical hub stability test results are also furnished.

  14. Electrothermal atomic absorption spectrometric determination of arsenic in essential lavender and rose oils.

    PubMed

    Karadjova, Irina B; Lampugnani, Leonardo; Tsalev, Dimiter L

    2005-02-28

    Analytical procedures for electrothermal atomic absorption spectrometric (ETAAS) determination of arsenic in essential oils from lavender (Lavendula angustifolia) and rose (Rosa damascena) are described. For direct ETAAS analysis, oil samples are diluted with ethanol or i-propanol for lavender and rose oil, respectively. Leveling off responses of four different arsenic species (arsenite, arsenate, monomethylarsonate and dimethylarsinate) is achieved by using a composite chemical modifier: l-cysteine (0.05gl(-1)) in combination with palladium (2.5mug) and citric acid (100mug). Transverse-heated graphite atomizer (THGA) with longitudinal Zeeman-effect background correction and 'end-capped' graphite tubes with integrated pyrolytic graphite platforms, pre-treated with Zr-Ir for permanent modification are employed as most appropriate atomizer. Calibration with solvent-matched standard solutions of As(III) is used for four- and five-fold diluted samples of lavender and rose oil, respectively. Lower dilution factors required standard addition calibration by using aqueous (for lavender oil) or i-propanol (for rose oil) solutions of As(III). The limits of detection (LOD) for the whole analytical procedure are 4.4 and 4.7ngg(-1) As in levender and rose oil, respectively. The relative standard deviation (R.S.D.) for As at 6-30ngg(-1) levels is between 8 and 17% for both oils. As an alternative, procedure based on low temperature plasma ashing in oxygen with ETAAS, providing LODs of 2.5 and 2.7ngg(-1) As in levender and rose oil, respectively, and R.S.D. within 8-12% for both oils has been elaborated. Results obtained by both procedures are in good agreement.

  15. Study of the use of axial viewed inductively coupled plasma atomic emission spectrometry with ultrasonic nebulization for the determination of select elemental impurities in oral drug products.

    PubMed

    Menoutis, James; Parisi, Angela; Verma, Natasha

    2018-04-15

    In efforts to control the potential presence of heavy metals in pharmaceuticals, the United States Pharmacopeia (USP) and International Conference on Harmonization (ICH) have put forth new requirements and guidelines for their control. The new requirements and guidelines establish specific daily exposures (PDE) for 24 heavy metals/elemental impurities (EI) based upon their toxicological properties. USP General Chapter 〈233〉 provides a general reference procedure for preparing pharmaceutical samples for analysis employing microwave assisted digestion (MWAD). It also provides two Compendial Procedures, Procedure 1 employing ICP-AES, and Procedure 2 employing ICP-MS. Given the extremely low detection limits afforded by ICP-MS, much work has been done in developing and evaluating analytical methods to support the analysis of elemental impurities in finished pharmaceutical products, active pharmaceutical ingredients, and excipients by this analytical technique. In this study, we have evaluated the use of axial ICP-AES. This employs ultrasonic nebulization (UN) for the determination of Class 1 and 2 EI, instead of traditional pneumatic nebulization. The study also employed closed vessel MWAD to prepare samples for analysis. Limits of quantitation were element specific and significantly lower than the PDEs for oral drugs. Spike recoveries for the elements studied ranged between 89.3% and 109.25%, except for Os, which was subject to OsO4 formation during MWAD. The use of axial ICP-AES UN provides an alternative to ICP-MS in the analysis of EI requiring low detection limits. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Determination of Total Carbohydrates in Algal Biomass: Laboratory Analytical Procedure (LAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    This procedure uses two-step sulfuric acid hydrolysis to hydrolyze the polymeric forms of carbohydrates in algal biomass into monomeric subunits. The monomers are then quantified by either HPLC or a suitable spectrophotometric method.

  17. 1990 National Water Quality Laboratory Services Catalog

    USGS Publications Warehouse

    Pritt, Jeffrey; Jones, Berwyn E.

    1989-01-01

    PREFACE This catalog provides information about analytical services available from the National Water Quality Laboratory (NWQL) to support programs of the Water Resources Division of the U.S. Geological Survey. To assist personnel in the selection of analytical services, the catalog lists cost, sample volume, applicable concentration range, detection level, precision of analysis, and preservation techniques for samples to be submitted for analysis. Prices for services reflect operationa1 costs, the complexity of each analytical procedure, and the costs to ensure analytical quality control. The catalog consists of five parts. Part 1 is a glossary of terminology; Part 2 lists the bottles, containers, solutions, and other materials that are available through the NWQL; Part 3 describes the field processing of samples to be submitted for analysis; Part 4 describes analytical services that are available; and Part 5 contains indices of analytical methodology and Chemical Abstract Services (CAS) numbers. Nomenclature used in the catalog is consistent with WATSTORE and STORET. The user is provided with laboratory codes and schedules that consist of groupings of parameters which are measured together in the NWQL. In cases where more than one analytical range is offered for a single element or compound, different laboratory codes are given. Book 5 of the series 'Techniques of Water Resources Investigations of the U.S. Geological Survey' should be consulted for more information about the analytical procedures included in the tabulations. This catalog supersedes U.S. Geological Survey Open-File Report 86-232 '1986-87-88 National Water Quality Laboratory Services Catalog', October 1985.

  18. Bias and precision of selected analytes reported by the National Atmospheric Deposition Program and National Trends Network, 1984

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Willoughby, T.C.

    1987-01-01

    The U.S. Geological Survey operated a blind audit sample program during 1974 to test the effects of the sample handling and shipping procedures used by the National Atmospheric Deposition Program and National Trends Network on the quality of wet deposition data produced by the combined networks. Blind audit samples, which were dilutions of standard reference water samples, were submitted by network site operators to the central analytical laboratory disguised as actual wet deposition samples. Results from the analyses of blind audit samples were used to calculate estimates of analyte bias associated with all network wet deposition samples analyzed in 1984 and to estimate analyte precision. Concentration differences between double blind samples that were submitted to the central analytical laboratory and separate analyses of aliquots of those blind audit samples that had not undergone network sample handling and shipping were used to calculate analyte masses that apparently were added to each blind audit sample by routine network handling and shipping procedures. These calculated masses indicated statistically significant biases for magnesium, sodium , potassium, chloride, and sulfate. Median calculated masses were 41.4 micrograms (ug) for calcium, 14.9 ug for magnesium, 23.3 ug for sodium, 0.7 ug for potassium, 16.5 ug for chloride and 55.3 ug for sulfate. Analyte precision was estimated using two different sets of replicate measures performed by the central analytical laboratory. Estimated standard deviations were similar to those previously reported. (Author 's abstract)

  19. A multi-targeted liquid chromatography-mass spectrometry screening procedure for the detection in human urine of drugs non-prohibited in sport commonly used by the athletes.

    PubMed

    Mazzarino, Monica; Cesarei, Lorenzo; de la Torre, Xavier; Fiacco, Ilaria; Robach, Paul; Botrè, Francesco

    2016-01-05

    This work presents an analytical method for the simultaneous analysis in human urine of 38 pharmacologically active compounds (19 benzodiazepine-like substances, 7 selective serotonin reuptake inhibitors, 4 azole antifungal drugs, 5 inhibitors of the phosphodiesterases type 4 and 3 inhibitors of the phosphodiesterase type 5) by liquid-chromatography coupled with tandem mass spectrometry. The above substances classes include both the most common "non banned" drugs used by the athletes (based on the information reported on the "doping control form") and those drugs who are suspected to be performance enhancing and/or act as masking agents in particular conditions. The chromatographic separation was performed by a reverse-phase octadecyl column using as mobile phases acetonitrile and ultra-purified water, both with 0.1% formic acid. The detection was carried out using a triple quadrupole mass spectrometric analyser, positive electro-spray as ionization source and selected reaction monitoring as acquisition mode. Sample pre-treatment consisted in an enzymatic hydrolysis followed by a liquid-liquid extraction in neutral field using tert-butyl methyl-ether. The analytical procedure, once developed, was validated in terms of sensitivity (lower limits of detection in the range of 1-50 ng mL(-1)), specificity (no interferences were detected at the retention time of all the analytes under investigation), recovery (≥60% with a satisfactory repeatability, CV % lower than 10), matrix effect (lower than 30%) and reproducibility of retention times (CV% lower than 0.1) and of relative abundances (CV% lower than 15). The performance and the applicability of the method was evaluated by analyzing real samples containing benzodiazepines (alprazolam, diazepam, zolpidem or zoplicone) or inhibitors of the phosphodiesterases type 5 (sildenafil or vardenafil) and samples obtained incubating two of the phosphodiesterases type 4 studied (cilomilast or roflumilast) with pooled human liver microsomes. All the parent compounds, together with their main phase I metabolites, were clearly detected using the analytical procedures here developed. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Development of an analytical method for the simultaneous analysis of MCPD esters and glycidyl esters in oil-based foodstuffs.

    PubMed

    Ermacora, Alessia; Hrnčiřík, Karel

    2014-01-01

    Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.

  1. MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER: PART 1. PROTOCOLS

    EPA Science Inventory

    A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...

  2. Identification and quantification of carbamate pesticides in dried lime tree flowers by means of excitation-emission molecular fluorescence and parallel factor analysis when quenching effect exists.

    PubMed

    Rubio, L; Ortiz, M C; Sarabia, L A

    2014-04-11

    A non-separative, fast and inexpensive spectrofluorimetric method based on the second order calibration of excitation-emission fluorescence matrices (EEMs) was proposed for the determination of carbaryl, carbendazim and 1-naphthol in dried lime tree flowers. The trilinearity property of three-way data was used to handle the intrinsic fluorescence of lime flowers and the difference in the fluorescence intensity of each analyte. It also made possible to identify unequivocally each analyte. Trilinearity of the data tensor guarantees the uniqueness of the solution obtained through parallel factor analysis (PARAFAC), so the factors of the decomposition match up with the analytes. In addition, an experimental procedure was proposed to identify, with three-way data, the quenching effect produced by the fluorophores of the lime flowers. This procedure also enabled the selection of the adequate dilution of the lime flowers extract to minimize the quenching effect so the three analytes can be quantified. Finally, the analytes were determined using the standard addition method for a calibration whose standards were chosen with a D-optimal design. The three analytes were unequivocally identified by the correlation between the pure spectra and the PARAFAC excitation and emission spectral loadings. The trueness was established by the accuracy line "calculated concentration versus added concentration" in all cases. Better decision limit values (CCα), in x0=0 with the probability of false positive fixed at 0.05, were obtained for the calibration performed in pure solvent: 2.97 μg L(-1) for 1-naphthol, 3.74 μg L(-1) for carbaryl and 23.25 μg L(-1) for carbendazim. The CCα values for the second calibration carried out in matrix were 1.61, 4.34 and 51.75 μg L(-1) respectively; while the values obtained considering only the pure samples as calibration set were: 2.65, 8.61 and 28.7 μg L(-1), respectively. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. The Importance of Method Selection in Determining Product Integrity for Nutrition Research1234

    PubMed Central

    Mudge, Elizabeth M; Brown, Paula N

    2016-01-01

    The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. PMID:26980823

  4. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    PubMed

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  5. Is enzymatic hydrolysis a reliable analytical strategy to quantify glucuronidated and sulfated polyphenol metabolites in human fluids?

    PubMed

    Quifer-Rada, Paola; Martínez-Huélamo, Miriam; Lamuela-Raventos, Rosa M

    2017-07-19

    Phenolic compounds are present in human fluids (plasma and urine) mainly as glucuronidated and sulfated metabolites. Up to now, due to the unavailability of standards, enzymatic hydrolysis has been the method of choice in analytical chemistry to quantify these phase II phenolic metabolites. Enzymatic hydrolysis procedures vary in enzyme concentration, pH and temperature; however, there is a lack of knowledge about the stability of polyphenols in their free form during the process. In this study, we evaluated the stability of 7 phenolic acids, 2 flavonoids and 3 prenylflavanoids in urine during enzymatic hydrolysis to assess the suitability of this analytical procedure, using three different concentrations of β-glucuronidase/sulfatase enzymes from Helix pomatia. The results indicate that enzymatic hydrolysis negatively affected the recovery of the precursor and free-form polyphenols present in the sample. Thus, enzymatic hydrolysis does not seem an ideal analytical strategy to quantify glucuronidated and sulfated polyphenol metabolites.

  6. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    PubMed Central

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217

  7. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  8. The Importance of Method Selection in Determining Product Integrity for Nutrition Research.

    PubMed

    Mudge, Elizabeth M; Betz, Joseph M; Brown, Paula N

    2016-03-01

    The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. © 2016 American Society for Nutrition.

  9. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Research in digital adaptive flight controllers

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1976-01-01

    A design study of adaptive control logic suitable for implementation in modern airborne digital flight computers was conducted. Both explicit controllers which directly utilize parameter identification and implicit controllers which do not require identification were considered. Extensive analytical and simulation efforts resulted in the recommendation of two explicit digital adaptive flight controllers. Interface weighted least squares estimation procedures with control logic were developed using either optimal regulator theory or with control logic based upon single stage performance indices.

  11. Integrating Water Quality and River Rehabilitation Management - A Decision-Analytical Perspective

    NASA Astrophysics Data System (ADS)

    Reichert, P.; Langhans, S.; Lienert, J.; Schuwirth, N.

    2009-04-01

    Integrative river management involves difficult decisions about alternative measures to improve their ecological state. For this reason, it seems useful to apply knowledge from the decision sciences to support river management. We discuss how decision-analytical elements can be employed for designing an integrated river management procedure. An important aspect of this procedure is to clearly separate scientific predictions of the consequences of alternatives from objectives to be achieved by river management. The key elements of the suggested procedure are (i) the quantitative elicitation of the objectives from different stakeholder groups, (ii) the compilation of the current scientific knowledge about the consequences of the effects resulting from suggested measures in the form of a probabilistic mathematical model, and (iii) the use of these predictions and valuations to prioritize alternatives, to uncover conflicting objectives, to support the design of better alternatives, and to improve the transparency of communication about the chosen management strategy. The development of this procedure led to insights regarding necessary steps to be taken for rational decision-making in river management, to guidelines about the use of decision-analytical techniques for performing these steps, but also to new insights about the application of decision-analytical techniques in general. In particular, the consideration of the spatial distribution of the effects of measures and the potential added value of connected rehabilitated river reaches leads to favoring measures that have a positive effect beyond a single river reach. As these effects only propagate within the river network, this results in a river basin oriented management concept as a consequence of a rational decision support procedure, rather than as an a priori management paradigm. There are also limitations to the support that can be expected from the decision-analytical perspective. It will not provide the societal values that are driving prioritization in river management, it will only support their elicitation and rational use. This is particularly important for the assessment of micro-pollutants because of severe limitations in scientific knowledge of their effects on river ecosystems. This makes the influence of pollution by micro-pollutants on prioritization of measures strongly dependent on the weight of the precautionary principle relative to other societal objectives of river management.

  12. Riccati parameterized self-similar waves in two-dimensional graded-index waveguide

    NASA Astrophysics Data System (ADS)

    Kumar De, Kanchan; Goyal, Amit; Raju, Thokala Soloman; Kumar, C. N.; Panigrahi, Prasanta K.

    2015-04-01

    An analytical method based on gauge-similarity transformation technique has been employed for mapping a (2+1)- dimensional variable coefficient coupled nonlinear Schrödinger equations (vc-CNLSE) with dispersion, nonlinearity and gain to standard NLSE. Under certain functional relations we construct a large family of self-similar waves in the form of bright similaritons, Akhmediev breathers and rogue waves. We report the effect of dispersion on the intensity of the solitary waves. Further, we illustrate the procedure to amplify the intensity of self-similar waves using isospectral Hamiltonian approach. This approach provides an efficient mechanism to generate analytically a wide class of tapering profiles and widths by exploiting the Riccati parameter. Equivalently, it enables one to control efficiently the self-similar wave structures and hence their evolution.

  13. Aerodynamic shape optimization of wing and wing-body configurations using control theory

    NASA Technical Reports Server (NTRS)

    Reuther, James; Jameson, Antony

    1995-01-01

    This paper describes the implementation of optimization techniques based on control theory for wing and wing-body design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for airfoils and wings in which the shape and the surrounding body-fitted mesh are both generated analytically, and the control is the mapping function. Recently, the method has been implemented for both potential flows and flows governed by the Euler equations using an alternative formulation which employs numerically generated grids, so that it can more easily be extended to treat general configurations. Here results are presented both for the optimization of a swept wing using an analytic mapping, and for the optimization of wing and wing-body configurations using a general mesh.

  14. Copper(II)-rubeanic acid coprecipitation system for separation-preconcentration of trace metal ions in environmental samples for their flame atomic absorption spectrometric determinations.

    PubMed

    Soylak, Mustafa; Erdogan, Nilgun D

    2006-09-21

    A simple and facile preconcentration procedure based on the coprecipitation of trace heavy metal ions with copper(II)-rubeanic acid complex has been developed. The analytical parameters including pH, amounts of rubeanic acid, sample volume, etc. was investigated for the quantitative recoveries of Pb(II), Fe(III), Cd(II), Au(III), Pd(II) and Ni(II). No interferic effects were observed from the concomitant ions. The detection limits for analyte ions by 3 sigma were in the range of 0.14 microg/l for iron-3.4 microg/l for lead. The proposed coprecipitation method was successfully applied to water samples from Palas Lake-Kayseri, soil and sediment samples from Kayseri and Yozgat-Turkey.

  15. Torque Transient of Magnetically Drive Flow for Viscosity Measurement

    NASA Technical Reports Server (NTRS)

    Ban, Heng; Li, Chao; Su, Ching-Hua; Lin, Bochuan; Scripa, Rosalia N.; Lehoczky, Sandor L.

    2004-01-01

    Viscosity is a good indicator of structural changes for complex liquids, such as semiconductor melts with chain or ring structures. This paper discusses the theoretical and experimental results of the transient torque technique for non-intrusive viscosity measurement. Such a technique is essential for the high temperature viscosity measurement of high pressure and toxic semiconductor melts. In this paper, our previous work on oscillating cup technique was expanded to the transient process of a magnetically driven melt flow in a damped oscillation system. Based on the analytical solution for the fluid flow and cup oscillation, a semi-empirical model was established to extract the fluid viscosity. The analytical and experimental results indicated that such a technique has the advantage of short measurement time and straight forward data analysis procedures

  16. Toward Worldwide Hepcidin Assay Harmonization: Identification of a Commutable Secondary Reference Material.

    PubMed

    van der Vorm, Lisa N; Hendriks, Jan C M; Laarakkers, Coby M; Klaver, Siem; Armitage, Andrew E; Bamberg, Alison; Geurts-Moespot, Anneke J; Girelli, Domenico; Herkert, Matthias; Itkonen, Outi; Konrad, Robert J; Tomosugi, Naohisa; Westerman, Mark; Bansal, Sukhvinder S; Campostrini, Natascia; Drakesmith, Hal; Fillet, Marianne; Olbina, Gordana; Pasricha, Sant-Rayn; Pitts, Kelly R; Sloan, John H; Tagliaro, Franco; Weykamp, Cas W; Swinkels, Dorine W

    2016-07-01

    Absolute plasma hepcidin concentrations measured by various procedures differ substantially, complicating interpretation of results and rendering reference intervals method dependent. We investigated the degree of equivalence achievable by harmonization and the identification of a commutable secondary reference material to accomplish this goal. We applied technical procedures to achieve harmonization developed by the Consortium for Harmonization of Clinical Laboratory Results. Eleven plasma hepcidin measurement procedures (5 mass spectrometry based and 6 immunochemical based) quantified native individual plasma samples (n = 32) and native plasma pools (n = 8) to assess analytical performance and current and achievable equivalence. In addition, 8 types of candidate reference materials (3 concentrations each, n = 24) were assessed for their suitability, most notably in terms of commutability, to serve as secondary reference material. Absolute hepcidin values and reproducibility (intrameasurement procedure CVs 2.9%-8.7%) differed substantially between measurement procedures, but all were linear and correlated well. The current equivalence (intermeasurement procedure CV 28.6%) between the methods was mainly attributable to differences in calibration and could thus be improved by harmonization with a common calibrator. Linear regression analysis and standardized residuals showed that a candidate reference material consisting of native lyophilized plasma with cryolyoprotectant was commutable for all measurement procedures. Mathematically simulated harmonization with this calibrator resulted in a maximum achievable equivalence of 7.7%. The secondary reference material identified in this study has the potential to substantially improve equivalence between hepcidin measurement procedures and contributes to the establishment of a traceability chain that will ultimately allow standardization of hepcidin measurement results. © 2016 American Association for Clinical Chemistry.

  17. Analytic methods for design of wave cycles for wave rotor core engines

    NASA Technical Reports Server (NTRS)

    Resler, Edwin L., Jr.; Mocsari, Jeffrey C.; Nalim, M. R.

    1993-01-01

    A procedure to design a preliminary wave rotor cycle for any application is presented. To complete a cycle with heat addition there are two separate but related design steps that must be followed. The 'wave' boundary conditions determine the allowable amount of heat added in any case and the ensuing wave pattern requires certain pressure discharge conditions to allow the process to be made cyclic. This procedure, when applied, gives a first estimate of the cycle performance and the necessary information for the next step in the design process, namely the application of a characteristic based or other appropriate detailed one dimensional wave calculation that locates the proper porting around the periphery of the wave rotor. Four examples of the design procedure are given to demonstrate its utility and generality. These examples also illustrate the large gains in performance that could be realized with the use of wave rotor enhanced propulsion cycles.

  18. Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ticknor, Brian W.; Metzger, Shalina C.; McBay, Eddy H.

    Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oakmore » Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.« less

  19. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  20. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    PubMed

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  1. Improvement of structural models using covariance analysis and nonlinear generalized least squares

    NASA Technical Reports Server (NTRS)

    Glaser, R. J.; Kuo, C. P.; Wada, B. K.

    1992-01-01

    The next generation of large, flexible space structures will be too light to support their own weight, requiring a system of structural supports for ground testing. The authors have proposed multiple boundary-condition testing (MBCT), using more than one support condition to reduce uncertainties associated with the supports. MBCT would revise the mass and stiffness matrix, analytically qualifying the structure for operation in space. The same procedure is applicable to other common test conditions, such as empty/loaded tanks and subsystem/system level tests. This paper examines three techniques for constructing the covariance matrix required by nonlinear generalized least squares (NGLS) to update structural models based on modal test data. The methods range from a complicated approach used to generate the simulation data (i.e., the correct answer) to a diagonal matrix based on only two constants. The results show that NGLS is very insensitive to assumptions about the covariance matrix, suggesting that a workable NGLS procedure is possible. The examples also indicate that the multiple boundary condition procedure more accurately reduces errors than individual boundary condition tests alone.

  2. Experimental evaluation of tool run-out in micro milling

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta

    2018-05-01

    This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.

  3. Automated acid and base number determination of mineral-based lubricants by fourier transform infrared spectroscopy: commercial laboratory evaluation.

    PubMed

    Winterfield, Craig; van de Voort, F R

    2014-12-01

    The Fluid Life Corporation assessed and implemented Fourier transform infrared spectroscopy (FTIR)-based methods using American Society for Testing and Materials (ASTM)-like stoichiometric reactions for determination of acid and base number for in-service mineral-based oils. The basic protocols, quality control procedures, calibration, validation, and performance of these new quantitative methods are assessed. ASTM correspondence is attained using a mixed-mode calibration, using primary reference standards to anchor the calibration, supplemented by representative sample lubricants analyzed by ASTM procedures. A partial least squares calibration is devised by combining primary acid/base reference standards and representative samples, focusing on the main spectral stoichiometric response with chemometrics assisting in accounting for matrix variability. FTIR(AN/BN) methodology is precise, accurate, and free of most interference that affects ASTM D664 and D4739 results. Extensive side-by-side operational runs produced normally distributed differences with mean differences close to zero and standard deviations of 0.18 and 0.26 mg KOH/g, respectively. Statistically, the FTIR methods are a direct match to the ASTM methods, with superior performance in terms of analytical throughput, preparation time, and solvent use. FTIR(AN/BN) analysis is a viable, significant advance for in-service lubricant analysis, providing an economic means of trending samples instead of tedious and expensive conventional ASTM(AN/BN) procedures. © 2014 Society for Laboratory Automation and Screening.

  4. A novel second-order standard addition analytical method based on data processing with multidimensional partial least-squares and residual bilinearization.

    PubMed

    Lozano, Valeria A; Ibañez, Gabriela A; Olivieri, Alejandro C

    2009-10-05

    In the presence of analyte-background interactions and a significant background signal, both second-order multivariate calibration and standard addition are required for successful analyte quantitation achieving the second-order advantage. This report discusses a modified second-order standard addition method, in which the test data matrix is subtracted from the standard addition matrices, and quantitation proceeds via the classical external calibration procedure. It is shown that this novel data processing method allows one to apply not only parallel factor analysis (PARAFAC) and multivariate curve resolution-alternating least-squares (MCR-ALS), but also the recently introduced and more flexible partial least-squares (PLS) models coupled to residual bilinearization (RBL). In particular, the multidimensional variant N-PLS/RBL is shown to produce the best analytical results. The comparison is carried out with the aid of a set of simulated data, as well as two experimental data sets: one aimed at the determination of salicylate in human serum in the presence of naproxen as an additional interferent, and the second one devoted to the analysis of danofloxacin in human serum in the presence of salicylate.

  5. Analytical Problems and Suggestions in the Analysis of Behavioral Economic Demand Curves.

    PubMed

    Yu, Jihnhee; Liu, Liu; Collins, R Lorraine; Vincent, Paula C; Epstein, Leonard H

    2014-01-01

    Behavioral economic demand curves (Hursh, Raslear, Shurtleff, Bauman, & Simmons, 1988) are innovative approaches to characterize the relationships between consumption of a substance and its price. In this article, we investigate common analytical issues in the use of behavioral economic demand curves, which can cause inconsistent interpretations of demand curves, and then we provide methodological suggestions to address those analytical issues. We first demonstrate that log transformation with different added values for handling zeros changes model parameter estimates dramatically. Second, demand curves are often analyzed using an overparameterized model that results in an inefficient use of the available data and a lack of assessment of the variability among individuals. To address these issues, we apply a nonlinear mixed effects model based on multivariate error structures that has not been used previously to analyze behavioral economic demand curves in the literature. We also propose analytical formulas for the relevant standard errors of derived values such as P max, O max, and elasticity. The proposed model stabilizes the derived values regardless of using different added increments and provides substantially smaller standard errors. We illustrate the data analysis procedure using data from a relative reinforcement efficacy study of simulated marijuana purchasing.

  6. Hydrolysis of Letrozole catalyzed by macrocyclic Rhodium (I) Schiff-base complexes.

    PubMed

    Reddy, P Muralidhar; Shanker, K; Srinivas, V; Krishna, E Ravi; Rohini, R; Srikanth, G; Hu, Anren; Ravinder, V

    2015-03-15

    Ten mononuclear Rhodium (I) complexes were synthesized by macrocyclic ligands having N4 and N2O2 donor sites. Square planar geometry is assigned based on the analytical and spectral properties for all complexes. Rh(I) complexes were investigated as catalysts in hydrolysis of Nitrile group containing pharmaceutical drug Letrozole. A comparative study showed that all the complexes are efficient in the catalysis. The percent yields of all the catalytic reaction products viz. drug impurities were determined by spectrophotometric procedures and characterized by spectral studies. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2012-10-24

    This plan incorporates U.S. Department of Energy (DOE) Office of Legacy Management (LM) standard operating procedures (SOPs) into environmental monitoring activities and will be implemented at all sites managed by LM. This document provides detailed procedures for the field sampling teams so that samples are collected in a consistent and technically defensible manner. Site-specific plans (e.g., long-term surveillance and maintenance plans, environmental monitoring plans) document background information and establish the basis for sampling and monitoring activities. Information will be included in site-specific tabbed sections to this plan, which identify sample locations, sample frequencies, types of samples, field measurements, and associatedmore » analytes for each site. Additionally, within each tabbed section, program directives will be included, when developed, to establish additional site-specific requirements to modify or clarify requirements in this plan as they apply to the corresponding site. A flowchart detailing project tasks required to accomplish routine sampling is displayed in Figure 1. LM environmental procedures are contained in the Environmental Procedures Catalog (LMS/PRO/S04325), which incorporates American Society for Testing and Materials (ASTM), DOE, and U.S. Environmental Protection Agency (EPA) guidance. Specific procedures used for groundwater and surface water monitoring are included in Appendix A. If other environmental media are monitored, SOPs used for air, soil/sediment, and biota monitoring can be found in the site-specific tabbed sections in Appendix D or in site-specific documents. The procedures in the Environmental Procedures Catalog are intended as general guidance and require additional detail from planning documents in order to be complete; the following sections fulfill that function and specify additional procedural requirements to form SOPs. Routine revision of this Sampling and Analysis Plan will be conducted annually at the beginning of each fiscal year when attachments in Appendix D, including program directives and sampling location/analytical tables, will be reviewed by project personnel and updated. The sampling location/analytical tables in Appendix D, however, may have interim updates according to project direction that are not reflected in this plan. Deviations from location/analytical tables in Appendix D prior to sampling will be documented in project correspondence (e.g., startup letters). If significant changes to other aspects of this plan are required before the annual update, then the plan will be revised as needed.« less

  8. Nonlocal continuum analysis of a nonlinear uniaxial elastic lattice system under non-uniform axial load

    NASA Astrophysics Data System (ADS)

    Hérisson, Benjamin; Challamel, Noël; Picandet, Vincent; Perrot, Arnaud

    2016-09-01

    The static behavior of the Fermi-Pasta-Ulam (FPU) axial chain under distributed loading is examined. The FPU system examined in the paper is a nonlinear elastic lattice with linear and quadratic spring interaction. A dimensionless parameter controls the possible loss of convexity of the associated quadratic and cubic energy. Exact analytical solutions based on Hurwitz zeta functions are developed in presence of linear static loading. It is shown that this nonlinear lattice possesses scale effects and possible localization properties in the absence of energy convexity. A continuous approach is then developed to capture the main phenomena observed regarding the discrete axial problem. The associated continuum is built from a continualization procedure that is mainly based on the asymptotic expansion of the difference operators involved in the lattice problem. This associated continuum is an enriched gradient-based or nonlocal axial medium. A Taylor-based and a rational differential method are both considered in the continualization procedures to approximate the FPU lattice response. The Padé approximant used in the continualization procedure fits the response of the discrete system efficiently, even in the vicinity of the limit load when the non-convex FPU energy is examined. It is concluded that the FPU lattice system behaves as a nonlocal axial system in dynamic but also static loading.

  9. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    PubMed

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.

  10. Environmental and human monitoring of Americium-241 utilizing extraction chromatography and {alpha}-Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, S.J.; Hensley, C.A.; Armenta, C.E.

    1997-03-01

    Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for {alpha}-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of `real` environmental and bioassaymore » samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of {approx}2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously. 24 refs., 2 figs., 2 tabs.« less

  11. Using Analytic Hierarchy Process in Textbook Evaluation

    ERIC Educational Resources Information Center

    Kato, Shigeo

    2014-01-01

    This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…

  12. Analytical solution for the advection-dispersion transport equation in layered media

    USDA-ARS?s Scientific Manuscript database

    The advection-dispersion transport equation with first-order decay was solved analytically for multi-layered media using the classic integral transform technique (CITT). The solution procedure used an associated non-self-adjoint advection-diffusion eigenvalue problem that had the same form and coef...

  13. MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER. PART 2. APPENDICES TO PROTOCOLS

    EPA Science Inventory

    A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...

  14. ANALYTICAL PROCEDURES FOR CHARACTERIZING UNREGULATED EMISSIONS FROM VEHICLES USING MIDDLE-DISTILLATE FUELS

    EPA Science Inventory

    This research program was initiated with the objective of developing, codifying and testing a group of chemical analytical methods for measuring toxic compounds in the exhaust of distillate-fueled engines (i.e. diesel, gas turbine, Stirling, or Rankin cycle powerplants). It is a ...

  15. Computer program for calculating the flow field of supersonic ejector nozzles

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.

    1974-01-01

    An analytical procedure for computing the performance of supersonic ejector nozzles is presented. This procedure includes real sonic line effects and an interaction analysis for the mixing process between the two streams. The procedure is programmed in FORTRAN 4 and has operated successfully on IBM 7094, IBM 360, CDC 6600, and Univac 1108.

  16. Methodological Research on Knowledge Use and School Improvement. Volume III. Measuring Knowledge Use: A Procedural Inventory.

    ERIC Educational Resources Information Center

    Dunn, William N.; And Others

    This volume presents in one collection a systematic inventory of research and analytic procedures appropriate for generating information on knowledge production, diffusion, and utilization, gathered by the University of Pittsburgh Program for the Study of Knowledge Use. The main concern is with those procedures that focus on the utilization of…

  17. NHEXAS PHASE I REGION 5 STUDY--STANDARD OPERATING PROCEDURE--NHEXAS FILTER HANDLING, WEIGHING AND ARCHIVING PROCEDURES FOR AEROSOL SAMPLES (RTI/ACS-AP-209-011)

    EPA Science Inventory

    This protocol describes the procedures for weighing, handling, and archiving aerosol filters and for managing the associated analytical and quality assurance data. Filter samples were weighed for aerosol mass at RTI laboratory, with only the automated field sampling data transfer...

  18. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Application of Sigma Metrics Analysis for the Assessment and Modification of Quality Control Program in the Clinical Chemistry Laboratory of a Tertiary Care Hospital.

    PubMed

    Iqbal, Sahar; Mustansar, Tazeen

    2017-03-01

    Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.

  20. Determination of endocrine-disrupting chemicals in human milk by dispersive liquid-liquid microextraction.

    PubMed

    Vela-Soria, Fernando; Jiménez-Díaz, Inmaculada; Díaz, Caridad; Pérez, José; Iribarne-Durán, Luz María; Serrano-López, Laura; Arrebola, Juan Pedro; Fernández, Mariana Fátima; Olea, Nicolás

    2016-09-01

    Human populations are widely exposed to numerous so-called endocrine-disrupting chemicals, exogenous compounds able to interfere with the endocrine system. This exposure has been associated with several health disorders. New analytical procedures are needed for biomonitoring these xenobiotics in human matrices. A quick and inexpensive methodological procedure, based on sample treatment by dispersive liquid-liquid microextraction, is proposed for the determination of bisphenols, parabens and benzophenones in samples. LOQs ranged from 0.4 to 0.7 ng ml(-1) and RSDs from 4.3 to 14.8%. This methodology was satisfactorily applied in the simultaneous determination of a wide range of endocrine-disrupting chemicals in human milk samples and is suitable for application in biomonitoring studies.

  1. Pushover analysis of reinforced concrete frames considering shear failure at beam-column joints

    NASA Astrophysics Data System (ADS)

    Sung, Y. C.; Lin, T. K.; Hsiao, C. C.; Lai, M. C.

    2013-09-01

    Since most current seismic capacity evaluations of reinforced concrete (RC) frame structures are implemented by either static pushover analysis (PA) or dynamic time history analysis, with diverse settings of the plastic hinges (PHs) on such main structural components as columns, beams and walls, the complex behavior of shear failure at beam-column joints (BCJs) during major earthquakes is commonly neglected. This study proposes new nonlinear PA procedures that consider shear failure at BCJs and seek to assess the actual damage to RC structures. Based on the specifications of FEMA-356, a simplified joint model composed of two nonlinear cross struts placed diagonally over the location of the plastic hinge is established, allowing a sophisticated PA to be performed. To verify the validity of this method, the analytical results for the capacity curves and the failure mechanism derived from three different full-size RC frames are compared with the experimental measurements. By considering shear failure at BCJs, the proposed nonlinear analytical procedures can be used to estimate the structural behavior of RC frames, including seismic capacity and the progressive failure sequence of joints, in a precise and effective manner.

  2. Managing heteroscedasticity in general linear models.

    PubMed

    Rosopa, Patrick J; Schaffer, Meline M; Schroeder, Amber N

    2013-09-01

    Heteroscedasticity refers to a phenomenon where data violate a statistical assumption. This assumption is known as homoscedasticity. When the homoscedasticity assumption is violated, this can lead to increased Type I error rates or decreased statistical power. Because this can adversely affect substantive conclusions, the failure to detect and manage heteroscedasticity could have serious implications for theory, research, and practice. In addition, heteroscedasticity is not uncommon in the behavioral and social sciences. Thus, in the current article, we synthesize extant literature in applied psychology, econometrics, quantitative psychology, and statistics, and we offer recommendations for researchers and practitioners regarding available procedures for detecting heteroscedasticity and mitigating its effects. In addition to discussing the strengths and weaknesses of various procedures and comparing them in terms of existing simulation results, we describe a 3-step data-analytic process for detecting and managing heteroscedasticity: (a) fitting a model based on theory and saving residuals, (b) the analysis of residuals, and (c) statistical inferences (e.g., hypothesis tests and confidence intervals) involving parameter estimates. We also demonstrate this data-analytic process using an illustrative example. Overall, detecting violations of the homoscedasticity assumption and mitigating its biasing effects can strengthen the validity of inferences from behavioral and social science data.

  3. Application of the analytic hierarchy process to a sustainability assessment of coastal beach exploitation: a case study of the wind power projects on the coastal beaches of Yancheng, China.

    PubMed

    Tian, Weijun; Bai, Jie; Sun, Huimei; Zhao, Yangguo

    2013-01-30

    Sustainability assessments of coastal beach exploitation are difficult because the identification of appropriate monitoring methodologies and evaluation procedures is still ongoing. In particular, the most suitable procedure for the application of sustainability assessment to coastal beaches remains uncertain. This paper presents a complete sustainability assessment process for coastal beach exploitation based on the analytic hierarchy process (AHP). We developed an assessment framework consisting of 14 indicators derived from the three dimensions of suitability, economic and social value, and ecosystem. We chose a wind power project on a coastal beach of Yancheng as a case study. The results indicated that the wind power farms on the coastal beach were not completely in keeping with sustainable development theory. The construction of the wind power farms had some negative impacts. Therefore, in the design stage, wind turbines should be designed and planned carefully to minimize these negative impacts. In addition, the case study demonstrated that the AHP was capable of addressing the complexities associated with the sustainability of coastal beaches. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies

    PubMed Central

    Jobard, Elodie; Trédan, Olivier; Postoly, Déborah; André, Fabrice; Martin, Anne-Laure; Elena-Herrmann, Bénédicte; Boyault, Sandrine

    2016-01-01

    The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation) and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies. PMID:27929400

  5. One- and Two-Equation Models to Simulate Ion Transport in Charged Porous Electrodes

    DOE PAGES

    Gabitto, Jorge; Tsouris, Costas

    2018-01-19

    Energy storage in porous capacitor materials, capacitive deionization (CDI) for water desalination, capacitive energy generation, geophysical applications, and removal of heavy ions from wastewater streams are some examples of processes where understanding of ionic transport processes in charged porous media is very important. In this work, one- and two-equation models are derived to simulate ionic transport processes in heterogeneous porous media comprising two different pore sizes. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without Faradaic reactions or specific adsorption of ions. A two-step volume averaging technique is used to derive the averaged transportmore » equations for multi-ionic systems without any further assumptions, such as thin electrical double layers or Donnan equilibrium. A comparison between both models is presented. The effective transport parameters for isotropic porous media are calculated by solving the corresponding closure problems. An approximate analytical procedure is proposed to solve the closure problems. Numerical and theoretical calculations show that the approximate analytical procedure yields adequate solutions. Lastly, a theoretical analysis shows that the value of interphase pseudo-transport coefficients determines which model to use.« less

  6. One- and Two-Equation Models to Simulate Ion Transport in Charged Porous Electrodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gabitto, Jorge; Tsouris, Costas

    Energy storage in porous capacitor materials, capacitive deionization (CDI) for water desalination, capacitive energy generation, geophysical applications, and removal of heavy ions from wastewater streams are some examples of processes where understanding of ionic transport processes in charged porous media is very important. In this work, one- and two-equation models are derived to simulate ionic transport processes in heterogeneous porous media comprising two different pore sizes. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without Faradaic reactions or specific adsorption of ions. A two-step volume averaging technique is used to derive the averaged transportmore » equations for multi-ionic systems without any further assumptions, such as thin electrical double layers or Donnan equilibrium. A comparison between both models is presented. The effective transport parameters for isotropic porous media are calculated by solving the corresponding closure problems. An approximate analytical procedure is proposed to solve the closure problems. Numerical and theoretical calculations show that the approximate analytical procedure yields adequate solutions. Lastly, a theoretical analysis shows that the value of interphase pseudo-transport coefficients determines which model to use.« less

  7. In-tube extraction for the determination of the main volatile compounds in Physalis peruviana L.

    PubMed

    Kupska, Magdalena; Jeleń, Henryk H

    2017-01-01

    An analytical procedure based on in-tube extraction followed by gas chromatography with mass spectrometry has been developed for the analysis of 24 of the main volatile components in cape gooseberry (Physalis peruviana L.) samples. According to their chemical structure, the compounds were organized into different groups: one hydrocarbon, one aldehyde, four alcohols, four esters, and 14 monoterpenes. By single-factor experiments, incubation temperature, incubation time, extraction volume, extraction strokes, extraction speed, desorption temperature, and desorption speed were determined as 60°C, 20 min, 1000 μL, 20, 50:50 μL/s, 280°C, 100 μL/s, respectively. Quantitative analysis using authentic standards and external calibration curves was performed. The limit of detection and limit of quantification for the analytical procedure were calculated. Results shown the benzaldehyde, ethyl butanoate, 2-methyl-1-butanol, 1-hexanol, 1-butanol, α-terpineol, and terpinen-4-ol were the most abundant volatile compounds in analyzed fruits (68.6-585 μg/kg). The obtained data may contribute to qualify cape gooseberry to the group of superfruits and, therefore, increase its popularity. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. On the Temporal Stability of Analyte Recognition with an E-Nose Based on a Metal Oxide Sensor Array in Practical Applications.

    PubMed

    Kiselev, Ilia; Sysoev, Victor; Kaikov, Igor; Koronczi, Ilona; Adil Akai Tegin, Ruslan; Smanalieva, Jamila; Sommer, Martin; Ilicali, Coskan; Hauptmannl, Michael

    2018-02-11

    The paper deals with a functional instability of electronic nose (e-nose) units which significantly limits their real-life applications. Here we demonstrate how to approach this issue with example of an e-nose based on a metal oxide sensor array developed at the Karlsruhe Institute of Technology (Germany). We consider the instability of e-nose operation at different time scales ranging from minutes to many years. To test the e-nose we employ open-air and headspace sampling of analyte odors. The multivariate recognition algorithm to process the multisensor array signals is based on the linear discriminant analysis method. Accounting for the received results, we argue that the stability of device operation is mostly affected by accidental changes in the ambient air composition. To overcome instabilities, we introduce the add-training procedure which is found to successfully manage both the temporal changes of ambient and the drift of multisensor array properties, even long-term. The method can be easily implemented in practical applications of e-noses and improve prospects for device marketing.

  9. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique

    PubMed Central

    2018-01-01

    A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis. PMID:29686931

  10. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique.

    PubMed

    Valavala, Sriram; Seelam, Nareshvarma; Tondepu, Subbaiah; Jagarlapudi, V Shanmukha Kumar; Sundarmurthy, Vivekanandan

    2018-01-01

    A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8  µ m) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.

  11. On the Temporal Stability of Analyte Recognition with an E-Nose Based on a Metal Oxide Sensor Array in Practical Applications

    PubMed Central

    Kaikov, Igor; Koronczi, Ilona; Adil Akai Tegin, Ruslan; Smanalieva, Jamila; Sommer, Martin; Ilicali, Coskan; Hauptmannl, Michael

    2018-01-01

    The paper deals with a functional instability of electronic nose (e-nose) units which significantly limits their real-life applications. Here we demonstrate how to approach this issue with example of an e-nose based on a metal oxide sensor array developed at the Karlsruhe Institute of Technology (Germany). We consider the instability of e-nose operation at different time scales ranging from minutes to many years. To test the e-nose we employ open-air and headspace sampling of analyte odors. The multivariate recognition algorithm to process the multisensor array signals is based on the linear discriminant analysis method. Accounting for the received results, we argue that the stability of device operation is mostly affected by accidental changes in the ambient air composition. To overcome instabilities, we introduce the add-training procedure which is found to successfully manage both the temporal changes of ambient and the drift of multisensor array properties, even long-term. The method can be easily implemented in practical applications of e-noses and improve prospects for device marketing. PMID:29439468

  12. Neutron radiative capture methods for surface elemental analysis

    USGS Publications Warehouse

    Trombka, J.I.; Senftle, F.; Schmadebeck, R.

    1970-01-01

    Both an accelerator and a 252Cf neutron source have been used to induce characteristic gamma radiation from extended soil samples. To demonstrate the method, measurements of the neutron-induced radiative capture and activation gamma rays have been made with both Ge(Li) and NaI(Tl) detectors, Because of the possible application to space flight geochemical analysis, it is believed that NaI(Tl) detectors must be used. Analytical procedures have been developed to obtain both qualitative and semiquantitative results from an interpretation of the measured NaI(Tl) pulse-height spectrum. Experiment results and the analytic procedure are presented. ?? 1970.

  13. Estimating and testing mediation and moderation in within-subject designs.

    PubMed

    Judd, C M; Kenny, D A; McClelland, G H

    2001-06-01

    Analyses designed to detect mediation and moderation of treatment effects are increasingly prevalent in research in psychology. The mediation question concerns the processes that produce a treatment effect. The moderation question concerns factors that affect the magnitude of that effect. Although analytic procedures have been reasonably well worked out in the case in which the treatment varies between participants, no systematic procedures for examining mediation and moderation have been developed in the case in which the treatment varies within participants. The authors present an analytic approach to these issues using ordinary least squares estimation.

  14. Analytical studies of the Space Shuttle orbiter nose-gear tire

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Tanner, John A.; Peters, Jeanne M.; Robinson, Martha P.

    1991-01-01

    A computational procedure is presented for evaluating the analytic sensitivity derivatives of the tire response with respect to material and geometrical properties of the tire. The tire is modeled by using a two-dimensional laminated anisotropic shell theory with the effects of variation in material and geometric parameters included. The computational procedure is applied to the case of the Space Shuttle orbiter nose-gear tire subjected to uniform inflation pressure. Numerical results are presented which show the sensitivity of the different tire response quantities to variations in the material characteristics of both the cord and rubber.

  15. Study of a heat rejection system using capillary pumping

    NASA Technical Reports Server (NTRS)

    Neal, L. G.; Wanous, D. J.; Clausen, O. W.

    1971-01-01

    Results of an analytical study investigating the application of capillary pumping to the heat rejection loop of an advanced Rankine cycle power conversion system are presented. The feasibility of the concept of capillary pumping as an alternate to electromagnetic pumping is analytically demonstrated. Capillary pumping is shown to provide a potential for weight and electrical power saving and reliability through the use of redundant systems. A screen wick pump design with arterial feed lines was analytically developed. Advantages of this design are high thermodynamic and hydrodynamic efficiency, which provide a lightweight easily packaged system. Operational problems were identified which must be solved for successful application of capillary pumping. The most important are the development of start up and shutdown procedures, and development of a means of keeping noncondensibles from the system and of earth-bound testing procedures.

  16. Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.

    PubMed

    Blake, Christopher J

    2007-09-01

    Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.

  17. Analytic modeling of aerosol size distributions

    NASA Technical Reports Server (NTRS)

    Deepack, A.; Box, G. P.

    1979-01-01

    Mathematical functions commonly used for representing aerosol size distributions are studied parametrically. Methods for obtaining best fit estimates of the parameters are described. A catalog of graphical plots depicting the parametric behavior of the functions is presented along with procedures for obtaining analytical representations of size distribution data by visual matching of the data with one of the plots. Examples of fitting the same data with equal accuracy by more than one analytic model are also given.

  18. Automated dynamic analytical model improvement for damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J. S.; Berman, A.

    1985-01-01

    A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.

  19. The National Shipbuilding Research Program. Environmental Studies and Testing (Phase V)

    DTIC Science & Technology

    2000-11-20

    development of an analytical procedure for toxic organic compounds, including TBT ( tributyltin ), whose turnaround time would be in the order of minutes...Cost of the Subtask was $20,000. Subtask #33 - Turnaround Analytical Method for TBT This Subtask performed a preliminary investigation leading to the...34Quick TBT Analytical Method" that will yield reliable results in 15 minutes, a veritable breakthrough in sampling technology. The Subtask was managed by

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parrado, G., E-mail: gparrado@sgc.gov.co; Cañón, Y.; Peña, M., E-mail: mlpena@sgc.gov.co

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes withmore » medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.« less

Top